Ruminating over brain damage and other fun things on a Memorial Day weekend‏


Fighting off a bug this weekend and catching up on my reading. So I’ve been going through a big stack of cognitive/neurology newsletters I get…why I get them I don’t recall. Brain in the News. Interesting stuff, though, all kinds of articles. There’s a lot of talk about brain injuries suffered by Iraqi veterans (this was 2008), particularly due to concussions caused by roadside bombs. Not much good news. Very profound after effects with a lot of changes in memory, skills and most fascinatingly behavior–victims can get quite nasty, vile and dangerous even. Scary. It is astounding how much damage can be caused to the brain by concussion alone.  Which got me to thinking….

I wonder during the First World War, especially on the Western Front, just how many concussions were inflicted upon the soldiers who were subjected to all that shell fire?  Millions of shells were fired. Hundreds of millions*. Falling in a very small area, upon enormous numbers of men, for four years. Even accounting for all the duds and the non-explosive gas shells, that is an incredible amount of explosions in a small area upon a lot of guys over a long period of time. Who knows how many of those soldiers were concussed, and how often? There must have been, then, enormous numbers of men in postwar Europe who suffered from the after effects. So……….did those after-effects collectively have an effect on post war Europe? Did those hundreds of thousands–maybe millions–of concussed soldiers with all those classic symptoms of paranoia and hostility and rage and depression and confusion—were they in numbers sufficient enough to actually alter the social and political atmosphere of societies and nations in the post-war years? Pre-war and post-war Europe were much different places. Post-WW1 Europe was much more violent, rent by extremism and sociopathic political movements. And I wonder if any of that mass brain damage among the men of Europe in the twenties and thirties somehow made Nazism possible? Would normal people ever have fallen for it? Hell, was Hitler himself a victim of post-concussion effects…is that what made him so evil?  He spent time in the trenches, exposed, and was nearly blown to hell a couple times. A lucky bastard….but did he suffer after effects?

Just an idea. But the Europe of 1918 to 1945…man, that is one inexplicably berzerk place.

Another horrible phenomenon that fascinates me is all these incredibly violent ‘armies’ of boys in Africa. They seem to be raised from the masses of orphans left by AIDS. There must be tens of millions of orphans in Sub-Saharan Africa, and in many places so many adults have died or are dying that the social structure is simply not capable of taking care of them. Hence sociopaths collect them, Manson-like, into armies and as they’re just kids they make ideal killers. Of course, HIV is becoming less lethal as the more virulent versions kill their hosts and like malaria soon it’ll just incapacitate but leave the hosts capable of bearing and raising children. The supply of AIDS orphans will drop, and those armies will disappear. Funny how that happens.

It’s interesting that the Spanish flu pandemic had a mortality demographic somewhat similar to AIDS in that it killed almost exclusively people in the 18 to 35 year range (roughly..I can’t remember the exact age range but it was young adults. The reason for this, incidentally, is that people above that age and kids below hat age had a resistance they’d developed after being subjected to a flu strain that the vulnerable group in between missed…and perhaps missed because of a strain of flu they had as children.) The flu killed tens of millions in Europe in 1918-19. I am guessing, then, that it left millions of orphans, especially combined with so many fathers having been killed in the war. It’d be interesting to see how many young Nazis in the early thirties were orphans, if the incidence was higher.

Anyway, a little something to ponder on Memorial Day I guess.  War and it’s leftovers seem to linger long after the shooting stops.

Notes: Continue reading

Verbs not adjectives

[This was an email from 2010 that attempted to explain what I had drunkenly been describing to a friend at one of our parties. I failed. But she was terribly nice about it. I found the email later, turned it into a blog post. Then I realized I hadn’t discussed metaphors at all–which I always advise to use sparingly because the reading brain tends to trip over them–and worse yet, couldn’t discuss them. So I decided to do more reading up on metaphors. Now, reading this again, I don’t see why I thought this piece failed, as it works fine without the metaphor discussion. Whatever. This is what happens when college drop outs try to think like smart people.]

Ya know, L., I just remembered we were talking at that party about my verbs-instead-of-adjectives thing. Here’s the LA Weekly piece I was describing:

Lockjaw and Prez made him pick up the saxophone. This was New Orleans. There was a teenaged “Iko, IKo”, the very first. By ’63 he’s in L.A., playing Marty’s every night, and players—Sonny Rollins, everybody—dropping by, sitting in. Steady work with Basie and the Juggernaut and Blue Mitchell. Twenty years with Jimmy Smith. A million sessions for Motown and Stax, and first call for a slew of singers—that’s where you refine those ballad skills, with singers. Live he slips into “In A Sentimental Mood” and everything around you dissolves. There’s just his sound, rich, big, full of history, a little bitter, maybe, blowing Crescent City air. He gets inside the very essence of that tune, those melancholy ascending notes, till it fades, pads closing, in a long, drawn out sigh. You swear it’s the most beautiful thing you’ve ever heard, that song, that sound, and you tell him so. He shrugs. “It’s a lifetime of experience” he says, then calls out some Monk and is gone.*

How that piece happened was I asked to do a pick on an upcoming Herman Riley gig at Charlie O’s. He was one of the great tenor saxophonists and yet a virtual unknown. I called him up for a few comments before writing. He spilled for a half hour, his whole life. It was overwhelming. I didn’t dare cut him off. He was one of my heroes. So I just dashed off some notes and then pulled them together. I had 200 words, tops, to work with.  I probably had 400 words first draft. Kept winnowing. Reducing.  Down to the verbs. It’s nearly all verbs, action. I had a real good friend at work I used to instant message all the time, and I became fascinated with how language was used in our messages. By how brief a message could be if reduced to the verbs, and how much of an impact it would have. It could be quite visceral. I was doing a lot of heavy thinking about neurolinguistics then, how language works mechanically in the brain. It seemed to me that action was much more effective than description, and could get across the same info. Furthermore, when you used verbs instead of actions, the brain—by means of mirror neurons—automatically pictures what is being described without any necessary descriptive context—it somehow fills all that  in. That means that virtually no description is needed beyond bare hints. I never say where it is I am seeing him play here, even though that is based upon an actual event (a Charlie O’s gig I had just seen.) But even so, invariably the response from readers was “I felt like I was there”. Where exactly? It doesn’t matter. All that matters is the action.

The brain is more powerfully affected by action than anything else in language. It sees something or even reads about something and the exact same neurons that are used to actually perform the action are stimulated. I just picked up a glass of water. In your mind the exact same neurons that you would use to pick up a glass of water fired off as if they were picking up a glass of water and not me writing that I had. No matter…whether you do it, watch it, or read about it, the effect is the same. And that is why narratives that are based on action instead of description, that describe movement, things take place in time, are so powerful. **

That being said, you have to think about perspective, and not from the point of view of the writer but of the reader. I believe it works this way: if you describe something in first person the reader’s mind has to visualize the action, then interpret it into you taking the action, and then interpreting that action as if they were doing it themselves, and then imagining you doing so, etc, etc. That’s a lot of steps. Third person works the same way pretty much.  Second person, in English, anyway, is impossible to pull off—you do this, you do that, etc. Very cumbersome. Our language is not grammatically designed to pull that off easily (but that’s another lecture….) So I use an implied second person. Everything I do is first person, but I remove myself (by not using the first person pronoun much) and let the reader get the feeling that it is actually he that is doing/seeing/hearing what I describe. Again, this shortens the steps necessary for the brain to interpret what it is reading into understanding.  I’ve noticed that the fewer steps required the faster the language is retrieved by the brain and the more powerful the impact. And good writing is all about impact. You want to move people, you have to increase the visceral impact of the prose. And to do that you need to think about how it is that the brain turns the words you right into thoughts it can visualize. It’s verbs, baby.

The brain is designed for verbs. It sees in verbs. It has instantaneous perception in verbs. Adjectives take extra thinking. You don’t want extra thinking. You want your words to turn themselves into those units of perception that lay beneath language, that existed before language, and you want that process to occur as quickly as possible. Think of porn. The sexual excitement people get viewing it is not literary. It’s older than that. As is love. And just about anything else important. I mean the hell with language. You have to get beneath it. You have to aim for the centers beyond the brain’s language centers, because that is where the feeling is. That is where you move people. And the most direct way to get there is action, verbs.

It interesting to note that vision was initially a matter of detecting motion, indeed a matter of detecting the change in light. If a primitive animal was hiding and suddenly all was light, it meant something might have exposed it in order to eat it. Or if the light suddenly changed to dark, it meant something blocking the light might be there to eat it. This implies movement. Go forward half a billion years or so and amphibians detect movement and not much else. You can see this in a frog to this day. A fly sits on a leave three inches from the frog, the frog can’t see it.  The fly takes flight and zap, it’s the frog’s lunch. Reptiles can see more than amphibians. Mammals more still ***. But the fundamental basis of recognition remains movement.

Adjectives reflect a much more sophisticated vision and awareness of what is being seen. Adjectives require observation and analysis. There is no equivalent of mirror neurons for the kind of information adjectives describe. You watch a tennis player serve the ball, the mirror neurons fire in exactly the same way in your brain as they do in the tennis player’s brain even though you are not actually moving. They reflect the motion. But the neurons that fire to let you know that you are seeing a tennis player with red hair, a blue shirt and a green tennis ball, they do not fire in the same way the tennis player’s does. The only automatic understanding of what you see is the parts of the brain that detect motion, that is action. Those have been firing off in brains for hundreds of millions of years. They are part of the fundamental infrastructure of the brain, much as the hypothalamus deep in the temporal lobe controls the fundamental 4 F’s (feeding, fighting, fleeing and mating) of human and mammalian behavior, and reptilian and amphibian and fish behavior****. That is, going back over half a billion years (though way back when it was not necessarily a hypothalamus per se in other animals, I believe, but the things that evolved into our own hypothalamus). Mirror neurons are also an ancient part of the brain. Information relayed through mirror neurons is understood instantly by the brain. Whatever you write that fires them off will be much more quickly and powerfully–indeed viscerally–responded to by the reader. In fact, you have only to give the absolute barest amount of descriptive detail, since the brain seems to automatically fill that in. If you say you walk into a room and sit in a chair, the reader’s brain automatically seems to provide all the details it needs to understand the scene. The reader gets that “it feels like I’m there” feeling. And nothing gets a response from readers more than prose that makes them feel as if they are part of the action. Makes them feel they are the one actually walking into the room and sitting in a chair. Or sitting in a bar and seeing and hearing a saxophone player.

Plus you should spell good.


Notes: Continue reading

Apes with extraordinary cognitive abilities

Once you realize that every single human being there is has inside their skulls the most complex thing that we know of in the entire universe, it gets a little weird. There are over 7 billion of these brains out there right now, all over the planet, each vastly more complex in its interconnectedness than the universe it exists in. Dig these numbers: a human brain has about 86 million neurons, and roughly ten times that many glial cells, or upwards of a billion. Each of these neurons fires five to fifty times a second and each of these neurons has up to ten thousand connections with other neurons. The estimates for the total numbers of synapses (i.e. the connections) between our neurons run from 100 trillion to 1,000 trillion (or one quadrillion). These synapses connect via dendrites (little filaments that grow from the surface of a neuron) and there are more dendrites than are used by neuron at any given time, so the potential number of connections could be one million billion (or one quintillion). That difference between that maximum total number of actual synapses (one quadrillion) and potential synapses (one quintillion), means the brain hasn’t come close to maximizing its capacity. And it means that the brain will continue to grow in complexity (and size). The human brain currently uses but a tiny fraction of its synaptic capacity. There simply isn’t enough to think about yet to fill it up.

83% of your brain is cerebral cortex, the thing that makes you you and people people. That cerebral cortex has grown at an astonishing speed evolutionary-wise. In just a couple million years it has expanded from chimp size (about a pound) to what it is now (about three pounds). Indeed it has grown so fast that it developed the folds you see in a brain in a jar, in order to maximize the number of neurons that could be crammed in the area available inside the skull. These folds increase surface area inside a limited space (or skull size), which increases the amount of neurons and synaptic connections between them. The size of our skull is limited by the dimensions of the human female’s birth canal. Indeed, the difficulty of human birth is due entirely to the size of the homo sapien sapien cerebral cortex. Were the woman’s pelvis able to widen further (it can’t, or at least natural selection isn’t capable of widening it at the same rate as a continuously expanding skull size)–or were it detachable like a snakes jaw (it isn’t), the human skull might be even larger, since apparently skull size is one of the things that can change quickly in our species through time (look at a collection of us and our predecessors to compare.)

Now about those billion glial cells. There’s ten times as many of them because they are much smaller than neurons. We used to think all these glia simply held neurons in place–it is vital that neurons remain in place to keep the synapses firing correctly, since synapses are not actually linked together but are just close enough for an electro-chemical charge to cross between them. But these glial cells also help to provide the neurons with nutrition, such as oxygen or the minerals such as potassium used in neurotransmission, which neurons exhaust quickly. And glia also helps with repairs and supplies the myelin which, like the rubber around a wire, shields the current running from one neuron to another via each synapse. But now it’s also known that much of the brains incredible plasticity is due to glial cells, and they are used in communication (and even breathing) and who knows what else. Glial cells, like everything else about the brain, just keep revealing more complexity.

And the complexity of all this is so vast that we are incapable of actually visualizing it. We fall back on huge numbers like quadrillion and quintillion, or compare it to the relative paucity of complexity in the known universe. What we have in our own skulls, and which is our very essence, we can barely understand. But every person you see has something in their heads that is more astonishing than the entire known universe. I can tell you that without truly comprehending it myself, because it is not comprehensible. We can understand it as a fact, an abstraction, but not actually appreciate just what it means. Like how we know what infinity is, but we can’t truly comprehend what it is. Our brains have myriad capabilities beyond our capacity to understand, because our brains are smarter than we are intelligent. After all, we are still just apes. Apes with extraordinary cognitive abilities, but still apes.

Thinking about thinking and vice versa

“A human being might be more a verb than a noun” a friend said, discussing consciousness. A great line, that. There may not be a mind as much as a process, he explained. I’d been reading that theory too. What we think of as us is just the result of a lot of various brain processes. There was an air of mystery to it.  There is no there there, someone else chimed in. I liked that line, too. But neither it nor the verb line quite did it for me. But it got me to thinking, and then to writing, and writing, and writing. Rather than sleeping, sleeping, and sleeping, which would have been a much better idea, running on fumes as I am.

But when discussing the nature of consciousness, I just don’t think there’s a difference between it being a process or a thing. It’s still all neurology, which is just a way of saying physiology. Except we can’t say physiology until we actually know to a fairly definitive degree what neurons are doing what that allows for consciousness–aka the human mind–to happen.  But you can say the same thing about climate. There’s a whole bunch of things that combine in all kinds of varied ways to create what we call “climate”. But very few are a complete mystery to us anymore. They can all be explained and modelled. Debated, yes, and various models drawn up, but they are all models based on data. We’re not there with the mind yet.  We know that it’s all neurology, we just don’t know exactly what does what in order to make us conscious. But once we do, we will no longer think of it as a verb. We’ll call it a process, a thing. It’ll be a noun.

But as I said, we aren’t there yet. And since we aren’t there yet, we tend to ascribe it a sense of mystery  that makes it more than a mere thing. But that sense of mystery is just a state of ignorance. That is, we don’t know how it works yet.  Once we do, it’ll no longer be a philosopher’s quandary, a mysterious unknowable thing, a process that can’t be reduced to a noun…it’ll no longer be anything other than a process that can be described like any other process. Nothing metaphysical about it.

Neurology and cognitive science have advanced at such an incredibly prodigious rate since the 1980’s that its difficult for non-specialists to even conceive of what all the new knowledge means. There is so much known about us now, about what our neural networks do, about how so much of our behavior can be located in places in the brain that can literally be seen and touched…and yet we still don’t even have ways to understand it except in neurological terms. yet we really are the result of patterns of electro-chemical responses in our neurons and glia. If your electrolytes run low, a neuron can’t carry the charge its receiving from another neuron to the next. No potassium, no thought. My wife’s heart stopped that way, and with it, her brain. (Both were revived, thankfully.) Or too much potassium and the neuron begins firing too many adjacent neurons and you can have epilepsy (which is why I have to be very careful with bananas and high potassium foods.) And while these very simple chemical processes are at the root of human consciousness, it’s the astonishingly complex lattice of interactions across the 100 million neurons and hundreds of billions of possible synaptic connections that create the thing we call consciousness…a process that as of now is simply too vast and complex and variable for us to really understand as a whole.

Which is a shame, because until we can conceptualize that whole, consciousness–what we are–will be an unfathomable mystery. But we couldn’t discuss the universe a century ago like we do now. Today it’s not only the subject of documentary series, where physicists describe cosmology and even theoretical physics for laymen, but those laymen, millions of them, are able to understand what is being explained to them. They can conceive it. The human brain can, by now, turn all that physics into a model it can see in its mind’s eye. We can’t yet do that with the neurology of consciousness. I like to think we are at the same stage now with our own neurology as we were a hundred years ago with conceptualizing evolution. People then knew that there was such a thing, but they had to think of it as a mysterious process called “evolution”. Now we know how it works, and evolution isn’t a mysterious process at all. It’s something we can conceptualize so readily that unless we’re religious, there’s no mystery to it at all. It’s a wikipedia entry.

We’re a couple decades away from that as far as human consciousness goes. But it too will stop being a mystery, will stop being so difficult to conceive of. I’m not saying it will be figured out yet, but it will be seen as a noun, a thing, something that can be explained as a physiological process. By then there will no longer be philosophers involved in the discussions, any more than philosophers are involved in oncology or genetics or climate science now. At some point science becomes mechanics, a study of process.

As far as the mind and consciousness goes, we’re not at the point yet where the process has been defined or even discovered. It’s like a 16th century map with big empty spaces unhelpfully labeled terra incognita.  And for laymen like us, several steps removed from the state of the science, we’re left in the dark, and almost always a couple years–even a few years–behind the research.  And then the competing theories are still being battled out in the journals, consensus is far off. But I think that in a generation at most we’ll be working with a model of human consciousness just like we have a working model of evolution today.

By then we’ll no longer asking if there’s a there there, or think that there might not even be a source of consciousness, or debating whether consciousness is a noun or a verb. All our discussions, and certainly an essay like this, will seem terribly quaint. Consciousness will be understood as a process, we’ll know to a much greater degree just where in the brain it’s located and how it happens, and it won’t ruin anything. I think there’s a fear that if we discover the actual mechanics of consciousness it’ll ruin everything somehow. That we need the mystery. But understanding evolution didn’t ruin everything, nor did the discovery of our tiny insignificant place in the universe. What we are, what makes us people, makes us cognitive beings is based on much more than theories of mind, universe or genetics. And none of that will be changing soon. After all, we invented the Internet (the only human creation that comes anywhere near the complexity of the human brain) and then filled it with porn. If baboons invented the internet they would have filled it with porn, too. And if baboons could they’d have TMZ. We’re all primates. Knowing exactly what consciousness is won’t change that a bit.

It’s now 6 am and I’ve spent the whole night writing about thinking. I began the night thinking about sleeping. Which I’d better try to do a little bit before it’s too late.

Neurons of the neocortex–your consciousness is in there somewhere. Photograph by Benjamin Bollmann, from Sebastian Seung’s


Your brain on homonyms

I know very well [posted a friend of mine], really by instinct, the proper placement of apostrophes and other punctuation and the usage of words like “there, their, they’re”, “you your you’re” and so on and yet I get to typing so fast that my brain is constantly pulling the wrong one out of my hat. I get so embarrassed when I discover these mistakes later.

I love when that happens, actually, because I think it shows so much about the brain and language.

This is how I think that happens: we type by listening to the inner voice speaking in our heads…language was strictly spoken for a hundred or so thousand years before we began reading it, and very few people at all were reading and writing it until the last century or two. And typing was not invented till 150 years ago. When we direct our fingers to type we are actually listening to the narrative voice running through our heads, and then some center of the brain in turn directs the actions of our fingers to type what it is “hearing”…an impressive task, as typing is something much more complex than writing with a pen or pencil (or with a stylus on a mud tablet as they did when writing was brand new). We sort of type the way a pianist plays, hearing the music in his head and then coordinating his hands and fingers to play it. Of course, in music you can harmonize so you can use all the fingers, but writing has no harmonies (unfortunately) so its more like playing trumpet, one note on a trumpet equals one letter on a keyboard. (Both language and speech are more like playing the trumpet, too, but that’s another essay). So our typing fingers hear the sounds of the words and not see them. There, their and they’re can be placed interchangeably by our undiscriminating fingers as they are homonyms (though perhaps not in all dialects.) Some people could make the same mistake with merry, marry and Mary, though in my regionally accented ears the three sound like different words, unlike there, their and they’re. It doesn’t always happen, perhaps it’s a matter of context or syntax that enables our typing fingers to figure out the correct homonym. Sometimes, though, they mess up, and probably more than we realize because we catch ourselves making the mistake as we type. But we don’t always so quite often a homonym (i.e., words that sound alike but have different spellings) have to be scene–that is read–and not heard–that is written–to be corrected. And I notice now that I typed scene instead of seen, proving my point. Ha!

Basically, we are typing by dictating to ourselves. We hear but don’t see the words…and if we DO see the words it immediately breaks up our train of thought as reading and writing are two completely separate processes in the brain and we can’t do both simultaneously. So those their/they’re/there’s will keep popping up and their/they’re/there’s not much we can do about it except proofread–and even then if you proofread too soon your brain still has the short term memory of “hearing” what you wrote and your eyes will not always catch the mistake. Sometimes, if writing a story, it’s good to let the draft sit for an hour before proofing it, by which time that short term memory will be gone and you are actually reading what you wrote and not remembering what you narrated.

I also typed are as our….making at least two that made it past two drafts (and spell check) before I noticed them. In a digital format it’s no big deal. But years ago, when they got past me, spellcheck, my editor and at least one copy editor to wind up fast and permanent on paper in my jazz column in the LA Weekly, I would invariably be admonished by readers. You should really learn how to write, somebody would say. Usually from the safety of an email, but sometimes in person, at a jazz club, if they were drunk enough. You should learn how to write, they’d say, then totter off.

Chick magnet

(Email from years ago.)

There was a kid here at work a few years ago, a nice kid, good looking, sensitive, Appalachian, smart as hell, woodsy, a musician. He’s a member of a local comedy troupe that are popular in Hollywood as those things go. One day he came to me looking depressed and scared and said he’d been diagnosed with multiple sclerosis. So sad. Welcome to the short bus, motherfucker, I said. I always say that. One of those cognitive issues in-jokes. He laughed. Being that I was the big brotherly epileptic guy in the office, he asked for advice about how to remember to take his meds and how not to lose his car keys, things that seem minor unless your memory is degenerating. I filled him in best as I knew on sundry disability things like the Americans with Disabilities Act, which helped him get a permanent medical leave, and where to leave those car keys (the exact same spot every night.) About how leaving notes around doesn’t seem to work and all you do is find notes around later that you forgot were there. I told him how to make himself a little boring, unchanging, because deeply seated memories, rote things, seem to have a lasting power that regular memory doesn’t. I went over some tricks with names you can’t remember and faces you can’t recall. I tried my best to cheer him up, but the poor kid was so depressed. I said look, you’ll adapt, your life will continue. He said no, it’s over, no job, no more playing guitar and no more comedy. Certainly no girlfriend. Not even a date….what girl would want to go out with a guy with MS? I said dude, you don’t get it: you are now a Chick Magnet. He shook his head. I said Yup. You’ll bring it out in them. They’ll love you to death. He was skeptical. No way, he said, and moped off to wallow in the thoughts the brain impaired wallow in.

Fast forward a few months and he’s still playing guitar and still doing improv. And he’s dating a big six figure television star. She’s rich, successful and nuts about him. Even talked all about him on late night TV, in front of god and studio execs and everybody. He blushed. I laughed. If it hadn’t been for his mildly damaged brain, his MS, she would not have paid attention to him at all. Well, maybe, but he would have just another cognitively fully functioning guy, and you know how dull they are. Dime a dozen, perfectly self-sufficient and always seem to know where their car keys are.

Play your cards right and a little brain damage might be the best thing that ever happened to you. Remember that. Of course you all can remember that. But we have to leave little post it notes around the house, and you know how well that works.

Words and pictures

This is one of those incredible photographs that cannot possibly described in writing. It is so frustrating how you can’t recreate what we see in words, and indeed, almost nothing we see requires words to understand. It’s all pre-language. Vision is nearly a billion years old. Language a couple hundred thousand. Written language maybe five thousand years old. The speech and language thing in the brain is new, very primitive, extremely limited. And when I see a picture like this that tells a whole story, and realize that I could spend hours trying to tease out a few sentences that would do the same, I begin to hate photographers. I mean think about it, someone invented a camera less than two centuries ago and within two or three decades unbelievable photos, these perfect images, begin to appear. Iconic things. Hell, when we think of the Civil War we see Matthew Brady’s dead strewn in a field, not any of the word images in thousands and thousands of histories, memoirs, poems, and novels. That was the secret of Ken Burns’ Civil War–the photos that the television camera would play across, giving an illusion of animation to still life. The narrators would recite passages from people who were there, and the theme would swell, fade and disappear, but what we remember are the images. Not the words, we default instead to the ancient vision centers there in the back of the brain. You don’t need to explain anything to those vision centers. It gets it automatically (which is why it is so easy to fool us with trick photography–the vision center believes what it sees). Photos nail us. We have no defense. And I just wrote a couple hundred words trying to say that.

This is why I’m mean to photographers. Not to angels like the one in this picture, though. They can melt your heart.

OURO PRETO, BRAZIL - APRIL 05: A girl dressed as an angel walks home after marching in the annual Easter procession during traditional Semana Santa (Holy Week) festivities on April 5, 2015 in Ouro Preto, Brazil. Holy Week marks Easter celebrations for Catholics and Brazil holds the largest number of Catholics on the planet. Ouro Preto was a colonial mining town founded in the late 17th century and the Semana Santa tradition in Ouro Preto can be traced back to the 18th century Portuguese colonial period. (Photo by Mario Tama/Getty Images)

“A girl dressed as an angel walks home after marching in the annual Easter procession during traditional Holy Week festivities in Ouro Preto, Brazil.”   (Photo by Mario Tama/Getty Images)



There’s nothing like accidentally posting a random collection of notes to your blog and then having to go into all the social media sites and deleting it. This didn’t happen when this stuff was all analog, with an analog pen and analog paper and analog edits and analog scratching out and analog illegibility. Not to mention the lost art of margin doodling. Times were simpler then. Messier, but simpler. I almost miss ink stained hands.

I have a whole box full of analog words like that. Page after analog page. I like looking at the edits. The sentences lined out and rewritten in the margins.The paragraphs lifted up and dropped onto a whole other page. Sometimes there are entire pages scratched out that I really like now. This was a much younger brain, I wonder what it thought when it saw this stuff. And this was before email, before instant messages, before texting and tweets and Facebook posts. Before the comments sections on news sites. Before blogging. This was a different universe. In that universe none of you people would be reading this. In fact none of you would have read anything I wrote unless you picked up a West Coast Review of Books or an obscure rock zine or two.

But that universe was pure creativity, a lab, a mass of failure, the occasional gem. Rhymes even. Certainly a lot of epilepsy. I keep thinking I ought to drag that box out of the closet and zap some of that stuff into the digital universe. But there’s so much. It’s a helluva lot of work, transcribing. And it feels weird going back in time like that. You begin to feel the way you felt decades and decades ago. That fresh, unwrinkled skin. The raging testosterone. The stupidity, on one hand, and then all those brain cells long since gone. What would it feel to be dropped into my twenty-five year old body with a brain a quarter again as big as mine now? Would it be noticeable? How could it not be? Like moving into a sprawling ranch house from a two bedroom apartment. All this snuggly comfort would be gone in all those rooms, but think of the views you’d have. Views you’d given up as your life got smaller, narrower, quieter. Even if the brain is only 15% smaller in volume, there are all sorts of synaptic paths you’ve abandoned. Like that big ranch house full of nooks and crannies you no longer use. A back door you haven’t opened in decades. The kids’ room, left as it was. A garage stacked with inaccessible boxes full of things you forgot you ever had. Neurons have settled into comfortable patterns. Some are passed by, ignored. Some have drifted into other areas of responsibility. Much has been sorted into piles, some you need, and some like those boxes in the garage. You just don’t get excited about so much anymore, not like you did when you were in your twenties, because your brain is so set in its sensory and concept reception ways. It’s gotten comfortable, in sort of the cognitive equivalent of a favorite chair, watching old movies.

Our brains are at the maximum size in our twenties…after that the brain doesn’t bother replacing the cells–neurons and glia both–it doesn’t think are necessary. We don’t have a choice, it does it for us, it economizes. Such a shame. We’ll never know exactly what we’ve lost, but we know we lost something. I lost all those analog thoughts and memories. I’d love to have them back. Or maybe I don’t. Digital is easier, editing so simple. Mistakes so easily hidden. Things, worthless or not, so easy to save. I guess that’s a good thing.

So I’ll put off pulling out that box again and live in the now. It’s easier that way. As much as I reminisce about the analog universe, this digital one is much easier, while it lasts. Civilization is on the cusp of the next step. You can feel it. Something beyond this even, something beyond the written word. And people like me will be museum pieces then. Historical oddities. We wrote. You what? Wrote. What was that? This. That? Yeah this. Why?

Why? I have no idea why. We just did.


The plasticity of the brain defies belief…a woman born without a cerebellum and not diagnosed till age 24. Which means that even though the genes that would normally be involved in cerebellum development were missing (actually they were there, they were just unable to perform this set of tasks), other genes were able to use other parts of the brain to perform the cerebellum’s tasks. The brain is able to function the way a drummer lays down a complex web of polyrhythms. That is, a drummer has two feet and two hands, but is able to create varying patterns by combining elements of the same hands and foot pedals striking in different but simultaneous patterns. And the brain can use the same neurons in varying patterns by making varying combinations of neurons…a neuron firing can fire off other neurons simultaneously so one neuron can be part of various chains of neurons performing various functions. It’s insanely complex.

Neurons firing off in real-time. From 'The dynamic mind: Stunning 3D 'glass brain' (Daily Mail)

Neurons firing off in real-time. From ‘The dynamic mind: Stunning 3D ‘glass brain’ (Daily Mail)