Among humanities scholars, one of more controversial claims of the OHM Thesis, maybe best represented by the title of one of Ong’s articles, “Writing is a Technology that Restructures Thought,” is the idea that technologies such as writing, the printing press, etc., can actually restructure consciousness. “Brains don’t change,” one well-known linguist told me during a WPA-L discussion a few years ago. “But they do,” I argued, pointing to research from cognitive science.
With that preface, I offer this passage from a Daily Mail article, written by Susan Greenfield and adapted from her book ID: The Quest For Identity In The 21st Century:
Anyone who doubts the malleability of the adult brain should consider a startling piece of research conducted at Harvard Medical School.
There, a group of adult volunteers, none of whom could previously play the piano, were split into three groups.
The first group were taken into a room with a piano and given intensive piano practise for five days. The second group were taken into an identical room with an identical piano – but had nothing to do with the instrument at all.
And the third group were taken into an identical room with an identical piano and were then told that for the next five days they had to just imagine they were practising piano exercises.
The resultant brain scans were extraordinary. Not surprisingly, the brains of those who simply sat in the same room as the piano hadn’t changed at all.
Equally unsurprising was the fact that those who had performed the piano exercises saw marked structural changes in the area of the brain associated with finger movement.
But what was truly astonishing was that the group who had merely imagined doing the piano exercises saw changes in brain structure that were almost as pronounced as those that had actually had lessons.
“The power of imagination” is not a metaphor, it seems; it’s real, and has a physical basis in your brain.
Alas, no neuroscientist can explain how the sort of changes that the Harvard experimenters reported at the micro-cellular level translate into changes in character, personality or behaviour. [Read more.]
From a media ecology perspective, the implications of this are vast. And it raises a number of important questions, such as: Should we include visualization of writing and researching as part of composition pedagogy? What, if any, noetic changes might come about from asking students to imagine themselves within a text? What did it mean to have a memory palace in which one walked through to recall information? Or, for that matter, what affects have our metaphors of memory had on noetic structures? While the classical and medieval metaphor of memory as a container such as a dovecote have fallen out of favor, this research at least implies that their conception of memory as a container may have had tangible noetic affects.
While the article itself points to some interesting research and asks some interesting questions, such as:
What worries me is that if something as innocuous as imagining a piano lesson can bring about a visible physical change in brain structure, and therefore some presumably minor change in the way the aspiring player performs, what changes might long stints playing violent computer games bring about?
I find it a bit too pessimistic. Or, maybe, a better term is too fearful. Consider, for instance, the intro:
Human identity, the idea that defines each and every one of us, could be facing an unprecedented crisis.
It is a crisis that would threaten long-held notions of who we are, what we do and how we behave. It goes right to the heart – or the head – of us all.
This crisis could reshape how we interact with each other, alter what makes us happy, and modify our capacity for reaching our full potential as individuals.
And it’s caused by one simple fact: the human brain, that most sensitive of organs, is under threat from the modern world.
Or this passage:
We could be raising a hedonistic generation who live only in the thrill of the computer-generated moment, and are in distinct danger of detaching themselves from what the rest of us would consider the real world.
The intro seems to suggest that the noetic structures of our current techno-cultural milieu are somehow natural and that new technologies will dehumanize us.
And that second passage, about a hedonistic generation detached from the “real world,” seems too commonplace to even take seriously. How often have I heard that sentiment in other contexts? Playing (non-computer) role-playing games, which, for most, just meant Dungeons and Dragons put me at risk for becoming detached from the real world, or so I was told by a number of people. Same too with reading fantasy. On the advice of my parents, a relative gave me a number of Lancer Paperback Conan books as a gift during the summer between 8th and 9th grade. Said relative took me aside to assure himself that I understood that I lived in the 20th century America and wasn’t in danger of doing whatever it was that said relative imagined Conan did.
I don’t want to make light of these concerns. I’m sure that violent video games are having some effect on who we are and the way we think, but that doesn’t make gamers a hedonistic generation “in distinct danger of detaching themselves from what the rest of us would consider the real world.” This smacks of cliché just as much as my desire to counter it by nodding to Plato’s complaint against writing the Phaedrus is cliché. For Plato, the real world, or, really, our ability to access it, resided in dialogue, which writing did not allow. However, that my nod to Plato is clichéd does not mean that it isn’t appropriate or accurate. And, likewise, just because Susan Greenfield’s concern over digital technologies seems clichéd does not mean we should brush aside her concerns. In making my nod to Plato, what I am doing is pointing out that in a newspaper article which does draw upon a fair amount of scientific fact, its clichéd warnings seem to be gut reactions rather than concerns rooted in research.
Yes, these technologies are changing us, and yes, once we have fully interiorized digital technologies we will not be the same people we were when print was the hight of modern technology any more than the people of Eighteenth Century Britain were the same people as the creators of Stonehenge and Skara Brae.
Despite the dire warnings and ominous tone throughout the piece, Greenfield sort of lightens up at the end. Ultimately her point seems to be Marshall McLuhan’s point (although, admittedly, McLuhan was much more suspicious of technology than Ong was): we need to create an inventory of effects so that we can be in control of our environment. As McLuhan puts it in The Medium is the Massage, “There is absolutely no inevitability as long as there is a willingness to contemplate what is happening.”
What Greenfield seems to be asking in both this article and her book is whether or not we will remain human. And that’s a loaded question. (( See, for instance, legendary science fiction editor John W. Campbell’s 1959 essay “What Do You Mean … Human?” or any number of science fiction stories, novels, movies, and TV shows, for that matter. I was going to point to Karel Čapek’s R.U.R., the science fiction play in which Čapek coined the word robot, as an early example of science fiction tackling this issue, but we find the theme front and center in Frankenstein, if not earlier. )) Many of the sadder chapters of human history are predicated on a definition of human as “just like us.” Just because we’re digital–either corporal humans living in a fully interiorized digital techno-cultural milieu or humans who have uploaded their consciousness or downloaded their consciousness into something other than their born-into body–doesn’t mean we’ll stop being human. We won’t be the humans we are now, but, then again, we are not the humans that our grandparents or our grandparents’ grandparents were any more than they were the same humans as those who lived 500 or 1,000 years ago.
Maybe I’m just more optimistic than Greenfield is, more of an Ong to her McLuhan (although she’s much more openly suspicious of technology than I’ve ever found McLuhan to be). In mulling this over, I can’t help but wonder if my optimism stems from my enculturation in science fiction thinking. This is not to suggest that all science fiction is optimistic–cyberpunk, which is largely dystopic and cautionary, is one of my favorite science fiction genres. Rather, I think that as an avid reader of science fiction (and a scholar whose interests reside in the historical and comparative study of traditions), my conception of human is far more contingent than Greenfield’s seems to be.