On a screen, where all data are created seemingly equal, where does she get her idea of authority?
I came across this article, The End of Civilization as We Know It?, A Duke Magazine Forum explores the future of reading and was quite amused by the transparency of some of the stances and clearly ideological drippings. Take, for instance, this argument presented by Sven Birkerts, (described in the article as “author of The Gutenberg Elegies, an acclaimed book about the lure and cultural significance of reading”):
The system of the printed book has always been premised on individual authorship, on systematized classification, and on cumulative progress along a timeline, at least where scholarship is concerned. The library has physically embodied this. Given that books were costly and scarce and that most individuals could only possess a very few, the library’s purpose from the start has been centralized access. But, in serving that practical function, libraries also acquired a powerful symbolic status. As much as the university, they’ve been our culture’s way of putting an institutional imprimatur on the life of the mind. But things have clearly changed.[…]
Think of the student who has more or less grown up in our electronic culture, who already uses books differently, far more sparingly than those in the generation ahead of her did. Imagine this student placed now in an environment stripped of books, which offers only the power of the technology and the near infinity of data within keystroke reach. Where does she find her primary idea of context, of the principles of relatedness? What paradigm of knowledge does she hold, and by what sanction does she hold it? On a screen, where all data are created seemingly equal, where does she get her idea of authority? What is her developing picture of knowledge and its many branches? […]
In the process, we are rewriting the literal and conceptual relation of self to society. I can’t start to theorize what this implies about power, control.
(emphasis above and across the board, mine)
This week I already took jabs at Nicholas Carr’s idea that electronic data dissemination is making us dumb and now, coincidentally, I see the same argument but this time, presented by a seemingly well intentioned concern troll. Where will this hypothetical student learn about authority? How are we going to teach her which authority she needs to respect. (As an aside, I cannot help but find it extra amusing that this author, in his effort to be, or sound inclusive and gender aware, uses a female pronoun to emphasize his idea of enforcing authority; I don’t believe in the innocence or naivete of language, particularly when it comes from a published and supposedly acclaimed author). So that’s the root of this discussion, isn’t it? If all information was to be equal and equally accessible, how are we going to determine who’s in charge?
Well, maybe, just maybe, that’s how paradigm shifts occur. They don’t happen like a sudden realization, like an idea that suddenly comes to mind and injects itself in the collective. I would say these shifts occur by slowly changing our thought processes. If we learn to live without the percolated scholarly knowledge that, like Mr. Birkerts above would like us to hold on to, maybe we will also start seeing the pyramid power structures for what they really are (the control and normalization they imply in our daily lives, the oppression, etc.). Learn to think in rhizomatic, non hierarchical models in terms of information and maybe, that will translate into more rhizomatic, non hierarchical models of social organization. Thought precedes social models, after all.
And then, there was this other tidbit, that clearly does away with Nicholas Carr’s premise that “the internet is making us dumb”:
Julie Tetel Andresen (associate professor of English at Duke and author of Linguistics Reimagined: Language Study for the 21st Century): I recently ran across an article that touted seven ways to prevent cognitive decline as we age, and I’m going to give them to you in reverse order. The seventh was meditate, then do puzzles, then brush and floss, drink alcohol sparingly, eat blueberries, then exercise. The number-one way to prevent cognitive decline is to surf the Internet. The article went on to say that scientists had discovered that surfing the Internet may be more stimulating then reading. Interested, and perhaps alarmed, I tracked down two articles that seemed to speak to this issue.
The first was by a team of psychologists at Indiana University; this appeared in August 2008 in Psychological Science. It posited a relationship between “spatial foraging” and internal cognitive search. And it cited evidence that the neural mechanisms that evolved for the purpose of modulating between exploration and exploitation in spatial foraging have been subsequently adapted in later species for the purpose of modulating attention. In fact, modulating attention, and this would be goal-directed cognition, is exactly what is problematic in human pathologies such as attention deficit hyperactive disorder, drug addiction, Parkinson’s disease, obsessive compulsive disorder, schizophrenia, and certain autistic behaviors.
The psychologists hypothesized that if particular spatial tasks could be made to have long-lasting effects on the generalized cognitive-search processes—for example, by exposing individuals to tasks during development—this could provide useful hints toward non-pharmacological treatments for disorders of attention. That seemed to suggest why both doing puzzles, number six on the list, and surfing the Internet, number one on the list, might be good for cognition—because they’re both foraging tasks.
I was satisfied by that understanding until I remembered the claim that surfing the Internet may be more mentally stimulating then reading. So I found a second article by a team from UCLA, “Your Brain on Google,” published this past February in the American Journal of Geriatric Psychiatry. In their study of normal fifty-five-to-seventy-six-year-olds, they found that the patterns of brain activation during fMRI scanning while subjects performed a novel Internet search task was greater than on the control task of reading text on a computer screen formatted to simulate the prototypic layout of a printed book, where the content was matched in all respects. They had two groups: the Internet-savvy and the Internet-naïve. There was a two-fold increase in brain activation in the Internet-savvy in regions mediating decision-making, complex reasoning, and vision.
Both Birkerts and Carr’s arguments seem to be based in this privileged and fear mongering idea that we need authority, that we need to be told what is good and what is worthy, that such decisions cannot be left to individuals (unsurprisingly, both authors seem to deride the concept of the “wisdom of the masses”). I might be overly utopian, but I contend that the opposite is true, that not only are we going to find what information best serves our needs that but this new template is going to impact much more than just academic pursuits or scholarly research. I might not live long enough to see how these new thought processes are going to affect social models of organization, but if a couple of million years of evolution are to be taken into account, we might be moving towards a less centralized, more horizontal framework for all of our world views. I can see why many would feel threatened by such a thing. For me, it cannot come soon enough.
For the past decade and a half I have been making all my content available for free (and never behind a paywall) as an ongoing practice of ephemeral publishing. This site is no exception. If you wish to help offset my labor costs, you can donate on Paypal or you can subscribe to Patreon where I will not be putting my posts behind a lock but you'd be helping me continue making this work available for everyone. Thank you. Follow me on Twitter for new post updates.