–“Intensive multitakers are ‘suckers for irrelevancy,’ commented Clifford Nass, the Stanford professor who led the research. ‘Everything distracts them.’” The Shallows, p. 142
People have long felt that creative ideas come to us when we have time to think, optimally in a quiet, natural environment. In The Shallows, Nicholas Carr explores this idea by digging into the research. He looks at what it means to be constantly hyperlinked to information that is extraneous to the task at hand; what it means to be constantly inundated with tweets, with dings that display incoming emails and chat messages; and how multitasking is different from concentrated thought not only in work outcome but in its effect on the human brain and its growth.
Although the title of this book is apt, I still find it unfortunate because it seems to scream “angry Luddite who can’t accept modern technology!” And this isn’t the case at all. Carr obviously uses computers and the Internet. And he details some of the positive things that Internet availability has done for people in their daily lives—it may help older people keep their minds sharp; gamers have improved visual fields as well as increased speed in shifting visual focus. He also discusses positive changes that using the Internet can make to the human brain, the plasticity of which is astonishing.
And yet. The question. Are we losing something important in human culture, perhaps even the most vital aspects of being human like having compassion and empathy?
Carr begins with the boons that the Internet has given us and specifically discusses Google as a search engine. We can find lots of information about anything at any time and we don’t have to remember it. We will be able to retrieve it again and again. The problem, though, is that the brain grows and changes in response to the way information is accessed. Using the Internet dulls our capacity for concentration and contemplation. “Whether I’m online or not, my mind now expects to take in information the way the Net distributes it: in a swiftly moving stream of particles. Once I was a scuba diver in the sea of words. Now I zip along the surface like a guy on a Jet Ski.” Well-educated folks admit in these pages that they can no longer read extensive works and are irritated by having to stop and read text rather than click on hyperlinked words.
The Internet certainly isn’t the first technology to change the way the brain develops. Carr has interesting discussions of clocks and maps, of the printing press, and how these things created both gain and loss for users. However, a distinction of what is now called “deep reading” is that readers become more attentive, able to “concentrate intently over a long period of time, to ‘lose oneself’ in the pages of a book.” Doing this isn’t the natural state of the brain. (If a cave man could have gotten lost in a book, he also could have been eaten by a Saber-tooth tiger. He needed his brain, his awareness, to be constantly jumping around.) But the great advantage of being able to read deeply is that the brain expands and sets off other intellectual paths in the reader—who will be able to make his own associations, draw his own inferences and analogies, and foster his own ideas. In other words, deep reading is deep thinking.
There is plenty of research to back Carr’s claims. Many studies are discussed in the text; in addition there are 30 pages of endnotes and a few pages of suggested further reading. As I hear some people claim that reading fiction is no longer a very important task, I was particularly interested in a study that showed “’readers mentally simulate each new situation encountered in a narrative. Details about actions and sensations are captured from the text and integrated with personal knowledge from past experiences.” Further studies discussed in The Shallows indicate that the new knowledge isn’t just intellectual. It’s also emotional, giving rise to compassion and empathy as character traits. (OK, if you are a life-long deep reader, you know this intrinsically—but research and evidence are quite handy in discussing it.)
Carr argues that deep reading becomes difficult when the platform changes from a print book to a screen—say in a Kindle, Nook, or iPad. This surprised me, as I often read books on those platforms. The problem is that people click out of the books to see the comments of others readers, etc. (I get so involved in good books that I don’t do that, so maybe I’m going to be OK 😉 ) The studies on this phenomenon make me worry about textbooks, which will all be online soon. I’ve been welcoming the new form of textbooks because they have such cool extras—videos showing a lab experiment or links to art museums, etc. But if students are clicking on the links and videos embedded in the book, they’ll have the same problem that they’ll have on a webpage—the inability to immerse themselves in the deep reading that generates deep thinking. (And then people will say the schools are failing and teachers are getting lazier and lazier. When we don’t understand the cause, it’s always safe to blame the teacher!)
A related problem is how authors will start tailoring their books—the words they use—to get hits on search engines. They’ll be seeking visitors (as opposed to readers?), who in turn will be seeking ‘groupiness’ rather than enlightenment.
Carr moves beyond the deep thinking argument to studies on memory. Put quite simply, readers of hypertext don’t remember very much of what they read. I wonder, again, what this means for our students, who are young enough to have done almost all reading with hypertext. When teachers complain that students don’t remember as well as they used to, it may be more than crankiness that drives that perception. It may be true. Even the way the eye moves on an Internet screen (an F pattern—they don’t actually read the text) is different than reading and is not practice. We can keep making scapegoats of teachers, if it makes us feel better. But it may be that in fifty years, remembering and deep thinking will be specialized activities, perhaps done only by chosen people. Sort of like the YA novel The Giver that is so popular with students, with a twist. Everyone else will be buried in tons of information that is of immediate interest to them, whether it matters or not.
Carr finally takes on artificial intelligence and our incorrect assumption that it is modeled on human intelligence. The brain does not work like a computer; it’s not a simple data storage and retrieval system. Some of the research here—begun after studies on the brains of boxers (who had taken concussive blows to the head) and on post-seizure epileptics—is some of the most interesting stuff in the book because it proves that interruptions of short–term memories keep them from taking hold in long-term memory and that the two types of memory require different biological processes. Long-term memory is responsive to learning; it’s not a discreet data-bit hotel, but keeps processing, in numerous ways, the information it receives. We don’t need to free up space in our brains by storing information on the Internet because the brain doesn’t run out of space. In fact, the opposite appears to be true. Building up as store of memories sharpens the brain and makes it easier for us to learn new ideas and skills. It increases intelligence.
And how does the brain make the choice to convert working memory into long-term memory? Yup. Attentiveness. On the other hand, multitasking trains the brain for distractedness. With multitasking, neural pathways that process information quickly will grow in abundance, but pathways that create something new with that information won’t. And this doesn’t stop when you leave the computer because this is the new structure of your brain. We are giving up a lot in order to “mechanize the messy processes of intellectual exploration and even social attachment.”
Are we sacrificing the ability to read and to think deeply? The research says yes. Will it matter when everyone’s mind works like a computer works, when we use the Internet as a substitute for personal memory, when–following the path of artificial intelligence, we will, ironically, imitate the computer’s ‘brain,’ which was meant to imitate human intelligence? Carr thinks so.
Although The Shallows doesn’t take an alarmist tone, I found it pretty scary. As Carr cleverly put it, “The brighter the software, the dimmer the user.” And if it wasn’t bad enough, I read the epilogue, which is about using artificial intelligence to grade essays. I agree with Carr that software cannot “discern those rare students who break from the conventions of writing not because they’re incompetent but because they have a special spark of brilliance.” I suppose brilliant students will get lousy scores and stop showing their brilliance.
Is there any hope? “A series of psychological studies over the past twenty years has revealed that after spending time in a quiet rural setting, close to nature, people exhibit greater attentiveness, stronger memory, and generally improved cognition.”
So—check out this book. Then, go do a Thoreau, and make your way to the woods.