Googling while Memory Burns?

For new readers, one of the enduring themes of this blog, and of my teaching is that I am a big fan of facts and a doubter of the efficacy of teaching skills (21st century or otherwise). At the same time, I am a nerd, early adopter, and big fan of the internet. Sometimes this seems like a conflict, because a lot of nerds like me think that Google frees us from ever having to learn any facts, since we know where to look for them. We can skip the facts and go straight to the good stuff: the critical thinking skills. I disagree, and I am going to use the study du jour to talk about where I stand on this issue.

The recent study, published in Science, is from Betsy Sparrow and colleagues from Columbia University. If you are unfamiliar with this and want to learn a little more, I suggest reading the always amazing Ed Yong. You could also read the press release from Columbia, and watch the little video interview with Betsy Sparrow, the first author. I also liked Adam Clark Estes’ take on the Atlantic Wire as well as Matthew Ingram’s defense of memory outsourcing. There is also Jonah Lehrer, or Mr. “Shallows” himself: Nicholas Carr. Also, I think the best background for all of this is Adam Gopnik’s piece in the New Yorker, describing the three camps: the Never Betters, the Better Nevers, and the Ever Was’ers.

Where do I stand on this? First, I think it is useless to talk about whether Google/twitter/internet is making us stupid or smart in general. Does agriculture make us healthier? Was the discovery of the New World a good thing? Sure, we could try to answer these questions, and people did (I am reading about an essay contest for the second question, in Lee Dugatkin’s wonderful “Jefferson and the Giant Moose”) but the questions are too broad for the answers to be anything but the author’s prejudices, loosely clinging to the often shoddy available scientific evidence. That said, here’s my try to integrate these questions with what what I know and care about.

So first, let’s get some better questions, and try to stick a little closer to the current research, as well as the past research that gives it context. How does the widespread use of web search engines change our expectations of memory for certain trivia? We may not be able to generalize all the way up to “making us stupid”, but how much can we generalize? Even though the Sparrow study used general trivia questions, about flags, or movies, etc, does this apply to other domains of knowledge? Does relying on Google mean we have less knowledge ourselves? Or does this reliance on external memory, as Carr suggests, result in a “loss” of internal memory?

We have to start with a simple fact: Brains be thinking. You CAN NOT STOP a brain from thinking. When Carr ends his piece with “But as memory shifts from the individual mind to the machine’s shared database, what happens to that unique “cohesion” that is the self?” He is assuming that some sort of unitary memory is leaving our own heads, and residing in the cloud. We are shifting some part of memory, but we don’t have empty heads, and we never will. We will pick something else to think about. Yes, we lost the art of memory, memorizing long tracts like ancients, but that loss was replaced by something different. For me, the critical point is that we will always think about things we find interesting, and we will seek to learn more about those things we find interesting. Nicholas Carr finds the interaction between Google and human cognition interesting, so he seeks information (and yes, facts) about this. Some of that information he even finds on the internet (shh, don’t tell him the study was published online, and that thousands more people know of it, and have deeper knowledge of that study because of the internet). He thinks that knowledge is now “shallow” but has no good definition of either “deep” or “shallow”.

How does this fit in with existing research on learning and memory? Lehrer has covered it pretty well, and also in his book, Proust was a Neuroscientist. But I think Dan Willingham cuts to the point better: Memory is the residue of thought. We remember what we think about it. If we don’t think about it (like that report I did in sixth grade on Belgium) we don’t remember it. If we do think about it, we do. We remember facts that we use again. We remember facts that we like to play with in our heads. Jared Diamond says that “primitive” peoples are actually more intelligent than we are, since they have to keep in mind where the poisonous plants are, where the predators are, and don’t have external memory aids. But we just fill our brains with other stuff. On second thought, this is misleading… we have no indication that our brains can fill up, our memory capacity is limitless. What IS limited, is time. We fill our time with other thoughts.

So I think Carr is wrong for fearing what we are “losing.” But I also think that Sparrow, and others are wrong when they diminish the value of facts:

“Perhaps those who teach in any context, be they college professors, doctors or business leaders, will become increasingly focused on imparting greater understanding of ideas and ways of thinking, and less focused on memorization,” said Sparrow. “And perhaps those who learn will become less occupied with facts and more engaged in larger questions of understanding.” from the press release

The problem is that “larger questions of understanding” are often inextricably bound up with knowing a lot of facts. Critical thinking about art history is simply out of my reach, because I do not know enough facts about art history, whether they be the difference between the impressionists and the Surrealists, or what it meant to depict commoners instead of the royal family as subjects for art. Any ability I have to fake critical thinking about art history comes from facts (I really liked learning about Cezanne, because of what he tells us about perception of form, and everything I learned about surrealism is from reading about Duchamp’s art, tryign to recreate his rotoreliefs

for demonstrations in my classes on motion perception), not from critical thinking skills. You simply cannot do critical thinking without a solid foundation of factual and conceptual knowledge. Put another way, I am a far more effective googler than most of you are, when it comes to finding good information on cognitive psychology, not simply because I know that Wikipedia is not always reliable, but that I know how to fit something into the current set of things I know, what sources are better in my field. There are very few shortcuts to “larger questions of understanding.”

So, if facts are actually really important, then isn’t our reliance on Google a bad thing? It ain’t necessarily so. (The things that you’re liable to read in the Bible, ain’t necessarily so.) The take home message for me is this: Google is only bad if it reduces our overall interest in learning new facts. I don’t think Google does this. I am happy to let my wife remember how to make her wonderful gazpacho, she is happy to let me remember how to make bread, and we both are happy to let our phones remember phone numbers. But we both feel passionate about education and education reform. We collect facts, and talk about them at night. She uses twitter to learn hundreds of names of education journalists, teacher bloggers, policy wonks, and activists. We talk about who is an ideologue, who is worth listening to, what evidence informs what perspective. We wonder what makes Hanover County work so well while our alma mater District of Columbia Public Schools so dysfunctional. And I don’t see Google as dimming that interest, that curiosity in us.

I also don’t see Google dimming my kids curiosity, or that of my students. I do worry about high stakes standardized testing dimming students’ curiosity and teacher creativity. When my son uses Google Earth to explore the Egyptian pyramids, it doesn’t quench his curiosity, but whets his appetite for more knowledge. But when he comes home and says “I like reading, but I hate reading tests,” that’s when I worry. Ironically, it is not Google that is making us stupid, but our insistence on always knowing exactly how stupid we are, that is inhibiting what seems to be inevitable cognitive progress.