Ferris Jabr’s article “The Reading Brain in the Digital Age: The Science of Paper versus Screens” in Scientific American (April 11, 2013) revisits the themes raised in Maryanne Wolf’s Proust and the Squid mentioned in the previous posting. Jabr highlights much insightful writing on the neuroscience of reading, on which more in a bit. He begins, however, with a “haptic” anecdote that will resonate with parents and grandparents of children who are learning to read now or have learned in the last 3-5 years.
In a viral YouTube video from October 2011 a one-year-old girl sweeps her fingers across an iPad’s touchscreen, shuffling groups of icons. In the following scenes she appears to pinch, swipe and prod the pages of paper magazines as though they too were screens. When nothing happens, she pushes against her leg, confirming that her finger works just fine—or so a title card would have us believe.
Earlier the same year, I was lying in bed with an iPad reading Death and the Penguin by Andrey Kurkov. As the story drew me in and admittedly as the hour grew late, I found myself repeatedly reaching into the upper right-hand corner of the screen with my left forefinger and thumb to pick up and “turn the page.” I had not developed the habit of “sweeping” or “tapping” to move through the book. These real-life mirror images of the haptic habits of a young soon-to-be reading brain and an old reading brain bring Wolf’s speculations alive.
Numerous studies cited by Jabr suggest different areas of the brain at work in screen reading vs print reading and connect that to poorer retention and comprehension in screen reading than print reading. But one of the more recent ones (“Metacognitive regulation of text learning: On screen versus on paper,” by Ackerman and Goldsmith) shows that where readers
studied expository texts of 1000–1200 words in one of the two media and for each text […] provided metacognitive prediction-of-performance judgments with respect to a subsequent multiple-choice test[,] [u]nder fixed study time (Experiment 1), test performance did not differ between the two media, but when study time was self-regulated (Experiment 2) worse performance was observed on screen than on paper. The results suggest that the primary differences between the two study media are not cognitive but rather metacognitive—less accurate prediction of performance and more erratic study-time regulation on screen than on paper.
So the reading brain may not be rewiring itself, but print and screen do demand different strategies of reading and study. Might the “haptic habits” of physically turning the page or recalling three dimensionally the place in the book and on the page where a sentence occurs (or pinching, swiping and prodding) be clues to how we learn to learn what we read? What we may be seeing in the one-year old are the beginnings of the metacognitive cues that will raise the performance of tomorrow’s screen reading brains, and in Ackerman’s and Goldsmith’s subjects, the familiarity of today’s reading brains with the metacognitive cues so key to studying from print that the students print out the relevant ebook chapter.
As Jabr concludes, “When it comes to intensively reading long pieces of plain text, paper and ink may still have the advantage. But text is not the only way to read.”
Which harks back to the conclusion of the previous post and Jerome Bruner’s apt observation of Vygotsky’s fondness for Bacon’s epigram, “Nec manus, nisi intellectus, sibi permissus, multum valent” (Neither hand nor intellect left each to itself is worth much)” (247). Perhaps neither print nor digital left each to itself is sufficient.