Suppose you were commissioned to write a short book about Johannes Gutenberg’s invention of the printing press and its impact on Western society. What would it look like? You might studiously inventory its various effects on government, religion, science, the arts, and so forth, but in the end you would concede that its ultimate consequence was something mysterious. It was the ineffable and immeasurable change that made the modern mind of an entirely different order of being than the medieval mind.

Now imagine that Gutenberg’s invention had been instantly accessible to anyone, anywhere, free of charge—not only with text but with images and even sound. Instead of taking centuries to work its way through society, rippling outward from Nuremberg, its effects would have been felt immediately, at once at the level of individual psychology and soon in the shape of society itself. Such is the technological revolution that is now supplanting the modern mind with what we might call the digital mind. So Christine Rosen warns in The Extinction of Experience, whose bleak subject matter is only made bearable by its author’s feisty sense of irony.

Rosen’s critique of the digital age avoids the usual suspects, i.e., from the comprehensive restructuring of our economy and the rise of a new class of technocratic robber barons to the inevitable physical consequences of persistent hunching over a small screen. Instead, she narrows her scope to just one aspect—its effect on the nature of human experience. By this she means our sensory and tactile engagement with our world and the people in it, which is atrophying at a distressing rate. Recent trends such as the decline in face-to-face conversation or “the quiet disappearance of handwriting from our lives” seem harmless enough when taken by themselves but in their totality are cataclysmic:

Many children now grow up in a world where their first experiences—of the natural world, play, music, and words—are mediated through the screen or some other form of technology. Their toys talk to them and record them; their baby monitors watch them; their devices track and monitor them; their parents create online identities and Instagram pages for them from birth. They will grow up in a culture that values the digital image above all. … This is our world. Do we want to live here?

Do we have a choice?

_____________

A book like this is likely to divide readers according to where they fall on the digital fault line. To those of us who came of age when the computer was more like a small building than a handheld device, and into which one fed punch cards, there are no great revelations here. You read it as you might read the formal police report of an accident that you lived through, with a kind of detached chagrin, surprised only at the amount of forensic detail. Those of a later generation, particularly those who grew up with the smartphone—meaning anyone under 30—are likely to find it impossibly quaint. (Revive cursive handwriting? Why not buggy whips and the codpiece?) And yet this is precisely the cohort that suffered most intensively from the effects of digital dependence during the coronavirus pandemic, America’s “large-scale experiment in distance learning for nearly 50 million K–12 students.” They are the ones who returned to school to make only 70 percent of the gains in reading they would have achieved in a normal year, and less than 50 percent in math. If anybody should recognize that something has gone badly wrong, they should.

For Rosen (this magazine’s Social Commentary columnist and a senior fellow at the American Enterprise Institute), virtual learning is “disembodied” learning, literally disregarding the fact that we are bodily beings inhabiting physical spaces, about which our senses continually inform us. She cites the remarkable research of Patricia Kuhl, who studied how babies acquired language, recording their head movements as they sat on their mothers’ laps and watching them as they spoke. Kuhl systematically repeated the experiment, this time with the mother speaking on a television screen, and with chilling results: no learning whatsoever.

Kuhl merely demonstrated scientifically what we should know instinctively—that we learn with our whole bodies, just as we experience the world itself. This has long been known in the Western world, where young children were once whipped to make them remember important events and transactions or property boundaries, a practice that survives in milder form in such rituals as dubbing a knight or a bishop’s symbolic slap at confirmation in the Catholic church.

Kuhl’s research was published in 2010, a decade before the pandemic, but did nothing to constrain our national experiment with remote learning. Too much self-interest was at play, from “technology billionaires’ foundations and technology companies” to political leaders who could not resist the allure of the quick fix (“Zoom classes for all!”), none of whom would be held responsible for the long-term damage. Also culpable, although Rosen does not mention them, were America’s public-school teachers, most of whom must have soon recognized that online teaching was not working. Any seasoned teacher knows that the classroom is a social place and that curiosity is contagious; a student’s receptivity to new things (and retention of them) is accelerated by mutual reinforcement with those around her, when everyone suddenly feels a common shared awareness of synchronized bodies, a kind of quiet mental stampede. Those moments are indelible as lived experience, while we watch detached speakers on Zoom with not much more attention than the babies had in Kuhl’s experiment.

_____________

Rosen’s most fascinating, and most heartbreaking, chapter concerns that sublime instrument of communication, the human face. Among mammals, ours is the one furless and hairless face, so that every tightening or slackening of facial muscles can be read clearly and in each of its myriad combinations (one scholar has evidently mapped out “over two hundred uniquely different expressions of anger”). If the human face has evolved to be an elastic message board, so has the human eye, whose unusually large sclera (the white) makes its movements much easier to detect. This helps to calibrate the precise moment of eye contact, which plays an important role in social synchronization—the process by which brain waves start to move sympathetically in tandem between two individuals. To read facial expressions properly, you must be cognizant of subtle unspoken signals, slight turns of the body and tilts of the head, little movements of the mouth and lips. And all this must be distilled into a conclusion, for example that you may kiss someone, or send the message that you are willing to be kissed.

When my colleagues and I returned to campus after our bout of remote teaching, we were at first relieved to be together with our students in a classroom, only to be dismayed by how difficult it was to connect with our students behind their masks. “I just can’t read the room,” one lamented. Another was distressed by how students increasingly tend to avoid eye contact. Rosen shows us that his distress was not imaginary but has a physiological basis. She introduces us to the vagus nerve, which links the brain to the heart and other internal organs, and which is affected by our ability to read facial expressions and tone of voice: “Increases in one’s ‘vagal tone,’ which is measured by studying heart-rate variability, are related to one’s capacity for connection.” And the more attuned you are to others, as Barbara Fredrickson’s research shows, the healthier you will be.

And vice versa, alas. Already by 2012, in the first flush of the smartphone, there was a precipitous drop in face-to-face contact. It was worst among young Americans. A massive study of some 3,500 young girls, ages eight to 12, showed them spending 6.9 hours a day on electronic media and just 2.1 hours interacting in person with someone. This was 13 years ago. While some of that media use took place on platforms such as FaceTime and was ostensibly personal, young people tend to multitask and not give their onscreen friend their full attention; digital interaction is no substitute for its face-to-fact counterpart. In any event, the greater the use of electronic media, the worse one’s “negative social well-being,” while “face-to-face communication was strongly associated with positive social well-being.”

Sooner or later vices tend to exact a penalty, and if a certain loss of “social well-being” was simply the penalty for enjoying digital media, you might make your own cost-benefit analysis. But as with tobacco or alcohol, there are other hidden costs. The most alarming statistic in The Extinction of Experience is that the last generation of college students has seen a significant drop in their capacity for empathy. From about 1980 to 2010, it declined by 40 percent, the drop accelerating since the introduction of the smartphone, and it is hardly likely to have reversed course since then. That might help explain why polls at several colleges showed that up to half of their students felt the murder of UnitedHealthcare CEO Brian Thompson was “acceptable” or “completely justified.” Or worse, why at least three people could watch a psychopath burn a woman to death in the New York subway without making a move to help—and, in at least one instance, recording it on video.

_____________

Some years ago, I listened with envy as a former student described his experience of hiking the Appalachian Trail and how his initial discomfort and frustration gave way, after a week or two, to a totally unfamiliar but thoroughly pleasurable state of mind. He described it as a kind of serenity where body and mind were at peace with each other, with that clarity that comes with total freedom from distraction. Whatever that state of mind was, I remember thinking, I keenly desired it.

Of course, up to about 10,000 years ago, life for all of us was essentially a perpetual hike on the Appalachian Trail. However much we have evolved since then, it has left us defenseless against the nonstop drip of digital distraction. Were it foisted on us by Mexican cartels, and not by tech billionaires, digital addiction would be studied as sedulously as heroin or OxyContin addiction. Pornography is only one of the categories of possible digital addiction, whose finely calibrated algorithms ensure that we remain as long as possible on Instagram or YouTube or TikTok, swiping away, creating that saddest of contemporary sights, those “parents staring raptly at their smartphones while pushing their children in strollers.” Until recently, one spoke of the “30-million-word gap,” the notion that by the age of three children in upper-class households had heard that many more words spoken than their lower-class compatriots. One suspects that a repeat of that study would show a much lower gap between social classes, so much is digital addiction an equal-opportunity temptation.

For the dread of boredom, Rosen shows, is one of the great motivating factors of modern life. The smartphone means that one need never experience boredom while waiting in line, and because of that, we have lost our tolerance for it. “Unmediated interstitial time,” as she drily calls those fallow moments, “is going extinct”;  there is no time interval so short that it cannot be filled with a look at the phone, including the amount of time it takes for the stoplight to turn green. But, she asks, could boredom serve some useful function—that is, besides its admonitory role of teaching us patience?

As part of her research, Rosen found her own version of the distraction-free Appalachian Trail at a Trappist monastery in Kentucky, where she spent a week in silence. (One of the pleasures of this book is learning what an engagingly intrepid researcher Rosen is, as when she deliberately bumped into rush hour commuters on a sidewalk in Washington, D.C., where she learned that those merely hurrying could be mollified with an apologetic smile while those talking on their phones would respond with an angry curse.) She has much to say about the value of boredom, how “a culture without boredom undermines the act of daydreaming,” which is where the creativity of the wandering mind is set free. Likewise, childhood creativity thrives on “unstructured, unmediated time,” which would account for the falling scores on the Torrance Test, which measures creativity in children—yet another of the book’s ominous statistics.

A test of a thoughtful work of close social observation is its explanatory power, its ability to help us make sense of otherwise baffling phenomena. Recently, I was startled to find young viewers mystified by A Charlie Brown Christmas, the celebrated television special of 1965; I had assumed it had a permanent place in the pantheon of childhood classics, up there with The Cat in the Hat and The Wizard of Oz. But Rosen helped me realize that the problem was its pervasive tone of melancholy. Caught in that narrow register between depression and boredom—equally unhappy conditions that demand treatment—melancholy has effectively been squeezed out of existence to the point where it is no longer part of the emotional life of modern youth. 

The digital mind, if we may call it that, a mind that does not know melancholy and that cannot abide boredom, and that thrives on distraction, seems to be a mind that has trouble concentrating. The absolute focus and sustained attention that high-level tasks demand is not something that one learns on the job; it derives from the habits of a lifetime. Is it any wonder that as those habits have changed, and in barely one generation, we find ourselves increasingly concerned about our “crisis of competence”?

Christine Rosen’s The Extinction of Experience is an essential book for our time, if a melancholy one. Her good humor and abundant human sympathy make her a welcome companion as she takes us on what is a tragic journey. She tells us much about the workings of the human heart and mind—including our own—in our digital age. She refrains from taking the next step into speculating how a society of digitally minded citizens, impoverished in both empathy and creativity, will reshape itself in the coming years. She has given us the psychological analysis. The sociological will have to wait.

Photo: Nevit Dilmen via Wikimedia Commons

We want to hear your thoughts about this article. Click here to send a letter to the editor.

+ A A -
You may also like
50 Shares
Share via
Copy link