'It explains why our ability to focus has gone to hell': Screens are assaulting our Stone Age brains with more information than we can handle

The silhouette of a head outlined against many blue computer screens
(Image credit: Bevan Goldswain via Getty Images)

We often joke that our attention spans have dropped significantly in recent years with the rise of digital technologies and screen-centric entertainment, but there is sound science to back up this observation. In fact, a shorter attention span is simply one side effect of a recent explosion of screen distractions, as neurologist and author Richard E. Cytowic argues in his new book, "Your Stone Age Brain in the Screen Age: Coping with Digital Distraction and Sensory Overload" (MIT Press, 2024).

In his book, Cytowic discusses how the human brain has not changed significantly since the Stone Age, which leaves us poorly equipped to handle the influence and allure of modern technologies — particularly those propagated by big tech companies. In this excerpt, Cytowic highlights how our brains struggle to keep up with the lightning-fast pace at which modern technology, culture and society are changing.


From an engineering perspective, the brain has fixed energy limits that dictate how much work it can handle at a given time. Feeling overloaded leads to stress. Stress leads to distraction. Distraction then leads to error. The obvious solutions are either to staunch the incoming stream or alleviate the stress.

Hans Selye, the Hungarian endocrinologist who developed the concept of stress, said that stress "is not what happens to you, but how you react to it." The trait that allows us to handle stress successfully is resilience. Resilience is a welcome trait to have because all demands that pull you away from homeostasis (the biological tendency in all organisms to maintain a stable internal milieu) lead to stress.

Screen distractions are a prime candidate for disturbing homeostatic equilibrium. Long before the advent of personal computers and the internet, Alvin Toffler popularized the term “information overload” in his 1970 bestseller, Future Shock. He promoted the bleak idea of eventual human dependence on technology. By 2011, before most people had smartphones, Americans took in five times as much information on a typical day as they had twenty-five years earlier. And now even today’s digital natives complain how stressed their constantly present tech is making them.

Visual overload is more likely a problem than auditory overload because today, eye-to-brain connections anatomically outnumber ear-to-brain connections by about a factor of three. Auditory perception mattered more to our earliest ancestors, but vision gradually took prominence. It could bring what-if scenarios to mind. Vision also prioritized simultaneous input over sequential ones, meaning that there is always a delay from the time sound waves hit your eardrums before the brain can understand what you are hearing. Vision’s simultaneous input means that the only lag in grasping it is the one-tenth second it takes to travel from the retina to the primary visual cortex, V1.

Smartphones easily win out over conventional telephones for anatomical, physiological, and evolutionary reasons. The limit to what I call digital screen input is how much the lens in each eye can transfer information to the retina, the lateral geniculate, and thence to V1, the primary visual cortex. The modern quandary into which we have engineered ourselves hinges on flux, the flow of radiant energy that bombards our senses from far and near. For eons, the only flux human sense receptors had to transform into perception involved sights, sounds, and tastes from the natural world. From that time to the present we have been able to detect only the tiniest sliver of the total electromagnetic radiation that instruments tell us is objectively there. Cosmic particles, radio waves, and cellphone signals pass through us unnoticed because we lack the biological sensors to detect them. But we are sensitive, and highly so, to the manufactured flux that started in the twentieth century and lies on top of the natural background flux.

Our self-created digital glut hits us incessantly, and we cannot help but notice and be distracted by it. Smartphone storage is measured in tens of gigabytes and the hard drive of a computer in terabytes (1,000 gigabytes), while data volumes are calculated in petabytes (1,000 terabytes), zettabytes (1,000,000,000,000 gigabytes), and beyond. Yet humans still have the same physical brain as our Stone Age ancestors. True, our physical biology is amazingly adaptive, and we inhabit every niche on the planet. But it cannot possibly keep up with the breathtaking speed at which modern technology, culture, and society are changing. Attention spans figure prominently in debates about how much screen exposure we can handle, but no one considers the energy cost involved.

A much-cited study conducted by Microsoft Research Canada claims that attention spans have dwindled to below eight seconds — less than that of a goldfish — and this supposedly explains why our ability to focus has gone to hell. But that study has shortcomings, and “attention span” is a colloquial term rather than a scientific one. After all, some people’s Stone Age brains have the capacity to compose a symphony, monitor the data stream from a nuclear reactor or the space station, or work out heretofore unsolvable problems in mathematics. Individual differences exist in the capacity and ability to cope with stressful events. To give California its due, Gloria Mark at the University of California, Irvine, and her colleagues at Microsoft measured attention spans in everyday environments. In 2004, people averaged 150 seconds before switching from one screen to another. By 2012 that time had fallen to 47 seconds. Other studies have replicated these results. We are determined to be interrupted, says Mark, if not by others, then by ourselves. The drain on our switching is "like having a gas tank that leaks." She found that a simple chart or digital timer that prompts people to take periodic breaks helps a lot.

Neuroscience distinguishes sustained attention, selective attention, and alternating attention. Sustained attention is the ability to focus on something for an extended period. Selective attention speaks to the aptitude for filtering out competing distractions to stick with the task at hand. Alternating attention is the capacity to switch from one task to another and back again to where you left off. In terms of the energy cost incurred by repeatedly shifting attention throughout the day, I fear we have hit the brain’s Stone Age limit. Exceeding it results in foggy thinking, reduced focus, thought blocking, memory lapse or precision calipers, any tool quickly comes to feel like an extension of oneself. The same applies to smart devices. Two centuries ago when the first steam locomotives reached a blistering speed of thirty miles per hour, alarmists warned that the human body could not withstand such speeds. Since then ever-faster cars, communication methods, jet planes, and electronics have diffused into the culture and become absorbed into daily life. In earlier times fewer new technologies appeared per decade, fewer people were alive, and society was much less connected than it is today.

By contrast, the invention, proliferation, and evolution of digital technology have put the status quo in constant flux. Unlike analog counterparts such as a landline telephone or a turntable, smart devices repeatedly demand and command our attention. We have conditioned ourselves to respond to texts and incoming calls the moment they arrive. Admittedly, sometimes jobs and livelihoods do depend on an immediate response. Yet we pay a price in terms of energy cost incurred by constantly shifting and refocusing attention.

Disclaimer

This excerpt has been edited for style and length. Reprinted with permission from "Your Stone Age Brain in the Screen Age: Coping with Digital Distraction and Sensory Overload" by Richard E. Cytowic, published by MIT Press. All rights reserved.


Your Stone Age Brain in the Screen Age: Coping with Digital Distraction and Sensory Overload — $28.91 on Amazon

Your Stone Age Brain in the Screen Age: Coping with Digital Distraction and Sensory Overload — $28.91 on Amazon

The human brain hasn’t changed much since the Stone Age, let alone in the mere thirty years of the Screen Age. That’s why, according to neurologist Richard Cytowic — who, Oliver Sacks observed, “changed the way we think of the human brain” — our brains are so poorly equipped to resist the incursions of Big Tech.

TOPICS
Richard E. Cytowic
Clinical Professor of Neurology at George Washington University

Richard E. Cytowic, MD, MFA is best known for returning synesthesia back to mainstream science after decades of disbelief. Dr. Cytowic speaks to cultural institutions and performance venues worldwide. He holds an MFA in Creative Writing from American University and is an alumnus of Duke, Wake Forest, and George Washington Universities, along with London’s National Hospital for Nervous Diseases.