You know how certain technologies can be dangerous together? Texting and cars. Cigarettes and filling stations. Blood thinners and surgery.
It turns out that data science and neuroscience are like that. Those barely-overlapping disciplines each understand complementary things about human brains. Unfortunately, when these insights combine, the consequences can be catastrophic for mental health. What data science understands could be called “optimized persuasion.” What neuroscience understands is called “natural statistics.” Not many people are familiar with both ideas because expertise in either data science or neuroscience is rare, and finding both together in one brain is even rarer.
So lucky me — my PhD combined data science and neuroscience, and I’ve worked for over a decade in each field simplifying, inventing, explaining, analyzing and even helping save Microsoft’s NT5. My lucky accident of double training lets me see how brains work and also how technology affects them. As a bonus, I’m married to a fiercely independent thinker who understands mediated communication and story structure (“narratology”). Our combined expertise covers a wide swath of science and the humanities, and the peer-reviewed research report from our three-year sabbatical was published in 2017.
An Epic Battle
Here’s the gist I want you to know, as a fellow warm-blooded human who cares about those close to you: We’re in the midst of an epic battle between two historic forces, neither of which seems likely to retreat. In one corner, the triumph of data science and algorithms over our sensory and attentional systems — i.e., effective automated persuasion on a global scale. We humans have been hacked. In the other corner, the response of those sensory and attentional systems to a novel, 24/7 data flow containing weird regularities and interruptions, a statistical ensemble that looks nothing like the natural environment in which that hyper-sensitive system is able to self-calibrate. The irresistible force vs. our brittle brains.
The key neuroscience concept behind the brittleness of brains is “natural statistics.” It turns out that animal brains evolved to operate in natural environments, and they also learn best in those same environments. For example, a few decades ago, my long-time colleagues Bruno Olshausen and Dan Ruderman showed that the information-processing properties of the mammalian visual system ought to have evolved to work best in the forest-and-bush-heavy environment of nature, and in fact neurons do work that way.
In other words, our eyes expect natural-looking fine detail everywhere. The natural world is filigree. That’s what the visual system processes and learns from. Saying that nature looks like nature is now quantifiable and has become a crucial concept for understanding how brains work. For example, lab animals that grow up in cages, which are flat, boring and lit by flickering lights, have bad vision compared to ones that live outdoors.
Neuroscientists already know that hours a day of fake, distracting, unnatural input is bad for animal brains in general; why would it not be true for human brains?
On the other hand, as we know, electronic technology is ever-more addictive, intrusive and profitable, now consuming half of our waking hours and social lives, more or less. The problem isn’t that technology is necessarily bad — it isn’t — but that the specific kinds of technologies we find most captivating are bad, precisely because we find them captivating.
Our informational appetites evolved in the bush, where interesting things and dopamine are hard to find. Now, we have quick hits everywhere. Any kind of animal tends to get addicted when appetizing things once rare in nature — cocaine-levers for rats, laser dots for cats, treadmills for mice — become suddenly common. For really unusual stimuli, even existing at all violates nature’s statistical contract.
You don’t need to know the technical details of our scientific article (I’ll explain it if you want). The gist is threefold: 1) any technology that influences our sensory interactions affects our brains in numerous ways; 2) most of the damage is unconscious; and 3) our response to damage usually seeks even more of the damage-causing media.
From Cave Art to the TV
Of course, this has always been true of representational technologies, from cave art to novels to TV. As we fogies keep saying, don’t look too long at the flat thing because it is nothing like the real thing. But smooth sidewalks and flat walls are also unnatural and also deprive us of the natural variability we need. So, in a deep sense, the problem we face is as old as material culture itself — we build things we find fascinating, then we become fascinated by them.
But now the things we build are a million times faster and smarter than us. Technology has outstripped the human sensory system. We are simple beings, whose nervous systems evolved to work in the simple medium of air. Air is multi-sensory, high-fidelity, dropout-free, symmetrical and honest. It is in every way the opposite of the digital world. Persuasive representational technologies, on the other hand, have never before been so well-informed, so integrated into daily life, so interactive, so instantaneous, so “smart,” and so focused. As a species, machines have won the evolutionary bandwidth war hands-down. And they’re getting better all the time.
The good news: Just get back in 3D space and you’ll be fine. “Natural statistics” is why hikes outdoors make us feel better and why laughter in a crowded pub beats watching videos by yourself. From technology’s point of view, the onslaught of tech taking over human minds seems unstoppable. But from a human point of view, just turn it off and look around. You’re still here, after all.
*[Big tech has done an excellent job telling us about itself. This column, dubbed Tech Turncoat Truths, or 3T, goes beyond the hype, exploring how digital technology affects human minds and bodies. The picture isn’t pretty, but we don’t need pretty pictures. We need to see the truth of what we’re doing to ourselves.]
The views expressed in this article are the author’s own and do not necessarily reflect Fair Observer’s editorial policy.