Newspapers were already good business, way back when they carried actual news, on paper. Gather some news, write it down, stamp cheap ink on cheap wood-pulp and presto! — scalable revenue, with no obvious upper bound.
In fact, the more people read printed news, the more valuable it is, being both trusted by many and known by many. With rewards like those, of course some news outlets grow to national, even international, scale, almost like natural monopolies, and provide whole societies with standardized, synchronized information. With a subscription, any person or library could have their own complete, permanent record of what matters — and mattered — to society.
Now update printed news to the digital age. The cost of copying and delivering information goes from cheap to free, meaning more profits for the outlet but also more uncaptured revenue — because each reader can now copy and deliver too. Furthermore, each specific reader can be profiled, so that specific items and ads can be tailored for each reader. It’s almost the perfect business.
Is Technology Neutral?
“Almost,” because while digital platforms give far more fine-grained control over readers, they lose control over money. It’s actually too easy to copy digital content, and too easy for readers to find free versions, so the old pay-for-paper model, with its chokepoint at the retailer, doesn’t work anymore. One solution is extra advertising, bound so tightly with the news, paragraph by paragraph, that you can’t escape it. Unfortunately, such relentless interruption undermines reading comprehension and drives readers crazy.
An Almost Perfect Business
The alternative is to charge for digital access, with a paywall. A paywall not only captures revenue otherwise lost but, by making the information harder to access, it reaps the rewards of scarcity, because hard-to-get things are more valuable. Furthermore, a paywall can charge different rates to different people and give different things in return, so money and attention can be most effectively squeezed out of readers. In fact, if you ask the simple business question about how to turn an existing news source into money, a paywall seems the best answer. And it is the best answer, but to the wrong question.
The right question asks about the opposite influence: How do paywall revenues influence the information being supplied?
That depends on what you mean by “information.” Economics and computer science both understand information, but in different ways. As a general rule, economics considers information to be part of the essential infrastructure, like air: necessary, neutral and freely available. Or rather, economists use the concept of “free information” to prove theorems about stable economic balances in free markets. But as American laissez-faire economist Milton Friedman once said, “There is no such thing as a free lunch.” He could have also said that “There is no such thing as free information.” Information costs money and is worth money. The rarer it is, the more it’s worth — and the more it’s worth to copy.
That insight is so deep, even computer scientists understand it. In fact, one of them proved it mathematically. Claude Shannon, who invented the concepts of bytes and bandwidth, proved that the more unlikely a message is, the more information (bits or bytes) it carries. But there’s a catch: If you even copy the information once — much less a million times — you change those probabilities, and thus change the actual information carried by the message, even if the apparent information (the content) remains the same. So copying, just by itself, corrupts information. A stock tip on the front page isn’t worth as much.
Now add in moral hazards, the economic term for foxes guarding henhouses. For hundreds of years newspapers tilted news toward advertisers and those in power, but the tools were coarse, and proof lingered on paper. Online advertising has millions of times more data now, so the moneybags now have even more power to ensure the news serves their own interests. And since there is no solid paper to store, there is no way to record whatever chicanery keeps those sponsors happy. Biased news helps defray the bills, leaving little lingering trace, save on balance sheets. In the online world, unbiased news is more expensive to write — and harder to sell.
Moral hazards show up inside newsrooms too. The New York Times is reputed to earn $600 million a year from its paywall, which means half a billion dollars from people reading news on screens. With that much revenue at stake, how likely are they to report on the undisputed technological proof that screen use damages the human nervous system? Economically, their paywall forces them to stay hush about a dangerous technology.
The root problem is a bit of business wisdom I learned as a data scientist at a large aggregator of online sales “leads.” As I wrote the automated fraud-detection system, I was told to devalue so-called incentivized leads — “fill out this form, get a free phone”— as the least reliable ones in our entire ecosystem. The general rule is embarrassingly obvious: If monetary forces have a chance to influence information according to their specific bias, they will. It’s their job. Thus, incentives undermine trust.
To be sure, business pressures and human trust have co-existed for millennia, at least until recently. So obviously, incentives all by themselves don’t cancel human trust. But human trust, over not just millennia but over millions of years, was formed by eye contact, proximity, handshakes and long-term relationships, multiple forms of high-bandwidth sensory information entirely missing online. In the absence of that potent human glue, trust will inevitably erode, but it happens much faster when business pressures work their magic.
In short, business doesn’t understand trust once computers are involved. Fortunately, computer scientists do. They know two things.
First, they know that errors cascade. One becomes two becomes four becomes eight, and so on. That’s why even a single bit-flip can crash a computer. Second, they know that distributed processors must cooperate. If different parts of a computer start competing with each other, especially undermining each other’s communications the way warring nations do, the system must fail. So digital computers trust every bit, because every bit is perfect. It has to be.
To understand distributed human trust, imagine our base hardware in its original analog “paleo” configuration: pre-verbal Homo sapiens. Before words, our biped ancestors foraged and hunted in tight-knit groups, communicating entirely with grunts, hoots, grimaces and back-slaps. That is, their interactive, distributed communications channel was live vibration, like elaborate tuning forks linked over meters and seconds, down to micrometers and microseconds. As vibrating jelly bags, they stayed “in tune” with each other without using words or categories. That’s how we natively collaborate.
The good news is, we know the human kind of distributed computing works, or we wouldn’t be here. The bad news is, our version needs in-your-face reciprocity, which doesn’t work online. Imagine news, back in the day. An incident seen by a human being, and recounted around the campfire in the presence of other human beings, counted as trustworthy “news” to a human. Physical newspapers were still somewhat trustworthy, because a newspaper published over years by a known citizen, in public on enduring newsprint, had real people as publishers and editors, and had a medium — paper — which was transparent and enduring. An article flickering on a screen under a logo is far less trustworthy, being devoid of permanence and human presence. The worst, paywall-enabled news presentation occurs when online messages appear differently to each reader, as tailored by ads and algorithms, and can disappear whenever powerful interests wish, replaced by a sanitized version under the same title.
So the very features of digital news which help paywalls make money — their abilities to track readers, to rewrite and repost at will, to prevent unauthorized access and attribution — also undermine both individual trust and public trust. Paywall outlets do provide whole societies with information, but “personalization” means it is now neither standardized nor synchronized, and thus not useful for solving social problems. Unlike subscription newspapers, paywalls contribute noise and bias more than signal. In the neutral terms of computer science and data science, paywalls and ad tech undermine trust. It’s time for real innovators to find a real solution.
*[The articles in this column present a set of permanent scientific truths that interlock like jigsaw pieces. They span physics, technology, economics, media, neuroscience, bodies, brains and minds, as quantified by the mathematics of information flow through space and time. Together, they promote the neurosafe agenda: That human interactions with technology do not harm either the nervous system’s function, nor its interests, as measured by neuromechanical trust.]
The views expressed in this article are the author’s own and do not necessarily reflect Fair Observer’s editorial policy.
For more than 10 years, Fair Observer has been free, fair and independent. No billionaire owns us, no advertisers control us. We are a reader-supported nonprofit. Unlike many other publications, we keep our content free for readers regardless of where they live or whether they can afford to pay. We have no paywalls and no ads.
In the post-truth era of fake news, echo chambers and filter bubbles, we publish a plurality of perspectives from around the world. Anyone can publish with us, but everyone goes through a rigorous editorial process. So, you get fact-checked, well-reasoned content instead of noise.
We publish 2,500+ voices from 90+ countries. We also conduct education and training programs on subjects ranging from digital media and journalism to writing and critical thinking. This doesn’t come cheap. Servers, editors, trainers and web developers cost money. Please consider supporting us on a regular basis as a recurring donor or a sustaining member.