Culture

Facebook Rebrands Itself After a Fictional Dystopia

The social network renames itself after the Metaverse, a fictional immersive environment that first showed how awful commercial virtual worlds can be.
By
William Softky, Facebook news, Facebook rebrand, Facebook Meta, Facebook Metaverse, dystopian sci-fi, Snow Crash Neal Stephenson, Facebook harmful to children, social media harm to children, social media regulation

© Led Gapline / Shutterstock

November 23, 2021 10:42 EDT
Print

By rebranding itself as Meta, Facebook named itself an evil empire. It did so on purpose, and quickly enough to express its long-time motto, “Move fast and break things,” which, investors take note, is the antithesis of a long-term strategy. Moving too fast is probably why Facebook used open-access security protocols, a systematic sloppiness enabling whistleblowers to swipe top-secret papers documenting deliberate deception and damage.

Those documents show a growing global body count, including sad teenage girls, angry flag-wavers, sex traffickers and foreign dissidents. They prove that Facebook can’t be trusted even with its own security, much less with anyone else’s.


Looking for a Safe Place in Facebook’s Digital Universe

READ MORE


Imagine if a few years ago, the weirdly-lettered Google had renamed its parent company not the bland abstraction, Alphabet, but the chilling numbers 1984. George Orwell’s novel, “1984,” along with Aldous Huxley’s “Brave New World,” came to represent the dystopian genre of literature. “1984” in particular deployed the concepts of high-bandwidth surveillance and technological manipulation, both now Google’s specialties, to show how awful such a world would be. Naming a company 1984 would have officially espoused the evil Google once foreswore.

Manipulative Interference

In his 1992 novel “Snow Crash,” Neal Stephenson imagined digital mind-worms and a manipulative, immersive, corporate-sculpted social network called the Metaverse. “Snow Crash,” with its virtual avatars, skins and real-estate booms, showed how monetized technological interactions make people miserable. A major point of the book is that people are easy to program. Stephenson’s for-profit Metaverse was the original evil social network, ruthlessly converting human despair into corporate revenue.

The Facebook version will be yet more dangerous — if they can actually build it. At the moment, Epic’s game “Fortnite” is closer to Stephenson’s high bandwidth, synchronous space. Facebook’s version will add wireless, geo-tracking, news feeds, face-reading and biometric scanning — all intrusive technologies that magnify exploitation a hundredfold beyond what Stephenson imagined 30 years ago.

Embed from Getty Images

The inevitable result, as mathematical analysis predicted and medical evidence continues to support, is that technological socialization inflicts deep psychic damage. The medical case is simple: Humans evolve to communicate directly with in-born high-bandwidth sensory systems. Any damage to that data flow damages felt connection, but unconsciously, like lead poisoning. Monetized and manipulative interference causes the most harm.

Worse, trying to socialize through technology has horrific impacts on children’s emotional health. Even Stephenson’s original Metaverse, evil as it was, didn’t strip-mine children’s attentional systems for short-term revenue, as Instagram Youth will likely do.  

Unfortunately, those activities are still legal in the US, while lying to investors and government officials is not. Since social media’s damage to children’s mental health became clear only recently, those activities have been incentivized, nay required, in the name of “adding shareholder value.” Blame for past deeds will be legally murky.

Preventing Future Damage

Preventing future damage is more important, and governments already possess the know-how. If Facebook were a person, its relentless, breakneck, reckless disregard for law and safety over decades would make it by any reasonable standard an incorrigible repeat offender, an addict, a sociopath. Its data flows, incentives and resulting behavior show it to be adept at growing fast and making money, but only by means of a toxic business model and lies.

The growing body count of Facebook and similar technologies present a blunt challenge to governance of any kind. Even if exactly the right laws were passed, how could such a company obey? It’s not in its DNA, its workflow or its charter. How can governments deal with hardwired intransigence when public health is at stake?

Embed from Getty Images

The bad news is, the bad stuff is easy to hide. There are myriad curlicue ways that a complex, real-time technology company like Facebook can make obvious short-term money while creating hidden long-term damage. The common thread is algorithms that track money but not damage. 

Such systems can be so well-automated, humans may not even know the harm is happening at all. The good news is, all the information necessary to find and solve the problem is right there in the database, if you know where and how to look. (Spying solo into numbers on behalf of CEOs was my main job for several years; database queries don’t lie).

The good news is, governments already protect public health just fine in other domains, through regular inspections. Elevators, building codes, sprinkler systems, water quality and restaurant sanitation all invoke the same common-sense principle that if some technology endangers human beings, the thing itself must be inspected by a neutral party. Inspect the thing in question, not the paperwork that might be fudged. Governments know that when saving money costs lives, legalistic disclaimers and self-regulation don’t work. Random inspections do.

A crack team of independent data scientists, given support and read-only access to the Facebook databases and file systems, could in short order provide public health answers the public needs but that Facebook executives themselves don’t know.

Because damage done in ignorance is less culpable, some sort of amnesty will be necessary to grease the informational wheels, say immunity from prosecution for past deeds in return for new transparency, the way South Africa’s Truth Commission did. Right now, putting numbers on the body count and stopping it matter more than finding whom to blame.

Inspecting safety numbers is the same essential public-health service performed by a city inspector checking the temperature on a restaurant dishwasher. Toxic social-media metrics like “time on device” and “cost per conversion” already hurt children right now, and they’re already tracked. Best of all, their algorithmic influence can be turned off in an instant. Volunteers?

*[The articles in this column present a set of permanent scientific truths that interlock like jigsaw piecesThey span physics, technology, economics, media, neuroscience, bodies, brains and minds, as quantified by the mathematics of information flow through space and time. Together, they promote the neurosafe agenda: That human interactions with technology do not harm either the nervous system’s function, nor its interests, as measured by neuromechanical trust.]

The views expressed in this article are the author’s own and do not necessarily reflect Fair Observer’s editorial policy.

Comment

Only Fair Observer members can comment. Please login to comment.

Leave a comment

Support Fair Observer

We rely on your support for our independence, diversity and quality.

For more than 10 years, Fair Observer has been free, fair and independent. No billionaire owns us, no advertisers control us. We are a reader-supported nonprofit. Unlike many other publications, we keep our content free for readers regardless of where they live or whether they can afford to pay. We have no paywalls and no ads.

In the post-truth era of fake news, echo chambers and filter bubbles, we publish a plurality of perspectives from around the world. Anyone can publish with us, but everyone goes through a rigorous editorial process. So, you get fact-checked, well-reasoned content instead of noise.

We publish 2,500+ voices from 90+ countries. We also conduct education and training programs on subjects ranging from digital media and journalism to writing and critical thinking. This doesn’t come cheap. Servers, editors, trainers and web developers cost money.
Please consider supporting us on a regular basis as a recurring donor or a sustaining member.

Will you support FO’s journalism?

We rely on your support for our independence, diversity and quality.

Donation Cycle

Donation Amount

The IRS recognizes Fair Observer as a section 501(c)(3) registered public charity (EIN: 46-4070943), enabling you to claim a tax deduction.

Make Sense of the World

Unique Insights from 2,500+ Contributors in 90+ Countries

Support Fair Observer

Support Fair Observer by becoming a sustaining member

Become a Member