In 1954, when television sets were just becoming widespread in American homes, Alfred Hitchcock made “Rear Window.” The film portrays L. B. “Jeff” Jeffries, a magazine photographer played by James Stewart, stuck at home in a wheelchair with a broken leg, whose only entertainment is gazing through his window and observing his neighbors’ private lives in their homes. “First I watched them to kill time,” Stewart tells the viewer, “but then I couldn’t take my eyes of them, just as you wouldn’t be able to.”
The film is a subtle allusion to the new medium of television that would soon change the world. Using the window as a metaphor, Hitchcock depicts the effects of television on privacy and, most importantly, people’s curiosity about each other. Hitchcock anticipated the cultural shifts to be brought by the effects of television on American society in those times. Watching one’s neighbors through a rectangular window is not much different from watching TV on a couch. In this sense, Hitchcock anticipated the effect of the couch potato, which has now become common in the English idiom.
How Social Media Is Changing Our World
But Hitchcock’s artistic foresight went further than that. He foresaw something more profound, namely the intersection between human curiosity and a new form of communication technology that was television. Today, in the age of the smartphone and social networks, this intersection is even more visible and can easily be called the curiosity economy.
Since the heyday of television, human curiosity has driven technology further. The post-TV age made entertainment portable and put it in our pockets. There is a reason smartphone screens have been getting larger, not smaller, over the years. If television meant watching random, unknown people on a wide screen, the smartphone allows us to observe our neighbors on social media. This makes social media much more engaging and personal than television. Teens today can survive without TV, but they can barely do without their phones. A study last year found that people will have to be actually paid around $1,000 to quit Facebook, even after the Cambridge Analytica scandal that exposed the internet giant’s abuse of user data.
The smartphone era is sometimes referred to as the attention economy, where tech companies treat human attention as a scarce commodity and bombard us with push notifications and updates. But our attention is fueled first and foremost by our curiosity and the desire to know what others do and say. The curiosity economy is at the heart of our “global village,” a phrase coined by the media philosopher Marshall McLuhan. Already in the 1960s, McLuhan presciently remarked that “the global village is a world in which you don’t necessarily have harmony. You have extreme concern with everybody else’s business and much involvement in everybody else’s life.”
Reading a book used to be the most private and discreet way of accessing and interpreting information. It is not a public medium such as TV, social media, radio or even the newspaper. It is a fully private dialogue between the writer and the reader, completely desynchronized from the public. But even the book is losing this characteristic of privacy in the curiosity economy. Amazon’s e-reader Kindle shows the most-popular highlights throughout the book and recently introduced a button for Goodreads, a social media website for rating books. In the curiosity economy, it is no longer enough to interpret the content of a book for oneself — Kindle now allows you to do it “together” with other people. It gives clues as to what the public, not the individual reader, perceives as worth noting. Even listening to music is not always private. The music-streaming platform Spotify has the “social” option always on by default so friends can always see what music you are listening to. If you want to be able to listen to music in complete privacy, you have to constantly keep switching the function off.
If privacy is one of the biggest concerns of our times, then curiosity is the other side of the same coin: The former is under threat because the latter has no limits.
Who Killed the Video Star?
The shift from passive TV-viewing to the more engaging smartphone use is also seen on the political scene. If telegenic JFK was the first TV president, then Donald Trump is the first Twitter president. The TV age was mostly about presentability and image. The social media age is more about engagement and entertainment. In the TV age, one had to be physically present in a specific place at a specific time to be able to tune in to a lengthy political debate or a presidential address to the nation, which sometimes could last for hours. This made the engagement between the voter and the politician less frequent but more substantial.
The portable smartphone changes that. The curious and impatient smartphone voter expects more frequent updates from his politician compared to the TV voter. And the line between the politician and the influencer is increasingly blurred.
President Trump is known for starting his day by tweeting and admits he uses Twitter mainly to “keep people interested.” And he is not alone in doing that. The former Democratic presidential candidate Beto O’Rourke once live-streamed his visit to the dentist, suggesting that “If it is not on Instagram, it didn’t happen.” Ukraine’s President Volodymyr Zelensky successfully used Instagram for his electoral campaign, and it is not uncommon for him to address voters on Instagram directly from the gym.
In Brazil, President Jair Bolsonaro largely avoids TV and focuses on engaging with voters on social media instead. One analyst told The Economist that Bolsonaro is perceived as more sincere across social media networks because there, he is usually seen among friends and family. The former Italian Deputy Prime Minister Matteo Salvini used Facebook as part of the “every selfie a vote” strategy, as described by The Atlantic. Even the old-guard presidential candidate, former New York Mayor Michael Bloomberg, almost entirely bet his campaign on social media in order to defeat Trump, using the president’s own tactics of simplistic communication through memes.
In short, in the post-TV age the politician has to drop the suit and tie and behave more like your next-door neighbor. The public figure must provide the voter with a constant stream of information where the emphasis often is on quantity rather than quality.
In the Information Age, this constant stream of updates does not go to waste. It is now a valued resource we call data. Just like a river stream is converted into energy, we now use this flow of information to create intelligence — artificial intelligence. The enormous amount of information we generate is turned into a commodity that is now officially more valuable than oil. We have become the hunter-gatherers of information, and this new gold is no longer found underground but on the servers of tech companies.
But how did we arrive at this global village? How could the value of Facebook — a website originally intended for college students to see each other’s pictures — become bigger than the GDP of Argentina? What drove curiosity to become such an important pillar of today’s tech-based society?
One of the answers is that curiosity is deeply ingrained in our very own survival instincts. It is a human trait interlinked with prudence and the fear of the unknown. The word comes from the Latin cūriōsus — a careful, diligent person, with the root word cura, or care. Since the prehistoric times, human beings were never really safe in their village or cave, so they had to explore their immediate surroundings and expand the known territory for possible threats from outside. Attack was always the best form of defense. Exploring and conquering distant lands was a form of protection from the unknown. So was conquering nature. Driven by curiosity, every scientific discovery exposed nature’s secrets and, as a result, its threats. It is telling that NASA’s rover currently exploring Mars is called Curiosity.
It seems that we are curious about each other for the sake of connection as much as protection. If, as the French philosopher Jean-Paul Sartre famously quipped, hell is other people, then being curious about each other also means keeping a close eye on each other. In the end, it is Stuart’s curiosity in the “Rear Window” that saves the day when he eventually discovers a murderer among his neighbors.
In “Man and Technics,” historian Oswald Spengler observes that in the animal kingdom, keeping a close eye on each other is essential for survival. Carnivores higher up the feeding chain usually have their eyes fixated at the front of their skull to be able to set their target on the moving prey, the herbivores. In turn, many herbivores often have their eyes set sideways, which allows them to spot lurking predators while they graze.
The human eye is even more complex. Scientists suggest that humans are the only living creatures to have big white spots — the sclera — around the pupils, which allows them to spot the direction of each other’s gaze with remarkable precision. Just like animals, we rely on information accessed either by sight, smell or hearing. As the saying goes, information is power. But in addition, we have something that animals don’t have, which is speech. This makes us information predators, preying on each other in our own, particular way.
Curiosity and the need to stay informed have pushed humanity to constantly improve its communication methods by preserving and expanding speech across space and time. When speech evolved into writing, we could, via a piece of paper, put spoken words in our pocket or send a message overseas. The technology of writing made speech portable across space and durable across time. We did something similar with the smartphone: We put the stationary TV set, the typewriter and the telephone into a single device to fit in our pockets. Just like the piece paper containing speech, the smartphone encapsulates all our communication devices in portable form across space and easily accessible at all times. Thus, the ancient idea of expanding our communication capability across space and time remained the same.
But the question that arises more and more often these days is whether we now have too much information. If information is our new oil, then the simple rule of economics says that the increase in quantity always means a decrease in value. However, it is not the tech industry that experiences the decrease since the more input the AI machine has, the better. It is rather in the socio-political sphere where the depreciation is more visible. The desire for information for information’s sake risks turning politics into entertainment. One cannot have the cake and eat it too. “He knows a lot about them by now,” the narrator of “Rear Window” sums up Stewart’s curiosity. “Too much perhaps.”
The views expressed in this article are the author’s own and do not necessarily reflect Fair Observer’s editorial policy.
For more than 10 years, Fair Observer has been free, fair and independent. No billionaire owns us, no advertisers control us. We are a reader-supported nonprofit. Unlike many other publications, we keep our content free for readers regardless of where they live or whether they can afford to pay. We have no paywalls and no ads.
In the post-truth era of fake news, echo chambers and filter bubbles, we publish a plurality of perspectives from around the world. Anyone can publish with us, but everyone goes through a rigorous editorial process. So, you get fact-checked, well-reasoned content instead of noise.
We publish 2,500+ voices from 90+ countries. We also conduct education and training programs on subjects ranging from digital media and journalism to writing and critical thinking. This doesn’t come cheap. Servers, editors, trainers and web developers cost money. Please consider supporting us on a regular basis as a recurring donor or a sustaining member.