On social media sites, teens protect their privacy, transmit hidden messages and even hijack advertisers’ algorithms.
Back in the early days of social media, Danah Boyd was asked to participate on a panel alongside some representatives from various consumer brands. A fellow panelist who worked at Coca-Cola commented with satisfaction that his company was the most popular brand on MySpace. Without meaning to, Boyd (who writes her name in all lowercase letters) laughed audibly. At the moderator’s prompting, she explained that she, too, had noticed how popular Coke was on the site, and investigated. The most popular “brand” turned out to be not the soft drink, but cocaine.
Web-savvy brand managers, marketers, programmers and data analysts would never make that kind of mistake today — or would they? Boyd, an internationally recognized authority on social media — the Financial Times has dubbed her the “high priestess” of social networks — told the audience at the recent Wharton Web Conference that it is becoming more and more difficult even for web professionals to crack the ever-shifting code of people’s online interactions.
She described the increasingly sophisticated ways that users (especially teenagers, who are businesses’ future customers) craft posts to safeguard their privacy, transmit hidden messages to selected recipients and even hijack advertisers’ carefully crafted algorithms that determine which ads are sent to consumers. Boyd also addressed how the very act of planning and building technology entangles us in larger cultural and political questions, the implications of which we are only beginning to understand.
Kids Don’t Care About Privacy, Right?
Trained as an ethnographer, Boyd has spent the past decade traveling around the US talking to young people about how they use social technologies. “I had been overwhelmingly told, ‘Kids these days don’t care about privacy,’” she noted. “And yet when I wandered around talking to young people, I found that young people care deeply about privacy, even in an online environment.” But, Boyd added, how they strive to achieve that privacy is sometimes puzzling to outsiders.
She offered examples of teenagers trying to use social media in ways that not only protect their privacy across different groups of friends, but also shield them from authority figures such as parents, teachers and social workers. In a classic scenario familiar to most parents of teens, a girl who was interviewed on public radio complained about her mother reading all her posts. The teen insisted that she “needed her privacy.” The mother’s response was that since her daughter put the information on the Internet “for the world to see,” the mother was entitled to look, too. The girl’s response, ultimately, was to move to a different social media site and abandon the one her mother had become comfortable with.
While “the 20-somethings are sending sexy images,” she said, the teenagers are not. For teenagers, Snapchat offers a way to avoid getting in trouble over things that may be brought up or taken out of context later.
Boyd noted that the prevalence of social media has altered our very notions of “public” and “private.” She gave the example of two people having a conversation in a hallway. That conversation is private unless one of the people decides to repeat parts of it to someone else. But online, all conversations are public, retained in full for anyone to read. The person posting has to make some sort of effort to make it private.
Young people, Boyd said, choose to privatize certain material, either because they think it is embarrassing or will change the dynamics of their relationships. At the same time, they expect their audience to pay attention to the context in which they are operating — for example, the young woman who expected her mother to understand it was not appropriate to read her daughter’s posts. “I see quotes over and over again from young people saying, ‘Why are [adults on my social media site]? They don’t belong here. Don’t they understand?’ Or, ‘I wouldn’t look at their content; why are they looking at mine?’”
Privacy is about much more than just solving technical issues of access control, Boyd stated. “That is not how people live and experience privacy. Privacy is in many ways about controlling the social situation.” She described the case of a 14-year-old African-American boy who became very frustrated with Facebook. The boy explained to Boyd that he goes to an upscale school but characterized his family as having a lower socioeconomic status. His two identities, said Boyd, regularly collide on the social media site: For example, each group makes fun of the other’s video game preferences. “He thinks it should be obvious who he’s talking to, but he’s doing a lot of code-switching,” noted Boyd, referring to the practice of alternating between languages in a single conversation. “This is one of the struggles I see over and over again with young people. They use the technology, they try to separate these worlds, and it doesn’t work.”
Another teen that Boyd encountered chose to routinely delete Facebook posts, a practice the girl referred to as “white-walling.” The purpose was to keep her network from dredging up past events to start “drama” in the present. But a much more prevalent way to achieve privacy, said Boyd, is to “hide something in plain sight.” She gave the example of 17-year-old Carmen, who wanted to communicate with friends, but not family, about a painful breakup with her boyfriend. The girl posted a reference to the song “Always Look on the Bright Side of Life,” which is from a Monty Python movie and means the exact opposite of what it seems to. Carmen knew the British-American cultural reference would not be picked up by her Argentine family, but would be crystal-clear to her peers. The girl managed to signal different meanings to different audiences simultaneously, said Boyd, adding: “Figuring out how to navigate these public fora becomes really challenging.”
According to Boyd, this challenge is why many teenagers prefer to use multiple sites such as Twitter, Tumbler and Instagram in an effort to segregate the different groups of people in their lives. About Facebook, she quipped: “We can’t live with everybody we’ve ever met in the same room. It’s awkward.” She suggested that “the era of Facebook as the single platform for everybody was a complete anomaly” and added she is surprised that it lasted as long as it did. “The environment of fragmenting your services makes sense, in the same way that when you are in public spaces, you socialize in different bars or venues with different groups of people.”
How Snapchat Changes the Game
Issues of privacy and social control are transformed yet again in a service like Snapchat, according to Boyd. The fact that Snapchat makes messages ephemeral is a “shift in practice” that is “actually far more significant than most people realize,” she noted. While “the 20-somethings are sending sexy images,” she said, the teenagers are not. For teenagers, Snapchat offers a way to avoid getting in trouble over things that may be brought up or taken out of context later.
Boyd talked about some of the socially problematic issues that arise from companies’ tracking and using people’s personal information. A much-publicized example occurred in 2012 when Target basically “knew” a 16-year-old girl was pregnant before her own father did.
While acknowledging that there has been a publicized battle about whether Snapchat images actually disappear in the time specified, Boyd pointed out that they “socially disappear.” She distinguished snaps from the barrage of tweets and Instagrams that many people receive, which she said most of us cannot possibly keep up with. When you receive a snap, Boyd noted, “you sit there and think, ‘Do I have those seven seconds to really pay attention?’ because if I start looking at it, it will go away.” She called these communications “a beautiful focus of attention in a world where everything has become about constant streams.”
Messing with the Algorithm
Boyd also highlighted some ways that young people are “messing with the stream of content” just for fun, which has an impact on marketers and advertisers’ data analysis. Some teenagers have figured out that if they put brand names into posts, they show up higher on followers’ news feeds. “It makes no sense when you read them — [that is, if you] ‘human-read’ them,” Boyd noted. “But algorithmically, it looks like Nike suddenly became really important.” In addition to product names, random pictures and links are tested to see if they will make posts more prominent. And, she added, if you are a 15-year-old boy, “nothing is funnier” than using Gmail in a way that will trigger advertisers to send your friends diaper ads.
Just as individuals manipulate firms online, companies, of course, also manipulate individuals online. Boyd talked about some of the socially problematic issues that arise from companies’ tracking and using people’s personal information. A much-publicized example occurred in 2012 when Target basically “knew” a 16-year-old girl was pregnant before her own father did. The store’s predictive analytics identified her as pregnant based on certain buying patterns and started sending her ads for baby products. Her father noticed the “mistake,” complained angrily to the store manager and eventually found out the truth. Boyd said the story is illustrative of the fact that companies often make marketing and business decisions without thinking about the social or cultural implications.
She cited another story that hit the media, this one in June, about Facebook and “emotion contagion.” Facebook tested 689,003 unknowing users to see if changing what was posted in their news feed could alter the emotional outlook reflected in their own posts (it did). Boyd noted that she could see how, from a business decision standpoint, the practice that Facebook engaged in was of a piece with its normal activities of “curating and organizing information. They do this on a daily basis; they try to make you happy so you’ll stick with the service.” The real issue, according to Boyd, was people’s cultural discomfort with feeling a lack of control. That type of social backlash could have been seriously damaging for a business not as entrenched in people’s lives as Facebook.
From Marketing to Policing
Boyd predicted that an increasing number of organizations will run into social and policy-related issues if they do not devote more attention to the possible impact of their digital innovations. The stakes become even higher when technology is recruited in the service of crime and punishment, she said, as with the current use in some states of “predictive policing.” Using technology to anticipate where crimes will occur and sending the bulk of the police force to those areas, Boyd noted, is problematic because it may baselessly increase the number of searches and arrests in those areas. Another knotty issue is cataloguing people’s DNA, which contains information not only about the individual, but also their biological, even distant, relatives. She asked the audience to consider that a police database can now make assessments about who a particular person is connected to.
Boyd told the audience of technology professionals that they were no longer just building technology, they were building a core aspect of society. She asked them to consider the larger effects of their day-to-day activities on issues of fairness, privacy, politics and culture. “We’re all implicated, even when we’re building front-end systems, even when we’re building things that have nothing to do with large datasets.” Just because we can create something, Boyd asked, does that mean we should?
*[This article was originally published by Knowledge@Wharton.]
The views expressed in this article are the author’s own and do not necessarily reflect Fair Observer’s editorial policy.
For more than 10 years, Fair Observer has been free, fair and independent. No billionaire owns us, no advertisers control us. We are a reader-supported nonprofit. Unlike many other publications, we keep our content free for readers regardless of where they live or whether they can afford to pay. We have no paywalls and no ads.
In the post-truth era of fake news, echo chambers and filter bubbles, we publish a plurality of perspectives from around the world. Anyone can publish with us, but everyone goes through a rigorous editorial process. So, you get fact-checked, well-reasoned content instead of noise.
We publish 2,500+ voices from 90+ countries. We also conduct education and training programs on subjects ranging from digital media and journalism to writing and critical thinking. This doesn’t come cheap. Servers, editors, trainers and web developers cost money.
Please consider supporting us on a regular basis as a recurring donor or a sustaining member.
Support Fair Observer
We rely on your support for our independence, diversity and quality.
Will you support FO’s journalism?
We rely on your support for our independence, diversity and quality.