Human attention is a scarce, yet commercially valuable resource today, and social media competes for access to it. The wealth of digital connectivity and advances in browsing and filtering gives internet users more options and increases their perceived control over cyberspace. However, this is only a part of the whole truth.
Tech companies often enjoy exclusive access to data characterizing online behavior, which is then processed to filter preferences and customize content. Such personalization can save us hours of searching, for instance by presenting us with news in our geographical area and matching previous browsing. However, it can also be used to promote certain content, such as advertising and potential agendas, that prevents us from seeing the big picture.
Many social media platforms combine their intelligent algorithms with influence elements, borrowed from games and psychology, to grab our attention and persuade us to stay active online. In addictive domains such as gambling, total immersion and losing any sense of real-world time and space are common. Indeed, casinos do not have clocks or windows. This is not far from software like TikTok, where each spin will present you with unpredictable, concentrated content. The immersive design and uncertainty combined with social proof: For instance, how many people liked and commented on a sketch are all carefully designed influence elements.
Welcome to the Curiosity Economy
The one difference with gambling is probably the lack of warning measures and limit-setting. Taken together, all of the above can and should make us seriously question whether we are indeed as free and conscious of our own online choices as we think we are.
Human beings are social creatures, and we tend to be influenced by the opinions and actions of others. We seek to belong to groups and are curious to understand how others perceive us. It helps us to learn and shapes our behavior and decisions. At the same time, we desire autonomy, independence and distinctiveness. While it is natural that social media is designed to host that social interaction, its density, continuity and novel means are different in nature and scale. In a connected online world, it is always possible to socialize, and this arguably leads to “social obesity,” leaving us with little time for self-reflection.
The continuous nature of online social spaces may also trigger negative experiences such as the “fear of missing out” (FOMO) and lead to frequent checking of social media and seeking validation. FOMO leads to distraction, cognitive overload and other risks such as accessing social media during a lecture or even while driving. Some claim to be able to multitask, but this can, in many cases, just be denial, which is one symptom of behavioral addiction.
By accepting that observation, we need effective countermeasures and policies to increase the conscious and informed nature of our digital consumption but, until now, these protective resources remain lacking. Education about the persuasive and addictive design elements of these spaces and techniques to combat them would be potentially helpful to build resilience to temptation and unconscious usage.
Education includes explaining basic artificial intelligence (AI) concepts such as collaborative filtering and such phenomenon as echo chambers, which is having one’s online content personalized, thereby getting a potentially biased and partial view of the (online) world. It can also include techniques and soft skills for building confidence and resisting peer pressure, such as expectation management and self-talk. Prevention is the best cure, and education can also focus on how to post wisely and avoid unhealthy interactions that lead to anxiety and regret. Other useful techniques are enhancing self-esteem and checklists to analyze a post before publishing it.
However, it would be much more meaningful if measures are suggested while the problematic behavior is occurring. For example, if a person is found posting heavily and in a rash manner, they can be presented with infographics on how the loss of control and provocation happens online and how to use limit-setting. This could be done in a non-obtrusive style, and personalization of such an intervention is more possible now using the same data as for marketing or other purposes. For instance, users can be presented with behavioral information on where and how their own loss of control and distraction happens and how this correlates with the sentiment in their posts and the location and time of day.
Regrettably, tech companies do not allow easy access to user data by any third-party application or service that a user authorizes, claiming that this is a competitive disadvantage. The services that tech companies do offer to give users more self-control are either absent or very basic and limited to calculating one’s screen time, which is a simplistic measure of problematic internet usage. We need a fair share of such data and having a more open model for providing such digital wellbeing services.
*[This article is submitted on behalf of the author by the Hamad Bin Khalifa University’s Communications Directorate. The views expressed are the author’s own and do not necessarily reflect the university’s official stance.]
The views expressed in this article are the author’s own and do not necessarily reflect Fair Observer’s editorial policy.