TikTok and Our History of Addiction
TikTok and other social media apps have created user engagement that most products could only dream of. But at what cost?
TikTok plans to limit accounts for teenagers to 60 minutes daily. When the limit is reached, a parent or guardian must enter a passcode to extend their screen time. TikTok finds itself at the intersection of engagement and addiction.
Bytedance has created a product that drives user engagement that most apps would kill for. The average user spends 95 minutes in the app daily, while 17% of their users spend at least 5 hours every day!
TikTok users are also highly engaged in ads on the platform. The average engagement rate on TikTok is 4.24%. That rate is more than 7x better than the nearest competitor, Instagram. This, in turn, has helped Bytedance grow into a 30,000-person behemoth.
The flip side of this engagement is addiction. Approximately 6.4% of TikTok users are considered “at-risk” of addiction. These tend to be users impacted by loneliness and extraversion. While 6.4% seems small, remember that the platform has 1 billion monthly active users. This equates to 64m people at risk of TikTok addiction, or roughly the entire population of Italy!
The Science of Addiction and Behavior Design
The tobacco industry added bronchodilators, a medicine that makes breathing easier, to cigarettes so that tobacco smoke can more easily flow into the lungs. They genetically engineered their crops to double the amount of nicotine. They also added ammonia to ensure the nicotine travels faster to the brain.
Junk food is drenched in sugar, salt, and fat to release dopamine and have people craving more. While it takes cigarette smoke 10 seconds to stir the brain, sugar on the tongue will trigger a reward response in just over 0.5 seconds!
Casinos are littered with bright and flashing lights to trigger and remind gamblers of excitement, even as they lose money. They don’t have clocks or windows, so gamblers lose track of time. They run their games with chips to detach people from the reality that they are playing with real money.
So, given the track record of corporations investing in the addictiveness of their products, it comes as no surprise to see the same thing occurring in the digital age. Apps are designed for emotions, particularly social media. They help you feel connected when you are lonely (for example). They use emails and push notifications to trigger you to open the app. They implement features like auto-play and infinite scrolling to keep you engaged.
Much of these designs are born from a field called “Behavior Design,” developed by psychologist BJ Fogg. Fogg’s behavior model is based on the idea that three things must be true for a person to do anything. The need to be motivated. They must have the ability (to act). Finally, they must encounter triggers that prompt them to act.
History Repeats Itself
In the early ’90s, a media campaign was launched, urging people to “Just Say ‘No!’ to TV Addiction.” The campaign said TV addiction had become North America’s №1 mental health problem. The campaign, in particular, focused on the consumer culture being created from TV addiction. At the time, the average American was hit with 3,000 ads daily. The “Adbusters” campaign from the 90s even purchased TV ads to promote “Buy Nothing Day”:
Humans have had a long history of fear when it comes to media. Each generation has feared the effects of new technologies on the population’s mental health. People raised concerns in the 16th century about the overwhelming amount of data produced by printing presses. Radio was accused of distracting children from reading and diminishing school performance. In 2005, CNN reported that email was more damaging to people’s IQ than marijuana!
Fighting the Algorithm
A key ingredient that makes products like TikTok and Instagram so addictive is their algorithms. These apps’ “For You” page is an infinite scrolling, auto-play-fuelled rabbit hole. These algorithms have become incredibly effective at presenting content that generates the most engagement.
Social media algorithms are not only keeping their users hooked. They do so by providing content that is likely to elicit emotional responses. Often, this means presenting divisive content and misinformation. This creates a “one-two punch,” keeping users glued to the app for hours while consuming content exacerbating mental health issues.
The algorithms are unique to the social media generation. In past iterations of the media addiction debate, regulators could target the publishers of the media (e.g., TV stations and Newspapers). Social Media companies, however, are not publishers and are protected by Section 230.
Some government organizations have been testing the waters around, forcing greater transparency from social media companies. Recently, the Cyberspace Administration of China (CAC) forced their largest tech companies to publish the details of their algorithms. This included Tencent, Alibaba, and ByteDance.
At the time of writing, Elon Musk indicates that Twitter will open-source its algorithm very soon. Several other companies, like Tumbler and Flicker, have also stated that they plan to add support for ActivityPub. ActivityPub is an open, decentralized social networking protocol.
If Twitter does open source its algorithm, this could open the way for better analysis and more effective tools to combat social media addiction.