top of page
Search
  • Writer's pictureThe San Juan Daily Star

Wasn’t TikTok supposed to be fun?


No social app remains free of arguments over politics and social issues once it becomes popular.

By Shira Ovide


There is a predictable trajectory for social media apps. Many of them start out as helpful or even pure fun. But when they get popular enough, just about every app becomes a place for consequential discussions about politics and social issues, too. And with that comes both meaningful conversations and a litany of nastiness.


This reality has come for TikTok. An app better known for viral dance videos has become a significant source of political and social misinformation, as my colleague Tiffany Hsu detailed in a recent article.


Ahead of Kenya’s recent presidential election, a widely shared TikTok post showed an altered, violent image of one of the candidates with a caption that described him as a murderer. (The post was eventually removed.) Falsehoods about diets and school shootings easily spread in the app, Tiffany reported, as have variations on the PizzaGate conspiracy.


And on the serious even if not terrible side, American politicians and their allies are embracing TikTok to spread their campaign messages and promote policies such as the Child Tax Credit.


This may not be exactly what TikTok has in mind. Executives have continued to describe TikTok as an entertainment app. And sure, most people use TikTok, Facebook, Pinterest, Nextdoor, YouTube and Twitch in fun, productive and informative ways.


But it is inevitable that apps must plan for what will go wrong when online conversations eventually encompass the full scope of human interest. That will include political information and social activist movements, as well as nasty insults and even incitements to violence and hawking of bogus products for financial gain.


“It’s the life cycle of a user-generated content platform that once it reaches a critical mass, it runs into content moderation problems,” said Evelyn Douek, an assistant professor at Stanford Law School whose research focuses on online speech.


The tricky part, of course, is how to manage apps that evolve from “We’re just for fun!” to “We take our responsibility seriously.” (TikTok said that almost verbatim in its blog post on Wednesday.)


Pinterest is best known for pretty posts for wedding planning or meal inspiration, but it also has policies to weed out false information about vaccines and steers people to reliable sources when they search for terms related to self-harm. Roblox is a silly virtual world, but it also takes precautions — such as exhorting people to “be kind” — in case children and young adults want to use the app to do harmful things such as bullying someone.


TikTok knows that people use the app to discuss politics and social movements, and with that comes the potential risks. On Wednesday, TikTok laid out its plans to protect the 2022 U.S. elections from harmful propaganda and unsubstantiated rumors.


Maybe more so than other apps, TikTok doesn’t start with a presumption that each post is equally valid or that what becomes popular should be purely the will of the masses. TikTok creates trending hashtags, and reporters have found the app may have tried to direct people away from some material, like Black Lives Matter protests.


(TikTok is owned by the Chinese technology company ByteDance. And posts on Douyin, ByteDance’s version of TikTok in China, are tightly controlled, as all sites in China are.)


Whether TikTok is more or less effective at managing humans than Facebook or YouTube is open to debate. So is the question of whether Americans should feel comfortable with an app owned by a Chinese company influencing people’s conversations.


To put it frankly, it stinks that all apps must plan for the worst of the human condition. Why shouldn’t Twitch just be a place to enjoy watching people play video games, without fans abusing the app to stalk its stars? Why can’t neighbors coordinate school bus pickups in Nextdoor without the site also harboring racial profiling or vigilantism? Can’t TikTok just be for fun?


Sorry, no. Mixing people with computerized systems that shine attention on the most compelling material will amplify our best and our worst.


I asked Douek how we should think about the existence of rumors and falsehoods online. We know that we don’t believe every ridiculous thing we hear or see, whether it’s in an app or in conversations at our favorite lunch spot. And it can feel exhausting and self-defeating to cry foul at every manipulated video or election lie online. It’s also counterproductive to feel so unsure about what to believe that we don’t trust anything. Some days it all feels awful.


Douek talked me out of that fatalism and focused on the necessity of a harm reduction plan for digital life. That doesn’t mean our only choices are either every single app becoming full of garbage or Chinese-style government control of internet content. There are more than two options.


“As long as there have been rules, people have been breaking them. But that doesn’t mean platforms shouldn’t try to mitigate the harm their services contribute to and try to create a healthier, rather than unhealthier, public sphere,” Douek said.

62 views0 comments
bottom of page