OPINION: Misinformation migrates from Facebook to TikTok, impacting the youth vote
In 2016, Facebook and Twitter received a slew of criticism because of the mass amount of fake news that peppered users' feeds. This had a big impact on the 2016 Presidential Election, mostly in former President Donald Trump’s favor. Much of the false information users saw was against his opponent in 2016, former Secretary of State Hillary Clinton. Facebook and Twitter were the subjects of major criticism in the 2016 and 2020 elections because of this misinformation. For instance, Meta, the parent company of Facebook, Instagram and WhatsApp, accepted paid advertisements that claimed the 2020 election was rigged.
Older users were sharing this false information. According to the National Library of Medicine, Users over 50 were responsible for 80% of the fake news shared on Twitter. In that same study, researchers found that compared to younger users, users over 65 shared seven times more links to false media sites. Though older users may be more likely to share false information, younger users are more likely to believe it according to the Center for Countering Digital Hate.
TikTok, a platform especially popular among younger users, has faced significant backlash for failing to remove posts containing blatantly false information, keeping it at the center of media scrutiny. However, platforms with mostly short form videos, like TikTok, find it harder to review videos that may contain false information. Video content is far more difficult to moderate than text. Instead of using keywords to identify where misinformation may be, moderators must comb through entire videos to see if they contain any false information. In addition to this, TikTok users can create videos that are up to sixty minutes long, which makes it extremely hard to regulate videos on the platform. Moderators of TikTok have even sued the company because they claim they “suffered immense stress and psychological harm” from moderating videos.
Reece Young and Ashley Velez say they were subject to 12-hour shifts in which they had to assess videos for TikTok. They claim they had only 25 seconds to assess each video and sometimes had to watch multiple at once to meet quotas set by the company. TikTok also has a very different format than Facebook and Twitter, who follow a social graph format. This means that content that users see is based on who they follow or are friends with on the platform. TikTok, on the other hand, shows users videos from people who are not in one’s social circle. This is determined by interactions with other users and videos. If users interact with one video that contains false information, this leads that way for that user to see additional videos that contain misleading information.
According to the New York Times, more than 60 percent of videos with harmful information were viewed by Tiktok users before being taken down in the first quarter of 2022. Another study conducted by NewsGaurd, found that roughly one-fifth of TikToks contained misinformation. According to Pew Research Center data, a staggering number of adults claim they get their news from TikTok. In 2020, this number was only three percent, however, this year, the percentage has more than quintupled to 17 percent. Many videos that contain false information surround the election. Many of them are also targeted at Vice President Kamala Harris’ campaign.
For example, one video, from @douglasmducotesr47, shows a photo of Harris taking a photo with a young girl. The video zooms in on the phone screen to display the girl wrapping her arms around nothing. The caption reads, “Damn…the Harris campaign is getting really sloppy.” This original photo was found on X, formerly Twitter, without Harris taken out of the photo. This video, containing blatant misinformation, has been viewed by more than 500,000 people. This post has been up since Sept. 7 and has yet to be taken down. Another post posted by @dirtcreekroad, shows a clearly AI generated photo of Harris sitting with Hitler with the caption, “#MAGA”. This post has been up since the beginning of September and, again, has not been taken down.
About 56 percent of US adults say they use TikTok according to Pew Research. This will become important during the election year because voters in the age range of 18-34 will make up one-fifth of the electorate population according to Tufts. This range of voters is essential to the election. These voters could determine the outcome of the election. If this group of voters is being fed misinformation, it could sway the election in an unexpected way.
TikTok, with its quick, viral content and massive reach, has become a breeding ground for the rapid spread of misinformation, often unchecked and unverified. As the next election approaches, we must realize the profound impact this could have on our democracy. False information about candidates, policies, and the voting process itself can skew perceptions and influence decisions in ways that undermine it. It is important that we educate voters, especially young ones, about how to discern fact from truth on social media. The rising spread of misinformation on TikTok should concern you. It poses a giant threat to the integrity of our democratic election process. We have already witnessed it in 2016 and 2020 on older generations of Americans and now we are seeing it with the younger ones. As the election approaches, we must realize the massive threat social media poses to our democracy. The stakes are too high to ignore it.