Either the TikTok algorithm is truly making me concerned for the future of media consumption, or today is just another “no bones” day.
Anyone who understood that reference probably spends a good chunk of their time exploring the infinite scroll of their TikTok For You Page. Or, should I say, trying to stop exploring?
TikTok is intentionally addictive, as are most social media platforms. What sets this one apart from the rest, though, is its fine-tuned algorithm. By paying attention to users’ hashtags, sounds, liked posts, followed accounts and even the amount of time spent watching each video, TikTok curates personalized content to a chilling degree.
According to TechCrunch, TikTok began collecting faceprints, voiceprints and other forms of biometric metadata in the summer of 2021. TikTok does not clearly explain what these things are, or why they are necessary. While the company does state that their information collection is bound to U.S. laws, it does not specify federal or state. This is concerning because only a select few states have passed biometric privacy laws.
The user experience design also keeps people engaged. The icons are small, advertisements are limited and the content itself takes up the phone’s entire screen. All this adds to the time-cancelling vortex of the app.
Through my experience of talking to Gen-Z kids, and being one myself, I know cyber security is not the most pertinent issue to us. I often hear people ask why they should care about their information being shared when they have nothing to hide.
Personally, I am most disturbed by the way TikTok’s content can become so personalized that what the user sees defines what the user likes. This is completely backwards.
People somewhat harmlessly talk about which side of the app they are on. These sides have evolved far beyond the original “straight TikTok” vs. “alt TikTok” debate. They can be as specific as “axolotl-Tok,” or Andrew Garfield’s angry monologue from “The Social Network.”
Because our spaces on the app are thoroughly tailored to what we want to see, we stop questioning things. We take what is given to us. This constant presentation of individually related content is pushing young users to extreme corners of both the app and political ideology.
One article from Media Matters details a small study showing how quickly TikTok users are shown hateful and violent content. By initially interacting with transphobic content, one person behind this study settled into a new echo chamber. The study used transphobia as the “gateway prejudice.” From there, TikTok began suggesting content that was homophobic, misogynistic and so on. The study concluded that one user could be presented with white supremacist imagery, fascist rhetoric and incitement of violence in only four hours of scrolling.
As someone who has jokingly tweeted the words, “why would I date when nobody will ever know me like the TikTok algorithm,” I can not pass judgment on everyone who uses the app. Still, I believe that the luxury of razor sharp, personalized content does not outweigh the genuine danger that accompanies it.
am