TikTok’s radical pipeline, also known as the FYP
May 22, 2023
Have you ever watched a documentary about a cult and thought to yourself “How could you believe in this? They’re taking advantage of you.” While it’s easy to look in hindsight and to believe that cults are easily detectable & you won’t fall for their tricks, the truth is that no one is immune to cults and even the smartest of people can fall for their tricks. The same goes for radicalization, no matter how much you study, whether it be right, left, or a different political stance, you can still be radicalized.
Radicalization is the process where someone’s thoughts and opinions become more extreme or “radical” due to some source of stimuli. And this source of stimuli for most teens nowadays is the Tiktok “For you page” or its acronym, “FYP”.
Radicalization survives and thrives off of misinformation and disinformation with only 39% of Americans being able to recognize and detect fake news and has left 64% of Americans completely confused about basic facts as well as cognitive dissonance playing a role in making it harder to get out of a pipeline. The most we can do is monitor our friend groups and what parts of the internet are we going to.
“Radicalization comes, and grows, from people having cognitive dissonance where even when presented with facts that go contrary to their own, they refuse to believe it. The internet, forum boards (4-Chan, Reddit, etc.), and social media have only exacerbated radicalization.” Mr. Cotton, an IT professional and high school teacher said.
Tiktok’s FYP is a system that uses information like the content you like, content you’ve commented on, content you’ve shared, how much time you spend on a video, and more that we don’t fully know. And uses this information to show you videos that Tiktok thinks you will like, creating an endless cycle of addicting short-form videos for people to consume.
This is a problem because the FYP can slowly introduce more and more radical ideas to a user, who is then influenced by all of the videos that the Tiktok FYP shows them. This is particularly frightening because 1 in 4 Tiktok users are under 20 years old, so this predatory algorithm actively influences people’s young minds, which are scientifically more likely to be influenced by different stimuli.
The radical ideas on social media platforms aren’t just one thing. Radical ideas come in many forms, the most popular and widespread being the far left and the far right. These two radical ways of thought are wildly different, while the far left contains ideas like progress and freedom, the far right includes ideas like tradition and authority.
Far-right ideas seem to be more prevalent on the modern-day internet, and most extremist content on social media like Tiktok is far right. And while far-right content is more prevalent does not mean that there are no far-left ideas on the internet.
“Their peers, friends, and even their families play a role in how people think and act. Parents especially play a key role in their children’s patterns of behavior and thought processes. Parents who do not discipline their kids, tend to let them do whatever they want without repercussions, and who do not intervene when their child is doing something detrimental to their development are just as guilty of radicalizing their children as the child itself of being radical.” Said Mr. Cotton.
Speaking of radical thought, Andrew Tate is quite a familiar name nowadays after his social media content gained notoriety and fame for its bold and extreme views on things like women’s rights and such. Some of the reasons that the algorithm seemed to like Andrew Tate was that MANY different people would take clips of him, and make short-form videos to post on social media.
The algorithm of many social media platforms pushed his extreme ideas into people’s line of sight, which made more people want to make videos on him, or repost more of his clips, and then pushed Andrew Tate on more people, creating a digital ouroboros of social media. Even though him and his brother got arrested in December 2022, his arrest went through a Streisand effect where his influence only grew and gave his followers a reason why he was right about “The Matrix” a term that frequently pops up in videos of his and his followers.
“I am okay with it on its surface, but there need to be changes made.” Mr. Cotton said. “Regulating the algorithm of sites like Youtube, TikTok, etc. would mean regulating free speech which would violate the 1st Amendment, people need to be open to opposing opinions, and that comes from educators, role models, and leaders doing that as well.”
Most people didn’t pay much attention to Andrew Tate and his radical views, but the people who did listen found that their entire social media was now filled with more extreme ideas from Andrew Tate and his followers.
TikTok is not the only social media platform that can radically influence its users, social media apps and websites like Facebook, Instagram, and Twitter contain pockets for radical groups and content to spread and influence people. This pattern of different social media pushing radical ideas onto users is very dangerous and has the power to create disastrous events.
The January 6th insurrection was one of these disastrous events, it was spurred by the former president of the U.S. Donald Trump and his tweets on the social media platform Twitter. Trump tweeted a sort of “call to action” to his followers that then caused the insurrection that ended with most arrested, and some dead.
This incident was fostered by Twitter and the fact that this “call to action” was able to stay on its platform. Trump’s tweet(s) likely wouldn’t have caused anything if Twitter had a better grasp on their platform’s radical ideas and a better way of controlling them so they wouldn’t cause a dangerous event like the Jan. 6th insurrection.
While it’s easy to just say to restrict and control the algorithm of these sites, this would also violate the 1st Amendment and would allow the government to gain greater control over the internet. Another example of a possible violation of the 1st Amendment would be if the TikTok ban passes. While there’s evidence that China could be spying on US citizens there’s no outright public proof of this at the moment.
It’s also important to address disinformation and how it’s used to fuel hate groups and radicalism. While it’s easy to mix up misinformation with disinformation, misinformation is false or inaccurate information but is not intended to be malicious, however, disinformation is deliberately malicious and includes, hoaxes, propaganda, etc, which causes distrust among the population and further fuels radicalism and hate groups.
“I always use the phrase “Trust but verify” meaning if you trust what you’ve read, now go verify it to be real. Doing your own research is the best way to not only root out misinformation but to also educate yourself on a variety of topics.” Mr. Cotton said.
Even though there’s no way to be immune to radicalization, there are tactics you can do to become more aware of its presence and stay away from radical circles such as having healthy friend groups, peers, and relations with your family as well as monitoring your time online. Another notable thing is how much a child’s environment can radicalize the child as well such as their home life, how their parents monitor and filter the content they access, and how the parents act. According to the NIH in an experiment that involved parental presence in which adolescents without parental presence had higher behavioral and neural dysregulation than adolescents with parental presence.