TikTok is widely regarded as one of the hottest social media platforms available in 2021, but all of that attention also brings a fair share of criticism — with the latest being a new report that slams TikTok for promoting far-right accounts to its users. All social networks have faced ample backlash in recent months and years for how they handle extremist content. Facebook, Twitter, TikTok, and other sites receive an enormous amount of uploaded posts/videos on any given day, and while all of these sites have moderation policies in place, harmful content is bound to slip between the cracks.

If and when that happens, it's up to those social networks to make sure it's limited to as few eyes as possible. For example, if someone goes on YouTube and watches a video about a conspiracy theory, there's a chance they'll be recommended other videos about that conspiracy to watch next. These recommendations are supposed to be harmless, but when extremist content is thrown into the mix, they can play a big role in further radicalizing the person on the other end. The For You page on TikTok is notoriously good at determining a user's interests and feeding them videos based on what they like, and evident by this latest report, it has the potential to lead people down the rabbit hole when fed dangerous videos.

Related: TikTok: Why YouTube Shorts Is The Platform To Dethrone Short Video Royalty

In a study conducted by MediaMatters, the publication followed accounts from the For You page and then looked to see what other accounts would be recommended to follow from TikTok's Suggested Accounts feature. In one example, MediaMatters followed a QAnon account from the For You page, and shortly after, was recommend to follow another QAnon account and then a Three Percenter account (a far-right, anti-government militia group). There are five other scenarios that played out very similarly, with MediaMatters following one extremist account and then being fed heaps of related ones to keep the momentum going.

How TikTok's Recommendations Work

TikTok Logo

Similar to a lot of other apps, an algorithm is at the heart of TikTok's recommendations. It analyzes the types of videos a user is watching and interacting with, and based on that data, it's able to make personalized recommendations for that user. While this is fine if someone is watching videos about cute puppies, comedy bits, or anything else, it's evident that it can go from harmless to dangerous depending on the content that's in question.

While this isn't a problem that exists exclusively on TikTok (echo chambers can be created on any social platform), it has the potential to be more harmful here compared to some of its competitors. In addition to TikTok's For You feature being so powerful, there's also the fact that TikTok's user base is extremely impressionable. Per a Statista report from June 2020, 32.5% of TikTok users are between the ages of 10-19. When young users are combined with readily available extremist videos, that's not a good combination.

It's one thing if someone is actively seeking out extremist content, but where things get dangerous is when these videos appear on someone's For You page, they watch it, and are then recommended to keep digging deeper and deeper. Those are the cases that are the most worrying, and right now, it's difficult to know what the solution is.

Next: Instagram Reels Vs. YouTube Shorts: Best TikTok Alternative

Source: MediaMatters, Statista