Snap CEO Evan Spiegel says the company is working on parental controls for Snapchat to allow guardians to oversee their child’s social media activity and ensure that they have a safe experience using the platform. When it comes to young users, the big three catering to that demographic — Instagram, Tiktok, and Snapchat — have adopted an aggressive marketing strategy to woo them. But in the process of courting a younger audience that is stepping into the world of social media, the fail-safe systems have often been ignored when it comes to keeping them guarded online.

Less than a month ago, the United States Drug Enforcement Administration rebuked Snapchat and its rivals over the online sale of fake drugs laced with lethal chemicals that have caused multiple deaths in the country due to overdose. Concerned authorities have described the situation as an overdose crisis and have criticized these platforms for allowing online drug marketplaces to thrive. But the online sale of drugs is not the only problem these platforms face, but Snapchat appears to be one step behind on a key wellness metric — parental safety controls.

Related: How To Stop Random People Adding You On Snapchat

That is about to change soon. Talking to The Wall Street Journal, Snapchat co-founder Spiegel dished about an in-development parental control feature that he describes as some sort of ‘family center’ in the app. Spiegel notes that the upcoming feature will give parents visibility into the Snapchat activity of their young children, such as who they are talking to, how the privacy settings have been configured, etc. Spiegel added that the upcoming feature would at least help start a conversation between parents and their young children and allow parents to guide their children through tricky situations like communication requests from strangers.

Snapchat Has Evaded Heat So Far, But It's Time To Be Proactive

Friends list in Snapchat app

Spiegel did not provide any specifics about the parental control tools coming to the platform, neither did he reveal when it will arrive on Snapchat. Earlier this month, UK's broadcasting and telecommuting regulator Ofcom released a set of guidelines for video sharing platforms like Snapchat to protect users under the age of 18 from harmful content such as sexual abuse or violence-inciting material. However, it is surprising to note that Snap has survived so long without any meaningful parental controls whatsoever, despite having a predominantly younger audience and a massive user base, especially in the US. In comparison, TikTok and Instagram have kept adding new parental features over the past couple of years in the wake of related controversies.

Back in 2019, TikTok was fined a sum of $5.7 million by the United States Federal Trade Commission for collecting information about its young users without their parents’ consent. A year later, the regulatory body asked TikTok to disclose data collection practices and how its content pushing strategy affects young users. In 2021, TikTok was sued in the UK and EU for collecting data on users who did not even have an active account and then selling it to third parties. Instagram has also been criticized for years over the ill effects of its platform. A few weeks ago, leaked internal documents revealed that Facebook was well-aware of Instagram’s bad effect on teen users’ mental health, but the company prioritized profits and overlooked those worrying findings.

Next: How To Make Free Money On Snapchat With Spotlight Challenges

Sources: WSJ / Twitter, FTC