Discord is testing a suite of parental controls that will allow for increased oversight of its younger user base.
The company introduced a new “Family Center” feature for users during a live test running in Discord’s iOS app in the US, according to a report by TechCrunch.
Courtesy of the new feature, parents can configure tools to see the names and avatars of their teen’s recently added friends, the servers the teen has joined or participated in, and the names and avatars of users they’ve messaged or engaged within group chats.
However, in order to preserve minors’ right to privacy, the popular chat app said parents cannot view the content of their teen’s messages or calls.
Snapchat implemented a similar approach to parental controls last year, only allowing parents insights into who their teen is talking to and friending, not what they have typed or the media they have shared.
Discord is a voice over internet protocol (VoIP) and instant messaging social platform. The platform is currently used by 150 million people every month.
How to Activate Discord’s New Parental Control Feature
Users can access Family Center under the app’s User Settings section, below the Privacy & Safety and Profiles sections.
Parents could read an overview of the Family Center features and click the “Get Started” button to begin setting up the controls.
Discord explains on this screen that it “built Family Center to provide you with more content on how your teen uses Discord so you can work together on building positive online behaviors.”
The company added that parents need to scan a QR code provided by the teen to put the account under their supervision.
Discord is working on a new tab for parental controls and tools. #discord #datamining pic.twitter.com/HyYkC2ql2r
— abbie (@number0x01) February 27, 2023
Discord has reportedly confirmed that the parental control feature is under development but didn’t offer a firm deadline for when it would be fully rolled out.
“We’re always working to improve our platform and keep users safe, particularly our younger users,” a Discord spokesperson said. “We’ll let you know if and when something comes of this work.”
Social Media Platforms Under Scrutiny Over Teen Mental Issues
A number of major social media companies, such as Facebook, Twitter, Instagram, Snap, YouTube, and TikTok, have had to testify before Congress about the potential harm to minors from social media use.
This came as the impact of social media on teens has been a hot debate among lawmakers and researchers across the US. Hopefully Discord’s new suite of parental control features can help reduce these effects, if only a little.
In a recent report, the Child Mind Institute, an independent nonprofit focused on children struggling with mental health, claimed that social media has a negative impact on teenagers’ mental health and can “lead to low self-esteem, anxiety, and depression.”
In fact, some states like Arkansas and Utah have already passed laws requiring age verification and parental permission for social media use.
As reported, Utah approved new laws that prohibit anyone under 18 from using social media platforms such as TikTok, Instagram, and Facebook without parental consent in late March.
More recently, Senators Bill Cassidy and Ed Markey put forward “COPPA 2.0” (Children and Teens’ Online Privacy Protection Act), which is intended to expand on the original 1998 bill that imposes certain requirements on operators of websites or online services directed to minors.