Facebook has placed the creation of a kids’ version of Instagram, aimed at youngsters under the age of 13, on hold in order to address concerns about the vulnerability of younger users.
“I still believe it’s a good thing to build a version of Instagram that’s designing to be safe for tweens, but we want to take the time to talk to parents, researchers, and safety experts and get a better understanding of how to move forward,” Instagram CEO Adam Mosseri said on Monday on NBC’s “Today” show.
The announcement comes after The Wall Street Journal published an investigative series suggesting that Facebook was aware that some young ladies’ use of Instagram was associate with mental health issues and anxiety.
However, the expansion of Instagram to a younger audience was met with widespread criticism almost immediately.
In March, Facebook announced the creation of an Instagram Kids app, stating that it was “exploring a parent-controlled experience.” A bipartisan coalition of 44 attorneys general responded to Facebook CEO Mark Zuckerberg two months later, pushing him to quit the project, citing the safety of minors as a reason.
They highlighted increased cyberbullying, potential vulnerability to online predators, and Facebook’s “mixed record” in protecting minors on its services as reasons for their concerns. When Facebook debuted the Messenger Kids app in 2017, it was hailed as a method for kids to interact with relatives and friends who had been approved by their parents.
Fairplay, a children’s digital advocacy group, recommended the business to permanently shut down the app on Monday, according to Josh Golin, executive director. A group of Democratic members of Congress felt the same way.
“Facebook is listening to our pleas to halt plans to launch a children’s version of Instagram,” says the group “Senator Ed Markey of Massachusetts took to Twitter to express his displeasure with the situation. “However, a pause isn’t enough. Facebook should cancel this effort totally.”
The Senate had already scheduled a hearing with Antigone Davis, Facebook’s global safety chief, for Thursday to discuss what the firm knows about how Instagram affects the mental health of young users.
Mosseri stated on Monday that the firm believes it is best for children under the age of 13 to have a dedicated platform for age-appropriate content, and that other companies such as TikTok and YouTube have apps for that age range.
In a blog post, he argues that having a version of Instagram where parents can oversee and regulate their children’s experience is preferable to relying on the company’s ability to verify whether or not children are mature enough to use the program.
Instagram for kids, according to Mosseri, is intended for children aged 10 to 12, not younger. It will be ad-free, with age-appropriate material and features, and will require parental permission to join. Parents will be able to see how much time their children spend on the app, as well as who messages, follows, and is following by them.
While Instagram Kids development is pausing, the company will offer opt-in parental monitoring tools to teen accounts aged 13 and higher. According to Mosseri, further information about these technologies will be available in the following months.
This isn’t the first time Facebook has faced criticism for a children’s product. Experts in child development encouraged Facebook to shut down its Messenger Kids app in 2018, claiming that it was not responding to a “need” as Facebook claimed, but rather creating one.
In such an instance, Facebook decided to go ahead and release the app.