Top manager verification for Instagram for kids


Adam Mosseri, top manager of Instagram, confirmed that work is underway for a version of the popular photo sharing app for children under 13. The company owned by Facebook knows that many kids want to use Instagram, Mosseri said. However, there is no detailed plan yet, according to BuzzFeed.

“But part of the solution is to create an Instagram version for teenagers or children where parents have transparency or control,” Mosseri told BuzzFeed News. “This is one of the things we’ve discovered.” Instagram’s current policy prohibits children under the age of 13 on the platform.

“Children are increasingly asking their parents if they can join apps that help them follow their friends,” Facebook representative Joe Osborne told The Verge in an email. said. ‚ÄúThere are not many options for parents right now, so we are working to develop parent-led, child-friendly supplements, as we do with Messenger Kids. We are looking to bring a parent-controlled experience to Instagram to help kids keep up with their friends, discover new hobbies and interests, and discover more. ”

BuzzFeed News also posted a message from an internal message board in which the Instagram vice president of product Vishal Shah said that a “youth leg” project has been identified by the company as a priority. Shah wrote in his message that the Community Product Group will focus on privacy and security issues “to ensure the safest possible experience for young people”. While at Google, Mosseri was overseeing the project with vice president Pavni Diwanji, who ran YouTube Kids.

Instagram also posted a blog post earlier this week explaining its efforts to make the platform safe for its youngest users. However, it did not mention a new version for children under the age of 13.

Targeting children under the age of 13 with online products raises not only privacy concerns, but also legal issues. In September 2019, the US Federal Trade Commission fined Google $ 170 million for tracking children’s viewing history to serve ads to them on YouTube, which violates the Children’s Online Privacy Protection Act (COPPA)., the previous version of TikTok, was fined $ 5.7 million for violating the COPPA in February 2019.

In 2017, Facebook released an ad-free version of its Messenger chat platform designed for children ages 6-12. Child health advocates criticized this as harmful to children and urged CEO Mark Zuckerberg to shut down the app. Later in 2019, a bug in Messenger Kids allowed children to join groups with strangers, exposing thousands of children to conversations with unauthorized users. Facebook silently closed these unauthorized conversations, which it said affected “few” users.


Please enter your comment!
Please enter your name here