Facebook has had a lot of issues handling content moderation when it comes to kids on Instagram. The EU even launched a survey last year into the social network’s data storage practices for children.
To address all of these issues, the company is now looking to create a special version for users under the age of 13.
BuzzFeed News reported last night that in an internal memo, Instagram VP of Product Vishal Shah said that as a priority the company wanted to make the platform safer for teens and create a version for young children:
I’m excited to report that in the future we have identified youth work as a priority for Instagram and added it to our H1 priority list. We will be creating a new Youth Pillar within the Community Product Group to focus on two things: (a) accelerate our integrity and confidentiality work to ensure the safest experience possible for teens and (b) create a version of Instagram that allows people under the age of 13 to use Instagram safely for the first time.
For now, to use Instagram, you must be over 13 years old. And users can report accounts they suspect are under 13.
The internal post also noted that Pavni Diwanji – a former Google executive who was in charge of several children’s products, including YouTube Kids – will take on this new project.
For the safety of teens, the company rolled out an AI-powered feature earlier this week that prevents adult strangers from DMing with children.
It’s not Facebook’s first app focused on kids. In 2017, it released a kid-friendly version of Facebook Messenger with parental controls. However, in 2019, the company had to fix a bug that allowed children to chat with adult contacts not approved by their parents.
As more and more Facebook products integrate, the company will need to ensure that children’s apps remain in a separate secure silo.
Did you know that we have a newsletter dedicated to consumer technologies? It’s called Plugged In – and you can subscribe to it here.
Published March 19, 2021 – 08:35 UTC