Bowing – perhaps only for a moment – to pressure from lawmakers, critics, the media and child development experts, Facebook has said Monday it will “pause” its work on a kids’ version of its photo and video-oriented Instagram app.
Here’s everything you need to know about the saga:
What is this Instagram for kids plan?
Adam Mosseri, the head of Instagram, said in a Monday statement that this new Instagram platform would be for those aged 10 to 12. It will require parental permission to join, and it will not have ads. It will have age-appropriate content and features.
“Parents can supervise the time their children spend on the app and oversee who can message them, who can follow them and who they can follow. The list goes on,” he said.
“We started this project to address an important problem seen across our industry: kids are getting phones younger and younger, misrepresenting their age, and downloading apps that are meant for those 13 or older.”
What’s the outrage?
In mid-September, The Wall Street Journal published an explosive article that found Facebook knew from its own research that Instagram was harming some teens, especially girls, leading to mental health and body image problems and in some cases eating disorders and suicidal thoughts.
In public, however, Facebook has consistently played down the app’s negative side and until now has barrelled ahead with the kids’ version despite alarms from experts, lawmakers and its own research. It has also relentlessly criticised the Journal article as cherry-picking from Facebook’s research, though it did not dispute the facts. That story, however, was based on internal research leaked by a whistle-blower at the company.
So is Instagram for Kids cancelled?
Facebook has very specifically not said that it will abandon the project. Instead, Adam Mosseri, the head of Instagram, said in the Monday statement that the company would use its pause time “to work with parents, experts and policymakers to demonstrate the value and need for this product”.
Translation: Expect Facebook to sharpen its message on the “benefits” of Instagram for Kids in hopes that the furore will die down.
Who are the experts working with Facebook?
Four years ago, Facebook said it gathered a group of experts in the fields of online safety, child development and children’s media to “share their expertise, research and guidance”. The group it calls Youth Advisors include some well-known and some lesser-known non-profit groups, including the Family Online Safety Institute, Digital Wellness Lab, MediaSmarts, Project Rockit and the Cyberbullying Research Center.
All of these groups receive some form of funding from Facebook, according to their websites. Meanwhile, some of the best-known children’s online advocacy groups – and Facebook’s biggest critics on this matter – such as Common Sense Media and Fairplay (formerly known as the Campaign for Commercial-Free Childhood) are notably absent.
What have critics said about the problems with Instagram?
Critics acknowledge that many of the cooperative experts mean well, but say their influence has been negligible.
“Facebook has shown time and time again that it is incapable of governing or advising itself with any integrity,” said Kyle Taylor, programme director for the Real Facebook Oversight Board, a group critical of the social network.
“Facebook’s funding of research and civil society is hugely problematic, and prevents the kind of direct, open process that is required for any real change to occur.”
When Facebook seeks feedback for its projects, Taylor added, “the decks are always stacked with experts who have a financial interest or who will never criticise Facebook’s core issues – their algorithm and their profit margin.”
What about other platforms?
Facebook, of course, is not the only tech platform whose products have caused ripples of concern about the well-being of children.
And creating kids’ versions in the face of these concerns is a popular response. After getting in trouble with US regulators for violating children’s privacy rules, for instance, TikTok created a “limited, separate app experience” for users who are under 13.
They can’t share videos, comment on other people’s videos or message people. But as with any other app, if kids enter a fake birth date when they register with the app, they can get around that provision.
YouTube has a kids’ version too. Lawmakers earlier this year called it a “wasteland of vapid consumerist content” and launched an investigation that is still ongoing.