As the US Senate examines Meta’s protection children, firm calls for federal law for App Stores to approve app downloads for under-16s
Meta Platforms has sought to pass some of the responsibility for protecting children back to the parents and App Stores themselves.
The US Senate has begun to investigate Meta’s actions to protect children, after many US states recently sued the firm, alleging its Instagram and Facebook platforms are harming children’s mental health.
This comes despite the fact that children under 13 are already prohibited from creating accounts and downloading apps without a parent’s go-ahead.
But now in a blog post entitled ‘Parenting in a Digital World Is Hard. Congress Can Make It Easier’, Antigone Davis, Meta’s global head of safety, called for federal legislation that would mandate app stores to notify parents whenever a child between the age of 13 and 16 downloads an app, and would solicit the parents’ approval.
The blog called for legislation “so all apps teens use can be held to the same standard.”
“Parents should approve their teen’s app downloads, and we support federal legislation that requires app stores to get parents’ approval whenever their teens under 16 download apps,” Davis wrote.
The blog cited recent Pew research suggests that 81 percent of US adults support requiring parental consent for teens to create a social media account
It bemoaned that US states are passing a patchwork of different laws, many of which require teens (of varying ages) to get their parent’s approval to use certain apps, and for everyone to verify their age to access them.
But Meta’s Davies said there is a better way.
“Parents should approve their teen’s app downloads, and we support federal legislation that requires app stores to get parents’ approval whenever their teens under 16 download apps,” wrote Davies.
There was no specific mention of either Google or Apple – operators of the two biggest App Stores in the world – in the blog post.
“With this solution, when a teen wants to download an app, app stores would be required to notify their parents, much like when parents are notified if their teen attempts to make a purchase,” wrote Davies. “Parents can decide if they want to approve the download. They can also verify the age of their teen when setting up their phone, negating the need for everyone to verify their age multiple times across multiple apps.”
“This way parents can oversee and approve their teen’s online activity in one place,” Davies added. “They can ensure their teens are not accessing adult content or apps, or apps they just don’t want their teens to use. And where apps like ours offer age-appropriate features and settings, parents can help ensure their teens use them.”
Davis’s call came as the the Senate judiciary committee sent a letter to Mark Zuckerberg, requesting that he “provide documents related to senior executives’ knowledge of the mental and physical health harms associated with its platforms, including Facebook and Instagram”.
The letter asks for the documents by 30 November.
Last week a former high-level Meta employee (Arturo Bejar, a former engineering director at Instagram) had testified before the Senate about the harm Instagram can do to children, including his own daughter.
Bejar alleged that Meta’s leaders had ignored his concerns when he raised them internally.
Bejar’s allegations mirror that of the famous whistleblower Frances Haugen, who in 2021 leaked internal documents to the US government about how company executives ignored warnings about the detrimental effects of social media use on teen girls.
She told the Senate in October 2021 that Facebook allegedly puts “astronomical profits before people”. She has also claimed that Facebook knows it is harming people, and she said the social networking weakens democracy.
CEO and co-founder Mark Zuckerberg however hit back at Haugen’s allegations at the time, pointing out that many of Haugen’s allegations don’t make sense.
“If we wanted to ignore research, why would we create an industry-leading research program to understand these important issues in the first place?,” Zuckerberg wrote at the time. “If we didn’t care about fighting harmful content, then why would we employ so many more people dedicated to this than any other company in our space – even ones larger than us?”
“At the heart of these accusations is this idea that we prioritise profit over safety and well-being,” wrote Zuckerberg. “That’s just not true.”