Meta To Hide Eating Disorder, Suicide Posts From Teenagers

instagram, meta, social media

Posts about suicide and eating disorders will be hidden from the Instagram and Facebook feeds of teenagers

Meta Platforms has announced new protections, which it says will give teenagers “more age-appropriate experiences on ours apps.”

In a blog post on Tuesday, Meta confirmed it will “begin hiding more types of content for teens on Instagram and Facebook, in line with expert guidance.”

Meta said it is automatically placing all teenagers into the most restrictive content control settings on Instagram and Facebook, and is also restricting additional terms in Search on Instagram.

Teenage content policies

Meta said that it has developed more than 30 tools and resources to support teens and their parents, and it now announcing “additional protections that are focused on the types of content teens see on Instagram and Facebook.”

The platform said it “regularly consult with experts in adolescent development, psychology and mental health to help make our platforms safe and age-appropriate for young people, including improving our understanding of which types of content may be less appropriate for teens.”

And one of the big concerns around social media and teenagers is the issue of self-harm, as evidenced by the tragic case of Molly Russell.

A London coroner ruled that social media was “likely” to have played a role in the death of teenager Molly Russell in November 2017 when aged 14.

That ruling is increased the pressure on social media companies to do more to protect children.

In October last year, 33 US states filed legal action against Meta, alleging its Instagram and Facebook platforms were harming children’s mental health.

The US states alleged Meta contributes to the youth mental health crisis by knowingly and deliberately designing features on Instagram and Facebook that addict children to its platforms.

That federal suit was reportedly the result of an investigation led by a bipartisan coalition of attorneys general from California, Florida, Kentucky, Massachusetts, Nebraska, New Jersey, Tennessee, and Vermont.

Self-harm

Meta said the issue of self-harm is a complex topic, and has now confirmed that it will hide posts about suicide, eating disorders from teens’ Instagram and Facebook feeds.

It also said it would also remove “other types of age-inappropriate content.”

“Meta is evolving its policies around content that could be more sensitive for teens, which is an important step in making social media platforms spaces where teens can connect and be creative in age-appropriate ways,” Dr. Rachel Rodgers, Associate Professor, Department of Applied Psychology, Northeastern University was quoted as saying.

“These policies reflect current understandings and expert guidance regarding teen’s safety and well-being. As these changes unfold, they provide good opportunities for parents to talk with their teens about how to navigate difficult topics,” said Dr. Rodgers.

Meta said that it is automatically placing teens into the most restrictive content control setting on Instagram and Facebook. It pointed out that it has already apply this setting for new teens when they join Instagram and Facebook, and is now expanding it to teens who are already using these apps.

Of course, all of this is dependent that teenagers did not lie about their age when they signed up for Instagram or Facebook.

Avoiding regulation?

Critics said Meta’s moves don’t go far enough.

“Today’s announcement by Meta is yet another desperate attempt to avoid regulation and an incredible slap in the face to parents who have lost their kids to online harms on Instagram,” Josh Golin, executive director of the children’s online advocacy group Fairplay was quoted as saying the Associated Press.

“If the company is capable of hiding pro-suicide and eating disorder content, why have they waited until 2024 to announce these changes?”

The issue of protecting children and teenagers from social media has dogged Meta for years now, as evidenced by the lawsuit from US states.

Prior to that in October 2021, the head of Instagram confirmed the platform was ‘pausing’ the development of the “Instagram Kids” app, after the Wall Street Journal (WSJ) had reported on leaked internal research which suggested that Instagram had a harmful effect on teenagers, particularly teen girls.

Facebook had previously said it would require Instagram users to share their date of birth, in an effort to improve child safety.

Following the first reports, a consortium of news organisations published their own findings based on leaked documents from whistleblower Frances Haugen, who has testified before Congress and a British parliamentary committee about what she found.