KGUN 9NewsNational News

Actions

Instagram to alert parents if teens search for suicide or self-harm content

The new feature is set to roll out next week for parents using Instagram’s supervision tools in the U.S., U.K., Australia, and Canada, with additional regions to be added later this year.
Instagram-Teens
Posted

Meta, the parent company of the social media platform Instagram, announced Thursday that it will soon begin notifying parents if their teenage children repeatedly attempt to search for suicide or self-harm content over a short period of time.

The Silicon Valley-based company said move adds another safeguard to Instagram’s Teen Accounts and parental supervision features. Meta said the feature will begin rolling out "in the coming weeks" and will include expert resources to help parents have potentially sensitive conversations with their children.

RELATED STORY | Study warns about significant mental health risks of giving smartphones to pre-teens

"The vast majority of teens do not try to search for suicide and self-harm content on Instagram, and when they do, our policy is to block these searches, instead directing them to resources and helplines that can offer support," Meta said in a statement. "These alerts are designed to make sure parents are aware if their teen is repeatedly trying to search for this content, and to give them the resources they need to support their teen."

How it will work
Beginning next week, parents and teens enrolled in supervision will be notified about the new alerts. Instagram says searches that would trigger an alert include phrases promoting suicide or self-harm, statements suggesting a teen wants to harm themselves, and terms like “suicide” or “self-harm.”

Meta said the alerts will be sent to parents via email, text, or WhatsApp — depending on available contact information — as well as through in-app notifications. Tapping the alert opens a full-screen message explaining that the teen has repeatedly tried to search for terms associated with suicide or self-harm in a short period of time. Parents can also access expert resources provided by Meta to help guide follow-up conversations.

IN CASE YOU MISSED IT | Instagram chief says he does not believe people can get clinically addicted to social media

The new feature will roll out next week to parents using Instagram’s supervision tools in the U.S., U.K., Australia, and Canada, with additional regions to be added later this year. Meta said it also plans to extend similar alerts to certain AI experiences.

"These will notify parents if a teen attempts to engage in certain types of conversations related to suicide or self-harm with our AI," the company said. "This is important work and we’ll have more to share in the coming months."