Stories

Thu, 26 Feb 2026

Thu, 26 Feb 2026 Instagram to alert parents if teens search for self-harm and suicide content

Safety campaigners say Meta is "passing the buck" with its new feature for parents using Instagram's teen supervision tools.

* Instagram's parent company Meta will start sending alerts to parents whose teenagers search for suicide or self-harm related terms on the platform.
* The alerts, which will be rolled out globally after a trial in four countries, are designed to notify parents if their child has searched for this type of content repeatedly within a short space of time.
* However, several charities have expressed concerns that these alerts could cause more harm than good and leave parents feeling panicked and unprepared to support their children.
* The Molly Rose Foundation, which was established by the family of a teenager who took her own life after viewing self-harm content on Instagram, has said that the alerts are "clumsy" and will not be effective in preventing suicides.
* Meta has defended its new system, saying it is designed to provide parents with expert resources to help them navigate difficult conversations about mental health with their children.
* Critics argue that Meta's announcement is an admission that more needs to be done to protect children on Instagram, and that the company should focus on making its systems age-appropriate by design and default.


Terms of Use | Privacy Policy | Manage Cookies+ | Ad Choices | Accessibility & CC | About | Newsletters | Transcripts
Business News Top © 2024-2025