Instagram to Send Alerts to Parents if Teenagers Repeatedly Search for Terms Related to Suicide and Self-Harm on the App
 

Advertisment

Instagram is adding a new way to keep teenagers safe while they use the app. The company said on Thursday that it will notify parents if kids search for suicide terms. This will happen if a teen searches for these dangerous words repeatedly within a short period. The alerts are meant to help parents know when their child might be in trouble. Meta wants to give families a chance to talk before things get worse. Only parents who use the special supervision tool will get these new alerts.

The app already prevents harmful content from appearing in search results for young people. Now it is going a step further by sending messages to the parents. These messages will come through email, WhatsApp, or text. A notification will also show up on the parents' own Instagram account. The goal is to make sure the adults see the warning quickly. Meta is also working on a way to do this with their new AI tools. If a teen talks to the AI about hurting themselves, the parents will know.

This change comes at a time when Meta is facing significant legal trouble. Many families and schools are suing the company in court. They say the apps are made to be too addictive for children. They also say there aren't enough rules to keep kids from seeing bad things. Two major trials are underway in the United States on these issues. You can read more about it on Instagram to alert parents if teens repeatedly search for these terms. The feature will start soon in places like the UK and Canada. It will move to other countries later this year. Meta said they do not want to send too many alerts. They want the warnings to stay useful for families. This is a very important part of their plan to improve child safety online. Experts say that suicides can be prevented if people get help early.

Advertisment