Social media (SNS) Instagram will provide a feature that notifies parents when a teen repeatedly searches for terms related to suicide or self-harm.
On the 26th (local time), according to U.S. CNBC, Instagram said it was "designed so that parents can recognize their child's situation and help," and stated accordingly.
If a teen searches within a short period for phrases that encourage suicide or self-harm, or for related words, the information will be sent to parents via email, text message, or Instagram notifications. However, both the parent and the teen must enroll in the feature.
The warning feature is scheduled to roll out next week in the United States, the United Kingdom, Australia and Canada, in that order.
CNBC noted that Instagram added the warning feature as lawsuits continue against the social media industry on the grounds that it is harmful to teens like cigarettes.
Earlier, Instagram's parent company Meta Platforms also announced plans to launch a feature that sends warning alerts to parents when teens attempt to have conversations related to suicide or self-harm with its own artificial intelligence (AI) chatbots.