Meta will introduce a new policy on Instagram that blocks adult search terms and potentially harmful content to strengthen protections for teens. The company said the move aims to allow teens to see only content at the PG-13 rating level.
On the 14th (local time), Meta said in a press release that it will operate teen Instagram accounts at the PG-13 rating level under film standards, and that it will hide or not recommend content that could negatively affect teens, such as posts with marijuana-related paraphernalia or heavy swearing.
In addition to existing sensitive topics such as suicide, self-harm, and eating disorders, search terms associated with adult themes like "alcohol" and "violence" were also included in the block list. As a result, teen users will not be able to search for or be exposed to related content.
In addition, accounts that repeatedly post content inappropriate for minors will be restricted so that teens cannot follow them or exchange messages.
Meta said the update will help parents manage their children's Instagram content exposure more precisely. The policy will be applied first to teen accounts in the United States, the United Kingdom, Australia, and Canada, then expanded by the end of the year before being rolled out worldwide in phases.
However, civic groups remain skeptical of Meta's move. Josh Golin, executive director of the nonprofit Fairplay, said, "The move appears to be a publicity tactic to avoid regulatory legislation," and noted, "Announcements like this alone cannot keep kids safe; genuine accountability and transparency are needed."
Eileen Areasa, head of the parents' group ParentsTogether, also said, "Meta has made similar promises many times, but actual execution has been lacking," and emphasized, "This time, too, independent verification and the disclosure of concrete results must follow."