Instagram is launching a global update that will automatically place all teen accounts into a PG-13 content filter. This means users under 18 will only see content deemed appropriate for their age group, blocking exposure to posts containing profanity, dangerous behavior, drug-related material, and other adult themes. The update is part of Meta’s ongoing efforts to improve safety for younger users on the platform.
Even if a teen tries to search for restricted topics, Instagram will not show results, and the system will also detect common spelling variations to prevent workarounds. A new parental control feature called "Limited Content" will allow guardians to further restrict what kind of posts their children can see, interact with, or discuss through messages. Teens won’t be able to change these settings without parent approval.
To enforce the age policies, Instagram will expand the use of artificial intelligence to detect users who may have falsified their age when creating accounts. Those identified will be moved into appropriate content settings based on their actual age group.
This update builds on existing safety features such as default private accounts for teens, filters for sensitive content, reduced notifications during nighttime hours, and limitations on interactions with unknown adults. Instagram will also block teen accounts from following profiles known for frequently sharing inappropriate or mature content. If a teen already follows such accounts, those connections will be removed automatically.
The changes will begin rolling out in the US, UK, Canada, and Australia, with global expansion expected by the end of the year.