Amidst what many are calling a teen safety crisis on its platform, Instagram is rolling out its latest proposed fix: a PG-13 style content rating system. The new feature, announced by Meta, will automatically apply stricter filters to all accounts of users under the age of 18.
The system’s core is a “13+” setting that will serve as the new default experience for teenagers. This protective bubble can only be removed if a parent or guardian explicitly grants permission, a move designed to foster more parental involvement and control.
The PG-13 mode will be more comprehensive in its filtering. It aims to hide posts with strong language, risky stunts, and content that promotes harmful behaviors, such as featuring marijuana paraphernalia. Furthermore, Instagram will now block searches for sensitive terms, even if misspelled, to close a common loophole.
This action comes on the heels of a damning independent report which concluded that Instagram’s current safety tools are largely ineffective. While Meta has disputed this, the company is clearly feeling the pressure from the public, researchers, and regulators like the UK’s Ofcom to implement more meaningful protections.
The feature will launch in the US, UK, and other key markets first, with a global expansion to follow. However, advocacy groups are not celebrating yet. They are demanding that Meta allow independent researchers to test the system to verify that it provides real, effective protection for young users.
Is This the Fix? Instagram Tries PG-13 Rating to Address Teen Safety Crisis
61