Meta Platforms is rolling out new privacy and parental controls for Instagram users under 18, addressing growing concerns about social media's impact on children.

The company announced that all Instagram accounts for minors will automatically be converted to "Teen Accounts," which will default to private settings.

PHOTO | COURTESY Instagram

Under the new guidelines, teen accounts will only allow messaging and tagging by accounts they already follow or are connected with. The default setting for sensitive content will be set to the most restrictive level, and users under 16 will need parental permission to make changes.

Parents will also have access to tools to monitor their children’s app activity and interactions and set screen time limits.

This move comes amid increased scrutiny of social media platforms like Meta's Instagram, ByteDance's TikTok, and Google's YouTube, which have been linked to rising levels of depression, anxiety, and other mental health concerns among young users.

PHOTO | COURTESY Instagram

These platforms face numerous lawsuits from parents and school districts over their addictive nature. In 2023, 33 U.S. states, including California and New York, sued Meta for downplaying the risks of its platforms.

Meta had previously considered creating a version of Instagram specifically for teens but scrapped the idea after pressure from lawmakers and child safety advocates. This latest effort aims to create a safer experience for younger users.

PHOTO | COURTESY Instagram

As part of the new updates, Instagram will encourage under-18 users to close the app after 60 minutes of use daily, and accounts will automatically enter sleep mode overnight, silencing notifications.

The changes will be implemented in the U.S., UK, Canada, and Australia within 60 days, with the European Union following later. A global rollout is expected in January.