Instagram's parent company, Meta, announced that it will test features that hide nudity-containing messages to protect minors.

 This is part of Instagram's effort to alleviate worries about hazardous material in its applications.

PHOTO | COURTESY Instagram

Meta is under increasing scrutiny in the United States and Europe following reports that its apps were addictive and exacerbated mental health concerns among young people.

Meta stated that the protection function for Instagram's direct messaging will employ on-device machine learning to determine if an image transmitted over the site contains nudity.

PHOTO | COURTESY Instagram 

Users under 18 will have the functionality enabled by default, and Meta will alert adults to encourage them to do the same.

"Because the images are analysed on the device itself, nudity protection will also work in end-to-end encrypted chats, where Meta won't have access to these images – unless someone chooses to report them to us," the company said.

Unlike Meta's Messenger and WhatsApp apps, DMs on Instagram are not encrypted, although the firm has stated that it intends to provide encryption for the service.

Meta also stated that it was creating technologies to help detect accounts involved in sextortion scams and testing new pop-up notifications for users who may have engaged with such accounts.

PHOTO | COURTESY Instagram

In January, the social media giant said that it would conceal more information from minors on Facebook and Instagram, making it more difficult for them to discover sensitive topics such as suicide, self-harm, and eating disorders.

Attorneys general from 33 states sued the company in October, alleging that it regularly deceived the public about the hazards of its platforms.

The European Commission has inquired how Meta safeguards minors from unlawful and dangerous content in Europe.