Meta Faces Legal Challenge over Child Safety Concerns

New Mexico sued Meta for failing to protect younger users. Test accounts set up by the AG’s office, claiming to be preteens or teenagers, were bombarded with explicit messages and images, sexual propositions, and recommended sexual content by Meta’s algorithms. The suit accuses Meta of allowing its platforms to become a marketplace for child predators, and claims that CEO Mark Zuckerberg is personally liable for increasing risks to children.

Investigators provided adult dates of birth to get around Meta’s age restrictions, but implied that the accounts were being used by children. Inappropriate material flagged by the investigators was often found to be permissible by Meta’s reporting systems.

Meta claimed to prioritize child safety and invest in safety teams, reporting content to the National Center for Missing and Exploited Children and sharing information with other companies and law enforcement. The company also claimed to work to stop malicious adults from contacting children on its platforms.

Earlier reports indicated that Instagram’s algorithms helped accounts that commissioned and bought underage-sex material to find each other. The New Mexico lawsuit is part of a group of suits filed by 41 states and the District of Columbia alleging that Meta’s “addictive” aspects were harmful to young users and misleading people about safety on its platforms.