Teen Influencers’ Followers Exposed to Child-Sexualizing Reels on Instagram

After X’s alleged ad controversy involving antisemitic content, Meta is now under scrutiny for its content algorithm. An experiment by The Wall Street Journal found that Instagram’s Reels video service showed “risqué footage of children and overtly sexual adult videos” to accounts that only followed teen and preteen influencers, including young gymnasts and cheerleaders. These types of ads were supposed to be banned on Meta’s platforms.

The report revealed that salacious content was mixed in with ads from major US brands like Disney, Walmart, Pizza Hut, Bumble, Match Group, and even The Wall Street Journal itself. The Canadian Centre for Child Protection also found similar results in its own tests.

Bumble, Match Group, Hims (a seller of erectile-dysfunction drugs), and Disney have either removed their ads from Meta or urged the firm to address the issue. After the X controversy, advertisers are particularly sensitive about the content that appears next to their ads, especially for Disney, which was impacted by both X and now Instagram.

In response, Meta said it was investigating and would pay for brand-safety auditing services. However, the firm did not provide a timeline or details on future prevention.

Although some argue that these tests do not necessarily represent actual user experience, it was known internally at Instagram that the aggregation of child sexualization content was a problem, even before the launch of Reels, according to current and former Meta employees interviewed by the WSJ.

These employees suggested that revamping the algorithms responsible for pushing related content to users would be an effective solution. However, internal documents seen by the WSJ indicated that Meta made it challenging for its safety team to implement such significant changes, as traffic performance is apparently more important for the social media giant.