In the wake of X’s controversial advertisement dispute involving antisemitic content, the spotlight has now shifted to Meta as it faces scrutiny regarding its content moderation practices. A recent investigation conducted by The Wall Street Journal revealed alarming findings, indicating that Instagram’s Reels feature was serving “risqué footage of children and overtly sexual adult videos” to test accounts that primarily followed young influencers, such as teen gymnasts and cheerleaders. This raises significant concerns, especially as these types of advertisements were supposed to be prohibited on Meta’s platforms, highlighting a serious gap in their content monitoring framework.
Compounding the issue, this explicit content was reportedly displayed alongside advertisements for major U.S. brands, including Disney, Walmart, Pizza Hut, Bumble, Match Group, and even The Wall Street Journal itself. The Canadian Centre for Child Protection conducted similar experiments and found comparable results, indicating a systemic problem. These findings underline the pressing need for stricter content regulation on social media platforms, particularly those frequented by younger audiences.
In light of these revelations, brands like Walmart and Pizza Hut opted not to comment, while others such as Bumble, Match Group, Hims (a provider of erectile dysfunction medications), and Disney have either pulled their advertisements from Meta or urged the company to take immediate action. Given the recent controversies surrounding X, it’s clear that advertisers are now more cautious about the type of content that appears alongside their ads. This is especially critical for brands like Disney, which has been adversely affected by both X and now Instagram.
In response to the outcry, Meta has assured its users that it is investigating the matter further. The company stated that it “would pay for brand-safety auditing services to determine how often a company’s ads appear beside content it considers unacceptable.” However, they have refrained from providing a timeline for these changes or sharing details about future preventive measures, leaving many stakeholders eager for more transparency.
While some may argue that these investigations do not accurately reflect the typical user experience—something tech companies often claim—there has been a persistent concern regarding Instagram’s tendency to promote sexualized content involving minors. This issue was reportedly known internally even before the launch of Reels, as indicated by current and former Meta employees interviewed by the WSJ.
This same group of insiders suggested that a comprehensive solution would involve a complete overhaul of the algorithms responsible for curating content shown to users. Yet, internal documents reviewed by the WSJ indicated that Meta has made it challenging for its security team to implement such significant changes, as maintaining traffic performance seems to take precedence for the social media giant.