The Future. WSJ found that ads from brands like Disney, Walmart, Pizza Hut, Bumble, and even WhatsApp itself were served in between videos portraying child sexualization and adult film content on Reels… and users would see this content by simply following youth-oriented accounts. The test shows that the race to catch up with TikTok (and a moderation policy that protects user-engagement metrics above all else) may be making even the most mainstream platforms on the internet a dangerous place for people, companies, and, most importantly, kids.
Swipe at your own risk
The results of WSJ’s investigation into Reels’ algorithm highlights the danger brands face in advertising on short-form video platforms… especially ones racing to become the market leader.
- WSJ tested the efficacy of content moderation on Reels by creating test accounts that innocently followed “only young gymnasts, cheerleaders, and other teen and preteen influencers active on the platform.”
- Reels quickly started serving those accounts a mix of sexual adult content, pedophilic content, and advertising from major brands.
- When those test accounts followed adult users who also followed those youth accounts, they were served even more “disturbing recommendations.”
- The Canadian Centre for Child Protection also conducted a similar test and got the same results.
Why did this happen? According to experts, adults, especially men, who followed these accounts “had demonstrated interest in sex content related to both children and adults,” per WSJ. So, Meta’s algorithm was just giving them more of what they want. Current and former staffers at Meta, including its former head of youth policy, say that, internally, this has been a known problem… but the company pushed forward with a rollout regardless.
Meta says that it’s recently introduced new brand-safety tools and it’ll investigate the claims… but some brands, like Match Group and Bumble, have already pulled their ads off the platform.