Investigation finds YouTube is serving mindless AI slop to toddlers and preschoolers

A new investigation from The New York Times reveals how quickly YouTube floods feeds with bizarre AI-generated videos aimed at the youngest viewers. After a single CoComelon video, more than 40 percent of the recommended Shorts in a 15-minute session contained synthetic visuals.

The algorithm pushes content from channels claiming to teach toddlers about the alphabet and animals. But the clips themselves are often nonsensical, featuring warped faces, extra body parts and garbled text. None run longer than 30 seconds.

Recommended Videos

Experts say that format leaves no room for repetition or narrative structure, both essential for young children learning from media. Yet the videos pull millions of views.

Creators, many operating anonymously, have turned AI tools into a reliable income stream. The barrier is low, the payoff is high and the feed keeps flooding.

The algorithm prioritizes quantity over quality

Reporters conducted the analysis over several weeks, watching popular channels like CoComelon and Ms. Rachel from a private browser. Then they scrolled through recommended YouTube Shorts in 15-minute intervals to see what surfaced.

In one session following a “Wheels on the Bus” video, over 40 percent of the recommendations showed signs of AI generation. Some clips carried YouTube’s own “altered or synthetic content” label. Others required an AI detector to confirm because the visuals were seamless enough to evade casual detection.

The same videos and channels popped up repeatedly across multiple sessions. That suggests the algorithm actively boosts this content instead of filtering it out. Many accounts produce these clips multiple times a day, optimized for maximum views with minimum effort.

Inside the creator economy fueling the feed

Many of the YouTube accounts producing AI-generated children’s content operate anonymously. They list no contact information and offer few identifiable details about who runs them. The barrier to entry is remarkably low.

Creators teach themselves using readily available tools like Google’s Whisk and Runway, often following online tutorials. Some channels present themselves as educational, featuring animated animals and sing-along songs designed to appeal to parents seeking learning content for toddlers.

The financial incentive drives rapid production. One Halloween video featuring spooky animals amassed more than 370 million views. Accounts pump out multiple videos per day, optimized for maximum reach with minimum effort. The formula works: grab attention fast, keep it short and let the algorithm handle distribution.

YouTube reacts but parents are left to police the feed

After the Times shared examples with YouTube and requested comment, the platform suspended all five cited channels from its Partner Program. Those accounts can no longer earn ad revenue or appear on YouTube Kids. The company also removed three hyperrealistic videos from the kids’ app and took down one clip for violating child safety policies.

But the response was reactive, not proactive. YouTube requires creators to disclose AI-generated realistic content, but that rule doesn’t apply to animated videos for children. So the burden falls to parents, a task even experts find daunting as the tools improve.

Some families now create their own playlists of vetted content or remove the app entirely. The American Academy of Pediatrics advises parents to avoid AI-generated or highly sensationalized content. Spotting it remains the hard part.

Need help?

Don't hesitate to reach out to us regarding a project, custom development, or any general inquiries.
We're here to assist you.

Get in touch