Children Stream on Twitch—Where Potential Predators Find Them

A WIRED investigation found dozens of channels belong to children apparently under 13 and anonymous chat participants sending inappropriate messages their way.
a girl holding controller
“Tech firms have consistently failed to design their sites with child safety in mind.”Photograph: Getty Images

Last weekend, a young girl held up her smartphone and hit the Go Live button on Twitch’s mobile app. Her stream appeared under Twitch’s Just Chatting section, where people livestream themselves talking to their viewers. She explains that she’s about to do her morning routine. Within minutes, 11 viewers filed in, including her best friend and several strangers. One viewer asks her age. They stick around after she says she’s 10.

As she gets out her toothbrush, one anonymous viewer asks in an adjoining chat window whether she has WhatsApp and says she is cute. Another tells her that she is beautiful. Her friend types into chat, “Bro I do not like this at all” and logs off. Later in the stream, the girl notes that a friend of hers got 15 followers in an hour and, comparing her own follower count, asks whether she’s ugly or weird. (The channel was active and had 15 followers until yesterday.)

According to Twitch’s Terms of Service, you have to be 13 to stream on the platform. But a WIRED investigation turned up dozens of Twitch accounts seemingly operated by children under that age, including another girl who admitted to being 11. In their videos, which crop up every few minutes under Twitch’s Just Chatting section, apparent children livestream themselves talking while playing games like Fortnite, performing dances popular on TikTok, or sitting at home and communicating with a small number of viewers. WIRED has viewed several messages from viewers to these apparent children containing inappropriate comments, questions, or demands, and identified some accounts that follow multiple apparent children.

“Both our desktop and mobile apps prevent users from creating accounts if they enter a date of birth that indicates they are under 13,” a Twitch spokesperson told WIRED. Twitch pointed to its reporting system as a way to fight inappropriate behavior toward children, adding that people reporting streamers who appear too young comprises “an extremely small proportion of the reports we receive.” Twitch did not directly answer whether it has dedicated resources to combat these incidents. “We take action on content that is reported to us when it violates our rules, including issuing warnings, removing the content, and suspending accounts for various lengths of time, including and up to indefinitely," the spokesperson said.

While over half of US children own smartphones by age 11, the Children's Online Privacy Protection Rule prevents apps from collecting data from children under 13 without parental consent. Most social media platforms place age limits on who can sign up in the first place. Last year, Google and YouTube paid $170 million to settle allegations brought by the Federal Trade Commission for violating COPPA. The FTC alleged that some YouTube content was aimed at children, and because children watched, the company collected their data. “At the end of the day it comes down to, are the platforms doing everything they’re legally supposed to do to limit that from happening?” says Brad Shear, a lawyer who specializes in social media and privacy. In his view, expeditiously responding to reports about improper behavior on the platform qualifies as a “responsible” approach to COPPA.

Twitch’s mobile app has relatively few barriers for children who know to input an older age in the sign-up form. They can create an account and stream within minutes after a quick email verification. Over the past few days, a half-dozen viewings of Twitch’s recently started live channels in the Just Chatting section have all turned up at least one apparent child within the top five or 10 entries. Many of them appear to be on mobile. Live videos on Twitch are archived and disappear after 14 days.

Similar spot checks of Twitch’s less popular competitors, YouTube Gaming and Facebook Gaming, turned up far fewer instances of apparent children livestreaming. To stream on YouTube via mobile, a user must have more than 1,000 followers. Facebook Live doesn’t have a comparable restriction, but its live channel discovery sections for Facebook Gaming and Facebook Live appear more curated or moderated than Twitch’s. (Facebook also works with about 15,000 content moderators in the US alone.) That doesn't mean those platforms are faultless; Facebook Live in particular has struggled publicly with moderating violent or dark livestreams. And issues with child predation and exploitation extend well beyond livestreaming; The New York Times reported earlier this year that instances of media related to online child sexual abuse increased 50 percent in 2019, including 60 million photos and videos flagged by Facebook.

The dozens of active accounts WIRED discovered on Twitch sometimes contain harrowing conversations between apparent children and strangers. In some instances, the strangers “dare” young streamers for their entertainment, including asking young girls to flip their hair or kiss their friend on camera. Other times, strangers ask for young streamers' contact information on other apps such as Facebook-owned Instagram or WhatsApp. (Twitch also has an integrated private chat feature.) They also pretend to donate money, making a chat message appear like a verified donation, or post inappropriate ASCII art in chat. The streamers themselves are by and large unsupervised.

WIRED shared dozens of apparent childrens’ accounts with Twitch. Some have since been deactivated. (Following this report's publication, Twitch removed two search functions in the "Just Chatting" section where WIRED discovered apparent children streaming. "We take the safety of our community extremely seriously and are investigating these claims," Twitch said in a statement. "In the immediate term, we are taking steps to make it more difficult to target streams with low viewership in categories where abuse has been reported to us. We are working to determine the scope of the problem and the appropriate long term solutions to best protect our community.")

“The safety of our global community is a priority for Twitch and one in which we are perpetually investing,” a Twitch spokesperson told WIRED. “We are continuously working to ensure all members of the community experience Twitch in the way we intend and are investing in technologies to support this effort. In addition, we regularly assess and update policies and practices to ensure we are appropriately addressing emerging and evolving behaviors.” The spokesperson says that Twitch has a dedicated law enforcement response team, and that it works with parent company Amazon’s law enforcement team. When appropriate, the company flags violations to law enforcement and works with the Technology Coalition and the National Center for Missing and Exploited Children.

Martha Kirby, the child safety online policy manager at the UK’s National Society for the Prevention of Cruelty to Children, says that Covid-19-related lockdowns have exacerbated the risk of online sexual abuse “like never before.”

“Poor design choices on livestreaming sites can be exploited by groomers to abuse children in real time,” she says. “Tech firms have consistently failed to design their sites with child safety in mind, allowing offenders to easily watch livestreams of children and message them directly.”

In one video, archived three days ago, an apparent child describes herself as “boredd” and asks people to talk to her. She sits in her driveway eating ice cream making small talk with a stranger. “I don’t really have much stuff to do,” she says before asking the stranger where they live. At least a half dozen other videos livestreamed in the last day feature apparent children referencing boredom.

Security expert and Savvy Cyber Kids founder Ben Halpert also noted that, in quarantine, children often go unsupervised as they spend increasing time online. “Kids feel connection with other people when they’re livestreaming and communicating on things like Twitch,” says Halpert. At the same time, it is notoriously difficult to moderate live content—especially when there’s a lot of it. According to analytics firm Arsenal, hours watched of Twitch’s Just Chatting section increased from 86 million in January to 167 million in June.

“The problem that tech companies are confronting is that there is so much new content put on platforms that humans can’t keep up with,” says Halpert, “and the technology needs to mature so it can become more accurate and help the safety of children.”

While the incentives to scale a livestreaming platform are enormous, it’s clear that in too many cases content moderation hasn’t increased in kind.

Updated 7-31-20, 11:10 am ET: This article has been updated to include Twitch's actions after this article's publication.


More Great WIRED Stories