When Jennifer Watkins got a message from YouTube saying her channel was being shut down, she wasn’t initially worried. She didn’t use YouTube, after all.
Her 7-year-old twin sons, though, used a Samsung tablet logged into her Google account to watch content for children and to make YouTube videos of themselves doing silly dances. Few of the videos had more than five views. But the video that got Watkins in trouble, which one son made, was different.
“Apparently it was a video of his bottom,” said Watkins, who has never seen it. “He’d been dared by a classmate to do a nudie video.”
Human Rights Watch under fire for allegedly accepting millions in Qatar funds
Google-owned YouTube has artificial intelligence-powered systems that review the hundreds of hours of video that are uploaded to the service every minute. The scanning process can sometimes go awry and tar innocent individuals as child abusers.
Continue here: The New York Times