A Disney Channel child star has told Sky News that she “broke down in tears” after learning a criminal had used artificial intelligence (AI) to create sexual abuse images using her face.
Kaylin Hayman, who is 16 years old, returned home from school one day to a phone call from the FBI. An investigator told her that a man living thousands of miles away had sexually violated her without her knowledge.
Kaylin’s face, the investigator said, had been superimposed on images of adults performing sexual acts.
“I broke down in tears when I heard,” Kaylin says. “It feels like such an invasion of my privacy. It doesn’t feel real that someone I don’t know could see me in such a manner.”
Kaylin has starred for several seasons in the Disney Channel TV series, Just Roll With It, and was victimised alongside other child actors.
“My innocence was just stripped away from me in that moment,” she adds. “In those images, I was a 12-year-old girl and so it was heartbreaking, to say the least. I felt so lonely because I didn’t know this was actually a crime that was going on in the world.”
But Kaylin’s experience is far from unique. There were 4,700 reports of images or videos of the sexual exploitation of children made by generative AI last year, according to figures from the National Centre for Missing and Exploited Children (NCMEC) in the US.
AI-generated child sex abuse images are now so realistic that police experts are compelled to spend countless, disturbing hours discerning which of these images are computer simulated and which contain real, live victims.
That is the job of investigators like Terry Dobrosky, a specialist in cyber crimes in Ventura County, California.
“The material that’s being produced by AI now is so lifelike it’s disturbing,” he says. “Someone may be able to claim in court, ‘oh, I believed that that was actually AI-generated. I didn’t think it was a real child and therefore I’m not guilty.’ It’s eroding our actual laws as they stand now, which is deeply alarming.”
Sky News was granted rare access to the nerve centre for the Ventura County cyber crimes investigations team.
Mr Dobrosky, a District Attorney investigator, shows me some of the message boards he is monitoring on the dark web.
“This individual right here,” he says, pointing at the computer screen, “he goes by the name of ‘love tiny girls’… and his comment is about how AI quality is getting so good. Another person said he loves how AI has helped his addiction. And not in a way of overcoming the addiction – more like fuelling it.”
Creating and consuming sexual images using artificial intelligence is not just happening on the dark web. In schools, there have been instances of children taking pictures of their classmates from social media and using AI to superimpose them onto nude bodies.
At a school in Beverly Hills, Los Angeles, five 13 and 14-year-olds did just that and were expelled while a police investigation was launched.
But in some states – like California – it’s not yet designated a crime to use AI to create child sex abuse images.
Rikole Kelly, deputy district attorney for Ventura County, is trying to change that, with a proposal to introduce a new law.
“This is technology that is so accessible that a middle schooler [10 to 14 years of age] is capable of utilising it in a way that they can traumatise their peers,” she says. “And that’s really concerning because this is so accessible and in the wrong hands, it can cause irreparable damage.”
“We don’t want to desensitise the public to the sexual abuse of children,” she adds. “And that’s what this technology used in this way is capable of doing.”