
Former child actor Mara Wilson has spoken out about the horrific experience of her likeness being used in pornography before she reached high school.
She has issued a warning amid the controversy around X's AI feature Grok generating sexualised images of real people, suggesting 'the worst may be yet to come'.
Wilson, who starred as the titular character in the iconic 1996 adaptation of Roald Dahl's Matilda when she was just nine years old, has shared the 'nightmare' she went through when she realized predators were using her image to create child sexual abuse material (CSAM).
Now 38, the actor is warning the public about the threat posed by generative AI, which is increasingly being used to create sexualized images of women and children without their consent.
Advert
Writing in the Guardian, Wilson used stories from her own difficult childhood to show how pervasive and dangerous this issue is.
She started by making an important point that it was not the Hollywood studios that sexually exploited her image, but the public themselves.

She shared in her column a pearl of wisdom that she has learned in the three decades since starring alongside Danny DeVito and Rhea Perlman in the smash-hit family comedy.
Advert
Wilson said: “Hollywood throws you into the pool... but it’s the public that holds your head underwater.”
That is because Wilson was already seeing fake sexualized images of herself before she had even reached high school.
"I’d been featured on fetish websites and Photoshopped into pornography. Grown men sent me creepy letters," the actor shared.
She went on to say that, despite only starring in family friendly films and not being - in her own words - a 'beautiful girl', creeps would still turn her image into vile sexual material. Wilson said this was because of one thing.
Advert
She explained: "But I was a public figure, so I was accessible. That’s what child sexual predators look for: access. And nothing made me more accessible than the internet.
"It didn’t matter that those images 'weren’t me', or that the fetish sites were 'technically' legal. It was a painful, violating experience; a living nightmare I hoped no other child would have to go through."

This experience meant the actor 'feared the worst' when advanced generative AI emerged in the last few years, with an increasing number of women becoming victims of the new technology.
Advert
Wilson went on to share that she had left X after many years on the platform, at a time when lots of users were asking the social media's AI feature Grok to alter women's personal pictures to show what they would look like in a bikini. Children were also affected by this sick trend.
Elon Musk was forced to scale back the feature, promising to use geoblocks to prevent Grok users digitally undressing real people in countries where it is illegal.
But Wilson worried this could only be the tip of the iceberg, with some companies mulling open source AI models, which means 'anyone can access the code behind it'.
She says this would be a 'disaster for children's safety' and regulating measures are the only way to prevent further exploitation like what she experienced from becoming even more widespread.
Advert
She said: "We need to be the ones demanding companies that allow the creation of CSAM be held accountable.
"We need to be demanding legislation and technological safeguards. We also need to examine our own actions: nobody wants to think that if they share photos of their child, those images could end up in CSAM.
"But it is a risk, one that parents need to protect their young children from, and warn their older children about."
Topics: Celebrity, Film and TV, Artificial Intelligence