Emma watson fakes
CHILDHOOD photos of actress Emma Watson are being used by pervs on a porn forum dedicated to creating fake celebrity sex videos, The Sun has learned.
The actress, who may be as young as 10 in one photo, has been consistently targeted by users of the "deepfakes" app, which uses AI to let anyone make their own phoney smut clips.
Law professor Andrew Murray told The Sun that creating these videos using underage photos of Emma Watson is an "offence under the Protection of Children Act", and that doing so lead to jail time.
To create the videos, users first track down an XXX clip featuring a porn star that looks like an actress.
They then feed an app with hundreds – and sometimes thousands – of photos of the victim's face, and a machine learning algorithm swaps out the faces frame-by-frame until it spits out a realistic, but fake, sex tape.
Users have been sharing these fake porn videos, possible illegally, on the Reddit internet forum, although some hosting services have been trying to squash the dodgy material.
To help other users create these videos, pervs are uploading "facesets", which are huge computer folders filled with a celebrity's face that can be easily fed through the "deepfakes" app.
A folder filled with photos of Harry Potter film star Emma Watson contained several images where the actress is clearly underage.
One image of Emma's face had been pulled from the Harry Potter and the Philosopher's Stone movie, in a scene where her character Hermione was meeting Harry and Ron for the first time on the Hogwarts Express.
The movie was filmed between September 2000 and July 2001, which means Emma – who was born in April 1990 – would've been just 10 or 11 years old at the time.
We spoke to Andrew Murray, a Professor of Law at the London School of Economics, who specialises in technology and media law.
Murray said that creating videos using these underage pics for the creation of porn is "almost certainly" illegal.
He explains that the Protection of Children Act 1978 prohibits the production of an indecent image of a child, including "by superimposing the child's face onto images of an adult".
He saws that the law covers "pseudo-photographs", which include images made using computer software that look like genuine photographs.
"If the image appears to portray a child in an indecent way then this is covered by the Act," he tells us.
According to Murray, the creation of these videos carries a maximum jail sentence of 10 years, while possession is punishable by up to five years in jail in the UK.
However, he says that people supplying the underage photos of childhood stars like Emma Watson probably aren't committing a criminal offence, but are likely committing "civil wrongs, copyright infringement for one".
"Authorities may want to attempt to charge those who supply the original images with some form of incitement offence," he added.
One Reddit user commenting on a link to Emma Watson's "faceset" wrote: "If some of these images were used to train the model's face, would or wouldn't it be considered kiddie porn? I'm gonna say 'yes'".
MOST READ IN TECH
Another said: "Yeah, it's pretty f***ed up to put pictures of celebs when they are underage."
But despite some minor backlash forum users have "upvoted" – or liked – the Watson "faceset" more than 70 times.
We've asked Reddit for comment and will update this article with any response.
emma watson fakes
Since finding fame at the tender age of ten, through her role as Hermione Grainger in the Harry Potter movie franchise, Emma Watson has become a household name. She's also, sadly, become one of the most deepfaked celebrities online.
This word – 'deepfake' – is slowly embedding itself into our everyday lexicon as concerned campaigners speak out about deepfaking (a form of AI-generated synthetic media that can make it look like anyone is doing anything in a video, or still image) to raise awareness and encourage strict laws to be put in place. But, at this early stage, where deepfakes are still trickling into mass public consciousness, they're mostly still associated with political takedowns or women being edited into pornographic scenarios. Largely, without their consent.
But there's more we should be concerned and talking about besides deepfake videos, namely: deepfaked audio.
Can you deepfake someone's voice?
Over 90% of deepfaked content online is sexual in nature and features a female victim (be they a celebrity or member of the general public), most are non-consensual. Equally as alarming is the fact that many of the famous faces who are popular on deepfake websites, that receive millions of hits a month, also entered the spotlight at a young age – and sometimes, images of them whilst underage are used to generate pornographic videos.
Today, in the year 2023, generally we're all aware that Photoshopped images can be (and are often) created and posted minus any disclaimer. We know to look out for them on Instagram, via curvy door frames or wobbly floor tiles. The Kardashians are often called out for their Photoshop fails. We're also slowly coming around to the idea that videos can be distorted too, thanks to the likes of Nicki Minaj's 'slimming filter' glitch and ITVX's new series, Deep Fake Neighbour Wars, which tenuously sees the likes of 'Idris Elba', 'Greta Thunberg' and 'Kim Kardashian' being argumentative neighbours. (The celebs are AI-generated and combined with the bodies of impersonators, who also do voiceovers.)
How on earth ITVX has been able to get away with using the likenesses of these celebrities is curious and creepy in equal measure, but, presumably, the sheer lack of solid legislation that's in place (something the government have promised will be partially addressed in the new Online Safety Bill) surrounding deepfaking someone's image, with or without their consent, is part of the reason why. When asked, a spokesperson from ITVX told Cosmopolitan UK: "Comedy entertainment shows with impressionists have been on our screens since television began, the difference with our show is that we're using the very latest AI technology to bring an exciting fresh perspective to the genre." They also run a disclaimer at the start of each episode saying none of the celebs involved have consented, and have a 'Deep Fake' watermark on the screen throughout.
But, when it comes to discussing the rules surrounding deepfaked audio – rules are even flimsier – and now, we're seeing the damage voice-cloning can do play out just as quickly. Sadly, it's something Emma Watson has once again been targeted with, on a terrifying scale. Recently, an uncanny AI-generated voice clip of 'her' reading out Adolf Hitler's Mein Kampf (in which the murderous dictator outlines his political manifesto) was posted onto 4chan, a chat forum site known to have dark undercurrents and a strong incel community, showing just how powerful (and accessible) this tech is becoming. On a bigger scale, in 2020, well-versed criminals even used deepfake audio to trick a bank manager in Hong Kong into authorising a $35 million transfer.
Concerns have also been raised about the use of deepfaked audio (or 'deep voicing') when it comes to blackmailing individuals, or the possibility of falsified clips being put forward as evidence during legal trials. So, what can we do about it all?
"Criminals even used audio to trick a bank manager into authorising a $35 million transfer"
Technology site Motherboard suggests that the clip of Watson was made using ElevenLabs' (an "AI speech platform promising to revolutionize audio storytelling" which is still in the beta testing stage) voice-simulating programme. Other publications have backed this, with The Verge also experimenting and reporting being able to puppeteer Joe Biden's voice into reading a script they'd written in a number of minutes. In response to these concerns being raised in the press and on social media, ElevenLabs later posted a thread on Twitter saying that they would be clamping down on those wanting to use their voice-copying AI anonymously, or for harmful reasons.
"We’ve always had the ability to trace any generated audio clip back to a specific user. We'll now go a step further and release a tool which lets anyone verify whether a particular sample was generated using our technology and report misuse," a statement posted 31 January read. "Almost all of the malicious content was generated by free, anonymous accounts. Additional identity verification is necessary. For this reason, VoiceLab will only be available on paid tiers. This change will be rolled out ASAP."
It continued on to say: "We're tracking harmful content that gets reported to us back to the accounts it originated from and we're banning those accounts for violating our policy [...] We will continue to monitor the situation."
As for possible positive uses for deepfaked audio? Some have suggested it could help transform scripts into podcasts or radio plays, be beneficial for creating audiobooks and be of use for visually impaired users. The likes of ElevenLabs also have non-celebrity voices available to deliver any text/wording required, with human-sounding tone, inclination and speech pattern (which is to say: robots simply do not sound like robots anymore) – but they're far from the only company dabbling in this field, or offering users the chance to create their own deepvoiced scenarios.
This is just another big societal wake-up call: deepfake technology – and AI generally – will become further entrenched in every aspect of our lives, and that makes it more important than ever to remain vigilant to it. The powers that be in the tech sphere must better control the beasts they have built, and the government must hold them accountable for doing so.
Follow Jennifer on Instagram and Twitter
Related Story
Related Story
Features Editor
Jennifer Savin is Cosmopolitan UK's multiple award-winning Features Editor, who was crowned Digital Journalist of the Year for her work tackling the issues most important to young women. She regularly covers breaking news, cultural trends, health, the royals and more, using her esteemed connections to access the best experts along the way. She's grilled everyone from high profile politicians to A-list celebrities, and has sensitively interviewed hundreds of people about their real life stories. Jennifer is also a published author and patron for Y.E.S. (a youth services charity). She's a big fan of lipstick, leopard print and over-ordering at dinner.