An app which claimed to be able to ‘undress women’ has been taken offline by its creators over fear people will misuse it.
The app ‘DeepNude’ cost $50 (£40) and allowed users to create nude photos from pictures of fully dressed women, with creators calling it the ‘superpower you always wanted’ on Twitter.
The program reportedly used AI-based neural networks to remove clothing to produce realistic naked shots by matching skin tone, lighting and shadows and filling in estimated physical features.
The team behind DeepNude said the app was created for ‘entertainment’ a few months ago and after being featured on the site Motherboard the app owner’s website crashed as people sought to download the software.
Users could access two versions of the program; a free one which put large watermarks over the images or a paid version which put a small ‘fake’ stamp in the corner, though this would be easy to edit or crop out.
However, despite the precautions taken to identify the photos as fake, Katelyn Bowden, founder of anti-revenge porn campaign group Badass, called the app ‘terrifying’.
The most disturbing thing I read about today. An app which allowed men to digitally undress women via AI enable photoshop. How was this allowed to happen in the first place? Are the creators so naive to think it won't be misused? #DeepNude https://t.co/a6Lw0E7QR8
— Yamini P.Bhalerao (@yamini_pb) June 29, 2019
Speaking to Motherboard, she said:
Now anyone could find themselves a victim of revenge porn, without ever having taken a nude photo. This tech should not be available to the public.
Although the bodies shown wouldn’t actually be that of the person in the photo, giving users the opportunity to create naked photos of women, without their knowledge, is incredibly disrespectful and violating.
Take a look at one example of the app here:
The app only works on photos of women; Motherboard reports when the app was used on an image of a man it replaced his clothes with a vulva.
A creator told Motherboard ‘I’m not a voyeur, I’m a technology enthusiast’ but in reality DeepNude serves no purpose other than providing users with voyeuristic pleasure at the expense of women, most of whom likely won’t have consented to having their photo used.
The app creates the impression users can see women naked at their command; something which could be threatening if transferred to the real world.
Just to confirm, these guys aren't sorry. And they aren't that stupid. They think the power to make any woman you want appear naked is a superpower.
They're problem#deepnude pic.twitter.com/FcPwCqfgSl
— Jane Wilson (@janewilson90) June 27, 2019
Earlier this week, the creators took to Twitter to announce their decision to take the app offline.
They wrote:
We thought we were selling a few sales every month in a controlled manner. Honestly, the app is not that great, it only works with particular photos.
We greatly underestimated the request. Despite safety measures adopted (watermarks) if 500,000 people use it, the probability that people will misuse it is too high. We don’t want to make money this way.
The world is not yet ready for DeepNude.
— deepnudeapp (@deepnudeapp) June 27, 2019
Anyone who already owns the app but has not yet updated it will be given a refund, though it will still work for those who have it downloaded.
While the company admitted it might get shared online from other sources, they made clear they wouldn’t release other versions and that downloading the app from another source would be against the terms of the website.
If you have a story you want to tell, send it to [email protected].
Emily Brown first began delivering important news stories aged just 13, when she launched her career with a paper round. She graduated with a BA Hons in English Language in the Media from Lancaster University, and went on to become a freelance writer and blogger. Emily contributed to The Sunday Times Travel Magazine and Student Problems before becoming a journalist at UNILAD, where she works on breaking news as well as longer form features.