DeepNude app used to create realistic fake nudes of S'pore women, explained

The fake nudes have been circulated on porn sites.

Tanya Ong | July 12, 2019, 08:18 PM

DeepNude is an app that creates nude images from a photo of a clothed person.

One just needs to input a photo of a woman, and using artificial intelligence, the app will create a photo of the same woman, but nude.

DeepNude app creates fake nudes

It went viral in June 2019 after Motherboard by Vice discovered the app.

The app only works on images of women.

In their article, Motherboard described how the results of the doctored nudes would "vary dramatically" after testing the app on different images of women in varying states of dress (and undress).

However, they were of the opinion that some fake nude images were "passably realistic":

"The results vary dramatically, but when fed a well lit, high resolution image of a woman in a bikini facing the camera directly, the fake nude images are passably realistic."

Free and paid-for versions were available.

The basic version of the app churns out photos with a large watermark saying "FAKE".

However, the paid versions of the app feature photos with a small watermark in the upper-left corner, which can be easily cropped out.

Women all over the world affected, including Singapore

Since the app went viral, scores of women have had their photos doctored by the app.

Some of these photos have also been circulated on the internet, and more horrifyingly, uploaded to pornographic sites.

In Singapore, it is also believed that some women have also fallen victim to the DeepNude app.

According to The New Paper, a woman discovered that a fake nude photo of her had appeared on a sex forum.

The original photo of her was uploaded on social media a year ago.

Apart from this victim, dozens of other women in Singapore have had their photos doctored and uploaded to the same forum.

Pulled the plug on the app

DeepNude is believed to have been created sometime early in 2019 by developers who listed their location to be in Estonia.

In a tweet explaining the "brief history" of DeepNude, they said they created it purely for "user's entertainment":

However, after it had gone viral, the developers acknowledged that the probability for misuse was too high, and eventually pulled the plug on the app.

Since then, DeepNude has not released any other versions nor granted anyone its use.

"We don't want to make money this way," they said.

Deep fakes have been around, and they are a problem

The obvious problem with such technology is that people do not give their consent for their photos to be used and circulated in such a manner.

However, the problem of doctored nude photos is not a new one.

And women also seem to suffer disproportionately from the prevalence of such technology.

Previously, Vox reported on the subreddit r/CelebFakes.

It is a community devoted to doctored photos of celebrities supposedly in existence on Reddit since 2011.

In this subreddit, celebrities are photoshopped to appear nude and have been spread to various porn sites.

However, the conversation surrounding deepfakes -- the technique used to edit human image -- blew up sometime in 2017 when the technology resulted in videos where the faces of porn stars were swapped with faces of celebrities.

Celebrities affected by these "eerie approximations" include Emma Watson, Emilia Clarke, Sophie Turner and Ariana Grande.

But apart from perpetuating a horrifying culture of misogyny, another Vox article also explained how such photos can be a threat to the political realm because of its potential to "sow misinformation".

A few examples include a fake video of Mark Zuckerberg saying things he never said, and digitally inserting Donald Trump into random videos.

Laws surrounding it

There are, thankfully, laws in Singapore that help to safeguard against apps like DeepNude.

According to TNP, doctoring photos to make people appear naked is a criminal offence.

Under the Films Act, anyone who creates such pictures can be fined up to S$40,000, jailed for up to two years, or both.

Perpetrators can also be charged with insult of modesty, for which they will face a jail term of up to a year, a fine, or both.

Victims may also use the Protection from Harassment Act (POHA) to take out protection orders against online users.

Top photo via Unsplash