The Deepfake Videos Used In Porn Could Get More Dangerous (HBO)

Описание к видео The Deepfake Videos Used In Porn Could Get More Dangerous (HBO)

Last December a Reddit user named "deepfakes" posted a pornographic video with the graphically superimposed image of Gal Gadot's animated face on a porn star's body. In the video Gadot's face doesn't blink, but it was otherwise a reasonably convincing fake. This precipitated a brief but highly predditublicized blitz of amateur graphic artist trolls posting face-swapped videos online.

Now, just about anyone can be in almost anything — and there's no clear solution any serious consequences of it.

Reddit, Twitter, and other websites banned these "involuntary" pornographic videos. But "deepfakes" videos, as they have come to be known, remain online on mainstream sites more in the humorous variety. Posting videos of Nicolas Cage's face, for example, is particularly popular.

While celebrities and public figures are rightly concerned about the implications of this technology, its roots are not in the dark corners of the internet.

Last week in Vancouver the world of computer-generated graphics companies gathered at the SIGGRAPH conference to discuss their trade. It is an annual conference where the motion picture industry, computing giants like Google and Adobe, and leading computer science research institutions present their latest research.

Researchers at the Technical University of Munich, along with colleagues from Stanford, presented their work to make deepfakes even more advanced. Their work is intriguing, but when an audience member asked them how they planned to confront the ethical implications of these easily replicable fake videos, nobody had a satisfying answer. It was as if the researchers hadn’t considered how the technology they were developing could be misused.

The U.S. government thinks that’s a problem. This year DARPA will spend more than 28 million dollars developing a way to detect and debunk deepfake videos. But even if DARPA helps funds technology that can detect a fake video, that doesn’t mean people won’t be able to make it — and distribute it quickly — anyways.

Subscribe to VICE News here: http://bit.ly/Subscribe-to-VICE-News

Check out VICE News for more: http://vicenews.com

Follow VICE News here:
Facebook:   / vicenews  
Twitter:   / vicenews  
Tumblr:   / vicenews  
Instagram:   / vicenews  
More videos from the VICE network: https://www.fb.com/vicevideo

Комментарии

Информация по комментариям в разработке