Dear MEL Topic Readers,
AI means anyone can be a victim of deepfake porn. Here’s how to protect yourself
Deepfakes are images, videos, or audio that are edited or created by using artificial intelligence tools to depict real or non-existent persons. As AI becomes more capable and agentic, anyone could be a victim of a nonconsensual deepfake. Artificial intelligence tools can easily superimpose someone's face onto a nude body or manipulate a photo to a porn image or video. So, a deepfake photo or video of you, your child, or your friend might one day show up on the Internet without your consent or knowledge. Though you may not be able to prevent such a malicious act, you can still report it to platforms like Google or Meta and request the removal of such images. To do that, you need evidence like a screenshot even though you hate to see it. Also in some countries, there are government or non-profit institutions that help facilitate the removal of such images. How soon will AI prevent or remove such deepfakes from the Internet?
Read the article and learn about AI-created deepfake images.
https://edition.cnn.com/2024/11/12/tech/ai-deepfake-porn-advice-terms-of-service-wellness/index.html