A Telegram user who advertises their services on Twitter will create an AI-generated pornographic image of anyone in the world for as little as $10 if users send them pictures of that person. Like many other Telegram communities and users producing nonconsensual AI-generated sexual images, this user creates fake nude images of celebrities, including images of minors in swimsuits, but is particularly notable because it plainly and openly shows one of the most severe harms of generative AI tools: easily creating nonconsensual pornography of ordinary people.

  • T156@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    9 months ago

    At the same time, that does introduce an additional layer of work. Most people aren’t going to do that just for the extra work that it would involve, in much the same way that people today won’t track down an image back down to the original source, but usually just go by the one that they saw.

    Especially for people who aren’t so cryptographically or technologically inclined that they know what a hash is, where to find one, and how to compare it (without just opening them both and checking personally).

    • dysprosium@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      1
      ·
      9 months ago

      Sure but that’s no problem if software would do that automatically for users of big (news) sites. Browsers on desktop and apps on phones.