• rottingleaf@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    5 months ago

    Creating and distributing anything should be legal if no real person suffers during its creation and if it’s not intended at defamation, forgery, such things.

    • AstralPath@lemmy.ca
      link
      fedilink
      English
      arrow-up
      0
      ·
      5 months ago

      Bruh how is creating and distributing a non-consensual nude-ified picture of a young girl not a cause for suffering for the victim? Please, explain that to the class.

      Did you just not go to school as a kid? If so, that would explain your absolute ineptitude on this topic. Your opinion is some real “your body, my choice” kind of energy.

      • krashmo@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        5 months ago

        There’s a legitimate discussion to be had about harm reduction here. You’re approaching this topic from an all-or-nothing mindset but there’s quite a bit of research indicating that’s not really how it works in practice. Specifically as it relates to child pornography the argument goes that not allowing artificial material to be created leads to an increase in production of actual child pornography which obviously means more real children are being harmed than would be if other forms were not controlled in the same fashion. The same sort of logic could be applied to revenge porn, stolen selfies, or whatever else we’re calling the kind of thing this article is referring to. It may not be an identical scenario but I still think it would be fair to say that an AI generated image is not as damaging as a real one.

        That is not to say that nothing should be done in these situations. I haven’t decided what I think the right move is given the options in front of us but I think there’s quite a bit more nuance here than your comment would indicate.