A Telegram user who advertises their services on Twitter will create an AI-generated pornographic image of anyone in the world for as little as $10 if users send them pictures of that person. Like many other Telegram communities and users producing nonconsensual AI-generated sexual images, this user creates fake nude images of celebrities, including images of minors in swimsuits, but is particularly notable because it plainly and openly shows one of the most severe harms of generative AI tools: easily creating nonconsensual pornography of ordinary people.

    • conciselyverbose@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      0
      ·
      1 year ago

      Doesn’t mean distribution should be legal.

      People are going to do what they’re going to do, and the existence of this isn’t an argument to put spyware on everyone’s computer to catch it or whatever crazy extreme you can take it to.

      But distributing nudes of someone without their consent, real or fake, should be treated as the clear sexual harassment it is, and result in meaningful criminal prosecution.

      • cley_faye@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        1 year ago

        I’m not familiar with the US laws, but… isn’t it already some form of crime or something to distribute nude of someone without their consent? This should not change whether AI is involved or not.

        • T156@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          edit-2
          1 year ago

          It might depend on whether fabricating them wholesale would be considered a nude or not. Legally, it could be considered a different person if you’re making it, since the “nude” is someone else, and you’re putting their face on top, or it’s a complete fabrication made by a computer.

          Unclear if it would still count if it was someone else and they were lying about it being the victim, for example, pretending a headless mirror-nude was sent by the victim, when it was sent by someone else.

      • ITGuyLevi@programming.dev
        link
        fedilink
        English
        arrow-up
        0
        ·
        1 year ago

        While I agree in spirit, any law surrounding it would need to be very clearly worded, with certain exceptions carved out. Which I’m sure wouldn’t happen.

        I could easily see people thinking something was of them, when in reality it was of someone else.

      • treadful@lemmy.zip
        link
        fedilink
        English
        arrow-up
        0
        ·
        1 year ago

        Almost always it makes more sense to ban the action, not the tool. Especially for tools with such generalized use cases.

    • goldteeth@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      1 year ago

      “Djinn”, specifically, being the correct word choice. We’re way past fun-loving blue cartoon Robin Williams genies granting wishes, doing impressions of Jack Nicholson and getting into madcap hijinks. We’re back into fuckin’… shapeshifting cobras woven of fire and dust by the archdevil Iblis, hiding in caves and slithering out into the desert at night to tempt mortal men to sin. That mythologically-accurate shit.

    • roscoe@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      0
      ·
      1 year ago

      As soon as anyone can do this on their own machine with no third parties involved all laws and other measures being discussed will be moot.

      We can punish nonconsensual sharing but that’s about it.

      • CeeBee@lemmy.world
        cake
        link
        fedilink
        English
        arrow-up
        0
        ·
        1 year ago

        As soon as anyone can do this on their own machine with no third parties involved

        We’ve been there for a while now

        • roscoe@lemmy.dbzer0.com
          link
          fedilink
          English
          arrow-up
          0
          ·
          1 year ago

          Some people can, I wouldn’t even know where to start. And is the photo/video generator completely on home machines without any processing being done remotely already?

          I’m thinking about a future where simple tools are available where anyone could just drop in a photo or two and get anything up to a VR porn video.

          • CeeBee@lemmy.world
            cake
            link
            fedilink
            English
            arrow-up
            0
            ·
            1 year ago

            And is the photo/video generator completely on home machines without any processing being done remotely already?

            Yes

              • HeyListenWatchOut@lemmy.world
                link
                fedilink
                English
                arrow-up
                0
                ·
                1 year ago

                Stable Diffusion has been easily locally installed and runnable on any decent GPU for 2 years at this point.

                Combine that with Civitai.com for easy to download and run models of almost anything you can imagine - IP, celebrity, concepts, etc… and the possibilities have been endless.

                • T156@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  0
                  ·
                  1 year ago

                  Tensor processors/AI accelerators have also been a thing on new hardware for a while. Mobile devices have them, Intel/Apple include them with their processors, and it’s not uncommon to find them on newer graphics cards.

                  That would just make it easier compared to needing quite a powerful computer for that kind of task.

        • yildolw@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          1 year ago

          You may be sued for damages if you sell those nude paintings of Rihanna at a large enough scale that Rihanna notices

  • JackGreenEarth@lemm.ee
    link
    fedilink
    English
    arrow-up
    0
    ·
    1 year ago

    That’s a ripoff. It costs them at most $0.1 to do simple stable diffusion img2img. And most people could do it themselves, they’re purposefully exploiting people who aren’t tech savvy.

    • Khrux@ttrpg.network
      link
      fedilink
      English
      arrow-up
      0
      ·
      1 year ago

      I have no sympathy for the people who are being scammed here, I hope they lose hundreds to it. Making fake porn of somebody else without their consent, particularly that which could be mistaken for real if it were to be seen by others, is awful.

      I wish everyone involved in this industry a very awful day.

    • sugar_in_your_tea@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      0
      ·
      1 year ago

      IDK, $10 seems pretty reasonable to run a script for someone who doesn’t want to. A lot of people have that type of arrangement for a job…

      That said, I would absolutely never do this for someone, I’m not making nudes of a real person.

    • IsThisAnAI@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      1 year ago

      Scam is another thing. Fuck these people selling.

      But fuck dude they aren’t taking advantage of anyone buying the service. That’s not how the fucking world works. It turns out that even you have money you can post for people to do shit like clean your house or do an oil change.

      NOBODY on that side of the equation are bring exploited 🤣

    • M500@lemmy.ml
      link
      fedilink
      English
      arrow-up
      0
      ·
      1 year ago

      Wait? This is a tool built into stable diffusion?

      In regards to people doing it themselves, it might be a bit too technical for some people to setup. But I’ve never tried stable diffusion.

        • BlackPenguins@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          1 year ago

          It depends on the models you use too. There’s specific training models data out there and all you need to do is give it a prompt of “naked” or something and it’s scary good at making something realistic in 2 minutes. But yeah, there is a learning curve at setting everything up.

        • bassomitron@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          1 year ago

          Img2img isn’t always spot-on with what you want it to do, though. I was making extra pictures for my kid’s bedtime books that we made together and it was really hit or miss. I’ve even goofed around with my own pictures to turn myself into various characters and it doesn’t work out like you want it to much of the time. I can imagine it’s the same when going for porn, where you’d need to do numerous iterations and tweaking over and over to get the right look/facsimile. There are tools/SD plugins like Roop which does make transferring over faces with img2img easier and more reliable, but even then it’s still not perfect. I haven’t messed around with it in several months, so maybe it’s better and easier now.

        • M500@lemmy.ml
          link
          fedilink
          English
          arrow-up
          0
          ·
          1 year ago

          Thanks for the link, I’ve been running some llm locally, and I have been interested in stable diffusion. I’m not sure I have the specs for it at the moment though.

          • TheRealKuni@lemmy.world
            link
            fedilink
            English
            arrow-up
            0
            ·
            1 year ago

            By the way, if you’re interested in Stable Diffusion and it turns out your computer CAN’T handle it, there are sites that will let you toy around with it for free, like civitai. They host an enormous number of models and many of them work with the site’s built in generation.

            Not quite as robust as running it locally, but worth trying out. And much faster than any of the ancient computers I own.

          • TheRealKuni@lemmy.world
            link
            fedilink
            English
            arrow-up
            0
            ·
            1 year ago

            An iPhone from 2018 can run Stable Diffusion. You can probably run it on your computer. It just might not be very fast.

    • OKRainbowKid@feddit.de
      link
      fedilink
      English
      arrow-up
      0
      ·
      1 year ago

      In my experience with SD, getting images that aren’t obviously “wrong” in some way takes multiple iterations with quite some time spent tuning prompts and parameters.

    • echo64@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      1 year ago

      The people being exploited are the ones who are the victims of this, not people who paid for it.

      • Dkarma@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        1 year ago

        No one’s a victim no one’s being exploited. Same as taping a head on a porno mag.

        • Vanth@reddthat.com
          link
          fedilink
          English
          arrow-up
          0
          ·
          edit-2
          1 year ago

          How are the perpetrators victims?

          I could see an argument for someone in need of money making AI generated porn of themselves. Like, don’t judge sex workers, they’re just trying to make money. But taking someone else’s image without their consent is more akin to Tate coercing his “girlfriends” into doing cam work and taking all the money and ensuring they can’t escape. He’s not a victim nor a sex worker, he’s a criminal.

            • Vanth@reddthat.com
              link
              fedilink
              English
              arrow-up
              0
              ·
              edit-2
              1 year ago

              I would love to assume Lemmy users are intelligent enough to realize text-only sarcastic jokes about sex criminals are almost never a good idea, but alas, I’ve been on the internet longer than two weeks.

              • Sentient Loom@sh.itjust.works
                link
                fedilink
                English
                arrow-up
                0
                ·
                1 year ago

                Some people just don’t have a sense of humor.

                And those people are YOU!!

                Thanks for the finger-wagging, you moralistic rapist!

        • sbv@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          0
          ·
          1 year ago

          It seems like there’s a news story every month or two about a kid who kills themselves because videos of them are circulating. Or they’re being blackmailed.

          I have a really hard time thinking of the people who spend ten bucks making deep fakes of other people as victims.

          • Sentient Loom@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            0
            ·
            edit-2
            1 year ago

            I have a really hard time thinking

            Your lack of imagination doesn’t make the plight of non-consensual AI-generated porn artists any less tragic.

  • ulterno@lemmy.kde.social
    link
    fedilink
    English
    arrow-up
    0
    ·
    1 year ago

    I say stop antagonizing the AI.
    The only difference between a skilled artist making it with Photoshop and someone using a Neural Net, is the amount of time and effort put into creating the instance.

    If there are to be any laws against these, they need to apply to any and every fake that’s convincing enough, no matter the technology used.

    • Ultragigagigantic@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      1 year ago

      The laws that oppress us on a daily basis suck ass I’ll give yall that for fucking sure… but downvoting someone wishing for the law equally being applied to all?

      Maybe I should go back to 4chan.

      • ulterno@lemmy.kde.social
        link
        fedilink
        English
        arrow-up
        0
        ·
        1 year ago

        OooOo!
        That’s some high number of dwnv0t3s!
        I wouldn’t have realised unless you had replied here.

        Nice, but it’s also good that everyone is at least free to downvote and see the number of downvotes, unlike YouTube.

  • bigkahuna1986@lemmy.ml
    link
    fedilink
    English
    arrow-up
    0
    ·
    1 year ago

    This business is going to get out of control. It’s going to get out of control and we’ll be lucky to live through it.

  • Cris@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    1 year ago

    God, generative ai is such a fucking caustic technology. I honestly don’t see anything positive and not disgusting enabled by this tech.

    • Mubelotix@jlai.lu
      link
      fedilink
      English
      arrow-up
      0
      ·
      1 year ago

      You can call people disgusting over what they do with a tool, but the tool itself is just a tool, it can’t be disgusting

      • Cris@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        1 year ago

        The distinction is that I can see worthwhile use cases for non-generative ai, and not for generative ai, and generative ai is built on theft of creative labor

        I’m not angry at people who use generative ai, I’m angry at the people who built it by stealing from creatives to build a commercial tool that can seemingly only be used in awful ways.

  • anticurrent@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    0
    ·
    1 year ago

    We are acting as if through out history we managed to orient technology so as to to only keep the benefits and eliminate negative effects. while in reality most of the technology we use still comes with both aspects. and it is not gonna be different with AI.

  • guyrocket@kbin.social
    link
    fedilink
    arrow-up
    0
    ·
    1 year ago

    This is not new. People have been Photoshopping this kind of thing since before there was Photoshop. Why “AI” being involved matters is beyond me. The result is the same: fake porn/nudes.

    And all the hand wringing in the world about it being non consensual will not stop it. The cat has been out of the bag for a long time.

    I think we all need to shift to not believing what we see. It is counterintuitive, but also the new normal.

    • dysprosium@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      0
      ·
      1 year ago

      Exactly this. And rather believe cryptographically sighed images by comparing hashes with the one supplied by the owner. Then it’s rather a question of trusting a specific source for a specific kind of content. A news photo of the war in Ukraine by the BBC? Check hash on their site. Their reputation is fini if a false image has been found.

      • T156@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        1 year ago

        At the same time, that does introduce an additional layer of work. Most people aren’t going to do that just for the extra work that it would involve, in much the same way that people today won’t track down an image back down to the original source, but usually just go by the one that they saw.

        Especially for people who aren’t so cryptographically or technologically inclined that they know what a hash is, where to find one, and how to compare it (without just opening them both and checking personally).

    • echo64@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      1 year ago

      I hate this: “Just accept it women of the world, accept the abuse because it’s the new normal” techbro logic so much. It’s absolutely hateful towards women.

      We have legal and justice systems to deal with this. It is not the new normal for me to be able to make porn of your sister, or mother, or daughter. Absolutely fucking abhorrent.

      • AquaTofana@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        1 year ago

        I don’t know why you’re being down voted. Sure, it’s unfortunately been happening for a while, but we’re just supposed to keep quiet about it and let it go?

        I’m sorry, putting my face on a naked body that’s not mine is one thing, but I really do fear for the people whose likeness gets used in some degrading/depraved porn and it’s actually believable because it’s AI generated. That is SO much worse/psychologically damaging if they find out about it.

      • SharkAttak@kbin.social
        link
        fedilink
        arrow-up
        0
        ·
        1 year ago

        It’s not normal but neither is new: you already could cut and glue your cousin’s photo on a Playboy girl, or Photoshop the hot neighbour on Stallone’s muscle body. Today is just easier.

      • brbposting@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        0
        ·
        1 year ago

        It’s unacceptable.

        We have legal and justice systems to deal with this.

        For reference, here’s how we’re doing with child porn. Platforms with problems include (copying from my comment two months ago):

        Ill adults and poor kids generate and sell CSAM. Common to advertise on IG, sell on TG. Huge problem as that Stanford report shows.

        Telegram got right on it (not). Fuckers.

      • cley_faye@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        1 year ago

        How do you propose to deal with someone doing this on their computer, not posting them online, for their “enjoyment”? Mass global surveillance of all existing devices?

        It’s not a matter of willingly accepting it; it’s a matter of looking at what can be done and what can not. Publishing fake porn, defaming people, and other similar actions are already (I hope… I am not a lawyer) illegal. Asking for the technology that exists, is available, will continue to grow, and can be used in a private setting with no witness to somehow “stop” because of a law is at best wishful thinking.

        • Ookami38@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          0
          ·
          1 year ago

          There’s nothing to be done, nor should be done, for anything someone individually creates, for their own individual use, never to see the light of day. Anything else is about one step removed from thought policing - afterall what’s the difference between a personally created, private image and the thoughts on your brain?

          The other side of that is, we have to have protection for people who this has or will be used against. Strict laws regarding posting or sharing material. Easy and fast removal of abusive material. Actual enforcement. I know we have these things in place already, but they need to be stronger and more robust. The one absolute truth with generative AI, versus Photoshop etc is that it’s significantly faster and easier, thus there will likely be an uptick in this kind of material, thus the need for re-examining current laws.

        • Jrockwar@feddit.uk
          link
          fedilink
          English
          arrow-up
          0
          ·
          edit-2
          1 year ago

          And so is straight male-focused porn. We men seemingly are not attractive, other than for perfume ads. It’s unbelievable gender roles are still so strongly coded in 20204. Women must be pretty, men must buy products where women look pretty in ads. Men don’t look pretty and women don’t buy products - they clean the house and care for the kids.

          I’m aware of how much I’m extrapolating, but a lot of this is the subtext under “they’ll make porn of your sisters and daughters” but leaving out of the thought train your good looking brother/son, when that’d be just as hurtful for them and yourself.

          • lud@lemm.ee
            link
            fedilink
            English
            arrow-up
            0
            ·
            1 year ago

            Or your bad looking brother or the bad looking myself.

            Imo people making ai fakes for themselves isn’t the end of the world but the real problem is in distribution and blackmail.

            You can get blackmailed no matter your gender and it will happen to both genders.

        • echo64@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          1 year ago

          Sorry if I didn’t position this about men. They are the most important thing to discuss and will be the most impacted here, obviously. We must center men on this subject too.

          • Thorny_Insight@lemm.ee
            link
            fedilink
            English
            arrow-up
            0
            ·
            1 year ago

            Pointing out your sexism isn’t saying we should be talking about just men. It you whose here acting all holy while ignoring half of the population.

            • echo64@lemmy.world
              link
              fedilink
              English
              arrow-up
              0
              ·
              1 year ago

              Yes yes, #alllivesmatter amiirte? We just ignore that 99.999% of the victims will be women, just so we can grandstand about men.

    • EatATaco@lemm.ee
      link
      fedilink
      English
      arrow-up
      0
      ·
      1 year ago

      I suck at Photoshop and Ive tried many times to get good at it over the years. I was able to train a local stable diffusion model on my and my family’s faces and create numerous images of us in all kinds of situations in 2 nights of work. You can get a snap of someone and have nudes of them tomorrow for super cheap.

      I agree there is nothing to be done, but it’s painfully obvious to me that the scale and ease of it that makes it much more concerning.

      • T156@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        1 year ago

        Also the potential for automation/mass-production. Photoshop work still requires a person to sit down to do the actual photoshop. You can try to script things out, but it’s hardly an easy affair.

        By comparison, generative models are much more hands-free. Once you get the basics set up, you can just have it go, and churn things at rates well surpassing what a single human could reasonably do (if you have the computing power for it).

    • kent_eh@lemmy.ca
      link
      fedilink
      English
      arrow-up
      0
      ·
      1 year ago

      People have been Photoshopping this kind of thing since before there was Photoshop. Why “AI” being involved matters is beyond me

      Because now it’s faster, can be generated in bulk and requires no skill from the person doing it.

      • Bob Robertson IX @discuss.tchncs.de
        link
        fedilink
        English
        arrow-up
        0
        ·
        1 year ago

        A kid at my high school in the early 90s would use a photocopier and would literally cut and paste yearbook headshots onto porn photos. This could also be done in bulk and doesn’t require any skills that a 1st grader doesn’t have.

        • ChexMax@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          1 year ago

          Those are easily disproven. There’s no way you think that’s the same thing. If you can pull up the source photo and it’s a clear match/copy for the fake it’s easy to disprove. AI can alter the angle, position, and expression on your face in a believable manor making it a lot harder to link the photo to source material

          • Bob Robertson IX @discuss.tchncs.de
            link
            fedilink
            English
            arrow-up
            0
            ·
            1 year ago

            This was before Google was a thing, much less reverse lookup with Google Images. The point I was making is that this kind of thing happened even before Photoshop. Photoshop made it look even more realistic. AI is the next step. And even the current AI abilities are nothing compared to what they are going to be even 6 months from now. Yes, this is a problem, but it has been a problem for a long time and anyone who has wanted to create fake nudes of someone has had the ability to easily do so for at least a generation now. We might be at the point now where if you want to make sure you don’t have fake nudes created of you, then you don’t have images of yourself published. However now that everyone has high quality cameras in their pockets, this won’t 100% protect you.

      • Vespair@lemm.ee
        link
        fedilink
        English
        arrow-up
        0
        ·
        1 year ago

        no skill from the person doing it.

        This feels entirely non-sequitur, to the point of damaging any point you’re trying to make. Whether I paint a nude or the modern Leonardi DaVinci paints a nude our rights (and/or the rights of the model, depending on your perspective on this issue) should be no different, despite the enormous chasm that exists between our artistic skill.

    • HubertManne@kbin.social
      link
      fedilink
      arrow-up
      0
      ·
      1 year ago

      This is something I can’t quite get through to my wife. She does not like that I dismiss things to some degree when it does not makes sense. We get into these convos where Im like I have serious doubts about this and she is like. Are you saying it did not happen and im like. no. It may have happened but not in quite the way they say or its being portrayed in a certain manner. Im still going to take video and photos for now as being likely true but I generally want to see it from independent sources. like different folks with their phones along with cctv of some kind and such.

    • AstralPath@lemmy.ca
      link
      fedilink
      English
      arrow-up
      0
      ·
      1 year ago

      This kind of attitude toward non-consensual actions is what perpetuates them. Fuck that shit.

  • Kuinox@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    1 year ago

    The root problem is government not enforcing the law on internet. Deepfakes existed for years.
    The law enforcement should be more proactive on internet.

    • Sanctus@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      1 year ago

      Pretty sure we will see fake political candidates that actually garner votes soon here.

      • CeeBee@lemmy.world
        cake
        link
        fedilink
        English
        arrow-up
        0
        ·
        1 year ago

        FR is not generative AI, and people need to stop crying about FR being the boogieman. The harm that FR can potentially cause has been covered and surpassed by other forms of monitoring, primarily smartphone and online tracking.

        • BleatingZombie@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          1 year ago

          I wholeheartedly disagree on it being surpassed

          If someone doesn’t have a phone and doesn’t go online then they can still be tracked by facial recognition. Someone who has never agreed to any Terms and Conditions can still be tracked by facial recognition

          I don’t think there’s anything as dubious as facial recognition due to its ability to track almost anyone regardless of involvement with technology

          • neatchee@lemmy.world
            link
            fedilink
            English
            arrow-up
            0
            ·
            1 year ago

            You don’t need to be online or use a digital device to be tracked by your metadata. Your credit card purchases, phone calls, vehicle license plate, and more can all be correlated.

            Additionally, saying “just don’t use a phone” is no different than saying “just wear a mask outside your house”. Both are impractical, if not functionally impossible, in modern society

            I’m not arguing which is “worse”, only speaking to the reality we live in

            • BleatingZombie@lemmy.world
              link
              fedilink
              English
              arrow-up
              0
              ·
              edit-2
              1 year ago

              I am arguing which is worse. There are people in Palestine who don’t have the internet, don’t have a phone, and don’t have a credit card. How are they being tracked without facial recognition?

              I also didn’t say don’t use a phone. I don’t know where you got that

              • neatchee@lemmy.world
                link
                fedilink
                English
                arrow-up
                0
                ·
                edit-2
                1 year ago

                I know what you’re arguing and why you’re arguing it and I’m not arguing against you.

                I’m simply adding what I consider to be important context

                And again, the things I listed specifically are far from the only ways to track people. Shit, we can identify people using only the interference their bodies create in a wifi signal, or their gait. There are a million ways to piece together enough details to fingerprint someone. Facial recognition doesn’t have a monopoly on that bit of horror

                • BleatingZombie@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  0
                  ·
                  1 year ago

                  I didn’t say “real threat” either. I’m not sure where you’re getting these things I’m not saying

                  I think facial recognition isn’t as much of a “buzzword” as much as it is just the most prevalent issue that affects the most people. Yes there are other ways to track people, but none that allow you to easily track everybody regardless of their involvement with modern technology other than facial recognition

                  (Just to be clear I’m not downvoting you)

  • Mastengwe@lemm.ee
    link
    fedilink
    English
    arrow-up
    0
    ·
    1 year ago

    As long as there are simps, there will always be this bullshit. And there will always be simps, because it isn’t illegal to be pathetic.

  • GrymEdm@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    1 year ago

    To people who aren’t sure if this should be illegal or what the big deal is: according to Harvard clinical psychiatrist and instructor Dr. Alok Kanojia (also known as Dr. K from HealthyGamerGG), once non-consensual pornography (which deepfakes are classified as) is made public over half of people involved will have the urge to kill themselves. There’s also the risk of feeling depressed, angry, ashamed, etc. The analogy given is it’s like watching video the next day of yourself undergoing sex without consent as if you’d been drugged.

    I’ll admit I used to look at celeb deepfakes, but once I saw that video I stopped immediately and avoid it as much as I possibly can. I believe porn can be done correctly with participant protection and respect. Regarding deepfakes/revenge porn though that statistic about suicidal ideation puts it outside of healthy or ethical. Obviously I can’t make that decision for others or purge the internet, but the fact that there’s such regular and extreme harm involved for the (what I now know are) victims of non-consensual porn makes it personally immoral. Not because of religion or society but because I want my entertainment to be at minimum consensual and hopefully fun and exciting, not killing people or ruining their happiness.

    I get that people say this is the new normal, but it’s already resulted in trauma and will almost certainly continue to do so. Maybe even get worse as the deepfakes get more realistic.

    • lud@lemm.ee
      link
      fedilink
      English
      arrow-up
      0
      ·
      1 year ago

      once non-consensual pornography (which deepfakes are classified as) is made public over half of people involved will have the urge to kill themselves.

      Not saying that they are justified or anything but wouldn’t people stop caring about them when they reach a critical mass? I mean if everyone could make fakes like these, I think people would care less since they can just dismiss them as fakes.

      • eatthecake@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        1 year ago

        The analogy given is it’s like watching video the next day of yourself undergoing sex without consent as if you’d been drugged.

        You want a world where people just desensitise themselves to things that make them want to die through repeated exposure. I think you’ll get a whole lot of complex PTSD instead.

        • stephen01king@lemmy.zip
          link
          fedilink
          English
          arrow-up
          0
          ·
          1 year ago

          People used to think their lives are over if they were caught alone with someone of the opposite sex they’re not married to. That is no longer the case in western countries due to normalisation.

          The thing that makes them want to die is societal pressure, not the act itself. In this case, if societal pressure from having fake nudes of yourself spread is removed, most of the harm done to people should be neutralised.

          • eatthecake@lemmy.world
            link
            fedilink
            English
            arrow-up
            0
            ·
            1 year ago

            The thing that makes them want to die is societal pressure, not the act itself.

            That’s an assumption that you have no evidence for. You are deciding what feelings people should have by your own personal rules and completely ignoring the people who are saying this is a violation. What gives you the right to tell people how they are allowed to feel?

          • too_much_too_soon@lemmy.world
            link
            fedilink
            English
            arrow-up
            0
            ·
            edit-2
            1 year ago

            Agreed.

            "I’ve been in HR since '95, so yeah, I’m old, lol. Noticed a shift in how we view old social media posts? Those wild nights you don’t remember but got posted? If they’re at least a decade old, they’re not as big a deal now. But if it was super illegal, immoral, or harmful, you’re still in trouble.

            As for nudes, they can be both the problem and the solution.

            To sum it up, like in the animate movie ‘The Incredibles’: ‘If everyone’s special, then no one is.’ If no image can be trusted, no excuse can be doubted. ‘It wasn’t me’ becomes the go-to, and nobody needs to feel ashamed or suicidal over something fake that happens to many.

            Of course, this is oversimplifying things in the real world but society will adjust. People won’t kill themselves over this. It might even be a good thing for those on the cusp of AI and improper real world behaviours - ‘Its not me. Its clearly AI, I would never behave so outrageously’.

      • Drewelite@lemmynsfw.com
        link
        fedilink
        English
        arrow-up
        0
        ·
        1 year ago

        I think this is realistically the only way forward. To delegitimize any kind of nudes that might show up of a person. Which could be good. But I have no doubt that highschools will be flooded with bullies sending porn around of innocent victims. As much as we delegitimize it as a society, it’ll still have an effect. Like social media, though it’s normal for anyone to reach you at any time, It still makes cyber bullying more hurtful.

    • Regrettable_incident@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      1 year ago

      I’m wondering if this may already be illegal in some countries. Revenge porn laws now exist in some countries, and I’m not sure if the legislation specifies how the material should be produced to qualify. And if the image is based on a minor, that’s often going to be illegal too - some places I hear even pornographic cartoons are illegal if they feature minors. In my mind people who do this shit are doing something pretty similar to putting hidden cameras in bathrooms.

    • spez_@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      1 year ago

      The technology will become available everywhere and run on every device over time. Nothing will stop this