• jaschen@lemm.ee
    link
    fedilink
    English
    arrow-up
    43
    arrow-down
    31
    ·
    7 days ago

    Nobody except pedos will argue that child abuse is bad. AI cartoon porn in my opinion is fine. It’s a victimless crime. Literally nobody gets hurt. There is no studies confirming that someone who watches pedo cartoons will end up doing real life child abuse, in fact some studies show the opposite effect.

    I welcome AI porn. Cartoon or real life looking. Zero real women get taken advantage of and we get to pick whatever kink we want knowing that nobody was hurt in the making of the AI porn.

    • HubertManne@moist.catsweat.com
      link
      fedilink
      arrow-up
      18
      arrow-down
      2
      ·
      7 days ago

      I agree. Its always a tough stance. Its like ultimately I want nazis to be able to speak freely as long as they don’t actually do the stuff they spout. As far as im concerned when you try to ban stuff not in reality you are in the realm of trying to ban thought.

      • jaschen@lemm.ee
        link
        fedilink
        English
        arrow-up
        28
        arrow-down
        1
        ·
        7 days ago

        I mean, Japan has had csam cartoons for decades. They have a lower CSA rate compared to the USA. Not saying it’s totally related, but it doesn’t seem like if someone has access to cartoon csam they will normalize it and do it in real life.

      • Lightor@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        ·
        6 days ago

        Sure, the same way video games normalize stealing cars. Or the same way movies normalize killing people. I mean at some point you gotta stop blaming media.

        • jaschen@lemm.ee
          link
          fedilink
          English
          arrow-up
          16
          arrow-down
          4
          ·
          7 days ago

          People who downvoted you are lazy do a quick Google search on the topic.

      • LifeInMultipleChoice@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        11
        ·
        7 days ago

        So if legalized porn reduces rapes as studies show, how to we figure out if this existing allows for less abuse to kids, or if it spawns long term interest

        • jaschen@lemm.ee
          link
          fedilink
          English
          arrow-up
          18
          ·
          7 days ago

          Cartoon csam has been legal in Japan for decades. They have a lower CSA per Capita than the USA.

          There are some brain studies that show the area of the brain that is responsible for caring for children is butt up next to the part of the brain that is responsible for sexual pleasures. The study suggests that there might be a misfiring of the synapse between the different sides of the brain that might cause someone to be a pedo. These people don’t experience sexual pleasures without thinking about kids. It’s literally a disability.

          My opinion is that we don’t know if removing AI generated csam would make things worse for real life kids or not. But flat out banning it without proper research would be irresponsible.

          • BlackLaZoR@fedia.io
            link
            fedilink
            arrow-up
            10
            ·
            7 days ago

            But flat out banning it without proper research would be irresponsible.

            I think the whole argument is moot. AI image generation is available to pretty much everyone. It’s impossible to control what what people are doing with it

              • BlackLaZoR@fedia.io
                link
                fedilink
                arrow-up
                10
                ·
                7 days ago

                Self hosting is trivial these days. Any modern NVIDIA card and hundreds of models available online.

          • LifeInMultipleChoice@lemmy.dbzer0.com
            link
            fedilink
            English
            arrow-up
            7
            ·
            edit-2
            7 days ago

            Thanks for the thoughts on such, the way people were only downvoting originally and not providing any actual explanation to why, had me thinking it was just going to have been dumb to ask.

      • Krauerking@lemy.lol
        link
        fedilink
        English
        arrow-up
        9
        arrow-down
        2
        ·
        7 days ago

        That’s like saying the only people who bake wedding cakes are bakers…

        I mean yeah. But what of it? Or are you already implying a level of abuse and connotation to the mental disorder?

        • Klear@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          1
          ·
          6 days ago

          Bakers are people who bake for people who don’t bake for themselves…

        • Wogi@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          9
          ·
          7 days ago

          The act of baking does indeed make you a baker. Definitionally.

          Just because you aren’t going pro doesn’t mean you aren’t making a cake.

      • jaschen@lemm.ee
        link
        fedilink
        English
        arrow-up
        19
        arrow-down
        4
        ·
        7 days ago

        If a pedophile sexualizes fake AI children in his basement but is a productive human in society and never acts in real life. Do you think this person deserves to be in jail?

        • Wogi@lemmy.world
          link
          fedilink
          English
          arrow-up
          3
          arrow-down
          18
          ·
          7 days ago

          A: all models are trained on something

          2, you’re building your own straw man here. You’ve set up an extremely narrow condition under which this particular type of pedophilia is acceptable. Prove to me that that’s the norm, that it’s a typical use scenario, and that people looking at that crap are exclusively looking at loli, and not images meant to look like real people, and there’s a debate to be had there. But if you think any of that is true you’re lying to yourself. Sexualization of others is not going to happen in a vacuum under sterile conditions, it’s going to bleed in to real life.

          • jaschen@lemm.ee
            link
            fedilink
            English
            arrow-up
            16
            arrow-down
            2
            ·
            7 days ago

            Prove to me that removing this will not bleed into real life even more than it is not? You can’t either.

            What I can prove is that Japan has csam cartoons for decades and they have less CSA per Capita than the USA. Is it possible that the Japanese know something we don’t? Who knows.

            Can you prove to me that the AI trained models were done with real csam materials? If so, not reporting this to the FBI seems irresponsible.

          • ILikeTraaaains@lemmy.world
            link
            fedilink
            English
            arrow-up
            3
            arrow-down
            1
            ·
            6 days ago

            Generative models does not work like that, if it were so, how do you explain that I can generate a picture of a purple six legged cat throwing lasers from the eyes in space?

            In a very very very simplified way, the models are trained that from noise it de noises it until the image is “restored”. A part of the model learns to remove noise until a drawing of a child is restored, another learns to restore the image of a drawing of a nude woman. Basically you say to the model that from noise it has to restore the drawing of a nude child it combines the two proceses (also it is trained to combine things in a way that makes sense).

      • LifeInMultipleChoice@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        5
        arrow-down
        3
        ·
        7 days ago

        Sure but like I asked above, if porn reduces rapes, how do we know that this (gross) doesn’t reduce children being sexually assaulted. I can’t think of a single safe way it could be tested or monitored to find the better long term evil

    • sorrybookbroke@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      6
      arrow-down
      29
      ·
      edit-2
      7 days ago

      The “kink” you are picking is drawn child porn. I don’t care if nobody was directly hurt by your consumption of drawn child porn you are consuming child porn. You are a pedophile. Somebody attracted to children sexually.

      I don’t care if studies showing pedophiles who watch drawn child porn aren’t likely to offend. They are pedophiles. I know it’s a wild thing to state but I don’t like pedophiles. The debate on legality due to harm reduction is another thing all-together but at no point did I bring that up. I only asked that we not support or make AI porn of fictional children.

      Your support of a subset of child porn, particularly AI and drawn is noted though. Thank you for stating as much.

      • 𝙲𝚑𝚊𝚒𝚛𝚖𝚊𝚗 𝙼𝚎𝚘𝚠@programming.dev
        link
        fedilink
        English
        arrow-up
        32
        arrow-down
        2
        ·
        7 days ago

        They are pedophiles. I know it’s a wild thing to state but I don’t like pedophiles.

        This makes sense and all, but a pedophile who hasn’t harmed a child hasn’t caused any harm. These people have a disorder that should be treated, but this isn’t always easy. If this can give them some outlet that prevents any actual harm being done to children, then that can easily be argued to be a net positive.

        I prefer these people jack off to AI porn over real child porn or worse, them turning to actual sexual abuse of children. What’s wrong with preventing child abuse?

        • sorrybookbroke@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          4
          arrow-down
          11
          ·
          edit-2
          7 days ago

          I would agree if we see some meta-analysis suggesting this but the evidence is small towards the effect. The studies you state in other comments are inconclusive, are not the majority, and only show mild effects. This is not scientific fact yet and all evidence shows a mild effect at best.

          Even if it did though they are still a pedophile. They are masturbating to child porn. We should not accept that as a positive thing and we should not support people who make child porn. These are the people who need to seek help most. If part of that help is jacking it to drawn child porn so be it but be it so under the care of a professional.

          The fact that one doesn’t offend only stops one from being a monster. A child molester, or child rapist. A pedophile is still immoral.

          My issue is that child porn is inherently wrong. It is a fundamental negative whether drawn or generated. Some things are not about material harm they are about base morality. Sexualizing children is a fundamental wrong.

          If the only thing stopping you from raping, molesting, or otherwise harming a child is drawn child porn you are not a good person. That is terrifying, and disgusting.

          Lastly, our brains are neuroplastic. Anyone can develop a fetish through constant exposure to something in a positive sexual setting. Something may disgust you, say poop, but if you jack off to the thought long enough you will develop a fetish. This, unlike the claim that drawn child porn is helpful, is well known. Harm to children or not this creates more pedophiles. People who think of children in a sexual manner

          • jaschen@lemm.ee
            link
            fedilink
            English
            arrow-up
            14
            arrow-down
            1
            ·
            7 days ago

            No sane person is denying what you’re saying. With a Children of my own, I want to do anything and everything possible to protect them.

            That said, there are research that people who consume cartoon csam that haven’t done real life abuse. They have a problem. Taking away something that doesn’t hurt anyone might not improve our protection of our children, but make things worse.

      • jaschen@lemm.ee
        link
        fedilink
        English
        arrow-up
        14
        arrow-down
        2
        ·
        7 days ago

        If a pedophile sexualizes fake AI children in his basement but is a productive human in society and never acts in real life. Do you think this person deserves to be in jail?