A Florida man is facing 20 counts of obscenity for allegedly creating and distributing AI-generated child pornography, highlighting the danger and ubiquity of generative AI being used for nefarious reasons.

Phillip Michael McCorkle was arrested last week while he was working at a movie theater in Vero Beach, Florida, according to TV station CBS 12 News. A crew from the TV station captured the arrest, which made for dramatic video footage due to law enforcement leading away the uniform-wearing McCorkle from the theater in handcuffs.

  • ReallyActuallyFrankenstein@lemmynsfw.com
    link
    fedilink
    English
    arrow-up
    110
    arrow-down
    3
    ·
    edit-2
    2 months ago

    It’s hard to have a nuanced discussion because the article is so vague. It’s not clear what he’s specifically been charged with (beyond “obscenity,” not a specific child abuse statute?). Because any simulated CSAM laws have been, to my knowledge, all struck down when challenged.

    I completely get the “lock them all up and throw away the key” visceral reaction - I feel that too, for sure - but this is a much more difficult question. There are porn actors over 18 who look younger, do the laws outlaw them from work that would be legal for others who just look older? If AI was trained exclusively on those over-18 people, would outputs then not be CSAM even if the images produced features that looked under 18?

    I’m at least all for a “fruit of the poisoned tree” theory - if AI model training data sets include actual CSAM then they can and should be made illegal. Deepfaking intentionally real under 18 people is also not black and white (looking again to the harm factor), but also I think it can be justifiably prohibited. I also think distribution of completely fake CSAM can be arguably outlawed (the situation here), since it’s going to be impossible to tell AI from real imagery soon and allowing that would undermine enforcement of vital anti-real-CSAM laws.

    The real hard case is producing and retaining fully fake people and without real CSAM in training data, solely locally (possession crimes). That’s really tough. Because not only does it not directly hurt anyone in its creation, there’s a possible benefit in that it diminishes the market for real CSAM (potentially saving unrelated children from the abuse flowing from that demand), and could also divert the impulse of the producer from preying on children around them due to unfulfilled desire.

    Could, because I don’t think there’s studies that answers whether those are true.

    • mpa92643@lemmy.world
      link
      fedilink
      arrow-up
      37
      arrow-down
      9
      ·
      2 months ago

      I mostly agree with you, but a counterpoint:

      Downloading and possession of CSAM seems to be a common first step in a person initiating communication with a minor with the intent to meet up and abuse them. I’ve read many articles over the years about men getting arrested for trying to meet up with minors, and one thing that shows up pretty often in these articles is the perpetrator admitting to downloading CSAM for years until deciding the fantasy wasn’t enough anymore. They become comfortable enough with it that it loses its taboo and they feel emboldened to take the next step.

      CSAM possession is illegal because possession directly supports creation, and creation is inherently abusive and exploitative of real people, and generating it from a model that was trained on non-abusive content probably isn’t exploitative, but there’s a legitimate question as to whether we as a society decide it’s associated closely enough with real world harms that it should be banned.

      Not an easy question for sure, and it’s one that deserves to be answered using empirical data, but I imagine the vast majority of Americans would flatly reject a nuanced view on this issue.

      • Cryophilia@lemmy.world
        link
        fedilink
        arrow-up
        5
        arrow-down
        1
        ·
        2 months ago

        but there’s a legitimate question as to whether we as a society decide it’s associated closely enough with real world harms that it should be banned.

        Why should that be a question at all? If it causes harm, ban it. If not, don’t. Being “associated with” should never be grounds for a legal statute.

      • 9bananas@lemmy.world
        link
        fedilink
        arrow-up
        1
        ·
        2 months ago

        generally a very good point, however i feel it’s important to point out some important context here:

        the pedophiles you’re talking about in your comment are almost always members of tight knit communities that share CSAM, organize distribution, share sources, and most importantly, indulge their fantasies/desires together.

        i would think that the correlation that leads to molestation is not primarily driven by the CSAM itself, but rather the community around it.

        we clearly see this happening in other similarly structured and similarly isolated communities: nazis, incels, mass shooters, religious fanatics, etc.

        the common factor in radicalization and development of extreme views in all these groups is always isolation and the community they end up joining as a result, forming a sort of parallel society with it’s own rules and ideals, separate from general society. over time people in these parallel societies get used to seeing the world in a way that aligns with the ideals of the group.

        nazis start to see anyone not part of their group as enemies, incels start to see “females” instead of women, religious fanatics see sinners…and pedophiles see objects that exist solely for their gratification instead of kids…

        I don’t see why molesters should be any different in this aspect, and would therefore argue that it’s the communal aspect that should probably be the target of the law, i.e.: distribution and organization (forums, chatrooms, etc.)

        the harder it is for them to organize, the less likely these groups are to produce predators that cause real harm!

        if on top of that there is a legally available outlet where they can indulge themselves in a safe manner without harming anyone, I’d expect rates of child molestation to drop significantly, because, again, there’s precedence from similar situations (overdoses in drug addicts, for example)

        i think it is a potentially fatal mistake to think of pedophiles as “special” cases, rather than just another group of outcasts, because in nearly all cases of such pariahs the solutions that prove to work best in the real world are the ones that make these groups feel less like outcasts, which limits avenues of radicalization.

        i thought these parallels are something worth pointing out.

      • MagicShel@programming.dev
        link
        fedilink
        arrow-up
        36
        arrow-down
        6
        ·
        edit-2
        2 months ago

        The problem is empirical data cannot be morally or ethically found. You can’t show a bunch of people porn and then make a statistical observation of whether those shown child porn are more likely to assault children. So we have to go forward without that data.

        I will anecdotally observe anal sex, oral sex, and facials have gone up between partners as prevalence in porn has gone up. That suggests but does not prove a direct statistical harm caused by even “ethically produced CSAM.”

        • usualsuspect191@lemmy.ca
          link
          fedilink
          arrow-up
          29
          arrow-down
          2
          ·
          2 months ago

          I will anecdotally observe anal sex, oral sex, and facials have gone up between partners as prevalence in porn has gone up. That suggests but does not prove a direct statistical harm caused by even "ethically produced CSAM

          Can we look at trends between consenting adults (who are likely watching porn of real people by the way) as an indicator of what pedophiles will do? I’m not so sure. It’s not like step sibling sex is suddenly through the roof now with it being the “trend” in porn.

          Looking specifically at fake rape porn maybe and seeing if it increases rates of rape in the real world might be a better indicator.

          • LustyArgonianMana@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            edit-2
            2 months ago

            Do you think people like Andrew Tate have caused more rapes to occur? Like do you think his rhetoric encourages a rapist mindset in his listeners?

          • MagicShel@programming.dev
            link
            fedilink
            arrow-up
            13
            arrow-down
            1
            ·
            edit-2
            2 months ago

            That’s fair. I tried to make clear that my interpretation is not in any way scientific or authoritative. Better correlations are probably possible.

            ETA on further thought: I wonder if prevalence of incest porn has had an effect on actual incest rates. That might be a much closer correlation due to the similar social taboo. But I’m not sure we have good data on that, either.

        • mpa92643@lemmy.world
          link
          fedilink
          arrow-up
          10
          arrow-down
          2
          ·
          edit-2
          2 months ago

          True, it wouldn’t be ethical to conduct an experiment, but we can (and probably do) collect lots of observational data that can provide meaningful insight. People are arrested at all stages of CSAM related offenses from just possession, distribution, solicitation, and active abuse.

          While observation and correlations are inherently weaker than experimental data, they can at least provide some insight. For example, “what percentage of those only in possession of artificially generated CSAM for at least one year go on to solicit minors” vs. “real” CSAM.

          If it seems that artificial CSAM is associated with a lower rate of solicitation, or if it ends up decreasing overall demand for “real” CSAM, then keeping it legal might provide a real net benefit to society and its most vulnerable even if it’s pretty icky.

          That said, I have a nagging suspicion that the thing many abusers like most about CSAM is that it’s a real person and that the artificial stuff won’t do it for them at all. There’s also the risk that artificial CSAM reduces the taboo of CSAM and can be an on-ramp to more harmful materials for those with pedophilic tendencies that they otherwise are able to suppress. But it’s still way too early to know either way.

          • Cryophilia@lemmy.world
            link
            fedilink
            arrow-up
            2
            ·
            2 months ago

            True, it wouldn’t be ethical to conduct an experiment

            I think it would be ethical for researchers to go onto the boards of these already-existing CP distribution forums and conduct surveys. But then the surveyors would be morally obligated to report that board to the authorities to get it shut down. Which means that no one would ever answer surveyor questions because they knew the board would be shut down soon so they’d just find a new site ugh…

            Yeah nvm I don’t see any way around this one

          • HelixDab2@lemm.ee
            link
            fedilink
            arrow-up
            7
            arrow-down
            1
            ·
            2 months ago

            the thing many abusers like most about CSAM is that it’s a real person and that the artificial stuff won’t do it for them at all.

            Perhaps. But what about when they can’t tell the difference between real and virtual? It seems like the allure of all pornography is the fantasy, rather than the reality. That is, you may enjoy extreme BDSM pornography, and enjoy seeing a person flogged until they’re bleeding, or see needles slowly forced through their penis, but do you really care that it’s a real person that’s going to end the scene, take a shower, and go watch a few episodes of “The Good Place” with their dog before bed? Or is it about the power fantasy that you’re constructing in your head about that scene? How important is the reality of the scene, versus being able to suspend your disbelief long enough to get sexual gratification from it? If the whole scene was done with really good practical effects and CG, would your experience, as a user–even if you were aware–be different?

      • HelixDab2@lemm.ee
        link
        fedilink
        arrow-up
        19
        arrow-down
        2
        ·
        2 months ago

        CSAM possession is illegal because possession directly supports creation

        To expound on this: prior to this point, the creation of CSAM requires that children be sexually exploited. You could not have CSAM without children being harmed. But what about when no direct harms have occurred? Is lolicon hentai ‘obscene’? Well, according to the law and case law, yes, but it’s not usually enforced. If we agree that drawings of children engaged in sexual acts aren’t causing direct harm–that is, children are not being sexually abused in order to create the drawings–then how much different is a computer-generated image that isn’t based off any specific person or event? It seem to me that, whether or not a pedophile might decide that they eventually want more than LLM-generated images is not relevant. Treating a future possibility as a foregone conclusion is exactly the rationale behind Reefer Madness and the idea of ‘gateway’ drugs.

        Allow me to float a second possibility that will certainly be less popular.

        Start with two premises: first, pedophilia is a characteristic that appears to be an orientation. That is, a true pedophile–a person exclusively sexually attracted to pre-pubescent children–does not choose to be a pedophile, any more than a person chooses to be gay. (My understanding is that very few pedophiles are exclusively pedophilic though, and that many child molesters are opportunistic sexual predators rather than being pedophiles.) Secondly, the rates of sexual assault appear to have decreased as pornography availability has increased. So the question I would have is, would wide availability of LLM-generated CSAM–CSAM that didn’t cause any real, direct harm to children–actually decrease rates of child sexual assault?

        • RandomlyNice@lemmy.world
          link
          fedilink
          English
          arrow-up
          5
          ·
          2 months ago

          With regards to your last paragraph: Pedophiles can indeed by straight, gay or bi. Pedophiles may also not become molesters, and molesters of children may not at all be pedophilic. It’s seems you understand this. I mentioned ITT that I read a newspaper article many years ago that was commissioned to show the access to cp would increase child abuse, it seemed to show the opposite.
          If persons could use AI to generate their own porn of their own personal fantasies (whatever those might be) and NOT share that content what then? Canada allows this for text (maybe certain visuals? Audio? IDK). I don’t know about current ‘obscene’ laws in the USA, however, I do recall reading about an art exhibit in NY which featured an upside down urinal that was deemed obscene, than later deemed a work or art. I also recall seeing (via an internet image) a sculpture of what seemed to be a circle of children with penises as noses. Porn? Art? Comedy?

          • HelixDab2@lemm.ee
            link
            fedilink
            arrow-up
            6
            ·
            2 months ago

            My understanding was that ‘pure’ pedophiles–ones that have no interest at all in post-pubescent children or any adults whatsoever–tend to be less concerned with sex/gender, particularly because children don’t have defined secondary sex characteristics. I don’t know if this is actually correct though. I’m not even sure how you could ethically research that kind of thing and end up with valid results.

            And honestly, not being able to do solid research that has valid results makes it really fuckin’ hard to find solutions that work to prevent as many children from being harmed as possible. In the US at least research about sex and sexuality in general-much less deviant sexualities–seems to be taboo, and very difficult to get funding for.

        • 2xsaiko@discuss.tchncs.de
          link
          fedilink
          arrow-up
          5
          arrow-down
          1
          ·
          edit-2
          2 months ago

          Hard to say. I generally agree with what you’ve said though. Also, lots of people have other fantasies that they would never enact in real life for various reasons (e.g. it’s unsafe, illegal, or both; edit: I should also absolutely list non-consensual here). I feel like pedophilia isn’t necessarily different.

          However part of the reason loli/whatever is also illegal to distribute (it is, right? I assume it is at least somewhere) is that otherwise it helps people facilitate/organize distribution of real CSAM, which increases demand for it. That’s what I’ve heard at least and it makes sense to me. And I feel like that would apply to AI generated as well.

          • HelixDab2@lemm.ee
            link
            fedilink
            arrow-up
            7
            ·
            2 months ago

            It’s obvs. very hard to get accounts of what pedophiles are doing; the only ones that you can survey are ones that have been caught, which isn’t necessarily a representative sample. I don’t think that there are any good estimates on the rate of pedophilic tendencies.

            the reason loli/whatever is also illegal to distribute

            From a cursory reading, it looks like possession and distribution are both felonies. Lolicon hentai is pretty widely available online, and prosecutions appear to be very uncommon when compared to the availability. (Low priority for enforcement, probably?)

            I’m not sure that increasing the supply of CSAM would necessarily increase demand for CSAM in people that aren’t already pedophiles though. To put it another way, I’m sure that increasing the supply of gay porn would increase consumption of gay porn, but I am pretty sure that it’s not going to make more people gay. And people that aren’t gay (or at least bi-) aren’t going to be interested in gay porn, regardless of how hard up (heh) they might be for porn, as long as they have any choices at all. There’s a distinction between fetishes/paraphilia, and orientations, and my impression has been that pedophilia is much more similar to an orientation than a paraphilia.

            • 2xsaiko@discuss.tchncs.de
              link
              fedilink
              arrow-up
              2
              arrow-down
              2
              ·
              2 months ago

              I’m not sure that increasing the supply of CSAM would necessarily increase demand for CSAM in people that aren’t already pedophiles though.

              No, but allowing people to organize increases demand because then those who would want CSAM have a place to look for it and ask for it where it’s safe for them to do so, and maybe even pay for it to be created. It’s rather the other way around, the demand increases the supply if you want to put it like that. I’m not saying lolicon being freely available turns people into pedophiles or something like that, at all.

              • HelixDab2@lemm.ee
                link
                fedilink
                arrow-up
                5
                ·
                2 months ago

                I guess where I come down is that, as long as no real people are being harmed–either directly, or because their likeness is being used–then I’d rather see it out in the open than hidden. At least if it’s open you can have a better chance of knowing who is immediately unsafe around children, and easily using that to exclude people from positions where they’d have ready access to children (teachers, priests, etc.).

                Unfortunately, there’s also a risk of pedophilia being ‘normalized’ to the point where people let their guard down around them.

      • ObjectivityIncarnate@lemmy.world
        link
        fedilink
        arrow-up
        12
        ·
        edit-2
        2 months ago

        Downloading and possession of CSAM seems to be a common first step in a person initiating communication with a minor with the intent to meet up and abuse them.

        But this is like the arguments used to say that weed is a “gateway drug” by talking about how people strung out on harder drugs almost always have done weed as well, ignoring everyone who uses only weed. But this is even hazier because we literally have no real idea how many people consume that stuff but don’t ‘escalate’.

        I remember reading once in some research out of Japan that child molesters consume less porn overall than the average citizen, which seems counter-intuitive, but may not be, if you consider the possibility that maybe it (in this case, they were talking primarily about manga with anime-style drawings of kids in sexual situations) is actually curbing the incidence of the ‘real thing’, since the ones actually touching kids in the real world are reading those mangas less.

        I’m also reminded of people talking about sex dolls that look like kids, and if that’s a possible ‘solution’ for pedophiles, or if it would ‘egg on’ actual molestation.

        I think I lean on the side of ‘satiation’, from the limited bits of idle research I’ve done here and there. And if that IS in fact the case, then regardless of if it grosses me out, I can’t in good conscience oppose something that actually reduces the number of children who actually get abused, you know?

        • LustyArgonianMana@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          4
          ·
          edit-2
          2 months ago

          It’s less that these materials are like a “gateway” drug and more like these materials could be considered akin to advertising. We already have laws about advertising because it’s so effective, including around cigarettes and prescriptions.

          Second, the role that CP plays in most countries is difficult. It is used for blackmail. It is also used to generate money for countries. And it’s used as advertising for actual human trafficking organizations. And similar organizations exist for snuff and gore btw. And ofc animals. And any combination of those 3. Or did you all forget about those monkey torture videos, or the orangutan who was being sex trafficked? Or Daisy’s Destruction and Peter Scully?

          So it’s important to not allow these advertisers to combine their most famous monkey torture video with enough AI that they can say it’s AI generated, but it’s really just an ad for their monkey torture productions. And they do that with CP, rape, gore, etc, too.

          • tamal3@lemmy.world
            link
            fedilink
            arrow-up
            3
            arrow-down
            1
            ·
            2 months ago

            People, please don’t just downvote with no comment. Why is this being downloaded? The comparisons to advertisements have validity. And, if you disagree, be productive and tell us why.

            • LustyArgonianMana@lemmy.world
              link
              fedilink
              English
              arrow-up
              1
              arrow-down
              2
              ·
              2 months ago

              Because a huge percentage of Lemmy is sexist and I am openly a woman. You’ll know because this comment will get nuked also.

    • Cryophilia@lemmy.world
      link
      fedilink
      arrow-up
      2
      arrow-down
      3
      ·
      2 months ago

      I’m at least all for a “fruit of the poisoned tree” theory - if AI model training data sets include actual CSAM then they can and should be made illegal.

      Now all AI is illegal. It’s trained via scraping the internet, which will include CP as well as every other image.

    • ObjectivityIncarnate@lemmy.world
      link
      fedilink
      arrow-up
      14
      ·
      2 months ago

      I don’t know if it’s still a thing, but I’m reminded of some law or regulation that was passed a while back in Australia, iirc, that barred women with A-cup busts from working in porn, the “reasoning” being that their flatter chests made them look too similar to prepubescent girls, lol…

      Not only stupid but also quite insulting to women, imo.

    • HelixDab2@lemm.ee
      link
      fedilink
      arrow-up
      13
      ·
      2 months ago

      Because any simulated CSAM laws have been, to my knowledge, all struck down when challenged.

      To the best of my knowledge, calling drawn works obscene has been upheld in courts, most often because the artist(s) lack the financial ability to fight the charges effectively. The artist for the underground comic “Boiled Angel” had his conviction for obscenity upheld–most CSAM work falls under obscenity laws–and ended up giving up the fight to clear his name.

      • ReallyActuallyFrankenstein@lemmynsfw.com
        link
        fedilink
        English
        arrow-up
        6
        ·
        edit-2
        2 months ago

        Oh, for sure. I’m talking about laws specifically targeted to minors. “Obscenity” is a catch-all that is well-established, but if you are trying to protect children from abuse, it’s a very blunt instrument and not as effective as targeted abuse and trafficking statutes. The statutory schemes used to outlaw virtual CSAM have failed to my knowledge.

        For example: https://en.wikipedia.org/wiki/Ashcroft_v._Free_Speech_Coalition

        That case was statutorily superseded in part by the PROTECT Act, which attempted to differentiate itself by…relying on an obscenity standard. So it’s a bit illusory that it does anything new.

        • HelixDab2@lemm.ee
          link
          fedilink
          arrow-up
          4
          ·
          2 months ago

          The PROTECT Act has been, so far, found to be constitutional, since it relies on the obscenity standard in regards to lolicon hentai. Which is quite worrisome. It seems like it’s a circular argument/tautology; it’s obscene for drawn art to depict child sexual abuse because drawings of child sexual abuse are obscene.

    • brbposting@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      1
      ·
      2 months ago

      simulated CSAM

      When I used this phrase, someone told me it described a nonexistent concept, and that the CSAM term existed in part to differentiate between content where children were harmed to make it versus not. I didn’t wanna muddy any waters but do you have an opposing perspective?

      Deepfaking intentionally real under 18 people is also not black and white

      Interesting. Sounds real bad. See what you mean about harm factor though.

    • snooggums@midwest.social
      link
      fedilink
      English
      arrow-up
      22
      arrow-down
      2
      ·
      2 months ago

      Even worse, you don’t need CSAM to start with. If a learning model has regular porn and nude reference model photography of people under 18 that are used for drawing anatomy, then they have enough information to combine the two. Hell, it probably doesn’t even need the people under 18 to actually be nude.

      Hell, society tends to assume any nudity inder 18 to be CSAM anyway, because someone could see it that way.

  • beepnoise@piefed.social
    link
    fedilink
    English
    arrow-up
    13
    arrow-down
    37
    ·
    2 months ago

    Honestly, I don’t care if it is AI/not real, I’m glad that the man was arrested. He needs some serious help for sexualising kids.

          • Samvega@lemmy.blahaj.zone
            link
            fedilink
            arrow-up
            5
            arrow-down
            20
            ·
            2 months ago

            You don’t think that reducing testosterone and therefore sex drive will change offending rates? That is contrary to research which has reliably found that this is the best therapy, in terms of effectiveness on recidivism.

            • macniel@feddit.org
              link
              fedilink
              arrow-up
              22
              arrow-down
              1
              ·
              2 months ago

              That guy didn’t even commit anything just having AI imagery depicting children.

              That guy has a mental problem that you can’t only treat by chemical castration. He needs more than that.

              • Samvega@lemmy.blahaj.zone
                link
                fedilink
                arrow-up
                5
                arrow-down
                14
                ·
                edit-2
                2 months ago

                That does not change the fact that chemical castration is the most successful treatment we have to stop CSA recidivism at present.

                 

                That guy didn’t even commit anything just having AI imagery depicting children.

                Possessing and distributing images that sexually objectify children may be a crime, even if generated by AI.

            • Farid@startrek.website
              link
              fedilink
              arrow-up
              10
              ·
              edit-2
              2 months ago

              Cutting off their testicles and straight up executing them would also reduce the offending rates. Even more effectively than chemical castration, I’m sure. But we wouldn’t be calling that helping the offender, would we? And the comment above was specifically talking about helping them.
              What we have now is more of a best middle ground between the amount of damage caused to the patient and safety guarantees for the society. We obviously prioritize safety for the society, but we should be striving for less damage to the patient, too.

              • Samvega@lemmy.blahaj.zone
                link
                fedilink
                arrow-up
                3
                arrow-down
                4
                ·
                edit-2
                2 months ago

                …we should be striving for less damage to the patient, too.

                Can you make someone just not sexually interested in something they find arousing? As far as I know, conversion therapy for non-heterosexual people doesn’t have good success rates. Also, those therapies also tended to involve some form of harm, from what I’ve heard.

                • redfellow@sopuli.xyz
                  link
                  fedilink
                  arrow-up
                  4
                  ·
                  edit-2
                  2 months ago

                  It’s not about making someone want something, less, but helping them to never act on those needs.

                  Computer generated imagery could in theory be helpful, so the itch gets scratched without creating victims and criminals.

                  I’d call that a win-win in terms of societal well being, as also less funds are wasted on police work, jailing a perpetrator, and therapy for victim.

                • Farid@startrek.website
                  link
                  fedilink
                  arrow-up
                  4
                  ·
                  2 months ago

                  Can you make someone just not sexually interested in something they find arousing?

                  No, I can’t. Doesn’t mean that we (as a society) shouldn’t be working on finding ways to do it or finding alternative solutions. And it’s necessary to acknowledge that what we have now is not good enough.

                  those therapies also tended to involve some form of harm

                  They probably did. But nobody here is claiming those were good or helping the patients either.

      • treefrog@lemm.ee
        link
        fedilink
        arrow-up
        4
        arrow-down
        2
        ·
        2 months ago

        Depending on the state, yes actually.

        I did time in a medium security facility that also did sex offender treatment (I was there on drug charges). I still have friends that went through that program.

        The men who were actually invested in getting better, got better. The ones invested in staying well, are still well.

    • Cosmonauticus@lemmy.world
      link
      fedilink
      arrow-up
      70
      arrow-down
      2
      ·
      edit-2
      2 months ago

      You and I both know he’s not going to get it. I have a kinda sympathy for ppl attracted to kids but refuse to act on it. They clearly know it’s not normal and recognize the absolute life destroying damage they can cause if they act on it. That being said there’s not many places you can go to seek treatment. Any institutions that advertised treatment would have ppl outside with pitchforks and torches.

      Before anyone tries to claim I’m pro pedo you can fuck right off. I just wish it was possible for ppl who are attracted to kids and not out touching them to get some kind of therapy and medication to make them normal (or destroy their sex drive) before something terrible happens.

      • Samvega@lemmy.blahaj.zone
        link
        fedilink
        arrow-up
        29
        arrow-down
        13
        ·
        2 months ago

        to get some kind of therapy and medication to make them normal

        Hi, Psychologist here. Does society have strong evidence that therapeutic interventions are reducing rates of, say, the most common disorders of anxiety and depression? Considering that the rates of these are going up, I don’t think we can assume there’s a hugely successful therapy to help those attracted to CSA images to change. Psychology is not a very good science principally because it offers few extremely effective answers in the real world.

        In terms of medication androgen antagonists are generally used. This is because lowering testosterone generally leads to a lower sex drive. Here is an article about those drugs, including an offender who asked for them: https://www.theguardian.com/society/2016/mar/01/what-should-we-do-about-paedophiles

        TW: the article contains discussion of whether offenders are even psychologically disordered, when set within a historical cultural context of child-marriage. This paragraph is two above the illustration of people trapped within concentric circular walls, and starts “In the 2013 edition …”.

        Collis began to research the treatment and decided that it was essential to his rehabilitation. He believes he was born a paedophile, and that his attraction to children is unchangeable. “I did NOT wake up one morning and decide my sexual preference. I am sexually attracted to little girls and have absolutely no interest in sex with adults. I’ve only ever done stuff with adults in order to fit in with what’s ‘normal’.” For Collis, therefore, it became a question of how to control this desire and render himself incapable of reoffending.

        […]

        Many experts support Aaron Collis’s self-assessment, that paedophilia is an unchangeable sexual preference. In a 2012 paper, Seto examined three criteria – age of onset, sexual and romantic behaviour, and stability over time. In a number of studies, a significant proportion of paedophiles admitted to first experiencing attraction towards children before they had reached adulthood themselves. Many described their feelings for children as being driven by emotional need as well as sexual desire. As for stability over time, most clinicians agreed that paedophilia had “a lifelong course”: a true paedophile will always be attracted to children. “I am certainly of the view,” Seto told me, “that paedophilia can be thought of as a sexual orientation.”

        Brain-imaging studies have supported this idea. James Cantor, a psychiatry professor at the University of Toronto, has examined hundreds of MRI scans of the brains of paedophiles, and found that they are statistically more likely to be left-handed, shorter than average, and have a significantly lower density of white matter, the brain’s connective tissue. “The point that’s important for society is that paedophilia is in the brain at all, and that the person didn’t choose it,” Cantor told me. “As far as we can tell, they were born with it.” (Not that this, he emphasised, should excuse their crimes.)

        […]

        Clinical reality is a little more complicated. “There’s no pretence that the treatment is somehow going to cure them of paedophilia,” Grubin told me. “I think there is an acceptance now that you are not going to be able to change very easily the direction of someone’s arousal.” Grubin estimates that medication is only suitable for about 5% of sex offenders – those who are sexually preoccupied to the extent that they cannot think about anything else, and are not able to control their sexual urges. As Sarah Skett from the NHS put it: “The meds only take you so far. The evidence is clear that the best treatment for sex offending is psychologically based. What the medication does is help people have a little bit of control, which then allows them to access that treatment.”

         

        Some research on success rates:

        Prematurely terminating treatment was a strong indicator of committing a new sexual offense. Of interest was the general improvement of success rates over each successive 5-year period for many types of offenders. Unfortunately, failure rates remained comparatively high for rapists (20%) and homosexual pedophiles (16%), regardless of when they were treated over the 25-year period. [https://pubmed.ncbi.nlm.nih.gov/11961909/]

        Within the observation period, the general recidivism and sexual recidivism rates were 33.1% and 16.5%, respectively, and the sexual contact recidivism rate was 4.7%. [https://journals.sagepub.com/doi/abs/10.1177/0306624X231165416 - this paper says that suppressing the sex drive with medication was the most successful treatment]

        Men with deviant sexual behavior, or paraphilia, are usually treated with psychotherapy, antidepressant drugs, progestins, and antiandrogens, but these treatments are often ineffective. Selective inhibition of pituitary–gonadal function with a long-acting agonist analogue of gonadotropin-releasing hormone may abolish the deviant sexual behavior by reducing testosterone secretion. [https://www.nejm.org/doi/full/10.1056/nejm199802123380702 - this paper supports that lowering testosterone works best]

          • LustyArgonianMana@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            arrow-down
            1
            ·
            2 months ago

            Not always. There are people with brain injuries who suddenly develop an attraction towards kids and it’s not really due to power dynamics or anything else.

          • Rai@lemmy.dbzer0.com
            link
            fedilink
            arrow-up
            6
            ·
            2 months ago

            For people actually abusing? Spot on, most of the time.

            For non-offending paedos? Nah… a horrible affliction.

        • LustyArgonianMana@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          1
          ·
          edit-2
          2 months ago

          I don’t understand why we haven’t used inhalable oxytocin as an experimental drug for people attracted to children and animals. It seems intuitive- children and animals generate oxytocin for humans automatically, and it’s possible some people need a stronger stimulus to release oxytocin or may not have a lot of oxytocin endogenously. Oxytocin can be compounded at a pharmacy and has been used successfully for social anxiety.

        • z3rOR0ne@lemmy.ml
          link
          fedilink
          arrow-up
          15
          ·
          edit-2
          2 months ago

          Thank you for such a well laid out response and the research to back it up. I rarely see people approaching the subjects of pedophilia, and how best to treat pedophiles, rationally and analytically.

          It’s understandable considering the harm they can cause to society that most can only ever view them as nothing more or less than monsters, and indeed, those that are incapable of comprehending the harm they cause and empathizing with those they could potentially cause or have caused harm to, are IMHO some of the more loathsome individuals.

          That said, I think too often people are willing to paint others whose proclivities are so alien and antithetical to our own as not only monsters, but monsters that aren’t worth understanding with any degree of nuance, that we ultimately do ourselves and future generations a disservice by not at least attempting to address the issue at hand in the hopes that the most harmful parts of our collective psyche are treated palliatively to the best of our ability.

          Your annotated sources indicate that there is not nearly as clear a path forward as detractors to the “pedophiles are simply monsters and there’s no reason to look into their motives further” would like to believe, while also, by the nature of the existence of the attempted treatments themselves, points out that there is more work to be done to hopefully find a more lasting and successful rate of treatment.

          Like many of the psychological ailments plagueing societies today, you cannot simply kill and imprison the problem away. That is always a short term (albeit at times temporarily effective) solution. The solution to the problem of how to greatly reduce the occurrence of pedophilia will ultimately require more of this kind of research and will require more analysis and study towards achieving such ends.

          Again, I thank you for your nuanced post, and commend you for taking your nuanced stance as well.

  • hexdream@lemmy.world
    link
    fedilink
    arrow-up
    26
    ·
    2 months ago

    If this thread (and others like it) have taught me aulnything is that facts be damned, people are opinionated either way. Nuance means nothing and it’s basically impossible to have a proper discussion when it comes to wedge issues or anything that can be used to divide people. Even if every study 100% said Ai generated csam always led to a reduction in actual child harm and reduced recidivism and never needed any actual real children to be used as training material, the comments would still pretty much look the same. If the studies showed the exact opposite, the comments would also be the same. Welcome to the internet. I hope you brought aspirin.

    • Eezyville@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      8
      arrow-down
      1
      ·
      2 months ago

      My man. Go touch some grass. This place is no good. Not trying to insult you but it’s for your mental health. These Redditors aren’t worth it.

      • SynopsisTantilize@lemm.ee
        link
        fedilink
        arrow-up
        3
        ·
        2 months ago

        A lot of the places I’ve been to start conversation have been hostile and painful. If there is one thing that stands out that’s holding Lemmy back it’s the shitty culture this place can breed.

          • SynopsisTantilize@lemm.ee
            link
            fedilink
            arrow-up
            1
            ·
            2 months ago

            I accidentally went to Hexbear the other day… But yea I guess. Just wish there was more participation and less negativity

        • NauticalNoodle@lemmy.ml
          link
          fedilink
          arrow-up
          2
          ·
          edit-2
          2 months ago

          I’m convinced that a lot can be inferred from the type of reactions and the level of hostility one might receive by trying to present a calm and nuanced argument to a wedge topic. Even if it’s not always enjoyable. At the very least it also shows others that they may not be interacting rational actors when one gets their opponents to go full mask-off.

          • SynopsisTantilize@lemm.ee
            link
            fedilink
            arrow-up
            2
            ·
            2 months ago

            Agreed. And I’ve had my share of “being a dick” on the Internet here. But by the end of the interaction I try to at least jest. Or find a middle ground…I commented on a Hexbear instance by accident once…

    • NauticalNoodle@lemmy.ml
      link
      fedilink
      arrow-up
      4
      ·
      edit-2
      2 months ago

      I was hoping to comment on this post multiple times today after I initially lost track of It and now I see you’ve covered about 75% of what I wanted to say. I’ll post the rest elsewhere out of politeness. Thank you

  • BilboBargains@lemmy.world
    link
    fedilink
    arrow-up
    18
    arrow-down
    10
    ·
    2 months ago

    If no children were involved in the production of porn, how is it pedophilic? That’s like claiming a picture of water has the same properties as water.

    • derpgon@programming.dev
      link
      fedilink
      arrow-up
      7
      arrow-down
      1
      ·
      2 months ago

      However, a picture of water makes me thirsty. But then again, there is no substitute for water.

      I am not defending pedos, or defending Florida for doing something like that.

      • Sarmyth@lemmy.world
        link
        fedilink
        arrow-up
        3
        ·
        2 months ago

        That might be a you thing. Pictures of water dont make me thirsty. I get the metaphor you are attempting to make though.

    • Revan343@lemmy.ca
      link
      fedilink
      arrow-up
      2
      arrow-down
      1
      ·
      edit-2
      2 months ago

      It’s pedophillic because it’s sexual images of children; fake or not does not change that. Drawn images of child pornography are still pedophillic.

      The more important question is, is it CSAM? Whether drawn images that represent no real child are or not depends on the legal jurisdiction. Should drawn and AI generated child porn be legal or banned? I think the actual answer to that would require significant research into whether its existence acts as an outlet to prevent pedophiles from harming actual children, or whether it encourages their proclivities and makes them more likely to hurt actual children.

      Preventing harm to children should be the goal, but actual research of the effects of simulated child porn vis-a-vis pedophiles harming actual children is as unlikely to happen as any other research into pedophilia

      • aesthelete@lemmy.world
        link
        fedilink
        arrow-up
        3
        arrow-down
        5
        ·
        edit-2
        2 months ago

        It’s already illegal to do basically most of the things leading up to a murder. You’re not allowed to conspire to commit one, stalk your target, break into a building, torture animals, kidnap, hire a hitman, etc.

        • garpujol@discuss.online
          link
          fedilink
          arrow-up
          12
          arrow-down
          1
          ·
          2 months ago

          TV and movies should not be able to show crimes, because images depicting crimes should be illegal.

          (I’m just illustrating the slippery slope criminalizing “artistic” renderings)

          • aesthelete@lemmy.world
            link
            fedilink
            arrow-up
            1
            arrow-down
            7
            ·
            edit-2
            2 months ago

            I’m not advocating for what you’re saying here at all.

            So there you go, your slope now has gravel on it.

            EDIT: This dude was arrested using today’s laws, and I’m pretty sure the series Dexter is still legal to write, direct, film, and view. So your slippery slope is a fallacious one (as most of them tend to be in my experience).

              • aesthelete@lemmy.world
                link
                fedilink
                arrow-up
                1
                arrow-down
                3
                ·
                edit-2
                2 months ago

                It should be illegal for a number of reasons. One is a simple practical one: as the technology advances towards increasing levels of realism it’ll become impossible for law enforcement to determine what material is “regular” CSAM versus what material is “generated” CSAM.

                So, unless you’re looking to repeal all laws against possession of CSAM, you’ll have a difficult time crafting a cut-out for generated CSAM.

                And honestly, why bother? What’s the upside here? To have pedos get a more fulfilling wank to appease them and hope they won’t come after your kids for real? I really doubt the premise behind that one.

                • Cryophilia@lemmy.world
                  link
                  fedilink
                  arrow-up
                  6
                  ·
                  2 months ago

                  And honestly, why bother? What’s the upside here?

                  Allowing for victimless crimes simply because a group is undesirable is a terrible precedent. We can’t ban things just because they make us uncomfortable. Or because it makes law enforcements job easier.

        • homura1650@lemmy.world
          link
          fedilink
          arrow-up
          5
          ·
          edit-2
          2 months ago

          But you are allowed to stage a murder; hire a film crew to record your staged murder; pay television stations and websites to show a preview of your staged murder, and sell a recording of your staged murder to anyone who wants to buy. Depending on how graphic it is, you might be required to put an age advisory on your work and not broadcast it on public airwaves; but that is about the extent of the regulation.

          You can even stage murders of children and show that.

          Even if the underlying murder is real, there is still no law outlawing having a recording of it. Even producing a recording of a murder isn’t technically illegal; although depending on the context you might still be implicated in a conspiracy to commit murder.

          Sexual assult of children is the only crime for which the mere depiction of the crime is itself a crime.

          • aesthelete@lemmy.world
            link
            fedilink
            arrow-up
            2
            arrow-down
            5
            ·
            edit-2
            2 months ago

            Sexual assult of children is the only crime for which the mere depiction of the crime is itself a crime.

            Okay. Why should I care that this is an exception case?

            Laws aren’t derived from clean general principles by the gods on Mt. Olympus.

            • aesthelete@lemmy.world
              link
              fedilink
              arrow-up
              2
              arrow-down
              3
              ·
              edit-2
              2 months ago

              Another thing that you “can’t we think of the poor, helpless pedos that want to nut?” people don’t seem to think of is that if AI CSAM grows increasingly more realistic and we carve out an exception for it, how can you enforce laws against non-generated CSAM? You’d have to have some computer forensics asshole involved in every case to prove whether or not the images are generated, which would likely produce a chilling effect over time on regular CSAM cases, and all of this for the “societal benefit” of allowing pedos a more gratifying wank.

              • Cryophilia@lemmy.world
                link
                fedilink
                arrow-up
                4
                arrow-down
                1
                ·
                2 months ago

                For the societal benefit of not ceding our rights to government.

                Today it’s pedophilia, but what about if Trump wins and legally designates trans people as pedophiles?

                This is a power we cannot allow the government to have.

                • aesthelete@lemmy.world
                  link
                  fedilink
                  arrow-up
                  2
                  arrow-down
                  3
                  ·
                  2 months ago

                  For the societal benefit of not ceding our rights to government.

                  The right to create realistic looking CSAM is not a right I give a shit about having.

                  Today it’s pedophilia, but what about if Trump wins and legally designates trans people as pedophiles?

                  What if a lawless dictator does crazy things? I’m not sure the law (or lack thereof) will have anything to do with that scenario.

                  The whole idea of “if X is illegal, then the government will use the law against X against us” only matters in the context of a government following the law.

                  If they’re willing to color outside the lines and the courts say the law doesn’t apply to them then it doesn’t matter one fucking bit.

  • JaggedRobotPubes@lemmy.world
    link
    fedilink
    English
    arrow-up
    74
    arrow-down
    11
    ·
    2 months ago

    Do we know that AI child porn is bad? I could believe it would get them in the mood for the real thing and make them do it more, and I could believe it would make them go “ok, itch scratched”, and tank the demand for the real stuff.

    Depending on which way it goes, it could be massively helpful for protecting kids. I just don’t have a sense for what the effect would be, and I’ve never seen any experts weigh in.

    • PhilMcGraw@lemmy.world
      link
      fedilink
      English
      arrow-up
      22
      ·
      2 months ago

      In Australia cartoon child porn is enforced in the same way as actual child porn. Not that it answers your question but it’s interesting.

      I’d imagine for your question “it depends”, some people who would have acted on their urges may get their jollies from AI child porn, others who have never considered being pedophiles might find the AI child porn (assuming legal) and realise it’s something they were into.

      I guess it may lower the production of real child porn which feels like a good thing. I’d hazard a guess that there are way more child porn viewers than child abusers.

      • redfellow@sopuli.xyz
        link
        fedilink
        arrow-up
        10
        ·
        2 months ago

        In Australia a 30 year old woman cannot be in the porn industry if she has small breasts. That, and the cartoon ban both seem like overcompensating.

        • Queue@lemmy.blahaj.zone
          link
          fedilink
          arrow-up
          12
          ·
          2 months ago

          Nothing says “we’re protecting children” like regulating what adult women can do with their bodies.

          Conservatives are morons, every time.

          • Cryophilia@lemmy.world
            link
            fedilink
            arrow-up
            7
            ·
            2 months ago

            They’re not morons.

            Any time anyone ever says they want to do anything “to protect the children” you should assume it’s about control. No one actually gives a shit about children.

    • Thespiralsong@lemmy.world
      link
      fedilink
      arrow-up
      16
      ·
      2 months ago

      I seem to remember Sweden did a study on this, but I don’t really want to google around to find it for you. Good luck!

    • Cryophilia@lemmy.world
      link
      fedilink
      arrow-up
      16
      arrow-down
      3
      ·
      2 months ago

      Real question: “do we care if AI child porn is bad?” Based on most countries’ laws, no.

    • 31337@sh.itjust.works
      link
      fedilink
      arrow-up
      4
      ·
      2 months ago

      Wikipedia seems to suggest research is inconclusive whether consuming CSAM increases the likelihood of committing abuse.

    • Pankkake@lemmy.world
      link
      fedilink
      arrow-up
      6
      arrow-down
      25
      ·
      2 months ago

      Depending on which way it goes, it could be massively helpful for protecting kids

      Weeeelll, only until the AI model needs more training material…

      • JaggedRobotPubes@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        arrow-down
        6
        ·
        2 months ago

        I’m not sure if that is how it would work? But this is exactly the kind of thinking we need. Effects: intended plus unintended equals ???

      • Saledovil@sh.itjust.works
        link
        fedilink
        arrow-up
        6
        arrow-down
        1
        ·
        2 months ago

        You need more training material to train a new AI. Once the AI is there, it produce as many pictures as you want. And you can get good results even with models that can be run locally on a regular computer.

    • ZILtoid1991@lemmy.world
      link
      fedilink
      arrow-up
      12
      ·
      2 months ago

      There’s like a lot of layers to it.

      • For some, it might actually work in the opposite direction, especially if paried with the wrong kind of community around it. I used to moderate anime communities, the amount of loli fans wanting to lower the age of consent to 12 or even lower was way too high, but they only called people opposed to loli as “real predators”, because they liked their middle-school tier arguments (which just further polarized the fandom when the culture wars started).
      • Even worse might be the more realistic depictions might actually work against that goal, while with (most) loli stuff, at least it’s obvious it’s drawn.
      • An often overseen issue is, data laundering. Just call your real CP AI generated, or add some GAI artifacts to your collection. Hungary bans too realistic drawings and paintings of that kind, because people even did that with traditional means, by creating as realistic tracings as possible (the calling CP “artistic nudes” didn’t work out here at least).
    • barsquid@lemmy.world
      link
      fedilink
      arrow-up
      19
      arrow-down
      6
      ·
      2 months ago

      I’d like to know what psychologists think about it. My assumption is the former, it escalates their fantasizing about it and makes them more likely to attack a child.

      There seems to be no way to conduct that experiment ethically, though.

    • ObjectivityIncarnate@lemmy.world
      link
      fedilink
      arrow-up
      35
      arrow-down
      2
      ·
      2 months ago

      Do we know that AI child porn is bad? I could believe it would get them in the mood for the real thing and make them do it more, and I could believe it would make them go “ok, itch scratched”, and tank the demand for the real stuff.

      From bits/articles I’ve seen here and there over the years about other things that are kind of in the same category (porn comics with child characters in them, child-shaped sex dolls), the latter seems to be more the case.

      I’m reminded of when people were arguing that when Internet porn became widespread, the incidence of rape would go through the roof. And then literally the opposite happened. So…that pushes me toward hypothesizing that the latter is more likely to be the case, as well.

    • Todd Bonzalez@lemm.ee
      link
      fedilink
      arrow-up
      6
      arrow-down
      3
      ·
      2 months ago

      There are literally mountains of evidence that suggest that normalizing child abuse in any fashion increases the rate at which children are actually abused, but it never stops there from being a highly upvoted comment suggesting that jacking it to simulated kids is some how a “release valve” for actual pedophilia, which makes absolutely no fucking sense given everything we know about human sexuality.

      If this concept were true, hentai fans would likely be some of the most sexually well-adjusted people around, having tons of experience releasing their real-world sexual desires via a virtual medium. Instead, we find that these people objectify the living shit out of women, because they’ve adopted an insanely overidealized caricature of what a woman should look and act like that is completely divorced from reality.

    • Maggoty@lemmy.world
      link
      fedilink
      arrow-up
      15
      arrow-down
      10
      ·
      2 months ago

      You’re missing the point. They don’t care what’s more or less effective for helping kids. They want to punish people who are different. In this case nobody is really going to step up to defend the guy for obvious reasons. But the motivating concept is the same for conservatives.

    • mckean@programming.dev
      link
      fedilink
      arrow-up
      4
      ·
      2 months ago

      There definitively is opportunity in controlled treatment. But I believe outside of that there are too many unknowns.

  • BonesOfTheMoon@lemmy.world
    link
    fedilink
    arrow-up
    58
    arrow-down
    2
    ·
    2 months ago

    Could this be considered a harm reduction strategy?

    Not that I think CSAM is good in any way, but if it saves a child would it be worthwhile? Like if these pedos were to use AI images instead of actual CSAM would that be any better?

    I’ve read that CSAM sites on the dark web number into the hundreds of thousands. I just wonder if it would be a less harmful thing since it’s such a problem.

    • pregnantwithrage@lemmy.world
      link
      fedilink
      English
      arrow-up
      14
      arrow-down
      1
      ·
      2 months ago

      You would think so but you basically are making a patch work version of the illicit actual media so it’s a dark dark gray area for sure.

        • medgremlin@midwest.social
          link
          fedilink
          arrow-up
          8
          arrow-down
          4
          ·
          2 months ago

          Generative AI is basically just really overpowered text/image prediction. It fills in the words or pixels that make the most sense based on the data it has been fed, so to get AI generated CSAM…it had to have been fed some amount of CSAM at some point or it had to be heavily manipulated to generate the images in question.

          • BonesOfTheMoon@lemmy.world
            link
            fedilink
            arrow-up
            3
            ·
            2 months ago

            Ok makes sense. Yuck my skin crawls. I got exposed to CSAM via Twitter years ago, thankfully it was just a shot of nude children I saw and not the actual deed, but I was haunted.

          • CommanderCloon@lemmy.ml
            link
            fedilink
            arrow-up
            9
            arrow-down
            1
            ·
            edit-2
            2 months ago

            so to get AI generated CSAM…it had to have been fed some amount of CSAM

            No actually, it can combine concepts that aren’t present together in the dataset. Does it know what a child looks like? Does it know what porn looks like? Then it can generate child porn without having ever had CSAM in its dataset. See the corn dog comment as an argument

            Edit: corn dog

            • emmy67@lemmy.world
              link
              fedilink
              arrow-up
              1
              arrow-down
              2
              ·
              2 months ago

              A dumb argument. Corn and dog were. But that’s not a corn dog like what we expect when we think corn dog.

              Hence it can’t get what we know a corn dog is.

              You have proved the point for us since it didn’t generate a corn dog.

            • medgremlin@midwest.social
              link
              fedilink
              arrow-up
              4
              ·
              2 months ago

              Some of the image generators have attempted to put up guard rails to prevent generating pictures of nude children, but the creators/managers haven’t been able to eradicate it. There was also an investigation by Stanford University that showed that most of the really popular image generators had a not insignificant amount of CSAM in their training data and could be fairly easily manipulated into making more.

              The creators and managers of these generative “AIs” have done slim to none in the way of curation and have routinely been trying to fob off responsibility to their users the same way Tesla has been doing for their “full self driving”.

    • RandomlyNice@lemmy.world
      link
      fedilink
      English
      arrow-up
      41
      arrow-down
      3
      ·
      2 months ago

      Many years ago (about 25) I read an article in a newspaper (idk the name, but it may have been the The Computer Paper, which is archived on line someplace}. This article noted that a study had been commissioned to show that cp access increases child abuse. The study seemed to show the opposite.

      Here’s the problem with even AI generated cp: It might lower abuse in the beginning, but with increased access it would ‘normalise’ the perception of such conduct. This would likely increase abuse over time, even involving persons who may not have been so inclined otherwise.

      This is all a very complex. A solution isn’t simple. Shunning things in anyway won’t help though, and that seems to be the current most popular way to deal with the issue.

      • Cryophilia@lemmy.world
        link
        fedilink
        arrow-up
        18
        ·
        2 months ago

        “Normalized” violent media doesn’t seem to have increased the prevalence of real world violence.

        • Spacehooks@reddthat.com
          link
          fedilink
          English
          arrow-up
          4
          ·
          2 months ago

          I actually think video games reduce crime in general. Bad kids are now indoors getting thier thrills.

      • Facebones@reddthat.com
        link
        fedilink
        arrow-up
        27
        arrow-down
        2
        ·
        2 months ago

        Actual pedophiles (a lot of CSA is abuse of power, not pedophilia - though to be clear fuck abusers either way) have a high rate of suicidal ideation because they think its as fucked up as everyone else. Of course we can’t just say “sure AI material is legal now” but I could imagine a regulated system accessed via doctors akin to how controlled substances work.

        People take this firm “kill em all” stance but these people just feel the way they do same as I do towards women or a gay man feels toward men. It just is what it is - we all generally agree gay isnt a choice and this is no different. As long as they dont act on it, I think we should be sympathetic and be open to helping them live a less tortured life.

        I’m not 100% saying this is how we do it, but we should be open to exploring the issue instead of full stop demonization.

        • fine_sandy_bottom@lemmy.federate.cc
          link
          fedilink
          arrow-up
          6
          ·
          2 months ago

          I agree for the most part, particularly that we should be open minded.

          Obviously we don’t have much reliable data, which I think is critically important.

          The only thing I world add is that, I’m not sure treating a desire for CSAM would be the same as substance abuse. Like “weaning an addict off CSAM” seems like a strange proposition to me.

          • Facebones@reddthat.com
            link
            fedilink
            arrow-up
            3
            ·
            2 months ago

            Maybe I was unclear, when I relate potential generated material to controlled substances, I mean in relation to how you obtain it.

            You go see a psych, probably go through some therapy or something, and if they feel it would be beneficial you would be able to get material via strictly controlled avenues like how you need a prescription for xanax and its a crime to sell or share it.

            (and I imagine… Some sort of stamping whether in the imagery or in the files to trace any leaked material back to the person who shared it, but thats a different conversation)

        • HonorableScythe@lemm.ee
          link
          fedilink
          arrow-up
          12
          arrow-down
          1
          ·
          2 months ago

          Dan Savage coined the term “gold star pedophile” in a column years ago, referring to people who acknowledge their attraction to children but never act on it by harming a child or accessing CSAM. I do feel bad for these people because there are no resources to help them. The only way they can access actual therapeutic resources for their condition is by offending and going to jail. If the pedophile goes to a therapist and confesses attraction to children, therapists are mandated reporters and will assume they’re going to act on it. An article I read a few years back interviewed members of an online community of non-offending pedophiles who essentially made their own support group since no one else will help them, and nearly all research on them is from a forensic (criminal) context.

          There’s a pretty good article by James Cantor talking about dealing with pedophiles in a therapeutic context here.

          Don’t get me wrong - I think offenders need to be punished for what they do. I unfortunately have a former best friend who has offended. He’s no longer in my life and never will be again. But I think we could prevent offenders from reaching that point and hurting someone if we did more research and found ways to stop them before it happened.

          • Liz@midwest.social
            link
            fedilink
            English
            arrow-up
            11
            arrow-down
            3
            ·
            2 months ago

            We really gotta flip the standard and make therapist sessions 100% confidential. We should encouraging people to seek help in stopping their bad behavior, no matter what it is, and they’re less likely to do that if they think a therapist could report them.

            • LustyArgonianMana@lemmy.world
              link
              fedilink
              English
              arrow-up
              5
              arrow-down
              2
              ·
              edit-2
              2 months ago

              You’re asking therapists to live with that information. It’s not so easy to hear that a child is being actively raped and not legally being allowed to report it.

              We already lose tons of social workers in CPS because they can’t help those kids much or save them. Most normal adults can’t really mentally handle child torture without doing something about it. How many unreported child abuse cases before a therapist kills themselves?

              Let alone that you’re sentencing a child to live in a rape nightmare, something most adults can’t tolerate, all so their abuser can get some help maybe. Wonder how many kids will kill themselves. What the actual fuck. Here’s a hint: kids are slaves, so passing laws that disempower them even more is really fucked up.

              • Liz@midwest.social
                link
                fedilink
                English
                arrow-up
                2
                ·
                2 months ago

                I mean, how many children get abused because people are too afraid to seek help? It’s not an area with an easy answer, and I don’t have hard numbers on how much harm either scenario would produce.

                • LustyArgonianMana@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  arrow-down
                  1
                  ·
                  2 months ago

                  Well, if all the therapists kill themselves, then that system will be worse than the current one because no one will be getting help

    • xta@lemmy.world
      link
      fedilink
      arrow-up
      3
      arrow-down
      3
      ·
      2 months ago

      by the same metric, i wonder why not let convicts murderers and psichopaths work at Slaughterhouses

    • Todd Bonzalez@lemm.ee
      link
      fedilink
      arrow-up
      3
      arrow-down
      1
      ·
      2 months ago

      Think of it this way - what if the government said one day: “All child porn made in the before this date is legal, all child porn made after this date is illegal”.

      You would end up with a huge corpus of “legal” child porn that pedophiles could use as a release, but you could become draconian about the manufacture of new child porn. This would, theoretically, discourage new child porn from being created, because the risk is too high compared to the legal stuff.

      Can you see the problem? That’s right, in this scenario, child porn is legal. That’s fucked up, and we shouldn’t do that, even if it is “simulated”, because fuck that.

    • JovialMicrobial@lemm.ee
      link
      fedilink
      arrow-up
      7
      ·
      edit-2
      2 months ago

      I guess my question is does access to regular porn make people not want to have real sex with another person? Does it ‘scratch the itch’ so to speak? Could they go the rest of their life with only porn to satisfy them?

      It depends on the person. I feel like most people would be unsatisfied with only porn, but that’s just anecdotal.

      I honestly think ai generated csam isn’t something the world needs to be produced. It’s not contributing to society in any meaningful ways and pedophiles who don’t offend or hurt children need therapy, and the ones who do need jailtime(and therapy, but Im in the US so thats a whole other thing). They don’t ‘need’ porn.

      My own personal take is that giving pedophiles csam that’s AI generated is like showing alcohol ads to alcoholics. Or going to the strip club if you’re a sex addict. It’s probably not going to lead to good outcomes.

    • Phoenixz@lemmy.ca
      link
      fedilink
      arrow-up
      1
      ·
      2 months ago

      If from now on all child porn would be created artificially instead of by abusing children, wouldn’t that maybe be a good thing?

      Not trying to defend anything here, but where there is a want in the world, there is a market, you can’t stop that. If artificial material makes it that even one less child is abused, I think it’s worth having a discussion at least

  • spicystraw@lemmy.world
    link
    fedilink
    arrow-up
    25
    arrow-down
    16
    ·
    2 months ago

    I must admit, amount of comments that are defending AI images as not child porn is truly shocking.

    In my book, sexual images of children are not okay, AI generated or otherwise. Pedophiles need help, counseling and therapy. Not images that enable something I think is not acceptable in society.

    I truly do believe that AI images should be subject to same standards as regular images in what content we deem appropriate or not.

    Yes, this can be used to wrongfully prosecute innocent people, but it does not mean that we should freely allow AI-CP.

    • OutsizedWalrus@lemmy.world
      link
      fedilink
      arrow-up
      5
      arrow-down
      4
      ·
      2 months ago

      You’re not kidding.

      The only possible way I could see a defense if it were something like “AI CSAM results in a proven reduction of actual CSAM”.

      But. The defenses aren’t even that!

      They’re literally saying that CSAM is okay. I’m guessing a lot of these same comments would argue that deepfakes are okay as well. Just a completely fucked up perspective.

    • Landless2029@lemmy.world
      link
      fedilink
      arrow-up
      6
      arrow-down
      6
      ·
      2 months ago

      Cant speak for others but I agree that AI-CP should be illegal.

      The question is how do we define the crime with our current laws? It does seem like we need a new law to address AI images. Both for things like AI-CP, revenge porn, and slanderous/misleading photos. (The Communist Harris and Trump with black people photos)

      Where do we draw the line?
      How do we regulate it?
      Forced watermarks/labels on all tools?
      Jail time? Fines?
      Forced correction notices? (Doesn’t work for the news!)

      This is all a slippery slope but what I can say is I hope this goes to court. He looses. Appeals. Then it goes all the way up to federal so we can have a standard to point to.

      The shit wrong.
      Step one in fixing shit.

    • WormFood@lemmy.world
      link
      fedilink
      arrow-up
      9
      arrow-down
      13
      ·
      2 months ago

      the number of people willing to bat for this on Lemmy is truly disturbing. what do they think these ai models are trained on?

      • ZeroHora@lemmy.ml
        link
        fedilink
        English
        arrow-up
        12
        arrow-down
        1
        ·
        2 months ago

        No necessarily is trained on CP, could be trained with images of children (already fuck up, who gave them that permission?) and pornography.

        • kaffiene@lemmy.world
          link
          fedilink
          English
          arrow-up
          5
          arrow-down
          2
          ·
          2 months ago

          The article pointed out that stable diffusion was trained using a dataset containing CSAM

    • 31337@sh.itjust.works
      link
      fedilink
      arrow-up
      13
      ·
      2 months ago

      I generally think if something is not causing harm to others, it shouldn’t be illegal. I don’t know if “generated” CSAM causes harm to others though. I looked it up and it appears the research on whether CSAM consumption increases the likelihood of a person committing child abuse is inconclusive.

    • tron@midwest.social
      link
      fedilink
      arrow-up
      14
      arrow-down
      13
      ·
      2 months ago

      Pedophiles need help, counseling and therapy. Not images that enable something I think is not acceptable in society.

      I mean 30-40 years ago you could replace the word pedophile with homosexual and a vast majority of people would agree. I’m not defending pedophilia here but it’s important to remember these people are born the way they are. Nothing is going to change that, new pedophiles are born every day. They will never go away. The same way you can’t change gay or transgender people. Repressing sexual desire never works look at the priests in the Catholic Church. A healthy outlet such as AI generated porn could save a lot of actual children from harm. I think that should be looked into.

      • nickiwest@lemmy.world
        link
        fedilink
        arrow-up
        1
        arrow-down
        2
        ·
        2 months ago

        I would like to know what source you have for claiming that pedophiles are “born the way they are.”

        We understand some of the genetic and intrauterine developmental reasons for homosexuality, being trans, etc. That has scientific backing, and our understanding continues to grow and expand.

        Lumping child predators in with consenting adults smacks of the evangelical slippery slope argument against all forms of what they consider to be “sexual deviance.” I’m not buying it.

      • spicystraw@lemmy.world
        link
        fedilink
        arrow-up
        5
        arrow-down
        1
        ·
        2 months ago

        Look, I get what you are saying and I do agree. However, I don’t think that comparing pedophilic relations to LGBTQ struggles is fair. One is consented relationship between consenting adults, other is exploitation and high probability of setup for lifelong mental struggles from young age.

    • filcuk@lemmy.zip
      link
      fedilink
      arrow-up
      9
      arrow-down
      3
      ·
      edit-2
      2 months ago

      Agreed, especially considering it will eventually become indistinguishable.

  • Mubelotix@jlai.lu
    link
    fedilink
    arrow-up
    29
    arrow-down
    10
    ·
    2 months ago

    It’s not really children on these pics. We can’t condemn people for things that are not illegal yet

    • Nuke_the_whales@lemmy.world
      link
      fedilink
      arrow-up
      10
      ·
      2 months ago

      I’ve always wondered the same when an adult cop pretends to be a kid only to catch pedos. Couldn’t a lawyer argue that because there actually wasn’t a child, there wasn’t a crime?

    • Todd Bonzalez@lemm.ee
      link
      fedilink
      arrow-up
      4
      arrow-down
      3
      ·
      2 months ago

      It’s not really children on these pics.

      You are certain about this? If so, where are you getting that info, because it’s not in the article?

      Generative image models frequently are used for the “infill” capabilities, which is how nudifying apps work.

      If he was nudifying pictures of real kids, the nudity may be simulated, but the children are absolutely real, and should be considered victims of child pornography.

    • Microw@lemm.ee
      link
      fedilink
      arrow-up
      16
      arrow-down
      1
      ·
      2 months ago

      It’s Florida. They will simply book him and then present him a deal for “only x years prison”, which he’ll take and therefore prevent this from going to court and actually be ruled upon.

  • hightrix@lemmy.world
    link
    fedilink
    arrow-up
    17
    arrow-down
    3
    ·
    2 months ago

    He wasn’t arrested for creating it, but for distribution.

    If dude just made it and kept it privately, he’d be fine.

    I’m not defending child porn with this comment.

    • Todd Bonzalez@lemm.ee
      link
      fedilink
      arrow-up
      4
      arrow-down
      17
      ·
      2 months ago

      I’m not defending child porn with this comment.

      This is one of those cases where, even if you’re technically correct, you probably shouldn’t say out loud how you personally would get away with manufacturing child porn, because it puts people in the position of imagining you manufacturing child porn.