• _cryptagion [he/him]@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      8
      ·
      20 hours ago

      it says “this hidden site”, meaning it was a site on the dark web. It probably took them awhile to figure out were the site was located so they could shut it down.

      • Schadrach@lemmy.sdf.org
        link
        fedilink
        English
        arrow-up
        2
        ·
        7 hours ago

        it says “this hidden site”, meaning it was a site on the dark web.

        Not just on the dark web (which technically is anything not indexed by search engines) but hidden sites are specifically a TOR thing (though Freenet/Hyphanet has something similar but it’s called something else). Usually a TOR hidden site has a URL that ends in .onion and the TOR protocol has a structure for routing .onion addresses.

    • danny161@discuss.tchncs.de
      link
      fedilink
      English
      arrow-up
      36
      arrow-down
      8
      ·
      2 days ago

      That’s unfortunately (not really sure) probably the fault of Germanys approach to that. It is usually not taking these websites down but try to find the guys behind it and seize them. The argument is: they will just use a backup and start a “KidFlix 2” or sth like that. Some investigations show, that this is not the case and deleting is very effective. Also the German approach completely ignores the victim side. They have to deal with old men masturbating to them getting raped online. Very disturbing…

      • Schadrach@lemmy.sdf.org
        link
        fedilink
        English
        arrow-up
        1
        ·
        7 hours ago

        They have to deal with old men masturbating to them getting raped online.

        The moment it was posted to wherever they were going to have to deal with that forever. It’s not like they can ever know for certain that every copy of it ever made has been deleted.

      • Dr. Moose@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        ·
        19 hours ago

        I used to work in netsec and unfortunately government still sucks at hiring security experts everywhere.

        That being said hiring here is extremely hard - you need to find someone with below market salary expectation working on such ugly subject. Very few people can do that. I do believe money fixes this though. Just pay people more and I’m sure every European citizen wouldn’t mind 0.1% tax increase for a more effective investigation force.

        • lennivelkant@discuss.tchncs.de
          link
          fedilink
          English
          arrow-up
          1
          ·
          5 hours ago

          Most cases of “we can’t find anyone good for this job” can be solved with better pay. Make your opening more attractive, then you’ll get more applicants and can afford to be picky.

          Getting the money is a different question, unless you’re willing to touch the sacred corporate profits…

        • Geetnerd@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          ·
          17 hours ago

          Discovery of this kind of thing is as old as civilization.

          Someone runs their mouth, or you catch someone with incrimination evidence on them. Then you lean on them to tell you where to go.

        • Ledericas@lemm.ee
          link
          fedilink
          English
          arrow-up
          1
          ·
          16 hours ago

          they probably make double/triple in the private sector, i doubt govt can match that salary. fb EVEN probalby paid more, before they starte dusing AI to sniff out cp.

          • Dr. Moose@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            16 hours ago

            I’m a senior dev and tbh I’d take a lower salary given the right cause tho having to work with this sort of material is probably the main bottle neck here. I can’t imagine how people working this can even fall asleep.

      • recall519@lemm.ee
        link
        fedilink
        English
        arrow-up
        13
        ·
        1 day ago

        This feels like one of those things where couch critics aren’t qualified. There’s a pretty strong history of three letter agencies using this strategy successfully in other organized crime industries.

        • Geetnerd@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          ·
          17 hours ago

          Like I stated earlier, someone was caught red-handed, and snitched to get a lesser sentence.

      • taladar@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        9
        ·
        1 day ago

        Honestly, if the existing victims have to deal with a few more people masturbating to the existing video material and in exchange it leads to fewer future victims it might be worth the trade-off but it is certainly not an easy choice to make.

        • Geetnerd@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          ·
          edit-2
          16 hours ago

          Well, some pedophiles have argued that AI generated child porn should be allowed, so real humans are not harmed, and exploited.

          I’m conflicted on that. Naturally, I’m disgusted, and repulsed. I AM NOT ADVOCATING IT.

          But if no real child is harmed…

          I don’t want to think about it, anymore.

          • misteloct@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            14 hours ago

            Somehow I doubt allowing it actually meaningfully helps the situation. It sounds like an alcoholic arguing that a glass of wine actually helps them not drink.

          • Ledericas@lemm.ee
            link
            fedilink
            English
            arrow-up
            4
            ·
            16 hours ago

            that is still cp, and distributing CP still harms childrens, eventually they want to move on to the real thing, as porn is not satisfying them anymore.

          • ZILtoid1991@lemmy.world
            link
            fedilink
            English
            arrow-up
            6
            ·
            14 hours ago

            Issue is, AI is often trained on real children, sometimes even real CSAM(allegedly), which makes the “no real children were harmed” part not necessarily 100% true.

            Also since AI can generate photorealistic imagery, it also muddies the water for the real thing.

          • quack@lemmy.zip
            link
            fedilink
            English
            arrow-up
            2
            ·
            6 hours ago

            Understand you’re not advocating for it, but I do take issue with the idea that AI CSAM will prevent children from being harmed. While it might satisfy some of them (at first, until the high from that wears off and they need progressively harder stuff), a lot of pedophiles are just straight up sadistic fucks and a real child being hurt is what gets them off. I think it’ll just make the “real” stuff even more valuable in their eyes.

            • 🎨 Elaine Cortez 🇨🇦 @lemm.ee
              link
              fedilink
              English
              arrow-up
              2
              ·
              6 hours ago

              I feel the same way. I’ve seen the argument that it’s analogous to violence in videogames, but it’s pretty disingenuous since people typically play videogames to have fun and for escapism, whereas with CSAM the person seeking it out is doing so in bad faith. A more apt comparison would be people who go out of their way to hurt animals.

        • yetAnotherUser@discuss.tchncs.de
          link
          fedilink
          English
          arrow-up
          5
          ·
          edit-2
          1 day ago

          It doesn’t though.

          The most effective way to shut these forums down is to register bot accounts scraping links to the clearnet direct-download sites hosting the material and then reporting every single one.

          If everything posted to these forums is deleted within a couple of days, their popularity would falter. And victims much prefer having their footage deleted than letting it stay up for years to catch a handful of site admins.

          Frankly, I couldn’t care less about punishing the people hosting these sites. It’s an endless game of cat and mouse and will never be fast enough to meaningfully slow down the spread of CSAM.

          Also, these sites don’t produce CSAM themselves. They just spread it - most of the CSAM exists already and isn’t made specifically for distribution.

          • taladar@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            3
            arrow-down
            1
            ·
            21 hours ago

            Who said anything about punishing the people hosting the sites. I was talking about punishing the people uploading and producing the content. The ones doing the part that is orders of magnitude worse than anything else about this.

            • yetAnotherUser@discuss.tchncs.de
              link
              fedilink
              English
              arrow-up
              3
              ·
              18 hours ago

              I’d be surprised if many “producers” are caught. From what I have heard, most uploads on those sites are reuploads because it’s magnitudes easier.

              Of the 1400 people caught, I’d say maybe 10 were site administors and the rest passive “consumers” who didn’t use Tor. I wouldn’t put my hopes up too much that anyone who was caught ever committed child abuse themselves.

              I mean, 1400 identified out of 1.8 million really isn’t a whole lot to begin with.

                • yetAnotherUser@discuss.tchncs.de
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  9 hours ago

                  Not quite. Reuploading is at the very least an annoying process.

                  Uploading anything over Tor is a gruelling process. Downloading takes much time already, uploading even more so. Most consumer internet plans aren’t symmetrically either with significantly lower upload than download speeds. Plus, you need to find a direct-download provider which doesn’t block Tor exit nodes and where uploading/downloading is free.

                  Taking something down is quick. A script scraping these forums which automatically reports the download links (any direct-download site quickly removes reports of CSAM by the way - no one wants to host this legal nightmare) can take down thousands of uploads per day.

                  Making the experience horrible leads to a slow death of those sites. Imagine if 95% of videos on [generic legal porn site] lead to a “Sorry! This content has been taken down.” message. How much traffic would the site lose? I’d argue quite a lot.

      • TheProtagonist@lemmy.world
        link
        fedilink
        English
        arrow-up
        23
        ·
        edit-2
        1 day ago

        I think you are mixing here two different aspects of this and of similar past cases. I the past there was often a problem with takedowns of such sites, because german prosecutors did not regard themselves as being in charge of takedowns, if the servers were somewhere overseas. Their main focus was to get the admins and users of those sites and to get them into jail.

        In this specific case they were observing this platform (together with prosecutors from other countries in an orchestrated operation) to gather as much data as possible about the structure, the payment flows, the admins and the users of this before moving into action and getting them arrested. The site was taken down meanwhile.

        If you blow up and delete)such a darknet service immediately upon discovery, you may get rid of it (temporarily) but you might not catch many of the people behind it.

    • Eugene V. Debs' Ghost@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      17
      ·
      18 hours ago

      “See we caught these guys without doing it, thank of how many more we can catch if we do! Like all the terrorists America has caught with violating their privacy. …Maybe some day they will.”

    • Ronno@feddit.nl
      link
      fedilink
      English
      arrow-up
      10
      ·
      14 hours ago

      Basically the only reason I read the article is to know if they needed a “backdoor” in encryption, guess the don’t need it, like everyone with a little bit of IT knowledge always told them.

  • OsrsNeedsF2P@lemmy.ml
    link
    fedilink
    English
    arrow-up
    117
    arrow-down
    1
    ·
    2 days ago

    On average, around 3.5 new videos were uploaded to the platform every hour, many of which were previously unknown to law enforcement.

    Absolutely sick and vile. I hope they honey potted the site and that the arrests keep coming.

  • TJC@lemmy.world
    link
    fedilink
    English
    arrow-up
    5
    ·
    6 hours ago

    Maybe Jeff Bezos will write an article about him and editorialize about “personal liberty”. I have to keep posting this because every day another MAGA/lover - religious bigot or otherwise pretend upstanding community member is indicted or arrested for heinous acts against women and children.

  • GnuLinuxDude@lemmy.ml
    link
    fedilink
    English
    arrow-up
    12
    arrow-down
    25
    ·
    1 day ago

    Every now and again I am reminded of my sentiment that the introduction of “media” onto the Internet is a net harm. Maybe 256 dithered color photos like you’d see in Encarta 95 and that’s the maximum extent of what should be allowed. There’s just so much abuse from this kind of shit… despicable.

    • Blackmist@feddit.uk
      link
      fedilink
      English
      arrow-up
      28
      ·
      1 day ago

      Raping kids has unfortunately been a thing since long before the internet. You could legally bang a 13 year old right up to the 1800s and in some places you still can.

      As recently as the 1980s people would openly advocate for it to be legal, and remove the age of consent altogether. They’d get it in magazines from countries where it was still legal.

      I suspect it’s far less prevalent now than it’s ever been. It’s now pretty much universally seen as unacceptable, which is a good start.

      • mic_check_one_two@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        17
        ·
        1 day ago

        The youngest Playboy model, Eva Ionesco, was only 12 years old at the time of the photo shoot, and that was back in the late 1970’s… It ended up being used as evidence against the Eva’s mother (who was also the photographer), and she ended up losing custody of Eva as a result. The mother had started taking erotic photos (ugh) of Eva when she was only like 5 or 6 years old, under the guise of “art”. It wasn’t until the Playboy shoot that authorities started digging into the mother’s portfolio.

        But also worth noting that the mother still holds copyright over the photos, and has refused to remove/redact/recall photos at Eva’s request. The police have confiscated hundreds of photos for being blatant CSAM, but the mother has been uncooperative in a full recall. Eva has sued the mother numerous times to try and get the copyright turned over, which would allow her to initiate the recall instead.

    • adhdplantdev@lemm.ee
      link
      fedilink
      English
      arrow-up
      5
      ·
      1 day ago

      I think it just shows all the hideousness of humanity and all it’s glory in a way that we have never confronted before. It’s shatters the illusion the humanity has grown from its barbaric ways.

    • TankovayaDiviziya@lemmy.world
      link
      fedilink
      English
      arrow-up
      7
      ·
      1 day ago

      It is easy to very feel disillusioned with the world, but it is important to remember that there are still good people all around willing to fight the good fight. And it is also important to remember that technology is not inherently bad, it is a neutral object, but people could use it for either good or bad purposes.

  • Doctor_Satan@lemm.ee
    link
    fedilink
    English
    arrow-up
    23
    ·
    7 hours ago

    Goddam what an obvious fucking name. If you wrote a procedural cop show where the child traffickers ran a site called KidFlix, you’d be laughed out of the building for being so on-the-nose.

  • j0ester@lemmy.world
    link
    fedilink
    English
    arrow-up
    67
    ·
    20 hours ago

    They also seized 72,000 illegal videos from the site and personal information of its users, resulting in arrests of 1,400 suspects around the world.

    Wow

  • clearedtoland@lemmy.world
    link
    fedilink
    English
    arrow-up
    65
    arrow-down
    1
    ·
    2 days ago

    With everything going on right now, the fact that I still feel physically sick reading things like this tells me I haven’t gone completely numb yet. Just absolutely repulsive.

    • unphazed@lemmy.world
      link
      fedilink
      English
      arrow-up
      17
      arrow-down
      2
      ·
      2 days ago

      I know I’m not heartless yet because I am still traumatized by the brick in the window video…

        • FauxLiving@lemmy.world
          link
          fedilink
          English
          arrow-up
          13
          arrow-down
          1
          ·
          2 days ago

          To sanitize the traumatic video as much as possible: A man is driving under an overpass and a brick is dropped through the passenger side window instantly killing his wife. He reacts in horror.

        • HappyFrog@lemmy.blahaj.zone
          link
          fedilink
          English
          arrow-up
          4
          ·
          1 day ago

          They’re probably referencing the video where a woman was killed after a brick flew through the windshield. I haven’t watched it, but it is on YouTube and I’ve heard that the husband’s cries are not so nice.

          I don’t remember if it was kids throwing bricks off of a bridge or if it was something else.

      • BumpingFuglies@lemmy.zip
        link
        fedilink
        English
        arrow-up
        8
        arrow-down
        1
        ·
        2 days ago

        I’ll probably regret asking, but I’m out of the loop and insatiably curious.

        Brick in the window video?

        • Miles O'Brien@startrek.website
          link
          fedilink
          English
          arrow-up
          21
          arrow-down
          2
          ·
          2 days ago

          If it’s what I’m thinking of, camera footage of a vehicle interior.

          Driving down the highway, going under an overpass when a brick gets tossed by some kids and goes through the window.

          Passenger hit, husband is driving and screams.

          You know that scream they mention in The Princess Bride? That “only someone experiencing ultimate suffering” can make?

          If you know, you know.

          • aviationeast@lemmy.world
            link
            fedilink
            English
            arrow-up
            16
            arrow-down
            1
            ·
            2 days ago

            I’ve never seen that video but I can hear that scream from the husband. That’s some fucked up shit.

          • BumpingFuglies@lemmy.zip
            link
            fedilink
            English
            arrow-up
            14
            arrow-down
            1
            ·
            2 days ago

            Oh no. I remember that video now. I didn’t need to remember that video. Why did I have to ask?!

          • Liz@midwest.social
            link
            fedilink
            English
            arrow-up
            14
            ·
            edit-2
            2 days ago

            It’s not an overpass. A loose brick falls off a truck going in the opposite direction, bounces off the pavement once, then goes through the windshield.

            Edit: oh hurray, there’s two different brick videos.

            • Miles O'Brien@startrek.website
              link
              fedilink
              English
              arrow-up
              3
              ·
              1 day ago

              Well, I know what other video I’m never watching.

              And people wonder why I don’t like being around any vehicle that carries things…

    • T156@lemmy.world
      link
      fedilink
      English
      arrow-up
      10
      ·
      14 hours ago

      Or a Netflix for children/video editing app for primary schoolers in the early 2000s/late 1900s.

  • The_Caretaker@lemm.ee
    link
    fedilink
    English
    arrow-up
    11
    ·
    4 hours ago

    The name of it sounds like a streaming service for children’s movies and TV shows. Like, Netflix for kids. In the past 5 years I have seen at least 3 deepweb social communities that started out normally, with a lot of people talking shit and enjoying anonymous free speech. Then I log in a couple weeks or months later to find CP being posted and no mods doing anything to stop it. In all those cases, I reported the site to the FBI anonymously and erased my login from my password manager.

  • taladar@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    164
    arrow-down
    5
    ·
    2 days ago

    Does it feel odd to anyone else that a platform for something this universally condemned in any jurisdiction can operate for 4 years, with a catchy name clearly thought up by a marketing person, its own payment system and nearly six figure number of videos? I mean even if we assume that some of those 4 years were intentional to allow law enforcement to catch as many perpetrators as possible this feels too similar to fully legal operations in scope.

    • x00z@lemmy.world
      link
      fedilink
      English
      arrow-up
      59
      arrow-down
      6
      ·
      2 days ago

      It’s a side effect of privacy and security. The one side effect they’re trying to use to undermine all of the privacy and security.

      • TheProtagonist@lemmy.world
        link
        fedilink
        English
        arrow-up
        13
        arrow-down
        1
        ·
        1 day ago

        This has nothing to do with privacy! Criminals have their techniques and methods to protect themselves and their “businesses” from discovery, both in the real world and in the online world. Even in a complete absence of privacy they would find a way to hide their stuff from the police - at least for a while.

        In the real world, criminals (e.g. drug dealers) also use cars, so you could argue, that druck trafficking is a side effect of people having cars…

        • x00z@lemmy.world
          link
          fedilink
          English
          arrow-up
          4
          arrow-down
          4
          ·
          1 day ago

          This platform used Tor. And because we want to protect privacy, they can make use of it.

          • sleen@lemmy.zip
            link
            fedilink
            English
            arrow-up
            8
            ·
            1 day ago

            This particular platform used tor. It doesn’t mean all platforms are using privacy centric anonymous networks. There are incidents with people using kik, Snapchat, Facebook and other clear net services to perform criminal actions such as drugs or cp.

        • Cethin@lemmy.zip
          link
          fedilink
          English
          arrow-up
          14
          ·
          1 day ago

          Well, it does have to do with privacy and security, it just doesn’t matter if it’s legal or not for them. These people (in the US) always make a point that criminals will buy guns whether it’s legal or not, but then they’ll argue they need to destroy privacy because criminals are using it. It doesn’t make sense, but it doesn’t need to because honesty or consistency aren’t important.

    • swelter_spark@reddthat.com
      link
      fedilink
      English
      arrow-up
      7
      arrow-down
      1
      ·
      22 hours ago

      It definitely seems weird how easy it is to stumble upon CP online, and how open people are about sharing it, with no effort made, in many instances, to hide what they’re doing. I’ve often wondered how much of the stuff is spread by pedo rings and how much is shared by cops trying to see how many people they can catch with it.

      • Cryophilia@lemmy.world
        link
        fedilink
        English
        arrow-up
        6
        arrow-down
        5
        ·
        21 hours ago

        If you have stumbled on CP online in the last 10 years, you’re either really unlucky or trawling some dark waters. This ain’t 2006. The internet has largely been cleaned up.

        • Ledericas@lemm.ee
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          1
          ·
          16 hours ago

          most definitely not clean lmao, your just not actively searching for it, or stumbling onto it.

        • swelter_spark@reddthat.com
          link
          fedilink
          English
          arrow-up
          6
          ·
          19 hours ago

          I don’t know about that.

          I spot most of it while looking for out-of-print books about growing orchids on the typical file-sharing networks. The term “blue orchid” seems to be frequently used in file names of things that are in no way related to gardening. The eMule network is especially bad.

          When I was looking into messaging clients a couple years ago, to figure out what I wanted to use, I checked out a public user directory for the Tox messaging network and it was maybe 90% people openly trying to find, or offering, custom made CP. On the open internet, not an onion page or anything.

          Then maybe last year, I joined openSUSE’s official Matrix channels, and some random person (who, to be clear, did not seem connected to the distro) invited me to join a room called openSUSE Child Porn, with a room logo that appeared to be an actual photo of a small girl being violated by a grown man.

          I hope to god these are all cops, because I have no idea how there can be so many pedos just openly doing their thing without being caught.

          • Cryophilia@lemmy.world
            link
            fedilink
            English
            arrow-up
            3
            arrow-down
            1
            ·
            15 hours ago

            typical file-sharing networks

            Tox messaging network

            Matrix channels

            I would consider all of these to be trawling dark waters.

              • Cryophilia@lemmy.world
                link
                fedilink
                English
                arrow-up
                1
                arrow-down
                1
                ·
                3 hours ago

                This ain’t the early 2000s. The unwashed masses have found the internet, and it has been cleaned for them. 97% of the internet has no idea what Matrix channels even are.

            • Schadrach@lemmy.sdf.org
              link
              fedilink
              English
              arrow-up
              3
              ·
              6 hours ago

              …and most of the people who agree with that notion would also consider reading Lemmy to be “trawling dark waters” because it’s not a major site run by a massive corporation actively working to maintain advertiser friendliness to maximize profits. Hell, Matrix is practically Lemmy-adjacent in terms of the tech.

        • veeloth@lemm.ee
          link
          fedilink
          English
          arrow-up
          3
          ·
          18 hours ago

          not stumbled upon it but I’ve met a couple people offering it on mostly normal discord servers

        • LustyArgonian@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          1
          ·
          edit-2
          17 hours ago

          Search “AI woman porn miniskirt,” and tell me you don’t see questionable results in the first 2 pages, of women who at least appear possibly younger than 18. Because AI is so heavily corrupted with this content en masse, this has leaked over to Google searches in most porn categories being corrupted with AI seeds that can be anything.

          Fuck, the head guy of Reddit, u/spez, was the main mod of r/jailbait before he changed the design of reddit so he could hide mod names. Also, look into the u/MaxwellHill / Ghilisaine Maxwell conspiracy on Reddit.

          There are very weird, very large movements regarding illegal content (whether you intentionally search it or not) and blackmail and that’s all I will point out for now

          • Cryophilia@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            arrow-down
            1
            ·
            15 hours ago

            Search “AI woman porn miniskirt,”

            Did it with safesearch off and got a bunch of women clearly in their late teens or 20s. Plus, I don’t want to derail my main point but I think we should acknowledge the difference between a picture of a real child actively being harmed vs a 100% fake image. I didn’t find any AI CP, but even if I did, it’s in an entire different universe of morally bad.

            r/jailbait

            That was, what, fifteen years ago? It’s why I said “in the last decade”.

            • LustyArgonian@lemmy.world
              link
              fedilink
              English
              arrow-up
              1
              arrow-down
              1
              ·
              edit-2
              5 hours ago

              “Clearly in their late teens,” lol no. And since AI doesn’t have age, it’s possible that was seeded with the face of a 15yr old and that they really are 15 for all intents and purposes.

              Obviously there’s a difference with AI porn vs real, that’s why I told you to search AI in the first place??? The convo isn’t about AI porn, but AI porn uses images to seed their new images including CSAM

              • Cryophilia@lemmy.world
                link
                fedilink
                English
                arrow-up
                2
                arrow-down
                2
                ·
                3 hours ago

                It’s fucking AI, the face is actually like 3 days old because it is NOT A REAL PERSON’S FACE.

                • LustyArgonian@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  arrow-down
                  2
                  ·
                  edit-2
                  1 hour ago

                  We aren’t even arguing about this, you giant creep who ALWAYS HAS TO GO TO BAT FOR THIS TOPIC REPEATEDLY.

                  It’s meant to LOOK LIKE a 14 yr old because it is SEEDED OFF 14 YR OLDS so it’s indeed CHILD PORN that is EASILY ACCESSED ON GOOGLE per the original commenter claim that people have to be going to dark places to see this - NO, it’s literally in nearly ALL AI TOP SEARCHES. And it indeed counts for LEGAL PURPOSES in MOST STATES as child porn even if drawn or created with AI. How many porn AI models look like Scarlett Johansson because they are SEEDED WITH VER FACE. Now imagine who the CHILD MODELS are seeding from

                  You’re one of the people I am talking about when I say Lemmy has a lot of creepy pedos on it FYI to all the readers, look at their history

      • Ledericas@lemm.ee
        link
        fedilink
        English
        arrow-up
        3
        ·
        edit-2
        16 hours ago

        it can hide in plain sight, and then when you dig into someones profile, it can lead to someone or a group discussing CSAM and beastility, not just CP. like a site similar to r/pics, or porn site. yea sometimes you stumble into a site like that, but it seems to occur when people search for porn outside of the Pornhub and affiliates sites. remember PH sanatized thier site because of this. last decade there was article about an obscure site that was taken down, it had reddit like porn subs,etc. then people were complaining about the csam, and nothing was done about it. it was eventually taken down for legal reasons, thats not related to csam.

        • swelter_spark@reddthat.com
          link
          fedilink
          English
          arrow-up
          1
          ·
          3 hours ago

          I can definitely see how people could find it while looking for porn. I don’t understand how people can do this stuff out in the open with no consequences .

        • Rob1992@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          1
          ·
          16 hours ago

          Pick any country where child marriage is legal and where women are a object the man owns

        • surewhynotlem@lemmy.world
          link
          fedilink
          English
          arrow-up
          4
          arrow-down
          4
          ·
          1 day ago

          Context is important I guess. So two things.

          Is something illegal if it’s not prosecuted?

          Is it CSA if the kid is 9 but that’s marrying age in that country?

          If you answer yes, then no, then we’ll not agree on this topic.

          • taladar@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            7
            ·
            1 day ago

            I am not talking about CSA, I am talking about video material of CSA. Most countries with marriage ages that low have much more wide-spread bans on videos including sex of any kind.

            As for prosecution, yes, it is still illegal if it is not prosecuted. There are many reasons not to prosecute something ranging all the way from resource and other means related concerns to intentionally turning a blind eye and only a small minority of them would lead that country to actively sabotage a major international investigation, especially after the trade-offs are considered (such as loss of international reputation by refusing to cooperate).

    • lumony@lemmings.world
      link
      fedilink
      English
      arrow-up
      10
      ·
      19 hours ago

      It would feel odd, but you have to remember we live in a world where Epstein was allowed to get away with what he did until the little people found out.

      • Geetnerd@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        edit-2
        16 hours ago

        Epstein was very smart, and figured out early on there were many, many rich pedophiles.

        So, he got buddy buddy with them, supplied young girls to them.

        BUT, he filmed the encounters in secret, and blackmailed the shit out of these people.

        He was smart enough to become obscenely rich on Wall Street legitimately, but he liked to bang little girls, found others who did too, and then extorted them.

        There’s an anecdote about how when Epstein was holding court with other Aristos, they would bring up any random subject, to get his opinion.

        What would he say? “What does that have to do with pussy?”

        Many, many people have verified that. But because we was filthy rich, everyone just laughed, and blew it off.

        Epstein was murdered. I’m not a conspiracy nut. It’s just blatantly obvious. The 2 guards on duty admitted to fucking off (bribed,) and were aquitted.

        https://www.nbcnews.com/news/us-news/case-dropped-jail-guards-duty-night-epstein-died-rcna10557

        • Ledericas@lemm.ee
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          1
          ·
          16 hours ago

          trump was his most frequent guest, and he trump had his goon do everything in his power to get rid of the evidence when he was still alive. alot of politicians of different countries are part of it, as are hollywood execs, weinstein was probably the most infamous one.

          • Geetnerd@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            16 hours ago

            This kind of thing is rampant. And it’s not just little girls.

            Heard from Bryan Singer, lately? Corey Feldman’s story about he, and Corey Haim confirm it.

    • deegeese@sopuli.xyz
      link
      fedilink
      English
      arrow-up
      105
      arrow-down
      2
      ·
      2 days ago

      Illegal business can operate online for a long time if they have good OpSec. Anonymous payment systems are much easier these days because of cryptocurrencies.

    • sleen@lemmy.zip
      link
      fedilink
      English
      arrow-up
      8
      arrow-down
      1
      ·
      edit-2
      1 day ago

      With the amount of sites that are easily accessed on the dark net though the hidden wiki and other sites. This might of been a honeypot from the start.

      On the contrary, why would they announce that they seized the site? To cause more panic, and to exaggerate the actual situation?

      In addition, that last point should be considered because even if they used these type of operations, honeypotting would still be considered illegal. So Ultimately what is stopping the supreme power to abuse that power on other people?

      • quack@lemmy.zip
        link
        fedilink
        English
        arrow-up
        2
        ·
        edit-2
        6 hours ago

        No judge would authorise a honeypot that runs for multiple years, hosting original child abuse material meaning that children are actively being abused to produce content for it. That would be an unspeakable atrocity. A few years ago the Australian police seized a similar website and ran it for a matter of weeks to gather intelligence which undoubtedly protected far more children than it harmed and even that was considered too far for many.

        • sleen@lemmy.zip
          link
          fedilink
          English
          arrow-up
          1
          ·
          4 hours ago

          “That would be an unspeakable atrocity”, yet there is contradiction in the final sentence. The issue is, what evidence is there to prove such thing operation actually works, as my last point implied - what stops the government from abusing this sort of operation. With “covert” operations like this the outcome can be catastrophic for everyone.

    • prole@lemmy.blahaj.zone
      link
      fedilink
      English
      arrow-up
      30
      arrow-down
      3
      ·
      1 day ago

      with a catchy name clearly thought up by a marketing person

      A marketing person? They took “Netflix” and changed the first three letters lol

      • imetators@lemm.ee
        link
        fedilink
        English
        arrow-up
        1
        ·
        15 hours ago

        Exactly! There are plethora of *flix sites out there including adult ones. It does not take much of marketing skill to name site like this.

  • Gaxsun@lemmy.zip
    link
    fedilink
    English
    arrow-up
    7
    arrow-down
    1
    ·
    21 hours ago

    If that’s the actual splash screen that pops up when you try to access it (no, I’m not going to go to it and check, I don’t want to be on a new and exciting list) then kudos to the person who put that together. Shit goes hard. So do all the agency logos.

    • quack@lemmy.zip
      link
      fedilink
      English
      arrow-up
      5
      ·
      7 hours ago

      Feds have been stepping up their seized website banner game lately. The one for Genesis Market was pretty cool too.

  • PerogiBoi@lemmy.ca
    link
    fedilink
    English
    arrow-up
    117
    arrow-down
    1
    ·
    2 days ago

    Fuck man. I used to use a program called “Kidpix” when I was a kid. It was like ms paint but with fun effects and sounds.

  • quack@lemmy.zip
    link
    fedilink
    English
    arrow-up
    1
    ·
    7 hours ago

    Excellent work. That’s an unimaginable amount of abuse material.