An international operation against child sexual exploitation, supported by Europol and led by the State Criminal Police of Bavaria (Bayerisches Landeskriminalamt) and the Bavarian Central Office for the Prosecution of Cybercrime (ZCB), has resulted in the shutdown of Kidflix, one of the largest paedophile platforms in the world.
If you have stumbled on CP online in the last 10 years, you’re either really unlucky or trawling some dark waters. This ain’t 2006. The internet has largely been cleaned up.
I spot most of it while looking for out-of-print books about growing orchids on the typical file-sharing networks. The term “blue orchid” seems to be frequently used in file names of things that are in no way related to gardening. The eMule network is especially bad.
When I was looking into messaging clients a couple years ago, to figure out what I wanted to use, I checked out a public user directory for the Tox messaging network and it was maybe 90% people openly trying to find, or offering, custom made CP. On the open internet, not an onion page or anything.
Then maybe last year, I joined openSUSE’s official Matrix channels, and some random person (who, to be clear, did not seem connected to the distro) invited me to join a room called openSUSE Child Porn, with a room logo that appeared to be an actual photo of a small girl being violated by a grown man.
I hope to god these are all cops, because I have no idea how there can be so many pedos just openly doing their thing without being caught.
…and most of the people who agree with that notion would also consider reading Lemmy to be “trawling dark waters” because it’s not a major site run by a massive corporation actively working to maintain advertiser friendliness to maximize profits. Hell, Matrix is practically Lemmy-adjacent in terms of the tech.
This ain’t the early 2000s. The unwashed masses have found the internet, and it has been cleaned for them. 97% of the internet has no idea what Matrix channels even are.
97% of the internet has no idea what Matrix channels even are.
I’ve been able to explain it to people pretty easily as “like Discord, but without Discord administration getting to control what’s allowed, only whoever happens to run that particular server.”
Search “AI woman porn miniskirt,” and tell me you don’t see questionable results in the first 2 pages, of women who at least appear possibly younger than 18. Because AI is so heavily corrupted with this content en masse, this has leaked over to Google searches in most porn categories being corrupted with AI seeds that can be anything.
Fuck, the head guy of Reddit, u/spez, was the main mod of r/jailbait before he changed the design of reddit so he could hide mod names. Also, look into the u/MaxwellHill / Ghilisaine Maxwell conspiracy on Reddit.
There are very weird, very large movements regarding illegal content (whether you intentionally search it or not) and blackmail and that’s all I will point out for now
Did it with safesearch off and got a bunch of women clearly in their late teens or 20s. Plus, I don’t want to derail my main point but I think we should acknowledge the difference between a picture of a real child actively being harmed vs a 100% fake image. I didn’t find any AI CP, but even if I did, it’s in an entire different universe of morally bad.
r/jailbait
That was, what, fifteen years ago? It’s why I said “in the last decade”.
“Clearly in their late teens,” lol no. And since AI doesn’t have age, it’s possible that was seeded with the face of a 15yr old and that they really are 15 for all intents and purposes.
Obviously there’s a difference with AI porn vs real, that’s why I told you to search AI in the first place??? The convo isn’t about AI porn, but AI porn uses images to seed their new images including CSAM
was seeded with the face of a 15yr old and that they really are 15 for all intents and purposes.
That’s…not how AI image generation works? AI image generation isn’t just building a collage from random images in a database - the model doesn’t have a database of images within it at all - it just has a bunch of statistical weightings and net configuration that are essentially a statistical model for classifying images, being told to produce whatever inputs maximize an output resembling the prompt, starting from a seed. It’s not “seeded with an image of a 15 year old”, it’s seeded with white noise and basically asked to show how that white noise looks like (in this case) “woman porn miniskirt”, then repeat a few times until the resulting image is stable.
Unless you’re arguing that somewhere in the millions of images tagged “woman” being analyzed to build that statistical model is probably at least one person under 18, and that any image of “woman” generated by such a model is necessarily underage because the weightings were impacted however slightly by that image or images, in which case you could also argue that all drawn images of humans are underage because whoever drew it has probably seen a child at some point and therefore everything they draw is tainted by having been exposed to children ever.
We aren’t even arguing about this, you giant creep who ALWAYS HAS TO GO TO BAT FOR THIS TOPIC REPEATEDLY.
It’s meant to LOOK LIKE a 14 yr old because it is SEEDED OFF 14 YR OLDS so it’s indeed CHILD PORN that is EASILY ACCESSED ON GOOGLE per the original commenter claim that people have to be going to dark places to see this - NO, it’s literally in nearly ALL AI TOP SEARCHES. And it indeed counts for LEGAL PURPOSES in MOST STATES as child porn even if drawn or created with AI. How many porn AI models look like Scarlett Johansson because they are SEEDED WITH VER FACE. Now imagine who the CHILD MODELS are seeding from
You’re one of the people I am talking about when I say Lemmy has a lot of creepy pedos on it FYI to all the readers, look at their history
I’m offended by stupid-ass shit and feel compelled to call it out. Thinking AI-generated fake images is just as harmful as actual children getting abused is perhaps the most obvious definition of “stupid-ass shit” since the invention of stupid-ass shit.
If you have stumbled on CP online in the last 10 years, you’re either really unlucky or trawling some dark waters. This ain’t 2006. The internet has largely been cleaned up.
I don’t know about that.
I spot most of it while looking for out-of-print books about growing orchids on the typical file-sharing networks. The term “blue orchid” seems to be frequently used in file names of things that are in no way related to gardening. The eMule network is especially bad.
When I was looking into messaging clients a couple years ago, to figure out what I wanted to use, I checked out a public user directory for the Tox messaging network and it was maybe 90% people openly trying to find, or offering, custom made CP. On the open internet, not an onion page or anything.
Then maybe last year, I joined openSUSE’s official Matrix channels, and some random person (who, to be clear, did not seem connected to the distro) invited me to join a room called openSUSE Child Porn, with a room logo that appeared to be an actual photo of a small girl being violated by a grown man.
I hope to god these are all cops, because I have no idea how there can be so many pedos just openly doing their thing without being caught.
I would consider all of these to be trawling dark waters.
…and most of the people who agree with that notion would also consider reading Lemmy to be “trawling dark waters” because it’s not a major site run by a massive corporation actively working to maintain advertiser friendliness to maximize profits. Hell, Matrix is practically Lemmy-adjacent in terms of the tech.
Correct. And even then, there’s vanishingly little CP on lemmy.
File-sharing and online chat seem like basic internet activities to me.
This ain’t the early 2000s. The unwashed masses have found the internet, and it has been cleaned for them. 97% of the internet has no idea what Matrix channels even are.
I’ve been able to explain it to people pretty easily as “like Discord, but without Discord administration getting to control what’s allowed, only whoever happens to run that particular server.”
not stumbled upon it but I’ve met a couple people offering it on mostly normal discord servers
most definitely not clean lmao, your just not actively searching for it, or stumbling onto it.
That’s…what I said.
Search “AI woman porn miniskirt,” and tell me you don’t see questionable results in the first 2 pages, of women who at least appear possibly younger than 18. Because AI is so heavily corrupted with this content en masse, this has leaked over to Google searches in most porn categories being corrupted with AI seeds that can be anything.
Fuck, the head guy of Reddit, u/spez, was the main mod of r/jailbait before he changed the design of reddit so he could hide mod names. Also, look into the u/MaxwellHill / Ghilisaine Maxwell conspiracy on Reddit.
There are very weird, very large movements regarding illegal content (whether you intentionally search it or not) and blackmail and that’s all I will point out for now
Did it with safesearch off and got a bunch of women clearly in their late teens or 20s. Plus, I don’t want to derail my main point but I think we should acknowledge the difference between a picture of a real child actively being harmed vs a 100% fake image. I didn’t find any AI CP, but even if I did, it’s in an entire different universe of morally bad.
That was, what, fifteen years ago? It’s why I said “in the last decade”.
“Clearly in their late teens,” lol no. And since AI doesn’t have age, it’s possible that was seeded with the face of a 15yr old and that they really are 15 for all intents and purposes.
Obviously there’s a difference with AI porn vs real, that’s why I told you to search AI in the first place??? The convo isn’t about AI porn, but AI porn uses images to seed their new images including CSAM
That’s…not how AI image generation works? AI image generation isn’t just building a collage from random images in a database - the model doesn’t have a database of images within it at all - it just has a bunch of statistical weightings and net configuration that are essentially a statistical model for classifying images, being told to produce whatever inputs maximize an output resembling the prompt, starting from a seed. It’s not “seeded with an image of a 15 year old”, it’s seeded with white noise and basically asked to show how that white noise looks like (in this case) “woman porn miniskirt”, then repeat a few times until the resulting image is stable.
Unless you’re arguing that somewhere in the millions of images tagged “woman” being analyzed to build that statistical model is probably at least one person under 18, and that any image of “woman” generated by such a model is necessarily underage because the weightings were impacted however slightly by that image or images, in which case you could also argue that all drawn images of humans are underage because whoever drew it has probably seen a child at some point and therefore everything they draw is tainted by having been exposed to children ever.
It’s fucking AI, the face is actually like 3 days old because it is NOT A REAL PERSON’S FACE.
We aren’t even arguing about this, you giant creep who ALWAYS HAS TO GO TO BAT FOR THIS TOPIC REPEATEDLY.
It’s meant to LOOK LIKE a 14 yr old because it is SEEDED OFF 14 YR OLDS so it’s indeed CHILD PORN that is EASILY ACCESSED ON GOOGLE per the original commenter claim that people have to be going to dark places to see this - NO, it’s literally in nearly ALL AI TOP SEARCHES. And it indeed counts for LEGAL PURPOSES in MOST STATES as child porn even if drawn or created with AI. How many porn AI models look like Scarlett Johansson because they are SEEDED WITH VER FACE. Now imagine who the CHILD MODELS are seeding from
You’re one of the people I am talking about when I say Lemmy has a lot of creepy pedos on it FYI to all the readers, look at their history
I’m offended by stupid-ass shit and feel compelled to call it out. Thinking AI-generated fake images is just as harmful as actual children getting abused is perhaps the most obvious definition of “stupid-ass shit” since the invention of stupid-ass shit.
Removed by mod
No one is out here making stupid-ass claims around child marriage.
Why are YOU so passionate about AI porn? Why do you people make this an issue at all when there are real actual kids out there being harmed?