this post was submitted on 28 Feb 2025
266 points (100.0% liked)

Technology

68567 readers
4430 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
top 50 comments
sorted by: hot top controversial new old
[–] Kusimulkku@lemm.ee 41 points 1 month ago (3 children)

Even in cases when the content is fully artificial and there is no real victim depicted, such as Operation Cumberland, AI-generated CSAM still contributes to the objectification and sexualisation of children.

I get how fucking creepy and downright sickening this all feels, but I'm genuinely surprised that it's illegal or criminal if there's no actual children involved.

It mentions sexual extortion and that's definitely something that should be illegal, same for spreading AI generated explicit stuff about real people without their concent, involving children or adults, but idk about the case mentioned here.

[–] HappySkullsplitter@lemmy.world 19 points 1 month ago

It's certainly creepy and disgusting

It also seems like we're half a step away from thought police regulating any thought or expression a person has that those in power do not like

Exactly. If there's no victim, there's no crime.

[–] Korhaka@sopuli.xyz 5 points 1 month ago* (last edited 1 month ago) (1 children)

It would depend on the country. In the UK even drawn depictions are illegal. I assume it has to at least be realistic and stick figures don't count.

[–] Kusimulkku@lemm.ee 14 points 1 month ago (2 children)

It sounds like a very iffy thing to police. Since drawn stuff doesn't have actual age, how do you determine it? Looks? Wouldn't be great.

[–] JuxtaposedJaguar@lemmy.ml 12 points 1 month ago

Imagine having to argue to a jury that a wolf-human hybrid with bright neon fur is underage because it isn’t similar enough to a wolf for dog years to apply.

[–] jacksilver@lemmy.world 3 points 1 month ago (1 children)

I mean that's the same thing with AI generated content. It's all trained on a wide range of real people, how do you know what's generated isn't depicting an underage person, which is why laws like this are really dangerous.

Exactly. Any time there's subjectivity, it's ripe for abuse.

The law should punish:

  • creating images of actual underage people
  • creating images of actual non-consenting people of legal age
  • knowingly distributing one of the above

Each of those has a clearly identifiable victim. Creating a new work of a fictitious person doesn't have any clearly identifiable victim.

Don't make laws to make prosecution easier, make laws to protect actual people from becoming victims or at least punish those who victimize others.

[–] BrianTheeBiscuiteer@lemmy.world 36 points 1 month ago (2 children)

On one hand I don't think this kind of thing can be consequence free (from a practical standpoint). On the other hand... how old were the subjects? You can't look at a person to determine their age and someone that looks like a child but is actually adult wouldn't be charged as a child pornographer. The whole reason age limits are set is to give reasonable assurance the subject is not being exploited or otherwise harmed by the act.

This is a massive grey area and I just hope sentences are proportional to the crime. I could live with this kind of thing being classified as a misdemeanor provided the creator didn't use underage subjects to train or influence the output.

I could live with this kind of thing being classified as a misdemeanor provided the creator didn’t use underage subjects to train or influence the output.

So could I, but that doesn't make it just. It should only be a crime if someone is actually harmed, or intended to be harmed.

Creating a work about a fictitious individual shouldn't be illegal, regardless of how distasteful the work is.

[–] General_Effort@lemmy.world 2 points 1 month ago (4 children)

It's not a gray area at all. There's an EU directive on the matter. If an image appears to depict someone under the age of 18 then it's child porn. It doesn't matter if any minor was exploited. That's simply not what these laws are about.

Bear in mind, there are many countries where consenting adults are prosecuted for having sex the wrong way. It's not so long ago that this was also the case in Europe, and a lot of people explicitly want that back. On the other hand, beating children has a lot of fans in the same demographic. Some people want to actually protect children, but a whole lot of people simply want to prosecute sexual minorities, and the difference shows.

17 year-olds who exchange nude selfies engage in child porn. I know there have been convictions in the US; not sure about Europe. I know that teachers have been prosecuted when minors sought help when their selfies were being passed around in school, because they sent the images in question to the teacher, and that's possession. In Germany, the majority of suspects in child porn cases are minors. Valuable life lesson for them.

Anyway, what I'm saying is: We need harsher laws and more surveillance to deal with this epidemic of child porn. Only a creep would defend child porn and I am not a creep.

[–] FauxLiving@lemmy.world 40 points 1 month ago (1 children)

There's not an epidemic of child porn.

There's an epidemic of governments wanting greater surveillance powers over the Internet and it is framed as being used to "fight child porn".

So you're going to hear about every single case and conviction until your perception is that there is an epidemic of child porn.

"You can't possibly oppose these privacy destroying laws, after all you're not on the side of child porn are you?"

[–] turnip@sh.itjust.works 4 points 1 month ago (1 children)

Same with misinformation. Where anything they disagree with, in good faith or not, is misinformation.

[–] FauxLiving@lemmy.world 5 points 1 month ago

It's all part of 'manufacturing consent'.

There's plenty of material out in academia about it (as always check your sources), if you want to get into the weeds

[–] duisgur@sh.itjust.works 21 points 1 month ago

Legality is not the same as morality.

[–] BrianTheeBiscuiteer@lemmy.world 12 points 1 month ago (2 children)

It's not a gray area at all. There's an EU directive on the matter. If an image appears to depict someone under the age of 18 then it's child porn.

So a person that is 18 years old, depicted in the nude, is still a child pornographer if they don't look their age? This gives judges and prosecutors too much leeway and I could guarantee there are right-wing judges that would charge a 25yo because it could believed they were 17.

In Germany, the majority of suspects in child porn cases are minors. Valuable life lesson for them.

Is it though? I don't know about the penalties in Germany but in the US a 17yo that takes a nude selfie is likely to be put on a sex offender list for life and have their freedom significantly limited. I'm not against penalties, but they should be proportional to the harm. A day in court followed by a fair amount of community service should be enough of an embarrassment to deter them, not jail.

load more comments (2 replies)
[–] barsoap@lemm.ee 4 points 1 month ago* (last edited 1 month ago)

That's a directive, it's not a regulation, and the directive calling anyone under 18 a child does not mean that everything under 18 is treated the same way in actually applicable law, which directives very much aren't. Germany, for example, splits the whole thing into under 14 and 14-18.

We certainly don't arrest youth for sending each other nudes:

(4) Subsection (1) no. 3, also in conjunction with subsection (5), and subsection (3) do not apply to acts by persons relating to such youth pornographic content which they have produced exclusively for their personal use with the consent of the persons depicted.

...their own nudes, that is. Not that of classmates or whatnot.

[–] Allero@lemmy.today 36 points 1 month ago (4 children)

I'm afraid Europol is shooting themselves in the foot here.

What should be done is better ways to mark and identify AI-generated content, not a carpet ban and criminalization.

Let whoever happens to crave CSAM (remember: sexuality, however perverted or terrible it is, is not a choice) use the most harmless outlet - otherwise, they may just turn to the real materials, and as continuous investigations suggest, there's no shortage of supply or demand on that front. If everything is illegal, and some of that is needed anyway, it's easier to escalate, and that's dangerous.

As sickening as it may sound to us, these people often need something, or else things are quickly gonna go downhill. Give them their drawings.

[–] turnip@sh.itjust.works 9 points 1 month ago (3 children)

You can download the models and compile them yourself, that will be as effective as the US government was at banning encryption.

load more comments (3 replies)
[–] Fungah@lemmy.world 4 points 1 month ago (1 children)

I haven't read any of this research because, like, the only feelings I have about pedophiles are outright contempt and a small amount of pity for the whole fucking destructive evilness of it all, but I've been told having access to drawings and images and whatnot makes people more likely to act on their impulses.

And like. I don't think images of CSAM in any form, no matter how far removed they are from real people, actually contribute anything worthwhile st all yo the world, so like. I dunno.

Really couldn't give two squirts of piss of about anything that makes a pedophiles life harder. Human garbage.

[–] Allero@lemmy.today 19 points 1 month ago

As an advocate for online and offline safety of children, I did read into the research. None of the research I've found confirm with any sort of evidence that AI-generated CSAM materials increase risks of other illicit behavior. We need more evidence, and I do recommend to exercise caution with statements, but for the time being, we can rely on the studies in other forms of illegal behaviors and the effects of their decriminalization, which paint a fairly positive picture. Generally, people will tend to opt for what is legal and more readily accessible - and we can make AI CSAM into exactly that.

For now, people are criminalized for the zero-evidence-its-even-bad crime, while I tend to look quite positively on what it can bring on the table instead.

Also, pedophiles are not human trash, and this line of thinking is also harmful, making more of them hide and never get adequate help from a therapist, increasing their chances of offending. Which, well, harms children.

They are regular people who, involuntarily, have their sexuality warped in a way that includes children. They never chose it, they cannot do anything about it in itself, and can only figure out what to do with it going forward. You could be one, I could be one. What matters is the decisions they take based on their sexuality. The correct way is celibacy and refusion of any sources of direct harm towards children, including the consumption of real CSAM. This might be hard on many, and to aid them, we can provide fictional materials so they could let some steam off. Otherwise, many are likely to turn to real CSAM as a source of satisfaction, or even turn to actually abusing children IRL.

[–] drmoose@lemmy.world 4 points 1 month ago (1 children)

This relies on the idea that "outlet" is not harmful. It might be encouraging even but who do you think even would ever study this to help us know here. Can you imagine the scientists who'd have to be leading studies like this - incredibly grim and difficult subject with high likelihood that no one would listen to you anyway.

[–] aidan@lemmy.world 3 points 1 month ago

IIRC there was actually a study and pedos with access to synthetic CSAM were less likely to victimize real children.

[–] raptir@lemmy.zip 3 points 1 month ago (5 children)

What would stop someone from creating a tool that tagged real images as AI generated?

Have at it with drawings that are easily distinguished, but if anything is photorealistic I feel like it needs to be treated as real.

load more comments (5 replies)
[–] Xanza@lemm.ee 23 points 1 month ago (3 children)

I totally agree with these guys being arrested. I want to get that out of the way first.

But what crime did they commit? They didn't abuse children...they are AI generated and do not exist. What they did is obviously disgusting and makes me want to punch them in the face repeatedly until it's flat, but where's the line here? If they draw pictures of non-existent children is that also a crime?

Does that open artists to the interpretation of the law when it comes to art? Can they be put in prison because they did a professional painting of a child? Like what if they did a painting of their own child in the bath or something? Sure the contents questionable but it's not exactly predatory. And if you add safeguards for these people could then not the predators just claim artistic expression?

It just seems entirely unenforceable and an entire goddamn can of worms...

[–] Allero@lemmy.today 24 points 1 month ago (3 children)

I actually do not agree with them being arrested.

While I recognize the issue of identification posed in the article, I hold a strong opinion it should be tackled in another way.

AI-generated CSAM might be a powerful tool to reduce demand for the content featuring real children. If we leave it legal to watch and produce, and keep the actual materials illegal, we can make more pedophiles turn to what is less harmful and impactful - a computer-generated image that was produced with no children being harmed.

By introducing actions against AI-generated materials, they make such materials as illegal as the real thing, and there's one less reason for an interested party not to go to a CSAM site and watch actual children getting abused, perpetuating the cycle and leading to more real-world victims.

It's strange to me that it is referred to as CSAM. No people are involved so no one is a being sexually assaulted. It's creepy but calling it that implies a drawing is a person to me.

load more comments (2 replies)

Exactly, which is why I'm against your first line, I don't want them arrested specifically because of artistic expression. I think they're absolutely disgusting and should stop, but they're not harming anyone so they shouldn't go to jail.

In my opinion, you should only go to jail if there's an actual victim. Who exactly is the victim here?

load more comments (1 replies)
[–] JuxtaposedJaguar@lemmy.ml 13 points 1 month ago (1 children)

Not going to read the article, but I will say that I understand making hyper-realistic fictional CP illegal, because it would make limiting actual CP impossible.

As long as it’s clearly fictional though, let people get off to whatever imaginary stuff they want to. We might find it disgusting, but there are plenty of sexual genres that most people would find disgusting b yet shouldn’t be illegal.

[–] DoPeopleLookHere@sh.itjust.works 3 points 1 month ago (2 children)

The only way to generate something like that is to teach it something like that from real images.

[–] ifItWasUpToMe@lemmy.ca 14 points 1 month ago (1 children)

I don’t think this is actually true. Pretty sure if you feed it naked adults and clothed children it can figure out the rest.

[–] DoPeopleLookHere@sh.itjust.works 2 points 1 month ago (2 children)

That's not how these image generators work.

How would it know what an age appropriate penis looks like with our, you know, seeing one.

[–] Allero@lemmy.today 9 points 1 month ago (21 children)

That's exactly how they work. According to many articles I've seen in the past, one of the most common models used for this purpose is Stable Diffusion. For all we know, this model was never fed with any CSAM materials, but it seems to be good enough for people to get off - which is exactly what matters.

load more comments (21 replies)
[–] lime@feddit.nu 6 points 1 month ago* (last edited 1 month ago) (5 children)

no, it sort of is. considering style transfer models, you could probably just draw or 3d model unknown details and feed it that.

load more comments (5 replies)
[–] sugar_in_your_tea@sh.itjust.works 12 points 1 month ago (8 children)

only way

That's just not true.

That said, there's a decent chance that existing models use real images, and that is what we should be fighting against. The user of a model has plausible deniability because there's a good chance they don't understand how they work, but the creators of the model should absolutely know where they're getting the source data from.

Prove that the models use illegal material and go after the model creators for that, because that's an actual crime. Don't go after people using the models who are providing alternatives to abusive material.

load more comments (8 replies)
[–] badbytes@lemmy.world 7 points 1 month ago (2 children)

If an underage AI character, is portrayed in say a movie or games, is that wrong? Seems like a very slippery slope.

load more comments (2 replies)
[–] Muscle_Meteor@discuss.tchncs.de 3 points 1 month ago

Followed swiftly by operation jizzberworld

load more comments