this post was submitted on 02 Jul 2025
381 points (100.0% liked)

Technology

72362 readers
2763 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
 

Schools and lawmakers are grappling with how to address a new form of peer-on-peer image-based sexual abuse that disproportionately targets girls.

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 12 points 2 days ago (10 children)

This definitely will not add in any way to the way women and girls are made to feel entirely disgustingly dehumanized by every man or boy in their lives. Groups of men and boys reducing them and their bodies down to vivid sexual fantasies that they can quickly generate photo realistic images of.

Sexual attraction doesn't necessarily involve dehumanization. Unlike most other kinds of interest in a human being, it doesn't require interest in their personality, but these are logically not the same.

In general you are using emotional arguments for things that work not through emotion, but through literal interpretation. That's like using metric calculations for a system that expects imperial. Utterly useless.

If the person in the image is underaged then it should be classified as child pornography.

No, it's not. It's literally a photorealistic drawing based on a photo (and a dataset to make the generative model). No children have been abused to produce it. Laws work literally.

If the woman who’s photo is being used hasnt consented to this then it should be classified as sexual exploitation.

No, because the woman is not being literally sexually exploited. Her photo being used without consent is, I think, subject of some laws. There are no new fundamental legal entities involved.

Women and girls have faced degrees of this kind of sexual exploitation by men and boys since the latter half of the 20th century. But this is a severe escalation in that behavior. It should be illegal to do this and it should be prosecuted when and where it is found to occur.

I think I agree. But it's neither child pornography nor sexual exploitation and can't be equated to them.

There are already existing laws for such actions, similar to using a photo of the victim and a pornographic photo, paper, scissors, pencils and glue. Or, if you think the situation is radically different, there should be new punishable crimes introduced.

Otherwise it's like punishing everyone caught driving while drunk for non-premeditated murder. One is not the other.

[–] [email protected] 4 points 2 days ago (9 children)

Hey so, at least in the US, drawings can absolutely be considered CSAM

[–] [email protected] 3 points 1 day ago (8 children)

Well, US laws are all bullshit anyway, so makes sense

[–] [email protected] 2 points 1 day ago (1 children)

Normally yeah, but why would you want to draw sexual pictures of children?

[–] [email protected] 4 points 1 day ago (1 children)

Suppose I'm a teenager attracted to people my age. Or suppose I'm medically a pedophile, which is not a crime, and then I would need that.

In any case, for legal and moral purposes "why would you want" should be answered only with "not your concern, go eat shit and die".

[–] [email protected] 2 points 1 day ago (1 children)

I feel like you didn't read my comment thoroughly enough. I said it can constitue CSAM. There is a surprising amount of leewat for teenagers of course.

But no, I'm not gonna let you get away that easily. I want to know the why you think it's morally okay for an adult to draw sexually explicit images of children. Please, tell me how that's okay?

[–] [email protected] 2 points 1 day ago (1 children)

Because morally it's not your fucking concern what others are doing in supposed privacy of their personal spaces.

It seems to be a very obvious thing your nose doesn't belong there and you shouldn't stick it there.

But no, I’m not gonna let you get away that easily.

I don't need any getting away from you, you're nothing.

[–] [email protected] 1 points 1 day ago (1 children)

No. That's not a good enough excuse to potentially be abusing children.

I can't think of a single good reason to draw those kinds of things. Like at all. Please, give me a single good reason.

[–] [email protected] 2 points 1 day ago (1 children)

No. That’s not a good enough excuse to potentially be abusing children.

It's good enough for the person whose opinion counts, your doesn't. And there's no such potential.

I can’t think of a single good reason to draw those kinds of things. Like at all.

Too bad.

Please, give me a single good reason.

To reinforce that your opinion doesn't count is in itself a good reason. The best of them all really.

[–] [email protected] 1 points 1 day ago (1 children)

Okay so you have no reason. Which is because having sexually explicit images, drawn or otherwise, is gross and weird and disturbing. And the fact that you are continually doubling down shows me that you likely need your hard drives and notebooks checked.

Please don't respond again unless you are telling me what country you are from so I can report you to the appropriate authorities.

[–] [email protected] 2 points 1 day ago

People don't need reasons to do things gross or disturbing or whatever for you in their own space.

And the fact that you are continually doubling down shows me that you likely need your hard drives and notebooks checked.

Thankfully that's not your concern, and would get you in jail if you tried to do that yourself. Also I'm too lazy for my porn habits to be secret enough, LOL.

Please don’t respond again unless you are telling me what country you are from so I can report you to the appropriate authorities.

I don't think you understand. You're the fiend here. The kind of obnoxious shit that thinks it's in their right to watch after others' morality.

I wonder, what if I'd try to report you and someone would follow through (unlikely, of course, without anything specific to report), hypothetically, which instances of stalking and privacy violations they'd find?

You really seem the kind.

load more comments (6 replies)
load more comments (6 replies)
load more comments (6 replies)