this post was submitted on 02 Jul 2025
383 points (100.0% liked)

Technology

72362 readers
2900 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
 

Schools and lawmakers are grappling with how to address a new form of peer-on-peer image-based sexual abuse that disproportionately targets girls.

(page 2) 50 comments
sorted by: hot top controversial new old
[–] [email protected] 27 points 2 days ago (1 children)

Jfc the replies here are fucking rancid. Lemmy is full of sweaty middle aged blokes in tech who hate it when anyone tells them that grown men who pursue teenage girls who have just reached an arbitrary age are fucking creeps, so of course they're here encouraging the next generation of misogynist scum by defending this shit, too.
And men (pretend to) wonder why we distrust them.

Ngl, I'm only leaving reply notifs on for this one to work on my blocklist.

[–] [email protected] 14 points 2 days ago

Yeah there’s some nasty shit here. Big yikes, Lemmy.

[–] [email protected] 16 points 2 days ago (6 children)

Welp, if I had kids they would have one of those scramble suits like in a scanner darkly.

It would of course be their choice to wear them but Id definitely look for ways to limit their time in areas with cameras present.

load more comments (6 replies)
[–] [email protected] 71 points 2 days ago (9 children)

Honestly I think we need to understand that this is no different to sticking a photo of someone's head on a porn magazine photo. It's not real. It's just less janky.

I would categorise it as sexual harassment, not abuse. Still serious, but a different level

[–] [email protected] 51 points 2 days ago (6 children)

Schools generally means it involves underage individuals, which makes any content using them csam. So in effect, the "AI" companies are generating a ton of csam and nobody is doing anything about it.

[–] [email protected] 23 points 2 days ago (9 children)

Disagree. Not CSAM when no abuse has taken place.

That's my point.

[–] [email protected] 8 points 2 days ago* (last edited 2 days ago)

Except, you know, the harassment and abuse of said deepfaked individual. Which is sexual in nature. Sexual harassment and abuse of a child using materials generated based on the child's identity.

Maybe we could have a name for it. Something like Child-based sexual harassment and abuse material... CSHAM, or maybe just CSAM, you know, to remember it more easily.

[–] [email protected] 21 points 2 days ago (1 children)

I think generating and sharing sexually explicit images of a person without their consent is abuse.

That's distinct from generating an image that looks like CSAM without the involvement of any real child. While I find that disturbing, I'm morally uncomfortable criminalizing an act that has no victim.

load more comments (1 replies)
load more comments (7 replies)
load more comments (5 replies)
[–] [email protected] 37 points 2 days ago* (last edited 2 days ago) (44 children)

Yes, finding out that your peers have been sharing deep fake pornography of you is absolutely fine and a normal thing for young girls to go through in school. No girls have ever killed themselves because of this exact sort of thing, surely. This definitely will not add in any way to the way women and girls are made to feel entirely disgustingly dehumanized by every man or boy in their lives. Groups of men and boys reducing them and their bodies down to vivid sexual fantasies that they can quickly generate photo realistic images of.

If the person in the image is underaged then it should be classified as child pornography. If the woman who's photo is being used hasnt consented to this then it should be classified as sexual exploitation.

Women and girls have faced degrees of this kind of sexual exploitation by men and boys since the latter half of the 20th century. But this is a severe escalation in that behavior. It should be illegal to do this and it should be prosecuted when and where it is found to occur.

load more comments (44 replies)
load more comments (7 replies)
[–] [email protected] 28 points 2 days ago* (last edited 2 days ago)

I’m sure the laws will focus on protecting IP - specifically that of AI companies or megacorps, the famous and powerful, but not the small creators of content or the rabble negatively affected by AI abuse.

The rest of us will have to suffer through presenting whatever damaging and humiliating video to a court. If you can even afford a lawyer to do so. Then be offered a judgement that probably won’t be paid or won’t cover the damage done by an image that will never be able to be erased from the internet. Those damages could include the suicide of young people bullied and humiliated by such deepfakes.

[–] [email protected] 1 points 1 day ago (4 children)

anyone using any kind of AI either doesn't know how consent works-- or they don't care about it.

a horrifying development in the intersection of technofascism and rape culture

load more comments (4 replies)
[–] [email protected] 23 points 2 days ago (5 children)

So is this a way to take away rights by making it about kids?

I mean what the fuck. We did much less and got punished right? It didn't matter if we were on the property. Schools can hold students accountable for conduct with other students.

The leaded-gas adults of the time had no problem dealing with the emergence of cell phones. It was a distraction. They didn't need lawmakers to call it something specific. My Pokemon cards caused fights and were banned. No lawmakers needed.

The problem is surely with the interaction between parents and schools. Or maybe it's just the old way of thinking. Maybe it's better to have police and courts start taking over discipline in schools.

load more comments (5 replies)
[–] [email protected] 17 points 2 days ago (5 children)

My mama always told me, that if someone makes a deepfake of you, then you make a deepfake of them right back!

load more comments (5 replies)
[–] [email protected] 11 points 2 days ago (4 children)

I don't understand fully how this technology works, but, if people are using it to create sexual content of underage individuals, doesn't that mean the LLM would need to have been trained on sexual content of underage individuals? Seems like going after the company and whatever it's source material is would be the obvious choice here

[–] [email protected] 10 points 1 day ago (4 children)

I agree with the other comments, but wanted to add how deepfakes work to show how simple they are, and how much less information they need than LLMs.

Step 1: Basically you take a bunch of photos and videos of a specific person, and blur their faces out.

Step 2: This is the hardest step, but still totally feasable for a decent home computer. You train a neural network to un-blur all the faces for that person. Now you have a neural net that's really good at turning blurry faces into that particular person's face.

Step 3: Blur the faces in photos/videos of other people and apply your special neural network. It will turn all the blurry faces into the only face it knows how, often with shockingly realistic results.

load more comments (4 replies)
load more comments (3 replies)
load more comments
view more: ‹ prev next ›