this post was submitted on 21 May 2024
533 points (100.0% liked)

Technology

71843 readers
4635 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 175 points 1 year ago (21 children)

Mhm I have mixed feelings about this. I know that this entire thing is fucked up but isn't it better to have generated stuff than having actual stuff that involved actual children?

[–] [email protected] 116 points 1 year ago (2 children)

A problem that I see getting brought up is that generated AI images makes it harder to notice photos of actual victims, making it harder to locate and save them

[–] [email protected] 16 points 1 year ago (3 children)

And doesn't the AI learn from real images?

[–] [email protected] 23 points 1 year ago

It does learn from real images, but it doesn't need real images of what it's generating to produce related content.
As in, a network trained with no exposure to children is unlikely to be able to easily produce quality depictions of children. Without training on nudity, it's unlikely to produce good results there as well.
However, if it knows both concepts it can combine them readily enough, similar to how you know the concept of "bicycle" and that of "Neptune" and can readily enough imagine "Neptune riding an old fashioned bicycle around the sun while flaunting it's tophat".

Under the hood, this type of AI is effectively a very sophisticated "error correction" system. It changes pixels in the image to try to "fix it" to matching the prompt, usually starting from a smear of random colors (static noise).
That's how it's able to combine different concepts from a wide range of images to create things it's never seen.

[–] [email protected] 7 points 1 year ago

Basically if I want to create ... (I'll use a different example for obvious reasons, but I'm sure you could apply it to the topic)

... "an image of a miniature denium airjet with Taylor Swift's face on the side of it", the AI generators can despite no such thing existing in the training data. It may take multiple attempts and effort with the text prompt to get exactly what you're looking for, but you could eventually get a convincing image.

AI takes loads of preexisting data on airplanes, T.Swift, and denium to combine it all into something new.

[–] [email protected] 6 points 1 year ago

True, but by their very nature their generations tend to create anonymous identities, and the sheer amount of them would make it harder for investigators to detect pictures of real, human victims (which can also include indicators of crime location.

[–] [email protected] 6 points 1 year ago (1 children)

Well that, and the idea of cathartic relief is increasingly being dispelled. Behaviour once thought to act as a pressure relief for harmful impulsive behaviour is more than likely just a pattern of escalation.

[–] [email protected] 13 points 1 year ago (1 children)

Source? From what I’ve heard, recent studies are showing the opposite.

[–] [email protected] 6 points 1 year ago (12 children)

Catharsis theory predicts that venting anger should get rid of it and should therefore reduce subsequent aggression. The present findings, as well as previous findings, directly contradict catharsis theory (e.g., Bushman et al., 1999; Geen & Quanty, 1977). For reduc- ing anger and aggression, the worst possible advice to give people is to tell them to imagine their provocateur’s face on a pillow or punching bag as they wallop it, yet this is precisely what many pop psychologists advise people to do. If followed, such advice will only make people angrier and more aggressive.

Source

But there's a lot more studies who have essentially said the same thing. The cathartic hypothesis is mainly a byproduct of the Freudian era of psychology, where hypothesis mainly just sounded good to someone on too much cocaine.

Do you have a source of studies showing the opposite?

[–] [email protected] 12 points 1 year ago (5 children)

your source is exclusively about aggressive behavior...

it uses the term "arousal", which is not referring to sexual arousal, but rather a state of heightened agitation.

provide an actual source in support of your claim, or stop spreading misinformation.

load more comments (5 replies)
load more comments (11 replies)
[–] [email protected] 90 points 1 year ago (3 children)

The arrest is only a positive. Allowing pedophiles to create AI CP is not a victimless crime. As others point out it muddies the water for CP of real children, but it also potentially would allow pedophiles easier ways to network in the open (if the images are legal they can easily be platformed and advertised), and networking between abusers absolutely emboldens them and results in more abuse.

As a society we should never allow the normalization of sexualizing children.

[–] [email protected] 42 points 1 year ago (2 children)

Interesting. What do you think about drawn images? Is there a limit to how will the artist can be at drawing/painting? Stick figures vs life like paintings. Interesting line to consider.

[–] [email protected] 32 points 1 year ago (1 children)

If it was photoreal and difficult to distinguish from real photos? Yes, it's exactly the same.

And even if it's not photo real, communities that form around drawn child porn are toxic and dangerous as well. Sexualizing children is something I am 100% against.

[–] [email protected] 4 points 1 year ago* (last edited 1 year ago)

It feels like driving these people into the dark corners of the internet is worse than allowing them to collect in clearnet spaces where drawn csam is allowed.

load more comments (1 replies)
[–] [email protected] 31 points 1 year ago (2 children)

networking between abusers absolutely emboldens them and results in more abuse.

Is this proven or a common sense claim you’re making?

[–] [email protected] 5 points 1 year ago* (last edited 1 year ago) (1 children)

I wouldn't be surprised if it's a mixture of the two. It's kind of like if you surround yourself with criminals regularly, you're more likely to become one yourself. Not to say it's a 100% given, just more probable.

[–] [email protected] 20 points 1 year ago (1 children)

So... its just a claim they're making and you're hoping it has actual backing.

[–] [email protected] 5 points 1 year ago* (last edited 1 year ago) (1 children)

I'm not hoping anything, haha wtf? The comment above me asked if it was a proven statement or common sense and I said I wouldn't be surprised if it's both. I felt confident that if I googled it, there would more than likely be studies backing up a common sense statement like that, as I've read in the past how sending innocent people or people who committed minor misdemeanors to prison has influenced them negatively to commit crimes they might not have otherwise.

And look at that, there are academic articles that do back it up:

https://www.waldenu.edu/online-bachelors-programs/bs-in-criminal-justice/resource/what-influences-criminal-behavior

Negative Social Environment

Who we’re around can influence who we are. Just being in a high-crime neighborhood can increase our chances of turning to crime ourselves.4 But being in the presence of criminals is not the only way our environment can affect our behaviors. Research reveals that simply living in poverty increases our likelihood of being incarcerated. When we’re having trouble making ends meet, we’re under intense stress and more likely to resort to crime.

https://www.law.ac.uk/resources/blog/is-prison-effective/

Time in prison can actually make someone more likely to commit crime — by further exposing them to all sorts of criminal elements.

Etc, etc.

Turns out that your dominant social group and environment influences your behavior, what a shocking statement.

[–] [email protected] 4 points 1 year ago (1 children)

But you didn't say you had proof with your comment, you said it was probable. Basically saying its common sense that its proven.

Why are you getting aggressive about actually having to provide proof about something when saying its obvious?

Also, that seems to imply that locking up people for AI offenses would then encourage truly reprehensible behavior by linking them with those who already engage in it.

Almost like lumping people together as one big group, instead of having levels of grey area, means people are more likely to just go all in instead of sticking to something more morally defensible.

[–] [email protected] 8 points 1 year ago (1 children)

Because it's a casual discussion, I think it's obnoxious when people constantly demand sources to be cited in online comments section when they could easily look it up themselves. This isn't some academic or formal setting.

And I disagree, only the second source mentioned prisons explicitly. The first source mentions social environments as well. So it's a damned if you do, damned if you don't situation. Additionally, even if you consider the second source, that source mentions punishment reforms to prevent that undesirable side effect from occuring.

I find it ironic that you criticized me for not citing sources and then didn't read the sources. But, whatever. Typical social media comments section moment.

[–] [email protected] 7 points 1 year ago (1 children)

I think it's obnoxious when people constantly demand sources to be cited in online comments section when they could easily look it up themselves.

People request sources because people state their opinions as fact. If that’s how it’s presented then asking for a source is ok. Its either ask for a source or completely dismiss the comment.

[–] [email protected] 4 points 1 year ago

Again, in casual conversation where no one was really debating, it's obnoxious. When you're talking to friends in real life and they say something, do you request sources from them? No, because it'd be rude and annoying. If you were debating them in earnest and you both disagreed on something, sure, that would be expected.

But that wasn't the case here, the initial statement was common sense: If pedophiles are allowed to meet up and trade AI generated child sex abuse material, would that cause some of them to be more likely to commit crimes against real kids? And I think the answer is pretty obvious. The more you hang around people who agree with you, the more an echo chamber is cultivated. It's like an alcoholic going into a bar without anyone there to support them in staying sober.

Anyway, it's your opinion to think asking for sources from strangers in casual conversation is okay, and it's mine to say it can be annoying in a lot of circumstances. We all have the Internet at our fingertips, look it up in the future if you're unsure of someone's assertion.

load more comments (1 replies)
[–] [email protected] 20 points 1 year ago* (last edited 1 year ago) (2 children)

Actually, that's not quite as clear.

The conventional wisdom used to be, (normal) porn makes people more likely to commit sexual abuse (in general). Then scientists decided to look into that. Slowly, over time, they've become more and more convinced that (normal) porn availability in fact reduces sexual assault.

I don't see an obvious reason why it should be different in case of CP, now that it can be generated.

load more comments (2 replies)
[–] [email protected] 36 points 1 year ago (2 children)

Did we memory hole the whole ‘known CSAM in training data’ thing that happened a while back? When you’re vacuuming up the internet you’re going to wind up with the nasty stuff, too. Even if it’s not a pixel by pixel match of the photo it was trained on, there’s a non-zero chance that what it’s generating is based off actual CSAM. Which is really just laundering CSAM.

[–] [email protected] 33 points 1 year ago (1 children)

IIRC it was something like a fraction of a fraction of 1% that was CSAM, with the researchers identifying the images through their hashes but they weren't actually available in the dataset because they had already been removed from the internet.

Still, you could make AI CSAM even if you were 100% sure that none of the training images included it since that's what these models are made for - being able to combine concepts without needing to have seen them before. If you hold the AI's hand enough with prompt engineering, textual inversion and img2img you can get it to generate pretty much anything. That's the power and danger of these things.

load more comments (1 replies)
[–] [email protected] 12 points 1 year ago (1 children)
[–] [email protected] 8 points 1 year ago

Fair but depressing, it seems like it barely registered in the news cycle.

[–] [email protected] 28 points 1 year ago (10 children)

Yeah, it’s very similar to the “is loli porn unethical” debate. No victim, it could supposedly help reduce actual CSAM consumption, etc… But it’s icky so many people still think it should be illegal.

There are two big differences between AI and loli though. The first is that AI would supposedly be trained with CSAM to be able to generate it. An artist can create loli porn without actually using CSAM references. The second difference is that AI is much much easier for the layman to create. It doesn’t take years of practice to be able to create passable porn. Anyone with a decent GPU can spin up a local instance, and be generating within a few hours.

In my mind, the former difference is much more impactful than the latter. AI becoming easier to access is likely inevitable, so combatting it now is likely only delaying the inevitable. But if that AI is trained on CSAM, it is inherently unethical to use.

Whether that makes the porn generated by it unethical by extension is still difficult to decide though, because if artists hate AI, then CSAM producers likely do too. Artists are worried AI will put them out of business, but then couldn’t the same be said about CSAM producers? If AI has the potential to run CSAM producers out of business, then it would be a net positive in the long term, even if the images being created in the short term are unethical.

[–] [email protected] 24 points 1 year ago (3 children)

Just a point of clarity, an AI model capable of generating csam doesn't necessarily have to be trained on csam.

load more comments (3 replies)
load more comments (9 replies)
[–] [email protected] 21 points 1 year ago (4 children)

You know whats better? Having none of this shit

[–] [email protected] 18 points 1 year ago

Did you just fix menal health?

[–] [email protected] 14 points 1 year ago

Yeah as I also said.

[–] [email protected] 7 points 1 year ago

Better for whom and why?

[–] [email protected] 5 points 1 year ago* (last edited 1 year ago)

Nirvana fallacy

Yeah would be nice. Unfortunelately it isn't so and it's never going to. Chasing after people generating distasteful AI pictures is not making the world a better place.

[–] [email protected] 14 points 1 year ago (2 children)

I have trouble with this because it's like 90% grey area. Is it a pic of a real child but inpainted to be nude? Was it a real pic but the face was altered as well? Was it completely generated but from a model trained on CSAM? Is the perceived age of the subject near to adulthood? What if the styling makes it only near realistic (like very high quality CG)?

I agree with what the FBI did here mainly because there could be real pictures among the fake ones. However, I feel like the first successful prosecution of this kind of stuff will be a purely moral judgement of whether or not the material "feels" wrong, and that's no way to handle criminal misdeeds.

[–] [email protected] 15 points 1 year ago

If not trained on CSAM or in painted but fully generated, I can't really think of any other real legal arguments against it except for: "this could be real". Which has real merit, but in my eyes not enough to prosecute as if it were real. Real CSAM has very different victims and abuse so it needs different sentencing.

load more comments (1 replies)
[–] [email protected] 9 points 1 year ago (2 children)

It feeds and evolves a disorder which in turn increases risks of real life abuse.

But if AI generated content is to be considered illegal, so should all fictional content.

[–] [email protected] 30 points 1 year ago* (last edited 1 year ago) (1 children)

Or, more likely, it feeds and satisfies a disorder which in turn decreases risk of real life abuse.

Making it illegal so far helped nothing, just like with drugs

load more comments (1 replies)
[–] [email protected] 6 points 1 year ago (2 children)

Two things:

  1. Do we know if fuels the urge to get real children? Or do we just assume that through repetition like the myth of "gateway drugs"?
  2. Since no child was involved and harmed in the making of these images... On what grounds could it be forbidden to generate them?
[–] [email protected] 4 points 1 year ago* (last edited 1 year ago) (1 children)

Alternative perspective is to think that does watching normal porn make heterosexual men more likely to rape women? If not then why would it be different in this case?

The vast majority of pedophiles never offend. Most people in jail for child abuse are just plain old rapists with no special interest towards minors, they're just an easy target. Pedophilia just describes what they're attracted to. It's not a synonym to child rapist. It usually needs to coinside with psychopathy to create the monster that most people think about when hearing that word.

load more comments (1 replies)
load more comments (1 replies)
[–] [email protected] 9 points 1 year ago

Apparently he sent some to an actual minor.

[–] [email protected] 3 points 1 year ago

It reminds me of the story of the young man who realized he had an attraction to underage children and didn't want to act on it, yet there were no agencies or organizations to help him, and that it was only after crimes were committed that anyone could get help.

I see this fake cp as only a positive for those people. That it might make it difficult to find real offenders is a terrible reason against.

[–] [email protected] 3 points 1 year ago

Better only means less worse in this case, I guess

load more comments (11 replies)