this post was submitted on 22 May 2024
304 points (100.0% liked)

News

28671 readers
5334 users here now

Welcome to the News community!

Rules:

1. Be civil


Attack the argument, not the person. No racism/sexism/bigotry. Good faith argumentation only. This includes accusing another user of being a bot or paid actor. Trolling is uncivil and is grounds for removal and/or a community ban. Do not respond to rule-breaking content; report it and move on.


2. All posts should contain a source (url) that is as reliable and unbiased as possible and must only contain one link.


Obvious right or left wing sources will be removed at the mods discretion. Supporting links can be added in comments or posted seperately but not to the post body.


3. No bots, spam or self-promotion.


Only approved bots, which follow the guidelines for bots set by the instance, are allowed.


4. Post titles should be the same as the article used as source.


Posts which titles don’t match the source won’t be removed, but the autoMod will notify you, and if your title misrepresents the original article, the post will be deleted. If the site changed their headline, the bot might still contact you, just ignore it, we won’t delete your post.


5. Only recent news is allowed.


Posts must be news from the most recent 30 days.


6. All posts must be news articles.


No opinion pieces, Listicles, editorials or celebrity gossip is allowed. All posts will be judged on a case-by-case basis.


7. No duplicate posts.


If a source you used was already posted by someone else, the autoMod will leave a message. Please remove your post if the autoMod is correct. If the post that matches your post is very old, we refer you to rule 5.


8. Misinformation is prohibited.


Misinformation / propaganda is strictly prohibited. Any comment or post containing or linking to misinformation will be removed. If you feel that your post has been removed in error, credible sources must be provided.


9. No link shorteners.


The auto mod will contact you if a link shortener is detected, please delete your post if they are right.


10. Don't copy entire article in your post body


For copyright reasons, you are not allowed to copy an entire article into your post body. This is an instance wide rule, that is strictly enforced in this community.

founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 18 points 10 months ago (1 children)

I think that’s a bit of a stretch. If it was being marketed as “make your fantasy, no matter how illegal it is,” then yeah. But just because I use a tool someone else made doesn’t mean they should be held liable.

[–] [email protected] 2 points 10 months ago (2 children)

Check my other comments. My thought was compared to a hammer.

Hammers aren't trained to act or respond on their own from millions of user inputs.

[–] [email protected] 10 points 10 months ago (1 children)

Image AIs also don't act or respond on their own. You have to prompt them.

[–] [email protected] 2 points 10 months ago (1 children)

And if I prompted AI for something inappropriate, and it gave me a relevant image, then that means the AI had inappropriate material in it's training data.

[–] [email protected] 13 points 10 months ago (2 children)

No, you keep repeating this but it remains untrue no matter how many times you say it. An image generator is able to create novel images that are not directly taken from its training data. That's the whole point of image AIs.

[–] [email protected] 3 points 10 months ago (1 children)

What it's able and intended to do is besides the point, if it's also capable of generating inappropriate material.

Let me spell it more clearly. AI wouldn't know what a pussy looked like if it was never exposed to that sort of data set. It wouldn't know other inappropriate things if it wasn't exposed to that data set either.

Do you see where I'm going with this? AI only knows what people allow it to learn...

[–] [email protected] 10 points 10 months ago (1 children)

You realize that there are perfectly legal photographs of female genitals out there? I've heard it's actually a rather popular photography subject on the Internet.

Do you see where I'm going with this? AI only knows what people allow it to learn...

Yes, but the point here is that the AI doesn't need to learn from any actually illegal images. You can train it on perfectly legal images of adults in pornographic situations, and also perfectly legal images of children in non-pornographic situations, and then when you ask it to generate child porn it has all the concepts it needs to generate novel images of child porn for you. The fact that it's capable of that does not in any way imply that the trainers fed it child porn in the training set, or had any intention of it being used in that specific way.

As others have analogized in this thread, if you murder someone with a hammer that doesn't make the people who manufactured the hammer guilty of anything. Hammers are perfectly legal. It's how you used it that is illegal.

[–] [email protected] 2 points 10 months ago (1 children)

Yes, I get all that, duh. Did you read the original post title? CSAM?

I thought you could catch a clue when I said inappropriate.

[–] [email protected] 8 points 10 months ago (2 children)

Yes. You're saying that the AI trainers must have had CSAM in their training data in order to produce an AI that is able to generate CSAM. That's simply not the case.

You also implied earlier on that these AIs "act or respond on their own", which is also not true. They only generate images when prompted to by a user.

The fact that an AI is able to generate inappropriate material just means it's a versatile tool.

[–] [email protected] 2 points 10 months ago

Alright, well let's play an innocent hypothetical here.

Let's pretend you only know some magic word model (doesn't exist without thousands or millions of images by the way).

But anyways, let's say you're the AI. Now, with no vision of the world, what would you, as an AI, say if I asked you about how crescent wrenches and channel locks reproduced?

Now try the same hypothetical question again. This time, you actually have a genuine set of images of clean new tools, plus information that tools can't reproduce.

And now let's go to the modern day. Where AI has zillions of images of rusty redneck toolboxes, and a bunch of janky dialogue..

After all that, then where do crowbars come from?

AI is just as dumb as the people using it.

[–] [email protected] 1 points 10 months ago (1 children)
[–] [email protected] 5 points 10 months ago

3,226 suspected images out of 5.8 billion. About 0.00006%. And probably mislabeled to boot, or it would have been caught earlier. I doubt it had any significant impact on the model's capabilities.

[–] [email protected] 1 points 10 months ago

An image generator is able to create novel images that are not directly taken from its training data. That's the whole point of image AIs.

I just want to clarity that you've bought the silicon valley hype for AI but that is very much not the truth. It can create nothing novel - it can merely combine concepts and themes and styles in an incredibly complex manner... but it can never create anything novel.

[–] [email protected] 6 points 10 months ago (1 children)

I learned how to write by reading. The AI did the same, more or less, no?

[–] [email protected] 1 points 10 months ago (1 children)

The AI didn't learn to draw or generate photos from blind words though...

[–] [email protected] 7 points 10 months ago (1 children)

Oh, it learned from art? Like how human artists learn?

[–] [email protected] 1 points 10 months ago (1 children)

AI hasn't exactly kicked out a Picasso with a naked young girl missing an ear yet has it?

I sure hope not!

But if it can, then that seriously indicates it must have some bad training data in the system..

I won't be testing these hypotheses.

[–] [email protected] 3 points 10 months ago (1 children)
[–] [email protected] 2 points 10 months ago

Thank you for posting a relevant link. It's disappointing that such data is any part of any public AI systems.. ☹️