this post was submitted on 09 Jul 2025
13 points (100.0% liked)

Fuck AI

3433 readers
870 users here now

"We did it, Patrick! We made a technological breakthrough!"

A place for all those who loathe AI to discuss things, post articles, and ridicule the AI hype. Proud supporter of working people. And proud booer of SXSW 2024.

founded 1 year ago
MODERATORS
top 6 comments
sorted by: hot top controversial new old
[–] [email protected] 6 points 2 days ago* (last edited 2 days ago) (1 children)

This definitely has some "Won't somebody PLEASE think of the children?" vibes to it. But the article says it's a "multifaceted" issue and they give some more details.

Simpsons video snippet

Please excuse my a bit more nuanced opinion here, since this might not be the right community for it.

Eric Hartford wrote a good blog article on this very issue.

Main question is, do we want AI to be shaped intransparently by big corporations, and have them shape society and us however they like? Or do "the people" take part in this?

And same argument can be applied to other tech as well: "Linux should be illegal because people can use it to hack computers and send spam." Or the entire internet can be used as a tool for criminal activities. What does that tell us about the internet? This in itself isn't very straightforward in my opinion. It needs to be factored in and regulated. But it's not the same question as should we have AI be part of the world. And outlawing people to take part in this, while other entities can do it, comes with severe implications.

[–] [email protected] 4 points 2 days ago (1 children)

And same argument can be applied to other tech as well: "Linux should be illegal because people can use it to hack computers and send spam." Or the entire internet can be used as a tool for criminal activities. What does that tell us about the internet? This in itself isn't very straightforward in my opinion. It needs to be factored in and regulated. But it's not the same question as should we have AI be part of the world.

I was thinking similar when reading the article. Going by the logic proposed in it, GIMP development team should be responsible for which images will get mashed together using it

[–] [email protected] 5 points 2 days ago* (last edited 2 days ago) (1 children)

Yes. I think it's kind of a non sequitur argument. We also regularly don't ban the internet, knifes or an axe or cars - on the basis that they are misused by some people. It either needs a different conclusion that addresses the misuse. Or we need a different argument to prohibit something in general. But this way it's just a fallacy. And the obvious (false) conclusion (without taking it away from companies as well) will be harmful to everyone. So out of all the possibilities to address the problem, please don't do that.

[–] [email protected] 3 points 2 days ago (1 children)

I'm going to disagree here. We do regulate some of those things, there's plenty of things that are illegal on the internet and many governments including the US will hold you accountable for hosting illegal content. Gun makers have been sued on multiple occasions because of their unsafe practices. Car manufacturers are responsible for selling vehicles that meet minimum safety standards

The creator of a tool should have some level of responsibility for it, even if someone else utilizes their creation nefariously. I'm not saying Sam Altman should be sued for manslaughter when chatGPT tells someone to kill themselves (there's plenty of other things I think he belongs in jail for, but I digress), I just think that being open source doesn't absolve you of the responsibility of putting up guard rails. Not every invention deserves to see the light of day

[–] [email protected] 2 points 2 days ago* (last edited 2 days ago) (1 children)

Right. I'm not sure if I even disagree with that. I completely agree the manufacturer is responsible for their products. And we definitely need guardrails and regulations in place. I'd go even further than current lawmakers and mandate watermarking etc for AI services. And content filters for example for this specific case. I've already reported face-swapping services which violate law (sadly nothing ever came of it).

My main point is: It's very important to get it right. I've linked the best blog article I know about the subject in my first comment. Addressing it by removing open-source is (in my opinion) not going to help much, and it makes the overall situation with AI severely worse.

I can't come up with good analogies here. But I think the solution has to be to address the specific issues and make the tools more safe. Not turn them into a plaything for certain people only. That's likely going to have the opposite effect. And it might not stop the criminals either, depending on how it's done.

I mean a car manufacturer also shouldn't stop you from being able to replace the light bulbs or learn how a car works, just because someone used a car in a crime once. And we don't remove the knifes from my kitchen and replace that with pre-sliced food from the grocery store, so only big companies have knifes available... Or outlaw personal websites and user generated content, so only trusted companies can upload stuff to the internet. Or outlaw Linux so Microsoft and Google/Apple spying is on all our devices. I think it's just not the right means to address the issue... Sure we could do it this way, but that's mainly harming regular people even more.

But I really don't think these are opposites. We still can address issues. (And we should!) It just has to be a sane approach that doesn't do the opposite of what it's trying to do.

Edit: I think what this does is make the robot apocalypse (if we ever get to that) be shaped by Sam Altman and what he likes. We just reinforce Skynet (from Terminator) with that. Plus today people are going to use it, and it'll have the biases and stereotypes that Elon Musk etc like in the answers. And they're going to change the world with their propaganda and perspective. As long as AI is disruptive and has an impact on society, that's really, really bad. And we're stripping anyone else of any capability to take part in it or shape it differently. Or do research that contradicts their bussiness motivations.

[–] [email protected] 2 points 2 days ago

I think we're both dancing around the same ideas. If ~~Pandora~~ openAI hadn't already opened the box and loosed the horrors upon us this would be a different conversation. Open source models do return some of the grossly abused power away from mega-corps which is always a good move. However the creators of those models need to be held to higher standards than I think we hold most projects to online