157
submitted 1 week ago by [email protected] to c/[email protected]

Personally seen this behavior a few times in real life, often with worrying implications. Generously I'd like to believe these people use extruded text as a place to start thinking from, but in practice is seems to me that they tend to use extruded text as a thought-terminating behavior.

IRL, I find it kind of insulting, especially if I'm talking to people who should know better or if they hand me extruded stuff instead of work they were supposed to do.

Online it's just sort of harmless reply-guy stuff usually.

Many people simply straight-up believe LLMs to be genie like figures as they are advertised and written about in the "tech" rags. That bums me out sort of in the same way really uncritical religiosity bums me out.

HBU?

you are viewing a single comment's thread
view the rest of the comments
[-] [email protected] 38 points 1 week ago

I respond in ways like we did when Wikipedia was new: “Show me a source.” … “No GPT is not a source. Ask it for its sources. Then send me the link.” … “No, Wikipedia is not a source, find the link used in that statement and send me its link.”

If you make people at least have to acknowledge that sources are a thing you’ll find the issues go away. (Because none of these assholes will talk to you anymore anyway. ;) )

[-] [email protected] 28 points 1 week ago

GPT will list fake sources. Just in case you aren't aware. Most of these things will.

[-] [email protected] 23 points 1 week ago

My municipality used hallucinated references to justify closing down schools.

[-] [email protected] 5 points 1 week ago
[-] [email protected] 6 points 1 week ago

It made up some references allegedly saying larger schools were better for learning than smaller schools among other things. But the main reason for restructuring the school structure was to save money.

[-] [email protected] 2 points 1 week ago

It's government. Reason doesn't enter into it

[-] [email protected] 2 points 1 week ago

Hallois tjallabais!

[-] [email protected] 10 points 1 week ago

Tracing and verifying sources is standard academic writing procedure. While you definitely can’t trust anything an LLM spits out, you can use them to track down certain types of sources more quickly than search engines. On the other hand, I feel that’s more of an indictment of the late-stage enshittification of search engines, not some special strength of LLMs. If you have to use one, don’t trust it, demand supporting links and references, and verify absolutely everything.

[-] [email protected] 9 points 1 week ago

Yep. 100% aware. That’s one of my points - showing its fake. Sometimes enlightening to some folks.

[-] [email protected] 8 points 1 week ago

indeed they literally cannot cite sources accurately by way of their function 😬

[-] [email protected] 9 points 1 week ago* (last edited 1 week ago)

I’ll still ask the person shoving Ai slop in my face for a source or artist link just to shame these pathetic attempts to pass along slop and misinformation.

Edit for clarity

[-] [email protected] 2 points 1 week ago

You can ask it for whatever you want, it will not provide sources.

[-] [email protected] 8 points 1 week ago* (last edited 1 week ago)

Ask the person shoving Ai slop in my face for their source.

Not going to ask a racist pile of linear algebra for a fake source.

this post was submitted on 15 Jul 2025
157 points (100.0% liked)

Fuck AI

3526 readers
679 users here now

"We did it, Patrick! We made a technological breakthrough!"

A place for all those who loathe AI to discuss things, post articles, and ridicule the AI hype. Proud supporter of working people. And proud booer of SXSW 2024.

founded 1 year ago
MODERATORS