diz

joined 2 years ago
[–] [email protected] 12 points 2 days ago

That's why I say "sack of shit" and not say "bastard".

[–] [email protected] 13 points 2 days ago* (last edited 2 days ago) (11 children)

The funny thing is, even though I wouldn't expect it to be, it is still a lot more arithmetically sound than what ever is it that is going on with it claiming to use a code interpreter and a calculator to double check the result.

It is OK (7 out of 12 correct digits) at being a calculator and it is awesome at being a lying sack of shit.

[–] [email protected] 7 points 2 days ago

Incels then: Zuckerberg creates a hot-or-not clone with stolen student data, gets away with it, becomes a billionaire.

Incels now: chatgpt, what's her BMI.

[–] [email protected] 6 points 2 days ago

I think I figured it out.

He fed his post to AI and asked it to list the fictional universes he’d want to live in, and that’s how he got Dune. Precisely the information he needed, just as his post describes.

[–] [email protected] 8 points 2 days ago* (last edited 2 days ago)

I am also presuming this is about purely non-fiction technical books

He has Dune on his list of worlds to live in, though...

edit: I know. he fed his post to AI and asked it to list the fictional universes he'd want to live in, and that's how he got Dune. Precisely the information he needed.

[–] [email protected] 6 points 2 days ago* (last edited 2 days ago)

Naturally, that system broke down (via capitalists grabbing the expensive fusion power plants for their own purposes)

This is kind of what I have to give to Niven. The guy is a libertarian, but he would follow his story all the way into such results. And his series where organs are being harvested for minor crimes? It completely flew over my head that he was trying to criticize taxes, and not, say, republican tough-on-crime, mass incarceration, and for profit prisons. Because he followed the logic of the story and it aligned naturally with its real life counterpart, the for profit prison system, even if he wanted to make some sort of completely insane anti tax argument where taxing rich people is like harvesting organs or something.

On the other hand, much better regarded Heinlein, also a libertarian, would write up a moon base that exports organic carbon and where you have to pay for oxygen to convert to CO2. Just because he wanted to make a story inside of which "having to pay for air to breathe" works fine.

[–] [email protected] 12 points 3 days ago (5 children)

Maybe he didn't read Dune he just had AI summarize it.

[–] [email protected] 2 points 4 days ago* (last edited 4 days ago)

Yolo charging mode on a phone, disable the battery overheating sensor and the current limiter.

I suspect that they added yolo mode because without it this thing is too useless.

[–] [email protected] 3 points 4 days ago* (last edited 4 days ago)

There is an implicit claim in the red button that it was worth including.

It is like Google’s AI overviews. There can not be a sufficient disclaimer because the overview being on the top of Google search implies a level of usefulness which it does not meet, not even in the “evil plan to make more money briefly” way.

Edit: my analogy to AI disclaimers is using “this device uses nuclei known to the state of California to…” in place of “drop and run”.

[–] [email protected] 3 points 4 days ago

Jesus Christ on a stick, thats some trice cursed shit.

Maybe susceptibility runs in families, culturally. Religion does, for one thing.

[–] [email protected] 6 points 4 days ago* (last edited 4 days ago) (2 children)

I think this may also be a specific low-level exploit, whereby humans are already biased to mentally "model" anything as having an agency (see all the sentient gods that humans invented for natural phenomena).

I was talking to an AI booster (ewww) in another place and I think they really are predominantly laymen brain fried by this shit. That particular one posted a convo where out of 4 arithmetic operations, 2 were "12042342 can be written as 120423 + 19, and 43542341 as 435423 + 18" combined with AI word-salad, and he was expecting that this would be convincing.

It's not that this particular person thinks its genius, he thinks that it is not a mere computer, and the way it is completely shit at math only serves to prove it to them that it is not a mere computer.

edit: And of course they care not for any mechanistic explanations, because all of those imply LLMs are not sentient, and they believe LLMs are sentient. The "this isn't it but one day some very different system will" counter argument doesn't help either.

[–] [email protected] 6 points 4 days ago

Yeah I think it is almost undeniable chatbots trigger some low level brain thing. Eliza has 27% Turing Test pass rate. And long before that, humans attributed weather and random events to sentient gods.

This makes me think of Langford’s original BLIT short story.

And also of rove beetles that parasitize ant hives. These bugs are not ants but they pass the Turing test for ants - they tap the antennae with an ant and the handshake is correct and they are identified as ants from this colony and not unrelated bugs or ants from another colony.

view more: ‹ prev next ›