Phanatik

joined 2 years ago
[–] [email protected] 3 points 11 months ago

Are you really asking why someone would buy a game on Steam that they never play?

[–] [email protected] 36 points 11 months ago

Even the Wayback Machine has limits to what is available.

[–] [email protected] 4 points 11 months ago (1 children)

What you're alluding to is the Turing test and it hasn't been proven that any LLM would pass it. At this moment, there are people who have failed the inverse Turing test, being able to acerrtain whether what they're speaking to is a machine or human. The latter can be done and has been done by things less complex than LLMs and isn't proof of an LLMs capabilities over more rudimentary chatbots.

You're also suggesting that it minimises the complexity of its outputs. My determination is that what we're getting is the limit of what it can achieve. You'd have to prove that any allusion to higher intelligence can't be attributed to coercion by the user or it's just hallucinating based on imitating artificial intelligence from media.

There are elements of the model that are very fascinating like how it organises language into these contextual buckets but this is still a predictive model. Understanding that certain words appear near each other in certain contexts is hardly intelligence, it's a sophisticated machine learning algorithm.

[–] [email protected] 4 points 11 months ago (3 children)

I mainly disagree with the final statement on the basis that the LLMs are more advanced predictive text algorithms. The way they've been set up with a chatbox where you're interacting directly with something that attempts human-like responses, gives off the misconception that the thing you're talking to is more intelligent than it actually is. It gives off a strong appearance of intelligence but at the end of the day, it predicts the next word in a sentence based on what was said previously but it doesn't do that good job of comprehending what exactly it's telling you. It's very confident when it gives responses which also means when it's wrong, it's very confidently delivering the incorrect response.

[–] [email protected] 1 points 1 year ago

Tbf it's a compounding issue. It breaks Linux support because Vanguard demands access Linux will never give it which is kernel level.

[–] [email protected] 8 points 1 year ago (3 children)

And why I stopped playing

[–] [email protected] 8 points 1 year ago (1 children)

Ah gotcha so he just never leaves Israel or no one ever acts on the arrest warrant.

[–] [email protected] 7 points 1 year ago (3 children)

So... who is going to be marching into Israel to make the arrest?

[–] [email protected] 2 points 1 year ago

The Age rating is who can use the App, not how long it's been up.

[–] [email protected] 8 points 1 year ago (4 children)

Well, it's because he's an old fuck already so his heinous crimes result in him spending the rest of his worthless life in prison. If he's lucky, he'll die before he reaches 100.

[–] [email protected] 1 points 1 year ago (1 children)

I always mix up roguelike and roguelite so thank you for explaining.

I don't know how they plan to manage the Hideout if they want to have this open world.

[–] [email protected] 33 points 1 year ago (7 children)

It amazes me that people don't notice this is the exact same formula for roguelikes.

 
 
view more: next ›