19
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
this post was submitted on 14 Jul 2025
19 points (100.0% liked)
TechTakes
2086 readers
47 users here now
Big brain tech dude got yet another clueless take over at HackerNews etc? Here's the place to vent. Orange site, VC foolishness, all welcome.
This is not debate club. Unless it’s amusing debate.
For actually-good tech, you want our NotAwfulTech community
founded 2 years ago
MODERATORS
Here's an example of normal people using Bayes correctly (rationally assigning probabilities and acting on them) while rats Just Don't Get Why Normies Don't Freak Out:
(Dude then goes on to try to game-theorize this, I didn't bother to poke holes in it)
The thing is, genocides have happened, and people around the world are perfectly happy to advocate for it in diverse situations. Probability wise, the risk of genocide somewhere is very close to 1, while the risk of "omnicide" is much closer to zero. If you want to advocate for eliminating something, working to eliminating the risk of genocide is much more rational than working to eliminate the risk of everyone dying.
At least on commenter gets it:
(source)
Edit never read the comments (again). The commenter referenced above obviously didn't feel like a pithy one liner adhered to the LW ethos, and instead added an addendum wondering why people were more upset about police brutality killing people than traffic fatalities. Nice "save", dipshit.
Hmm, should I be more worried and outraged about genocides that are happening at this very moment, or some imaginary scifi scenario dreamed up by people who really like drawing charts?
One of the ways the rationalists try to rebut this is through the idiotic dust specks argument. Deep down, they want to smuggle in the argument that their fanciful scenarios are actually far more important than real life issues, because what if their scenarios are just so bad that their weight overcomes the low probability that they occur?
(I don't know much philosophy, so I am curious about philosophical counterarguments to this. Mathematically, I can say that the more they add scifi nonsense to their scenarios, the more that reduces the probability that they occur.)
reverse dust specks: how many LWers would we need to permanently deprive of access to internet to see rationalist discourse dying out?
What’s your P(that question has been asked at a US three letter agency)
it either was, or wasn't, so 50%
You know, I hadn't actually connected the dots before, but the dust speck argument is basically yet another ostensibly-secular reformulation of Pascal's wager. Only instead of Heaven being infinitely good if you convert there's some infinitely bad thing that happens if you don't do whatever Eliezer asks of you.
Yes, this is why people think that. This is a normal thought to think others have.
Why do these guys all sound like deathnote, but stupid?
because they cribbed their ideas from deathnote, and they're stupid
Here's my unified theory of human psychology, based on the assumption most people believe in the Tooth Fairy and absolutely no other unstated bizarre and incorrect assumptions no siree!
I mean if you want to be exceedingly generous (I sadly have my moments), this is actually remarkably close to the "intentional acts" and "shit happens" distinction, in a perverse Rationalist way. ^^
Thats fair, if you want to be generous, if you are not going to be Id say there are still conceptually large differences between the quote and "shit happens". But yes, you are right. If only they had listened to Scott when he said "talk less like robots"