this post was submitted on 18 Jan 2024
203 points (93.6% liked)
Technology
68639 readers
6384 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
But it should drive cars? Operate strike drones? Manage infrastructure like power grids and the water supply? Forecast tsunamis?
Too little too late, Sam. 
Yes on everything but drone strikes.
A computer would be better than humans in those scenarios. Especially driving cars, which humans are absolutely awful at.
deleted
Teslas aren't self driving cars.
deleted
Well, yes. Elon Musk is a liar. Teslas are by no means fully autonomous vehicles.
deleted
Here's the summary for the wikipedia article you mentioned in your comment:
No true Scotsman, or appeal to purity, is an informal fallacy in which one attempts to protect their generalized statement from a falsifying counterexample by excluding the counterexample improperly. Rather than abandoning the falsified universal generalization or providing evidence that would disqualify the falsifying counterexample, a slightly modified generalization is constructed ad-hoc to definitionally exclude the undesirable specific case and similar counterexamples by appeal to rhetoric. This rhetoric takes the form of emotionally charged but nonsubstantive purity platitudes such as "true", "pure", "genuine", "authentic", "real", etc. Philosophy professor Bradley Dowden explains the fallacy as an "ad hoc rescue" of a refuted generalization attempt.
^to^ ^opt^ ^out^^,^ ^pm^ ^me^ ^'optout'.^ ^article^ ^|^ ^about^
As advanced cruise control, yes. No, but in practice it doesn't change a thing as humans can bomb civilians just fine themselves. Yes and yes.
If we're not talking about LLMs which is basically computer slop made up of books and sites pretending to be a brain, using a tool for statistical analysis to analyze a shitload of data (like optical, acoustic and mechanical data to assist driving or seismic data to forecast tsunamis) is a bit of a no-brainer.