this post was submitted on 23 Mar 2025
996 points (100.0% liked)
memes
13636 readers
2417 users here now
Community rules
1. Be civil
No trolling, bigotry or other insulting / annoying behaviour
2. No politics
This is non-politics community. For political memes please go to [email protected]
3. No recent reposts
Check for reposts when posting a meme, you can only repost after 1 month
4. No bots
No bots without the express approval of the mods or the admins
5. No Spam/Ads
No advertisements or spam. This is an instance rule and the only way to live.
A collection of some classic Lemmy memes for your enjoyment
Sister communities
- [email protected] : Star Trek memes, chat and shitposts
- [email protected] : Lemmy Shitposts, anything and everything goes.
- [email protected] : Linux themed memes
- [email protected] : for those who love comic stories.
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
AP is supposed to disable itself if a fault or abnormality is detected. Pretty much all advanced cruise control systems do this.
I don’t think it’s fair to say the car was hiding evidence of AP being used unless it was intentionally logging the data in shady way. We’d need to see the logs of the car, and there are some roundabout ways for a consumer to pull those. That would probably be an interesting test for someone on YouTube to run.
These systems disable right before a crash because the national traffic safety org in the US requires manufacturers to report if these systems were engaged during an accident.
It is not for safety or because of a malfunction, it's for marketing. Car companies dont want the features that they sell for 3-8k coming up all the time in crash statistics.
Tesla is the biggest offender here, likely due to vehicles sold, but also due to their camera only system and their aggressively false "full self driving" and "autopilot" marketing that far over promises.
Just saying I’d like to see some more data. I get that Musk is not someone who should be trusted. Especially if it’s around complying with regulators.
That said, I could see that system being disengaged by some intended safety triggers.
At the very least the system should initiate an emergency break when it disengages like that and there is no conflicting human input.
100% agree. My stupid Volvo does that, and it doesn’t have lidar or a million cameras around it.