this post was submitted on 13 Jul 2025
678 points (100.0% liked)
Comic Strips
18211 readers
1693 users here now
Comic Strips is a community for those who love comic stories.
The rules are simple:
- The post can be a single image, an image gallery, or a link to a specific comic hosted on another site (the author's website, for instance).
- The comic must be a complete story.
- If it is an external link, it must be to a specific story, not to the root of the site.
- You may post comics from others or your own.
- If you are posting a comic of your own, a maximum of one per week is allowed (I know, your comics are great, but this rule helps avoid spam).
- The comic can be in any language, but if it's not in English, OP must include an English translation in the post's 'body' field (note: you don't need to select a specific language when posting a comic).
- Politeness.
- Adult content is not allowed. This community aims to be fun for people of all ages.
Web of links
- [email protected]: "I use Arch btw"
- [email protected]: memes (you don't say!)
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
To expand on this a bit AI in medicine is getting super good at cancer screening in specific use cases.
People now heavily associate it with LLMs hallucinating and speaking out of their ass but forget about how AI completely destroys people at chess. AI is already getting better than top physics models at weather predicting, hurricane paths, protein folding and a lot of other use cases.
AI's uses in specific well defined problems with a specific outcome can potentially become way more accurate than any human can. It's not so much about removing humans but handing humans tools to make medicine both more effective and efficient at the same time.
The problem is the use of ai in everything as a generic term. Algorithms have been around for awhile and im pretty sure the ai cancer detections are machine learning that are not at all related to LLMs.
Yeah absolutely, I'm specifically talking about AI as a neural network/reinforcement learning/machine learning and whatnot. Top of the line weather algorithms are now less accurate than neural networks.
LLMs as doctors are pretty garbage since they're predicting words instead of classifying a photo into yes/no or detecting which part of the sleep cycle a sleeping patient is in.
Fun fact, the closer you get the actual math the less magical the words become. Marketing says "AI", programming says "machine learning" or "neural network", mathematicians say "reinforcement learning".
I guess I worked with a guy working with algorithms and neural networks so I sorta just equated them. I was very obviously not a CS major.
Maybe it was my CS major talking there. An algorithm is a sequence of steps to reach a desired outcome such as updating a neural network. The network itself is essentially just a big heap of values you multiply through if you were curious.