this post was submitted on 10 Feb 2025
3 points (100.0% liked)
Stuff and Such
76 readers
9 users here now
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
They’re unprofitable by themselves, and adding extra processing to minimize hallucination / add “reasoning” doesn’t make them more profitable.
LLMs are a dead end because they burn 2-3x what they generate in revenue. That is unsustainable for any company in the long run.
We only need look at the evidence to come to this conclusion. There is no need to redefine any terminology.