this post was submitted on 19 Mar 2025
245 points (100.0% liked)
Technology
67151 readers
3796 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
My apologies if it seems "nit-picky". Not my intent. Just that, to my brain, the difference in semantic meaning is very important.
In my thinking, that's exactly what asking "can an LLM achieve sentience?" is, so, I can see the confusion. Because I am strict in classification, it is, to me, literally line asking "can the parahippocampal gyrus achieve sentience?" (probably not by itself - though our meat-computers show extraordinary plasticity... so, maybe?).
Precisely. And I suspect that it is very much related to the constrained context available to any language model. The world, and thought as we know it, is mostly not language. Not everyone has an internal monologue that is verbal/linguistic (some don't even have one and mine tends to be more abstract when not in the context of verbal things) so, it follows that more than linguistic analysis is necessary.