this post was submitted on 13 Dec 2023
96 points (92.1% liked)

Technology

68723 readers
3120 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
 

Mind-reading AI can translate brainwaves into written text: Using only a sensor-filled helmet combined with artificial intelligence, a team of scientists has announced they can turn a person’s thou...::A system that records the brain's electrical activity through the scalp can turn thoughts into words with help from a large language model – but the results are far from perfect

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 1 points 1 year ago (1 children)

Auto complete is not a lossy encoding of a database either, it's a product of a dataset, just like you are a product of your experiences, but it is not wholly representative of that dataset.

A wind tunnel is not intelligent because it doesn't answer questions or process knowledge/data it just creates data. A wind tunnel will not answer the question "is this aerodynamic" but you can observe a wind tunnel and use your intelligence to process that and answer the question.

Temperature and randomness don't explain hallucinations, they are a product of inference. If you turned the temperature down to 0 and asked it the question " what happened in the great Christmas fire of 1934" it will give it's best guess of what happened then even though that question is not in it's dataset and it can't look up the answer. The temperature would just mean that between runs it would consistently give the same story, the one that is most statistically probable, as opposed to another one that may be less probable but was pushed up due to randomness. Hallucinations are a product of inference, of taking something at face value then trying to explain it. People will do this too, if you tell someone a lie confidently then ask them about it they will use there intelligence to rationalize a story about what happened.