this post was submitted on 26 Dec 2024
117 points (100.0% liked)

Technology

68918 readers
4152 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
 

Thanks to @[email protected] for the links!

Here’s a link to Caltech’s press release: https://www.caltech.edu/about/news/thinking-slowly-the-paradoxical-slowness-of-human-behavior

Here’s a link to the actual paper (paywall): https://www.cell.com/neuron/abstract/S0896-6273(24)00808-0

Here’s a link to a preprint: https://arxiv.org/abs/2408.10234

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 60 points 3 months ago (8 children)

Bullshit. just reading this and comprehending it, which is thought, far exceeds 10 bits per second.
Speaking which is conveying thought, also far exceed 10 bits per second.

This piece is garbage.

[–] [email protected] 22 points 3 months ago* (last edited 3 months ago) (2 children)

Speaking which is conveying thought, also far exceed 10 bits per second.

There was a study in 2019 that analyzed 17 different spoken languages to analyze how languages with lower complexity rate (bits of information per syllable) tend to be spoken faster in a way that information rate is roughly the same across spoken languages, at roughly 39 bits per second.

Of course, it could be that the actual ideas and information in that speech is inefficiently encoded so that the actual bits of entropy are being communicated slower than 39 per second. I'm curious to know what the underlying Caltech paper linked says about language processing, since the press release describes deriving the 10 bits from studies analyzing how people read and write (as well as studies of people playing video games or solving Rubik's cubes). Are they including the additional overhead of processing that information into new knowledge or insights? Are they defining the entropy of human language with a higher implied compression ratio?

EDIT: I read the preprint, available here. It purports to measure externally measurable output of human behavior. That's an important limitation in that it's not trying to measure internal richness in unobserved thought.

So it analyzes people performing external tasks, including typing and speech with an assumed entropy of about 5 bits per English word. A 120 wpm typing speed therefore translates to 600 bits per minute, or 10 bits per second. A 160 wpm speaking speed translates to 13 bits/s.

The calculated bits of information are especially interesting for the other tasks (blindfolded Rubik's cube solving, memory contests).

It also explicitly cited the 39 bits/s study that I linked as being within the general range, because the actual meat of the paper is analyzing how the human brain brings 10^9 bits of sensory perception down 9 orders of magnitude. If it turns out to be 8.5 orders of magnitude, that doesn't really change the result.

There's also a whole section addressing criticisms of the 10 bit/s number. It argues that claims of photographic memory tend to actually break down into longer periods of study (e.g., 45 minute flyover of Rome to recognize and recreate 1000 buildings of 1000 architectural styles translates into 4 bits/s of memorization). And it argues that the human brain tends to trick itself into perceiving a much higher complexity that it is actually processing (known as "subjective inflation"), implicitly arguing that a lot of that is actually lossy compression that fills in fake details from what it assumes is consistent with the portions actually perceived, and that the observed bitrate from other experiments might not properly categorize the bits of entropy involved in less accurate shortcuts taken by the brain.

I still think visual processing seems to be faster than 10, but I'm now persuaded that it's within an order of magnitude.

[–] [email protected] 4 points 3 months ago

Thanks for the link and breakdown.

It sounds like a better description of the estimated thinking speed would be 5-50 bits per second. And when summarizing capacity/capability, one generally uses a number near the top end. It makes far more sense to say we are capable of 50 bps but often use less, than to say we are only capable of 10 but sometimes do more than we are capable of doing. And the paper leans hard into 10 bps being a internally imposed limit rather than conditional, going as far as saying a neural-computer interface would be limited to this rate.

"Thinking speed" is also a poor description for input/output measurement, akin to calling a monitor's bitrate the computer's FLOPS.

Visual processing is multi-faceted. I definitely don't think all of vision can be reduced to 50bps, but maybe the serial part after the parallel bits have done stuff like detecting lines, arcs, textures, areas of contrast, etc.

[–] [email protected] 3 points 3 months ago (1 children)

with an assumed entropy of about 5 bits per English word. A 120 wpm typing speed therefore translates to 600 bits per minute, or 10 bits per second. A 160 wpm speaking speed translates to 13 bits/s.

The problem here is that the bits of information needs to be clearly defined, otherwise we are not talking about actually quantifiable information. Normally a bit can only have 2 values, here they are talking about very different types of bits, which AFAIK is not a specific quantity.

the human brain tends to trick itself into perceiving a much higher complexity that it is actually processing

This is of course a thing.

[–] [email protected] 3 points 3 months ago

The problem here is that the bits of information needs to be clearly defined, otherwise we are not talking about actually quantifiable information

here they are talking about very different types of bits

I think everyone agrees on the definition of a bit (a binary two-value variable), but the active area of debate is which pieces of information actually matter. If information can be losslessly compressed into smaller representations of that same information, then the smaller compressed size represents the informational complexity in bits.

The paper itself describes the information that can be recorded but ultimately discarded as not relevant: for typing, the forcefulness of each key press or duration of each key press don't matter (but that exact same data might matter for analyzing someone playing the piano). So in terms of complexity theory, they've settled on 5 bits per English word and just refer to other prior papers that have attempted to quantify the information complexity of English.

[–] [email protected] 7 points 3 months ago (1 children)

You may be misunderstanding the bit measure here. It’s not ten bits of information, basically a single byte. It’s ten binary yes/no decisions to equal the evaluation of 1024 distinct possibilities.

The measure comes from information theory but it is easy to confuse it with other uses of ‘bits’.

[–] [email protected] 6 points 3 months ago (2 children)

What? This is the perfectly normal meaning of bits. 2^10 = 1024.

[–] [email protected] 7 points 3 months ago (1 children)

Only when you are framing it in terms of information entropy. I think many of those misunderstanding the study are thinking of bits as part of a standard byte. It’s a subtle distinction but that’s where I think the disconnect is

[–] [email protected] 3 points 3 months ago* (last edited 3 months ago)

Yes, the study is probably fine, it's the article that fails to clarify before using it, that they are not talking about bits the way bits are normally understood.

[–] [email protected] 3 points 3 months ago* (last edited 3 months ago) (3 children)

I think we understand a computer can read this text far faster than any of us. That is not the same as conscious thought though- it’s simply following an algorithm of yes/no decisions.

I’m not arguing with anything here, just pointing out the difference in what CPUs do and what human brains do.

load more comments (3 replies)
[–] [email protected] 5 points 3 months ago* (last edited 3 months ago) (2 children)

Indeed it is. If you want to illustrate the point that silicon and copper are faster than bioelectric lumps of fat there are lots of ways to do this and it's no contest, but this is not a well done study.

load more comments (2 replies)
[–] [email protected] 4 points 3 months ago (2 children)
[–] [email protected] 14 points 3 months ago* (last edited 3 months ago)

There's no plausible way to even encode any arbitrary idea into 10 bits of information.

[–] [email protected] 12 points 3 months ago* (last edited 3 months ago) (1 children)

That doesn't really matter, because 1 bit is merely distinguishing between 1 and zero, or some other 2 component value.
Just reading a single word, you understand the word between about 30000 words you know. That's about 15 bits of information comprehended.
Don't tell me you take more than 1.5 second to read and comprehend one word.

Without having it as text, free thought is CLEARLY much faster, and the complexity of abstract thinking would move the number way up.
1 thought is not 1 bit. But can be thousands of bits.

BTW the mind has insane levels of compression, for instance if you think bicycle, it's a concept that covers many parts. You don't have to think about every part, you know it has a handlebar, frame, pedals and wheels. You also know the purpose of it, the size, weight range of speed and many other more or less relevant details. Just thinking bicycle is easily way more than 10 bits worth of information. But they are "compressed" to only the relevant parts to the context.

Reading and understanding 1 word, is not just understanding a word, but also understanding a concept and putting it into context. I'm not sure how to quantize that, but to quantize it as 1 bit is so horrendously wrong I find it hard to understand how this can in any way be considered scientific.

[–] [email protected] 5 points 3 months ago (3 children)

You are confusing input with throughput. They agree that the input is much greater. It's the throughput that is so slow. Here's the abstract:

This article is about the neural conundrum behind the slowness of human behavior. The information throughput of a human being is about 10 bits/s. In comparison, our sensory systems gather data at ∼1⁢0^9^ bits/s. The stark contrast between these numbers remains unexplained and touches on fundamental aspects of brain function: what neural substrate sets this speed limit on the pace of our existence? Why does the brain need billions of neurons to process 10 bits/s? Why can we only think about one thing at a time? The brain seems to operate in two distinct modes: the “outer” brain handles fast high-dimensional sensory and motor signals, whereas the “inner” brain processes the reduced few bits needed to control behavior. Plausible explanations exist for the large neuron numbers in the outer brain, but not for the inner brain, and we propose new research directions to remedy this.

[–] [email protected] 8 points 3 months ago

Why can we only think about one thing at a time?

Someone tell that to the random tab in my brain who keeps playing music

[–] [email protected] 4 points 3 months ago* (last edited 3 months ago) (1 children)

He's not.

Executive function has limited capacity, but executive function isn't your brain (and there's no reasonable definition that limits it to anything as absurd as 10 bits). Your visual center is processing all those bits that enter the eyes. All the time. You don't retain all of it, but retaining any of it necessarily requires processing a huge chunk of it.

Literally just understanding the concept of car when you see one is much more than 10 bits of information.

[–] [email protected] 8 points 3 months ago (1 children)

I think that we are all speaking without being able to read the paper (and in my case, I know I wouldn't understand it), so I think dismissing it outright without knowing how they are defining things or measuring them is not really the best course here.

I would suggest that Caltech studies don't tend to be poorly-done.

[–] [email protected] 3 points 3 months ago* (last edited 3 months ago) (1 children)

There is literally nothing the paper could say and no evidence they could provide to make the assertion in the title anything less than laughable.

There are hundreds of systems in your brain that are actively processing many, many orders of magnitude more than ten bits of information per second all the time. We can literally watch them do so.

It's possible the headline is a lie by someone who doesn't understand the research. It's not remotely within the realm of plausibility that it resembles reality in any way.

[–] [email protected] 9 points 3 months ago (1 children)

There is literally nothing the paper could say and no evidence they could provide to make the assertion in the title anything less than laughable.

That is quite the claim from someone who has apparently not even read the abstract of the paper. I pasted it in the thread.

[–] [email protected] 3 points 3 months ago* (last edited 3 months ago) (1 children)

It doesn't matter what it says.

A word is more than 10 bits on its own.

[–] [email protected] 9 points 3 months ago (14 children)

You know, dismissing a paper without even taking a minute to read the abstract and basing everything on a headline to claim it's all nonsense is not a good look. I'm just saying.

load more comments (14 replies)
[–] [email protected] 3 points 3 months ago* (last edited 3 months ago) (1 children)

You are confusing input with throughput.

No I'm not, I read that part. Input is for instance hearing a sound wave, which the brain can process at amazing speed, separating a multitude of simultaneous sounds, and translate into meaningful information. Be it music, speech, or a noise that shouldn't be there. It's true that this part is easier to measure, as we can do something similar, although not nearly as well on computers. As we can determine not only content of sounds, but also extrapolate from it in real time. The sound may only be about 2x22k bit, but the processing required is way higher. And that's even more obviously way way way above 10 bit per second.

This is a very complex function that require loads of processing. And can distinguish with microsecond precision it reaches each ear to determine direction.
The same is the case with vision, which although not at all the resolution we think it is, requires massive processing too to interpret into something meaningful.

Now the weird thing is, why in the world do they think consciousness which is even MORE complex, should operate at lower speed? That idea is outright moronic!!!

Edit:

Changed nanosecond to microsecond.

[–] [email protected] 6 points 3 months ago (1 children)

As I suggested to someone else, without any of us actually reading the paper, and I know I do not have the requisite knowledge to understand it if I did, dismissing it with words like "moronic" is not warranted. And as I also suggested, I don't think such a word can generally be applied to Caltech studies. They have a pretty solid reputation as far as I know.

[–] [email protected] 3 points 3 months ago* (last edited 3 months ago) (2 children)

I'm not fucking reading a paper with such ridiculous claims, I gave it a chance, but it simply isn't worth it. And I understand their claims and argumentation perfectly. They simply don't have a clue about the things they make claims about.
I've been investigating and researching these issues for 40 years with an approach from scientific evidence, so please piss off with your claims of me not understanding it.

[–] [email protected] 6 points 3 months ago (1 children)

Without evaluating the data or methodology, I would say that the chance you gave it was not a fair one. Especially since you decided to label it "moronic." That's quite a claim.

[–] [email protected] 3 points 3 months ago* (last edited 3 months ago) (1 children)

It's 100% moronic, they use terminology that clearly isn't fit for the task.

[–] [email protected] 4 points 3 months ago (1 children)

"100% moronic" is an even bolder claim for someone who has not evaluated any of the claims in the paper.

One might even say that calling scientific claims "100%" false is a not especially scientific approach.

[–] [email protected] 3 points 3 months ago (1 children)

If the conclusion is moronic, there's a pretty good chance the thinking behind it is too.
They did get the thing about thinking about one thing at a time right though. But that doesn't change the error of the conclusion.

[–] [email protected] 3 points 3 months ago (1 children)

Again, I would say using the "100%" in science when evaluating something is not a very good term to use. I think you know that.

[–] [email protected] 3 points 3 months ago (1 children)

Yeah OK that's technically correct.

[–] [email protected] 2 points 3 months ago (9 children)

It's also been pointed out that they are using 'bit' in a way people here are not thinking they are using it: https://lemmy.world/comment/14152865

load more comments (9 replies)
[–] [email protected] 3 points 3 months ago (1 children)

What is your realm of research? How have you represented abstract thought by digital storage instead of information content?

[–] [email protected] 3 points 3 months ago* (last edited 3 months ago)

Mostly philosophical, but since I'm also a programmer, I've always had the quantized elements in mind too.

In the year 2000 I estimated human level or general/strong AI by about 2035. I remember because it was during a very interesting philosophy debate at Copenhagen University. Where to my surprise there also were a number of physics majors.
That's supposed to be an actually conscious AI. I suppose the chances of being correct were slim at the time, but now it does seem to be more likely than ever.

[–] [email protected] 3 points 3 months ago (1 children)

You're misunderstanding the terminology used then.

In information theory, "bit" doesn't mean "bitrate" like you'd see in networks, but something closer to "compressed bitrate."

For example, let's say I build a computer that only computes small sums, where the input is two positive numbers from 0-127. However, this computer only understands spoken French, and it will ignore anything that's not a French number in that range. Information theory would say this machine receives 14 bits of information (two 7-bit numbers) and returns 8 bits. The extra processing of understanding French is waste and ignored for the purposes of calculating entropy.

The article also mentions that our brains take in billions of bits of sensory data, but that's ignored for the calculation because we only care about the thought process (the useful computation), not all of the overhead of the system.

load more comments (1 replies)
[–] [email protected] 3 points 3 months ago

You'd be surprised how litle information you need to do too much. 10 bits seems a little low to me tok, but that's splitting things into 1/1024th every second. Not bad.

[–] [email protected] 3 points 3 months ago* (last edited 3 months ago) (3 children)

Try to read this next part without understanding it. If you know English you will find it impossible to NOT find meaning in these letters displayed in a row. That's more like a subconscious processing. If you're learning to read English then there's likely an active "thought" appearing in your experience. See a difference?

[–] [email protected] 3 points 3 months ago (4 children)

I absolutely do, which is why I concentrated on the THOUGHT part, as in understanding. You obviously can't have understanding without thought. That's the difference between data and information.
Please I have 40 years of experience in philosophic issues regarding intelligence and consciousness, also from a programmers perspective.

load more comments (4 replies)
load more comments (2 replies)
load more comments (1 replies)