It seems like this is generally compatible with the free energy principle: the idea that the part of the brain this preprint refers to as the “outer brain”—the part that processes raw sense perceptions in parallel—maintains a predictive running model of the self and its environment, and only passes to the consciousness (what the paper calls the “inner brain”) the most salient discrepancies between its model and its perceptions. So the “inner brain” is only concerned with the differential between prediction and perception, which (depending on the accuracy of the model) has a much lower bit rate than raw perception.
this feels so much like how thoughts seem to work to me. autopilot often where we seem to be waiting for something unexpected.
That’s a clever thought. Thanks.
Nights of winter turn me cold – fears of dying, getting old. We ran the race and the race was won by running slowly.
If the guesser wins routinely, this suggests that the thinker can access about 220≈1 million possible items in the few seconds allotted.
I’m not sure this premise is sound. Are there not infinitely more than 2^20 permutations of the game?
This would be true if the questions were preset, but the game, in reality, requires the guesser to make choices as the game progresses. These choices can be quite complex, relying on a well developed theory of mind and shared cultural context. Not all the information is internal to the mechanics of the game.
The unspoken rules of the game also require the thinker to pick something that can plausibly be solved. Picking something outlandishly obscure would be frowned upon. The game is partly cooperative in that sense.
If you were to reduce the game to “guess the number I’m thinking of between 0 and infinity”, then it wouldn’t be very fun, it would not persist across time and cultures, and you wouldn’t be studying it. But you might get close to a 0% win rate (or…maybe not?).
I’d guess that most of the “few seconds” the thinker spends is actually to reduce the number of candidates to something reasonable within the context of the game. If that’s true, it says nothing whatsoever about the upper bound of possibilities they are capable of considering.
Idea for further research: establish a “30 questions” game and compare win rates over time. Hypothesis: the win rate in 30 questions would fall to similar levels as with “20 questions” as players gained experience with the new mechanics and optimized their internal selection process.
our brain will never extract more than 10 bits/s
Aren’t there real recorded cases of eidetic memory? E.g. The Mind of a Mnemonist. I have not re-read that book with a mind toward information theory, so perhaps I am overestimating/misremembering the true information content of his memories.
Does it apply to any kind of human brain, including neurodivergent? I am likely a neurodivergent person, and it feels to me like my brain is constantly flooded by a sea of thoughts. And it feels like it’s way more than just 10 bits of information.
It’s an estimate of conscious thought. The brain receives and processes vastly more data, but that happens unconsciously.
10 bits means ten yes/no decisions, or choosing 1 of 1024 possible answers.
Which is actually an insane number of possibilities to be evaluated in a very short time. The 10 bits is a measure related to information entropy, not the typical bit units used in common computing parlance. I think that’s where some of the confusion is coming from in other comments on this study.
So thanks for the explanation, I think it addresses that common misunderstanding well.
Ooooooooohhhhhhhhh, I was confused when I heard all the news articles on this, which were using bits as a reference, and I was confused how the brain was thousands of times slower than dial up haha, that it’s way more interesting now that I get it!
The 10 bits is a measure related to information entropy, not the typical bit units used in common computing parlance.
Those are the same bits, the same units. It’s like the difference between a file containing a screenshot of a text, and a file containing the text as text. It’s a matter of encoding; of what one considers important or not. What is noise and should be discarded, and what is the message and should be reconstructed?
That wouldn’t make sense though right? 10 bits is barely more than a single byte, if we can only process the equivalent of a single text character per second, how do we… exist?
I should make one thing more clear: Information is **not **an absolute thing. It’s **not **like mass or temperature.
You can pick up a rock and determine its mass. You cannot determine its information content. The rock’s information is what facts about it, you want to record or communicate to other people.
Even if you only wanted to record the rock’s mass, that’s still not an absolute amount of information. If you measure small pebbles and huge boulders, you have to record if the numbers are in grams or tons. And that’s assuming that it’s clear that the numbers give the mass of rocks and not something completely different. Then the record of that rock’s mass is more information because of the different context and not because of anything about the rock itself.
Think about a video game. There’s a lot going on. There’s millions of pixels on the screen. But when you know the game, your decisions are based on a very high level understanding of what’s going on. Maybe, you recognize what NPC enemy you are facing and what move they are making. If there are 16 different enemies, then that’s only 4 bits of information. If every NPC has 8 different moves, then that’s only another 3 bits.
You make the high-level decisions based on very few bits of information. You implement the decisions by pushing buttons. When you are skilled, you don’t have to think about the buttons, or what your fingers do. Some part of your brain is taking in a lot of information about what your fingers sense via touch, the angle of the joints (proprioception), and so on. Then a large number of muscles is controlled in perfect harmony to hit the keys.
The paper gives the estimate that our senses deliver 1 billion bits/s to the brain. But the higher level thinking has a throughput of only a little over 10 bits/s. The paper uses the terms “inner brain” and “outer brain” for that.
I have been thinking about this question in some depth.
It basically applies to any human brain and it has nothing to do with neurodivergence.
Think about a video game. There’s a lot going on. There’s millions of pixels on the screen. But when you know the game, your decisions are based on a very high level understanding of what’s going on. Maybe, you recognize what NPC enemy you are facing and what move they are making. If there are 16 different enemies, then that’s only 4 bits of information. If every NPC has 8 different moves, then that’s only another 3 bits.
You make the high-level decisions based on very few bits of information. You implement the decisions by pushing buttons. When you are skilled, you don’t have to think about the buttons, or what your fingers do. Some part of your brain is taking in a lot of information about what your fingers sense via touch, the angle of the joints (proprioception), and so on. Then a large number of muscles is controlled in perfect harmony to hit the keys.
The paper gives the estimate that our senses deliver 1 billion bits/s to the brain. But the higher level thinking has a throughput of only a little over 10 bits/s. I called it conscious thought but that’s not exactly right. The paper talks about the “inner brain” and “outer brain”.
Feeling flooded by information may have to do with lacking the necessary skill in preprocessing to extract the relevant information. In the video game example, I gave above, that would mean not knowing the possible enemies or their moves. It may also reflect a failure to prioritize appropriately. That is, only making the necessary information available to the decision-making process. That is said to be a factor in ADHD.
That feeling might also simply be an illusion, like a déjà vu.
Individual differences in throughput are probably related to intelligence but that’s not related to neurodivergence. People who are perceived as intelligent, for example, use a bigger vocabulary. That is, they chose their words from a larger number of possibilities, which implies a higher information throughput all else equal. However, in terms of bits, that difference is certainly small.
10 bits means picking 1 out of 1024 possibilities. 11 bits allow 1 out of 2048 possibilities. Every bit doubles the number. More bit/s would either allow more different choices or twice as many decisions. Some people seem to think faster than others or seem to consider a bigger space of solutions in the same time. If their throughput is 3 bits higher than average, they would be able to think 8 times faster or consider 8 times more possibilities. Thinking about how fast people talk/write or what vocabulary they use, I think there’s hardly anyone who’s that much above average.
I got the flood in check in childhood and i think i do parallel thought? Or at least i can focus on multiple related things at the same time, depending on working memory (which grew after i was forced to work with Java, despite it being a bad match to my way of thinking).
That sounds plausible. My understanding of one of the key differences between autists and allists is that the autistic brain processes info from the ground up, which is to say that all the details are collected and the interpretation of the data is done by considering the sum of those details. A non-neurodivergent brain pre-summarizes and you get the gist of the data without seeing all the details. This is what makes autistic folks so good at processing large amounts of detail, and so bad/slow at interpretation of social situations.
From what I’ve read, understanding of ADHD is trending towards being seen as similar to autism as it largely boils down to very similar information processing differences.