• meyotch@slrpnk.net
    link
    fedilink
    English
    arrow-up
    12
    arrow-down
    1
    ·
    2 days ago

    Which is actually an insane number of possibilities to be evaluated in a very short time. The 10 bits is a measure related to information entropy, not the typical bit units used in common computing parlance. I think that’s where some of the confusion is coming from in other comments on this study.

    So thanks for the explanation, I think it addresses that common misunderstanding well.

    • Obinice@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      2 days ago

      Ooooooooohhhhhhhhh, I was confused when I heard all the news articles on this, which were using bits as a reference, and I was confused how the brain was thousands of times slower than dial up haha, that it’s way more interesting now that I get it!

    • General_Effort@lemmy.worldOP
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      1
      ·
      2 days ago

      The 10 bits is a measure related to information entropy, not the typical bit units used in common computing parlance.

      Those are the same bits, the same units. It’s like the difference between a file containing a screenshot of a text, and a file containing the text as text. It’s a matter of encoding; of what one considers important or not. What is noise and should be discarded, and what is the message and should be reconstructed?

      • Obinice@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        2 days ago

        That wouldn’t make sense though right? 10 bits is barely more than a single byte, if we can only process the equivalent of a single text character per second, how do we… exist?

        • General_Effort@lemmy.worldOP
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 day ago

          I should make one thing more clear: Information is **not **an absolute thing. It’s **not **like mass or temperature.

          You can pick up a rock and determine its mass. You cannot determine its information content. The rock’s information is what facts about it, you want to record or communicate to other people.

          Even if you only wanted to record the rock’s mass, that’s still not an absolute amount of information. If you measure small pebbles and huge boulders, you have to record if the numbers are in grams or tons. And that’s assuming that it’s clear that the numbers give the mass of rocks and not something completely different. Then the record of that rock’s mass is more information because of the different context and not because of anything about the rock itself.

        • General_Effort@lemmy.worldOP
          link
          fedilink
          English
          arrow-up
          1
          ·
          2 days ago

          Think about a video game. There’s a lot going on. There’s millions of pixels on the screen. But when you know the game, your decisions are based on a very high level understanding of what’s going on. Maybe, you recognize what NPC enemy you are facing and what move they are making. If there are 16 different enemies, then that’s only 4 bits of information. If every NPC has 8 different moves, then that’s only another 3 bits.

          You make the high-level decisions based on very few bits of information. You implement the decisions by pushing buttons. When you are skilled, you don’t have to think about the buttons, or what your fingers do. Some part of your brain is taking in a lot of information about what your fingers sense via touch, the angle of the joints (proprioception), and so on. Then a large number of muscles is controlled in perfect harmony to hit the keys.

          The paper gives the estimate that our senses deliver 1 billion bits/s to the brain. But the higher level thinking has a throughput of only a little over 10 bits/s. The paper uses the terms “inner brain” and “outer brain” for that.