• General_Effort@lemmy.world
        link
        fedilink
        English
        arrow-up
        5
        arrow-down
        2
        ·
        4 days ago

        Someone who can’t tell from the headline what kind of “Bit” is meant is probably not going to be helped by that comment.

    • givesomefucks@lemmy.world
      link
      fedilink
      English
      arrow-up
      8
      arrow-down
      6
      ·
      4 days ago

      All of the AI craze is predicated on “all it takes is simulating all the neurons”.

      So when we got close to that, people dropped billions into AI and keep insisting it’s around the corner

      The problem is we don’t know what consciousness is yet, so how the fuck are we gonna re-create?

      The smartest living physicist has spent 40 years insisting there’s a quantum compenont, but up until 5 months ago we didn’t see anyway for a quantum process to be stable in a human brain.

      We know now microtubules can create a larger tube that functions as a fiber optic cable and allow quantum superposition to be sustained much longer than ever thought.

      We probably have century at least before real AI.

      There’s a reason Elmo’s “robot” was just a Mechanical Turk

      • Buffalox@lemmy.world
        link
        fedilink
        English
        arrow-up
        7
        arrow-down
        2
        ·
        edit-2
        4 days ago

        I don’t see any requirement for a quantum theory element for consciousness to work. I also don’t agree that we don’t know what it is.
        If the brain has such elements, you must remember that it’s purely by chance, and there may very well be other ways to have similar functionality.
        We do have the advantage when trying to create a strong AI, that we are not nearly as energy or space restricted as nature, when trying to create it.
        It can weigh a ton and take a megawatt to run, and we will still call it a success.
        The idea that the uncertainty principle is a requirement for free will is nonsense. But so is most of the debate about free will, because most can’t agree on how to define it.
        There is no id or whatever the fuck weird concepts religious philosophers use, which generally just means they think it can’t exist without a god. And they think we have consciousness separate from the brain somehow? Because they are superstitious idiots, who think argument from ignorance counts.
        Spoiler, there is no soul, and there are no gods either. Except the ones created by man.
        Yet we have both consciousness and free will.

        • BearOfaTime@lemm.ee
          link
          fedilink
          English
          arrow-up
          6
          arrow-down
          3
          ·
          4 days ago

          I also don’t agree that we don’t know what it [consciousness] is.

          … most can’t agree on how to define it.

          Which is it?

      • atrielienz@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        ·
        4 days ago

        Oh, I’m not saying that we think at 10bits per second. I think it’s bung. And there’s at least one comment on that post that goes into depth about why the reasoning is flawed. I was just pointing out that it’s already been posted once and there’s a very … Controversial thread of replies with a lot of back and forth discussions.

  • irotsoma@lemmy.world
    link
    fedilink
    English
    arrow-up
    17
    arrow-down
    3
    ·
    4 days ago

    I just skimmed it, but it’s starting with a totally nonsensical basis for calculation. For example,

    “In fact, the entropy of English is only ∼ 1 bit per character.”

    Um, so each character is just 0 or 1 meaning there are only two characters in the English language? You can’t reduce it like that.

    I mean just the headline is nonsensical. 10 bits per second? I mean a second is a really long time. So even if their hypothesis that a single character is a bit we can only consider 10 unique characters in a second? I can read a whole sentence with more than ten words, much less characters, in a second while also retaining what music I was listening to, what color the page was, how hot it was in the room, how itchy my clothes were, and how thirsty I was during that second if I pay attention to all of those things.

    This is all nonsense.

    • GamingChairModel@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      3 days ago

      I think the fundamental issue is that you’re assuming that information theory refers to entropy as uncompressed data but it’s actually referring to the amount of data assuming ideal/perfect compression.

      Um, so each character is just 0 or 1 meaning there are only two characters in the English language? You can’t reduce it like that.

      There are only 26 letters in the English alphabet, so fitting in a meaningful character space can be done in less than 5 bits (2^5 = 32). Morse code, for example, encodes letters in less than 4 bits per letter (the most common letters use fewer bits, and the longest use 4 bits). A typical sentence will reduce down to an average of 2-3 bits per letter, plus the pause between letters.

      And because the distribution of letters in any given English text is nonuniform, there’s less meaning per letter than it takes to strictly encode things by individual letter. You can assign values to whole words and get really efficient that way, especially using variable encoding for the more common ideas or combinations.

      If you scour the world of English text, the 15-character string of “Abraham Lincoln” will be far more common than even the 3-letter string of “xqj,” so lots of those multiple character expressions only convey a much smaller number of bits of entropy. So it might be that it takes someone longer to memorize a random 10 character string that is truly random, including case sensitivity and symbols and numbers, than it would to memorize a 100-character sentence that actually carries meaning.

      Finally, once you actually get to reading and understanding, you’re not meticulously remembering literally every character. Your brain is preprocessing some stuff and discarding details without actually consciously incorporating them into the reading. Sometimes we glide past typos. Or we make assumptions (whether correct or not). Sometimes when tasked with counting basketball passes we totally miss that there was a gorilla in the video. The actual conscious thinking discards quite a bit of the information as it is received.

      You can tell when you’re reading something that is within your own existing knowledge, and how much faster it is to read than something that is entirely new, on a totally novel subject that you have no background in. Your sense of recall is going to be less accurate with that stuff, or you’re going to significantly slow down how you read it.

      I can read a whole sentence with more than ten words, much less characters, in a second while also retaining what music I was listening to, what color the page was, how hot it was in the room, how itchy my clothes were, and how thirsty I was during that second if I pay attention to all of those things.

      If you’re preparing to be tested on the recall of each and every one of those things, you’re going to find yourself reading a lot slower. You can read the entire reading passage but be totally unprepared for questions like “how many times did the word ‘the’ appear in the passage?” And that’s because the way you actually read and understand is going to involve discarding many, many bits of information that don’t make it past the filter your brain puts up for that task.

      For some people, memorizing the sentence “Linus Torvalds wrote the first version of the Linux kernel in 1991 while he was a student at the University of Helsinki” is trivial and can be done in a second or two. For many others, who might not have the background to know what the sentence means, they might struggle with being able to parrot back that idea without studying it for at least 10-15 seconds. And the results might be flipped for different people on another sentence, like “Brooks Nader repurposes engagement ring from ex, buys 9-carat ‘divorce ring’ amid Gleb Savchenko romance.”

      The fact is, most of what we read is already familiar in some way. That means we’re actually processing less information than we’re actually taking in, and discarding a huge chunk of what we perceive towards what we actually think. And when we encounter things that didn’t necessarily expect, we slow down or we misremember things.

      So I can see how the 10-bit number comes into play. It cited various studies showing the image/object recognition tends to operate in the high 30’s in bits per second, and many memorization or video game playing tasks involve processing in the 5-10 bit range. Our brains are just highly optimized for image processing and language processing, so I’d expect those tasks to be higher performance than other domains.

      • RememberTheApollo_@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        ·
        3 days ago

        So in other words - if we highly restrict the parameters of what information we’re looking at, we then get a possible 10 bits per second.

        • GamingChairModel@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          2 days ago

          if we highly restrict the parameters of what information we’re looking at, we then get a possible 10 bits per second.

          Not exactly. More the other way around: that human behaviors in response to inputs are only observed to process about 10 bits per second, so it is fair to conclude that brains are highly restricting the parameters of the information that actually gets used and processed.

          When you require the brain to process more information and discard less, it forces the brain to slow down, and the observed rate of speed is on the scale of 5-40 bits per second, depending on the task.

        • General_Effort@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          2 days ago

          Not quite. Information always depends on context. It is not a fundamental physical quantity like energy. When you have a piece of paper with english writing on it, then you can read and understand it. If you don’t know the script or language, you won’t even be able to tell if it’s a script or language at all. Some information needs to be in your head already. That’s simply how information works.

          You take in information through the senses and do something based on that information. Information flows into your brain through your senses and then out again in the form of behavior. The throughput is throttled to something on the order of 10 bits/s. When you think about it for a bit, you realize that a lot of things are predicated on that. Think of a video game controller. There’s only a few buttons. The interface between you and the game has a bandwidth of only a few bits.

      • irotsoma@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        arrow-down
        3
        ·
        edit-2
        3 days ago

        Regardless of how you define a “bit”, saying 10 in a second when most people easily process hundreds of pieces of information in every perceivable moment, much less every second, is still ridiculous. I was only using characters because that was one of the ridiculous things the article mentioned.

        Heck just writing this message I’m processing the words I’m writing, listening to and retaining bits of information in what’s on the TV. Being annoyed at the fact that I have the flu and my nose, ears, throat, and several other parts are achy in addition to the headache. Noticing the discomfort of the way my butt is sitting on the couch, but not wanting to move because my wife is also sick and lying in my lap. Keeping myself from shaking my foot, because it is calming, but will annoy said wife. Etc. All of that data is being processed and reevaluated consciously in every moment, all at once. And that’s not including the more subconscious stuff that I could pay attention to if I wanted to, like breathing.

        • finley@lemm.ee
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          1
          ·
          3 days ago

          It seems like you might be confusing the concept of transmission speed with available bandwidth. And also sounds like maybe you should recuperate from the flu and feel better. Getting upset about this isn’t worth it.

          • irotsoma@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            arrow-down
            3
            ·
            3 days ago

            Not exactly. I just think trying to apply a single threaded, cyclical processing model on a process that is neither threaded nor executed in measurable cycles is nonsensical. On a very, very abstract level it’s similar to taking the concept of dividing a pie between a group of people. If you think in terms of the object that you give to each person needing to be something recognizable as pie, then maybe a 9-inch pie can be divided 20 or 30 times. Bit if you stop thinking about the pie, and start looking at what the pie is made up of, you can divide it so many times that it’s unthinkable. I mean, sure there’s a limit. At some point there’s got to be some three dimensional particle of matter that can no longer be divided, but it just doesn’t make sense to use the same scale or call it the same thing.

            Anyway, I’m not upset about it. It’s just dumb. And thinking about it is valuable because companies are constantly trying to assign a monetary value to a human brain so they can decide when they can replace it with a computer. But we offer much different value, true creativity and randomness, pattern recognition, and true multitasking, versus fast remixing of predefined blocks of information and raw, linear calculation speed. There can be no fair comparison between a brain and a computer and there are different uses for both. And the “intelligence” in modern “AI” is not he same as in human intelligence. And likely will never be with digital computers.

            • finley@lemm.ee
              link
              fedilink
              English
              arrow-up
              3
              arrow-down
              3
              ·
              3 days ago

              I’m not reading all of that.

              I hope you feel better

  • HipsterTenZero@dormi.zone
    link
    fedilink
    English
    arrow-up
    4
    ·
    3 days ago

    no way this is true. it might be only 240p but I can stream video in this wrinkly meat computer without buffering.

  • 🇰 🌀 🇱 🇦 🇳 🇦 🇰 ℹ️@yiffit.net
    link
    fedilink
    English
    arrow-up
    3
    arrow-down
    1
    ·
    edit-2
    4 days ago

    Isn’t that a bit … slow?

    A single letter is 8 bits. I can think and speak out loud a whole sentence in just a second or two. Where did they come up with these numbers? In what world or system is a letter only 1 bit?

    • Echo Dot@feddit.uk
      link
      fedilink
      English
      arrow-up
      7
      arrow-down
      2
      ·
      3 days ago

      The human brain isn’t binary so the choice to describe processing speed in terms of bits is bizarre.

    • General_Effort@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      3 days ago

      Cn y ndrstnd this?

      You probably can. Human language has built-in redundancy. You can drop parts and still be understood. That’s useful for communicating in noisy environments or with people hard of hearing. So you could say that actual information content is less than 1 letter per letter, so to say.

      Properly, information content is measured in bits. A more detailed analysis gives a value of about 1 bit per character.

      Sidenote: You shouldn’t ask technical or scientific questions in this community. I don’t know why information theory denialism is so hot right now, but it obviously is.

  • General_Effort@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    arrow-down
    2
    ·
    4 days ago

    The Bit is a unit of information, like the Liter is a unit of volume. The Bit may also be called Shannon, though I do not know where that is commonly done.

    When people talk about a liter, they are often thinking about a liter of milk, or gas. That is, they are thinking of a quantity of a certain substance rather than of a volume in the abstract. People may also say liter to mean a specific carton or bottle, even if it no longer contains a liter of whatever.

    Similarly, people will say “bit” when they mean something more specific than just a unit of measurement. For example, the least significant bit, or the parity bit, and so on. It may refer to a lot of things that can contain 1 Bit of information.

    The fact that the headline is talking about bits/s makes clear that this is talking about how much information goes through a human mind per time unit.