no way this is true. it might be only 240p but I can stream video in this wrinkly meat computer without buffering.
I just skimmed it, but it’s starting with a totally nonsensical basis for calculation. For example,
“In fact, the entropy of English is only ∼ 1 bit per character.”
Um, so each character is just 0 or 1 meaning there are only two characters in the English language? You can’t reduce it like that.
I mean just the headline is nonsensical. 10 bits per second? I mean a second is a really long time. So even if their hypothesis that a single character is a bit we can only consider 10 unique characters in a second? I can read a whole sentence with more than ten words, much less characters, in a second while also retaining what music I was listening to, what color the page was, how hot it was in the room, how itchy my clothes were, and how thirsty I was during that second if I pay attention to all of those things.
This is all nonsense.
Regardless of how you define a “bit”, saying 10 in a second when most people easily process hundreds of pieces of information in every perceivable moment, much less every second, is still ridiculous. I was only using characters because that was one of the ridiculous things the article mentioned.
Heck just writing this message I’m processing the words I’m writing, listening to and retaining bits of information in what’s on the TV. Being annoyed at the fact that I have the flu and my nose, ears, throat, and several other parts are achy in addition to the headache. Noticing the discomfort of the way my butt is sitting on the couch, but not wanting to move because my wife is also sick and lying in my lap. Keeping myself from shaking my foot, because it is calming, but will annoy said wife. Etc. All of that data is being processed and reevaluated consciously in every moment, all at once. And that’s not including the more subconscious stuff that I could pay attention to if I wanted to, like breathing.
It seems like you might be confusing the concept of transmission speed with available bandwidth. And also sounds like maybe you should recuperate from the flu and feel better. Getting upset about this isn’t worth it.
Not exactly. I just think trying to apply a single threaded, cyclical processing model on a process that is neither threaded nor executed in measurable cycles is nonsensical. On a very, very abstract level it’s similar to taking the concept of dividing a pie between a group of people. If you think in terms of the object that you give to each person needing to be something recognizable as pie, then maybe a 9-inch pie can be divided 20 or 30 times. Bit if you stop thinking about the pie, and start looking at what the pie is made up of, you can divide it so many times that it’s unthinkable. I mean, sure there’s a limit. At some point there’s got to be some three dimensional particle of matter that can no longer be divided, but it just doesn’t make sense to use the same scale or call it the same thing.
Anyway, I’m not upset about it. It’s just dumb. And thinking about it is valuable because companies are constantly trying to assign a monetary value to a human brain so they can decide when they can replace it with a computer. But we offer much different value, true creativity and randomness, pattern recognition, and true multitasking, versus fast remixing of predefined blocks of information and raw, linear calculation speed. There can be no fair comparison between a brain and a computer and there are different uses for both. And the “intelligence” in modern “AI” is not he same as in human intelligence. And likely will never be with digital computers.
I’m not reading all of that.
I hope you feel better
Isn’t that a bit … slow?
A single letter is 8 bits. I can think and speak out loud a whole sentence in just a second or two. Where did they come up with these numbers? In what world or system is a letter only 1 bit?
Cn y ndrstnd this?
You probably can. Human language has built-in redundancy. You can drop parts and still be understood. That’s useful for communicating in noisy environments or with people hard of hearing. So you could say that actual information content is less than 1 letter per letter, so to say.
Properly, information content is measured in bits. A more detailed analysis gives a value of about 1 bit per character.
Sidenote: You shouldn’t ask technical or scientific questions in this community. I don’t know why information theory denialism is so hot right now, but it obviously is.
The human brain isn’t binary so the choice to describe processing speed in terms of bits is bizarre.
This has already been posted and apparently is very controversial here.
TL;DR: The “bit” in the headline is actually a “shannon”.
Someone who can’t tell from the headline what kind of “Bit” is meant is probably not going to be helped by that comment.
All of the AI craze is predicated on “all it takes is simulating all the neurons”.
So when we got close to that, people dropped billions into AI and keep insisting it’s around the corner
The problem is we don’t know what consciousness is yet, so how the fuck are we gonna re-create?
The smartest living physicist has spent 40 years insisting there’s a quantum compenont, but up until 5 months ago we didn’t see anyway for a quantum process to be stable in a human brain.
We know now microtubules can create a larger tube that functions as a fiber optic cable and allow quantum superposition to be sustained much longer than ever thought.
We probably have century at least before real AI.
There’s a reason Elmo’s “robot” was just a Mechanical Turk
I don’t see any requirement for a quantum theory element for consciousness to work. I also don’t agree that we don’t know what it is.
If the brain has such elements, you must remember that it’s purely by chance, and there may very well be other ways to have similar functionality.
We do have the advantage when trying to create a strong AI, that we are not nearly as energy or space restricted as nature, when trying to create it.
It can weigh a ton and take a megawatt to run, and we will still call it a success.
The idea that the uncertainty principle is a requirement for free will is nonsense. But so is most of the debate about free will, because most can’t agree on how to define it.
There is no id or whatever the fuck weird concepts religious philosophers use, which generally just means they think it can’t exist without a god. And they think we have consciousness separate from the brain somehow? Because they are superstitious idiots, who think argument from ignorance counts.
Spoiler, there is no soul, and there are no gods either. Except the ones created by man.
Yet we have both consciousness and free will.I also don’t agree that we don’t know what it [consciousness] is.
… most can’t agree on how to define it.
Which is it?
What?
Consciousness and free will is not the same thing.
Oh, I’m not saying that we think at 10bits per second. I think it’s bung. And there’s at least one comment on that post that goes into depth about why the reasoning is flawed. I was just pointing out that it’s already been posted once and there’s a very … Controversial thread of replies with a lot of back and forth discussions.
The Bit is a unit of information, like the Liter is a unit of volume. The Bit may also be called Shannon, though I do not know where that is commonly done.
When people talk about a liter, they are often thinking about a liter of milk, or gas. That is, they are thinking of a quantity of a certain substance rather than of a volume in the abstract. People may also say liter to mean a specific carton or bottle, even if it no longer contains a liter of whatever.
Similarly, people will say “bit” when they mean something more specific than just a unit of measurement. For example, the least significant bit, or the parity bit, and so on. It may refer to a lot of things that can contain 1 Bit of information.
The fact that the headline is talking about bits/s makes clear that this is talking about how much information goes through a human mind per time unit.