Thanks to @General_Effort@lemmy.world for the links!
Here’s a link to Caltech’s press release: https://www.caltech.edu/about/news/thinking-slowly-the-paradoxical-slowness-of-human-behavior
Here’s a link to the actual paper (paywall): https://www.cell.com/neuron/abstract/S0896-6273(24)00808-0
Here’s a link to a preprint: https://arxiv.org/abs/2408.10234
Information theory is an accepted field. The entropy in information theory is analogous and named after entropy in thermodynamics, but it’s not actually just thermodynamics. It looks like its own study. I know this because of all the debate around that correcthorsebatterystaple xkcd.
I’m not sure if you are making a joke, or also making a point. But boy that XKCD is spot on. 😋 👍
I think within it’s field thermodynamics works, but it’s so widely abused outside the field I’ve become sick of hearing about it from people who just parrot it.
I have not seen anything useful from information theory, mostly just nonsense about information not being able to get lost in black holes. And exaggerated interpretations about entropy.
So my interest in information theory is near zero, because I have discarded it as rubbish already decades ago.
For one, password security theory that actually works (instead of just “use a special character”) is based on information theory and its concept of entropy.
OK I don’t think information theory is actually needed for that. Just a bit of above average intelligence apparently.
Yes it’s true some use the term entropy, instead of just the statistical amount of combinations, and obviously forcing a special character instead of just having it as an option, makes the number of possibilities lower, decreasing the uncertainty, which they then choose to call entropy. Which counter intuitively IMO is called increased entropy.