That’s why consciousness is “magical,” still. If neurons ultra-basically do IF logic, how does that become consciousness?
And the same with memory. It can seem to boil down to one memory cell reacting to a specific input. So the idea is called “the grandmother cell.” Is there just 1 cell that holds the memory of your grandmother? If that one cell gets damaged/dies, do you lose memory of your grandmother?
And ultimately, if thinking is just IF logic, does that mean every decision and thought is predetermined and can be computed, given a big enough computer and the all the exact starting values?
Though I should point out that the virtual neurons in LLMs are also noisy and sensitive, and the noise they use ultimately comes from tiny fluctuations of random atomic noise too.
True.
That’s why consciousness is “magical,” still. If neurons ultra-basically do IF logic, how does that become consciousness?
And the same with memory. It can seem to boil down to one memory cell reacting to a specific input. So the idea is called “the grandmother cell.” Is there just 1 cell that holds the memory of your grandmother? If that one cell gets damaged/dies, do you lose memory of your grandmother?
And ultimately, if thinking is just IF logic, does that mean every decision and thought is predetermined and can be computed, given a big enough computer and the all the exact starting values?
You’re implying that physical characteristics are inherently deterministic while we know they’re not.
Your neurons are analog and noisy and sensitive to the tiny fluctuations of random atomic noise.
Beyond that: they don’t do “if” logic, it’s more like complex combinatorial arithmetics that simultaneously modify future outputs with every input.
Though I should point out that the virtual neurons in LLMs are also noisy and sensitive, and the noise they use ultimately comes from tiny fluctuations of random atomic noise too.