I'm Murrin wrote:They're deterministic systems that respond to stimuli. It's remarkable that these chemical processes have developed into such complicated forms, that these emergent systems have appeared at a high level of many interactions, but still, just the results of chemistry. To say otherwise you may as well be suggesting that living things have souls, an essence distinct from their physical matter. We are our minds, and our minds are matter, nothing more.
That's a very interesting rebuttal, and I'm sorry that I've missed it in the current thread page flip. I only caught it when Av quoted it. You're dipping into the mind/body problem, and fairly criticizing me by saying my position leads to dualism. I'll deal with that in a moment.
First, I don't think we know enough about the brain to say with certainty that brains are "just the results of chemistry." As Av's article points out, we still don't know much of the global or fine-detail workings of the brain. It's quite possible that its complexity goes well beyond the level of neurons into sub-atomic structures. In fact, in Shadows of the Mind, theoretical physicist Roger Penrose has argued that this is the case, and has backed up the idea with newly discovered micro-features in the brain (I've forgotten the term for them, and don't have his book handy at the moment). Some have pointed out that the brain acts more like an Bose-Einstein condensate, than merely some chemical reactions. This is a new form of matter, in which quantum effects become apparent on a macro-scale.
While this might sound more like pseudo-science than science, it's a fair criticism to point out that we still think about the brain in terms of classical, Newtonian physics. And it's possible we're missing the key features of what makes us conscious by doing so.
Computers are really very simple when you get down to their basic operations. They repeat them over and over, and perform them very fast, but they're still simple algorithmic operations. Our brain doesn't operate this way. It's structure is highly interconnected and parallel. Its more interesting functions aren't algorithmic at all. (Not to mention that humans can construct things like Godel's Theorem, which no computer could ever do, because it transcends any particular algorithmic system.)
Perhaps one day a mechanical device will be built that captures all the organization and detail of the brain, but it most likely won't be a computer--not the way we currently understand computers, which are
Universal Turing Machines. Some of us might still naively call these new machines "computes," but their function and structure will be entirely different.
So, getting back to your point about mind/body: my position doesn't necessarily imply dualism, and certainly not a spirit. I think that mind and body are more like a continuum, and I acknowledge that mind arises from certain organizations of matter, but I dispute the claim that computers (at least UTMs) could ever achieve this level of organization, no matter how large or fast we build them, because that structure is most likely quantum in nature, and certainly not algorithmic in function. Nor can this barrier be crossed depending on what instructions we give them. There is no computer command to
feel or to
value (for instance), because there is no algorithm which codifies feeling or valuing. [It's arguable whether or not these are necessary for consciousness, but I believe a sense of being personally invested in the world such that your being matters to you, is one of the key features of consciousness. But that's another issue.]