Yeah, it's getting thick. Let me try another angle.Skyweir wrote:Or I might be mixing too many metaphors and navigated myself into an irreversible black hole ____
Simulating consciousness is easy. The Eliza program dates back to the UNIVAC days. Because, remember, this task is merely making a program that SEEMS consciousness. It passes a Turing test.
If we are after a real consciousness, what we are simulating is a matrix - the raw materials - from which an actual consciousness can emerge. It doesn't just LOOK conscious, it IS conscious. Therefore, it's not simulated.
Which is where first order, second order, etc. effects comes in. Which is where the idea of a good simulation comes in. What we want to do is simulate the first order effects necessary for consciousness, and then hope that the second, third, etc. effects replicate real activities so well that the topmost effect, consciousness, appears.
(This is why I think the gravity analogy is right for the wrong reason. Gravity is matrix material. First order. It is pure simulation, completely unreal. Nor does it need to be real in any way. In other words, it only needs to SEEM like gravity, it doesn't need to IS gravity. Consciousness is a higher order phenomenon. And we aren't searching for SEEMS conscious, but IS conscious. So gravity is not a good analogy to consciousness.)
Frank Herbert described the purpose of consciousness is to be a filter which is used to make sense of the sensory flood. Which is interesting to me, in so far as it proposes that self-awareness is a side effect of the necessity of trying to understand what we perceive. A side effect!Skyweir wrote:Our physical, emotional, intellectual make up or sensory structures are immersed with data at constant and regular rates. Flooded with data human consciousness sorts it and makes meaning of it.
The biggest mystery of self-awareness is, why do we need to be? We would survive just as well being non-self-aware but otherwise intelligent. So maybe an accidental side-effect makes sense.