How is it not data? Sensory input is 'converted' into electrochemical signals. That's information - a signal sent to the brain from the sensors. Then the neural networks process that information
Electrochemical signals aren't in themselves
information (though we can use electrochemical signals to transmit information.)
Are the electrical currents in your home's wiring information? No. Information is a
symbolic representation. Electrical signals can be used to symbolize and model information, but information isn't
in electrical signals.
My car receives physical input from my steering wheel. I turn it, and a complex series of actions transfer turning-the-steering-wheel into the-front-tires-turning. But this transference of physical action doesn't mean that my steering wheel is processing information. It's merely transferring a physical action. Like dominos falling. The same thing happens with our retina and optic nerves.
Information contains semantic meaning. Electric signals don't have semantics, only form and structure (syntax).
Are you suggesting a soul or something beyond the brain? What evidence do you have for higher processing beyond the cortical layers of the brain? Smile
I'm not suggesting a soul. I'm saying that information only becomes meaningful to a mind. The computer may be able to display a picture of a sunset on its monitor. But just because it can process the correct electric signals to produce this picture doesn't mean that it understands it is producing a picture of a sunset. Yet, we do understand what our electrical signals are "saying." This understanding is itself something extra, something more than the electrical signals. Otherwise, you'd have to say that our neurons themselves understand the meaning of the signals they are transferring. If you don't allow for a holistic phenomenon--a mind--then you must admit that the individual neurons which transmit the physical impact of a photon upon the retina KNOW that this photon represents a piece of the sun. That's an amazing neuron, you've got there.
Computers process data because that's how we design them. We explicitly trace out circuit boards so that electrical currents model logical patterns. There is a
purpose in their design, and this purpose is explicitly meaningful. There is no purpose in our own design. Our brains just transfer one type of signal (light, for example) into another type of signal (electric). This transference in itself can't account for
meaning. Electricity is no more meaningful than light. Turning one into another doesn't create meaning. Something else, something extra, is happening.
Malik23 wrote:
Irrational processes do not derive from computation.
Yes they do. It's called biased thinking or lack of thinking. All research in psychiatry and psychology points to the fact that there is a flawed logical process behind irrational thoughts (i.e. depression is characterised by biased processes). The brain receives sensory data, processes it based upon the nature of the data and a knowledge base. Basic cognitive science.
If irrational processes can derive from computation, I'd love to see those computations. How do you program a computer to have "flawed logical processes"???
To make artificial minds, one has to try to replicate consciousness. As far as I know, there is no scientific law against it.
Unless one believes that God created all life, you have to accept that eventually we will develop thinking machines. Evolution managed thinking organisms, yet so far there has not been a single reason cognitions reside only in 'organic structures'.
I've said many times here that I do think it will be possible some day to create conscious machines. But they won't be computers. Computers are mere child's toys compared to the machines that will eventually become conscious. This isn't an emotional argument I'm making. It is rooted in mathematics and logic (see the other thread for my reasoning on Godel's theorem).
Yes, evolution managed thinking organisms. But it didn't do it by making Turing machines. For us to think that we can build conscious, thinking machines without even knowing how nature managed it seems a much greater prejudice than what I'm being accused of. That would be like thinking we can build a flying machine without understanding anything about aerodynamics, lift, and drag--or how birds manage it.
Of course there's no scientific law against creating conscious machines. But there's a clear logical "law" that proves we can't create conscious machines by merely running algorithms.