I think this kind of thing really undermines the plausibility argument for "zombies" (creatures indistinguishable from humans, but who lack qualia)--and plausibility is the only argument there is that it's a coherent notion. We think that we can imagine seeing the color red but having a completely different (in some hard-to-specify sense) feeling of what it's like to see red than someone else who also sees that color and tells us "yep, that's red." From this the argument goes that it's conceivable that someone could say "that's red" whenever we would, but who has nothing that it feels like to see red; there's just a bunch of nerves firing and chemicals changing, but nothing subjective going on. And if it's conceivable, so it is said, then it's possible.
I think, though, that it's harder (maybe even impossible) to imagine a zombie being fooled by this optical illusion; the illusion is, after all, precisely that you're losing the sense of what it is to feel like you're seeing one or more of the yellow dots, even though physically the photons are still hitting your eyes, the nerves are firing, etc. The lights are on, but you're intermittently not home to Mr. Yellow-dot Qualia. Is it conceivable that the zombie sees but isn't really aware of seeing the yellow dot, and also sees but isn't really aware of not seeing the yellow dot, and yet somehow still can distinguish objectively between the two states (so it can describe the illusion) just like someone with a mind? Or does your brain seize up in a kind of concept induced blindness trying to picture it?
I'm not entirely sure that I find his line of reasoning convincing however. It seems to me that any qualia-less 'zombie' would still have a sort of conscious/sub-conscious division to his mind. But by 'conscious' here it is only meant to refer to what information is available (i.e. directly accessible) to the central processing module (to use a computing analogy) of his mind. It is important to note that this is a distinct matter from whether phenomenological 'conscious experiences' (qualia) occur.
Given this distinction, it strikes me as entirely plausible that a 'zombie' could have sensory appartus detect the yellow dots, and yet have this information restricted to the lower sub-modules of his mind, so his central module would be entirely unaware of it. Thus a zombie could be just as fooled by the illusion as us, without any need for qualia.
P.S. For any readers with a basic knowledge of computer science, you may find it easier to follow this argument if you imagine a robot, with a specific function responsible for the parsing of perceptual input. This function would pick up all the raw data (including the light from the yellow dots), but the complicated parsing algorithm - fooled by the illusion - may fail to correctly interpret the raw data. Thus, when a different function (say, the speech one, if we want the robot to tell us what it sees) refers to this interpreted data, it will be unaware of any yellow dots.
So, yeah... it seems entirely plausible that a simple robot (let alone a full-blown zombie!) could be fooled by such illusions.
[Copied from old comments thread]
ReplyDeleteYou have a robot that has conscious awareness of sensory events iff we would have qualia in that situation...so why doesn't that robot have qualia? If qualia isn't having a sensation coupled with the awareness of having that sensation, then what could it possibly be? Some weird metaphysical epiphenomenon, I suppose, but I have trouble grasping why that's a live possibility let alone a compelling answer.
Joshua Macy | Email | Homepage | 5th Apr 04 - 2:34 pm | #
--------------------------------------------------------------------------------
Hi Joshua, good to hear from ya!
I'm afraid I might not have been clear enough in the distinction it is necessary to make between two quite distinct possible meanings of "conscious".
1) Sentience / subjective consciousness - this is the mysterious one, the one you're thinking of, where qualia are involved. The robot is NOT conscious in this sense.
2) Availability of information - even the most basic computer program can partition its data so that some lower-level info is hidden from the decision-making procedures.
I must emphasis that when I spoke of a zombie having "a sort of conscious/sub-conscious division", I was talking about the #2 definition above. This use of the word 'conscious' is really just metaphorical, so I apologise if my use of the term was misleading.
If you're interested, Jack Copeland (I think the book's title is "Artifical Intelligence: A philosophical introduction") briefly explores 3 different understandings of "consciousness", of which these are two.
Anyway, the central point to note is that any simple robot (as could be built today), could easily have 'hidden' information (e.g. raw visual data) which is not made available to its other decision-procedures.
That is, its camera would capture the photons (same as our eyes)... but the interpreting algorithm could be fooled into not noticing any yellow dots (same as our brain).
No qualia or "consciousness" (in the #1 sense) necessary.
Richard | 5th Apr 04 - 7:50 pm |