X-Message-Number: 17964
Date: Sat, 17 Nov 2001 19:39:16 -0800
From: Dave Shipman <>
Subject: More on consciousness (plus doubts)

Hello again,

Scott Badger (#17951) writes:

    "I thought before that you were saying that being aware
    of the environment was being conscious, but it now
    sounds like you're more precisely saying that
    appreciating the environment is what makes one
    conscious..."

No, I still mean by consciousness the notion of subjective awareness. The 
various "types" of consciousness are really different types of processes 
going on in our brains that we can be aware of. Our brains are doing the 
computations and we are somehow consciously aware of the results of (some 
of) these computations. I assume we cannot consciously experience anything 
that has not otherwise been physically manifested in the brain. The mystery 
of consciousness is really the question of how these physical states and 
our mental states become correlated. My beach story included elements of 
basic sensory consciousness (smells), higher level sensory awareness (image 
analysis), and feelings of overall pleasantness ("Life is good"). The 
latter no doubt based on my previous beach experiences but also maybe on 
some primitive human instinct to enjoy hanging out at the beach.

Desires, emotions, aversions, motivations and so forth are also physically 
based. As implemented in our bodies, they have a strong hormonal component 
as well as a neurological component. I see no reason in principle why they 
could not be coded up in a computer. But that doesn't mean the computer 
would necessarily "feel" them. As Freud pointed out, many of our desires 
and motivations remain unconscious.

Sorry to keep beating a dead horse, but my point is that the computation 
and the qualia associated with the computation are not the same. I might 
eval the following expression in a LISP interpreter:

	(setq goodness-of-life 10)

but the PC would not suddenly start to feel good. And I don't see how that 
would cease to be true even when the system includes lots of associated 
software, such as a "B-brain" subroutine monitoring the goodness-of-life 
variable, or subroutines making "decisions" on the basis of that variable's 
current value. Even if the machine starts to whistle "Zippity Do Da", 
that's just a behavior, it doesn't mean the machine is really happy.

Finally, I must admit to my insecurities. My previous postings have been 
written in a sort of "manifesto" style, because I've been trying to hammer 
home a point that I see so many people missing. But I do sometimes worry 
that twenty years from now I will say to myself, "Geez, I can't believe I 
fell for the old consciousness trap. What could I possibly have been 
thinking?" Many people much smarter than I, and who have struggled with 
these problems for much longer, hold opposing views. Still the conviction 
that consciousness must be reckoned with is unshakable. I find that most 
books with the word "consciousness" in the title are actually explanations 
of brain processes, the physical correlates of our experiences, rather than 
about the subjective experiences themselves and how they come to exist. The 
philosopher David Chalmers, who's site Scott links, refers to this 
distinction as "the easy problem" versus "the hard problem". I believe that 
many thinkers are constrained by the prevailing scientific worldview. 
Within this worldview the notion of consciousness is impossible to define 
or specify. That's why I use stories and metaphors to get across what I 
mean by consciousness. If it could be specifically defined, then we'd just 
write a computer program for it and we'd be done. As I've argued though, 
that can't be right. But then again, I am haunted by the fact that we are 
able to talk and write about our qualia, and how could we do that unless 
the qualia themselves were embedded in the neural substrate? So despite my 
manifestos, I am really at a loss as to what it all means.

	-- Dave Shipman

Rate This Message: http://www.cryonet.org/cgi-bin/rate.cgi?msg=17964