Let's start off with a longish quote from the neuroscientist, writer and broadcaster Susan Greenfield:
“The
idea [of a conscious robot] is ridiculous. Consciousness entails an
interaction between the brain and the body, trafficking a myriad of
chemicals between the two. To reproduce that, you would have to build
a body with a whole range of chemicals, and the three-dimensionality
of the brain would have to be preserved to the very last connection.”
Susan
Greenfield puts what can be called the case for biologism when
it comes to consciousness. The way Greenfield expresses this position
seems to make her an even stronger proponent of biologism
than John Searle (who'll be discussed in a moment).
Her
argument appears to be excruciatingly simple. It's this:
In
order to create artificial consciousness we would need to build
a biological brain and a biological body.
Or
less extremely:
In
order to create artificial consciousness one would need to replicate
a biological brain and a biological body.
Or even more simply:
In
order to replicate x, one would need to replicate everything of x (which has y)
– down to its material constitution.
You
could ask what would be the point of replicating biological brains
and biological bodies when we already have them. (Though the
replication of brains and bodies - regardless of consciousness - would
be an incredible thing.) This would be equivalent to a scientific
model that literally replicated every aspect of that which it is
modelling.
Another
way of putting this is to say that Susan Greenfield's position is the
exact opposite of functionalism. That is, it's not functions (or
computations, algorithms, etc.) which matter to consciousness: it's
the material constitution (or "substrate") which underpins it.
Isn't
that why Greenfield talks in terms of “trafficking a myriad of
chemicals between” the brain and body and then goes on to say that
“to build a body with a whole range of chemicals, and the
three-dimensionality of the brain would have to be preserved to the
very last connection”?
Here
Greenfield goes beyond material constitution (which would include biochemicals) to stressing the “three-dimensionality of the
brain”. Thus we've moved beyond material constitution to the shape/dimensionality
of the brain. (In any case, three-dimensionality - at least in
the abstract - would be easy to replicate.)
Again
it would seem that the replication of consciousness would require the
replication of both the brain and body in their entirety. That would
be a pointless act of replication in terms of replicating
consciousness. Though in terms of creating
artificial-life-with-consciousness it would be
earth-shattering.
Despite
all that, the Chalmers, Penroses and dualists among us may still
ask Susan Greenfield and others the following question:
What
if we carried out this act of perfect replication (of both brains and
bodies) and it still turned out that the replica didn't have
consciousness?
Even
though that's a subject for another day, I suspect that some
philosophers would discount this possibility in an a priori
manner.
John Searle
Many
philosophers and scientists have called John Searle a dualist. He, in
return, says that those who stress function and ignore biology are
effectively creating a non-material Cartesian reality populated with
functions or computations (rather than with Cartesian “ideas” or “thoughts”).
Searle himself writes:
“I
believe we are now at a point where we can address this problem as a
biological problem [of consciousness] like any other. For decades
research has been impeded by two mistaken views: first, that
consciousness is just a special sort of computer program, a special
software in the hardware of the brain; and second that consciousness
was just a matter of information processing. The right sort of
information processing -- or on some views any sort of information
processing --- would be sufficient to guarantee consciousness..... it
is important to remind ourselves how profoundly anti-biological these
views are. On these views brains do not really matter. We just happen
to be implemented in brains, but any hardware that could carry the
program or process the information would do just as well. I believe,
on the contrary, that understanding the nature of consciousness
crucially requires understanding how brain processes cause and
realize consciousness.. ”
Searle
continues:
“Perhaps
when we understand how brains do that, we can build conscious
artifacts using some nonbiological materials that duplicate, and not
merely simulate, the causal powers that brains have. But first we
need to understand how brains do it.”
It
can be said that there can be an artificial mind without having an
artificial (human) brain. However, isn't that precisely the claim
that's being disputed?
To
John Searle, it's all about what he calls “causal powers".
This
refers to the ostensible fact that a certain level of complexity is
what's required to bring about those causal powers which are
necessary for intentionality, mind and consciousness.
Despite that, Searle never never says (as far as I know) that biological brains are
the only things capable - in principle - of bringing about
consciousness and intentionality (therefore semantics, in
Searle-speak). He only says that biological brains are the only
things known which are complex enough to do so.
So
it really is all about the biological and physical complexity of
brains and therefore their causal powers.
His
basic position (like Greenfield's) on this is that if
computationalists or functionalists, for example, ignore the
physical biology of brains and exclusively focus on syntax,
computations or functions (the form/role rather than the physical
embodiment), then that will surely lead to a kind of dualism. What he means by this is that there's a radical disjunction created
between the actual physical reality of the brain and how these
philosophers explain - or account for - intentionality, mind and
consciousness.
Again,
Searle doesn't believe that only brains can give rise to
minds. Searle's position is that only brains do give rise to
minds. He's emphasising an empirical fact; though he's not denying
the logical and metaphysical possibility that other things can bring
forth minds.
Gerald
Edelman also holds the position that the mind
“can
only be understood from a biological standpoint, not through physics
or computer science or other approaches that ignore the structure of
the brain”.
Then Edelman -
in order to demonstrate his point - puts the seemingly extreme position of
“functionalists” (such as Marvin Minsky) who “say they can
build an intelligent being without paying attention to anatomy”.
So
if one says that biology matters, one's also saying that functions
aren't everything (though not that functions are nothing).
Finally,
according to Francis Crick, psychologists (as well as philosophers)
“have treated the brain as a black box, which can be understood in terms merely of inputs and outputs rather than of internal mechanisms”.
Thus,
to Greenfield, Searle, Edelman and Crick, consciousness really is all
about biological brains.
References
Crick,
Francis. (1996) quoted in The
End of Science, by John Horgan
Edelman,
Gerald. (1996) quoted in The End of Science, by John HorganGreenfield, Susan. (1999) quoted in Predictions: 30 Great Minds on the Future (edited by Sian Griffiths).
Searle, John. (1999) 'Consciousness'
No comments:
Post a Comment