Believers
in Strong AI believe (to put it very simply) that if computers behave
in certain ways, then they have intelligence and even minds.
Not
only can you be a functionalist when it comes to the mind, you can
also be a functionalist when it comes to life itself.
According
to John Horgan, Christopher Langton, of the Santa Fe Institute,
“described
himself as a functionalist, who believed life was characterised by
what it did rather than by what it was made of” (200).
Horgan
elaborates:
“If
a programmer created molecule-like structures that, following certain
laws, spontaneously organised themselves into entities that could
seemingly eat, reproduce, and evolve, Langton would consider those
entities to be alive – 'even if they're in a computer'.” (200)
One
can ask here why Horgan uses the words “seemingly eat” instead of
the simple “eat”. If artificial beings eat, then they eat. That
is, they gain some kind of energy or nutrition from what it is they
eat – even if what they eat isn't organic.
In
addition, why would artificial life automatically need to evolve?
Since it would be artificial, there's no automatic reason that
evolution should also apply to artificial life. Then again, there's
no automatic reasons why such artificial entities shouldn't
evolve either. It depends on the nature of the artificial beast.
Of
course these artificial entities could do all the things mentioned
above and still not be conscious or have minds. They could eat,
reproduce and evolve and not have minds or display conscious states.
Such things as eating, reproducing and evolving don't entail mind or
consciousness.
However,
it seems that such things do entail consciousness – or at least the
possibility of pain. Langton says:
“I
like to think that if I saw somebody sitting next to me at a computer
terminal who is torturing these creatures.... I would try to
get this guy some psychological help!”
I
presume that if these 'creatures' can feel pain, then they must also
display that pain. How would they do so? Again, artificial eating,
artificial evolution and artificial reproduction don't entail
consciousness or mind and therefore they don't entail pain. How would he
(or we) know that his artificial creatures felt pain? (How would they
know that even if they displayed 'pain behaviour'?)
Horgan
goes into more detail as regards Langton's life-functionalism. He
writes that he
“wanted
people to realise that life might be a process that could be
implemented by any number of arrangements of matter, including the
ebb and flow of electrons in a computer” (200).
Horgan
then quotes Langton:
“At
some level the actual physical realization is irrelevant to the
functional properties. Of course there are differences. There are
going to be differences if there's a different material base. But are
the differences fundamental to the property of being alive or not?”
It
seems quite incredible that Langton should argue that the 'material
base' isn't fundamental. Or at least he says that it may not be
fundamental. Then again, it may well be fundamental. After all, it's
a simple fact that all living things are organic, not artificial. The
inductive evidence supports the position that physical constitution
is important and fundamental. That just seems obvious.
Indeed
isn't it the case that functionally speaking we've already replicated
many of the things about life and mind that we wanted to replicate?
So why haven't we actually got life or mind at this juncture? What's
the missing ingredient? The functional or computational realities of
computers and whatnot are already highly complex – so what's
missing? Is the missing link biology - or the special qualities of the
organic - after all?
Perhaps
instead of replicating functions (such as computations, etc.),
the scientists of artificial life and artificial mind should attempt to
replicate biological matter (or the brain) instead. Though of course
that would be fiendishly complex and it's not in sight at the moment.
And that's partly why functions (rather than material bases) are
emphasised so much in the AI and AL literature.
No comments:
Post a Comment