Saturday, 18 October 2014

Searle on As-If Intentionality & Linguistic Meaning




According to John Searle, intentionality and consciousness are inextricably linked together. However, from the following it can be seen that some things can have intentionality without having consciousness. Or, more correctly, if intentionality comes along with consciousness, these things have "as-if intentionality" without consciousness; though they don’t have (pure) intentionality without consciousness. In other words, as-if intentionality is parasitical on real intentionality. And intentionality depends on consciousness. Thus as-if intentionality is parasitical on consciousness – it's parasitical on human beings or, more precisely, on persons.

Searle gives "linguistic meaning" as an example of intentionality, even if it is as-if intentionality. Linguistic meaning is a real form of intentionality; though it's not intrinsic intentionality.

Linguistic meaning is "a real form of intentionality"; though it's not "intrinsic" – it is extrinsic or as-if. It's dependent upon the intentionality of those who use linguistic meanings. That is, on us. Thus linguistic meaning is also dependent upon consciousness, which has intrinsic intentionality and is intrinsic in and of itself.

Searle gives other examples on as-if - or extrinsic - intentionality which display themselves in terms of the perceptual, epistemic and cognitive metaphors we apply to inanimate objects. For example,


"I say about my thermostat that it perceives changes in the temperature; I say of my carburettor that it knows when to enrich the mixture; and I say of my computer that its memory is bigger than the memory of the computer I had last year." (500)


The as-if-ness of thermostats, carburettors and computers is "derived" from the fact that these inanimate objects have been designed to perceive, know and memorise. Though this is only as-if perception, or as-if knowledge, or as-if memory. That is, it's dependent on human perception, on human knowledge, or on human memory. More precisely, they're designed to replicate or mimic - though not actually be - perceiving or knowing objects; or objects capable of memorising. Perception, knowledge and memory require real - or intrinsic - intentionality, not as-if intentionality. Thermostats, carburettors and computers have a degree of as-if intentionality, derived from (our) intrinsic intentionality. However, despite all these qualifications of as-if intentionality, as-if intentionality is still "real" intentionality; though derived (a game is derived from its parts, but is nonetheless real). This is especially true in the case of linguistic meaning.

As Searle puts it:


"There is nothing metaphorical or as-if about saying that certain sentences mean certain things or certain maps are correct representations of the state of California or the certain pictures are pictures of Winston Churchill. These forms of intentionality are real, but they are derived from the intentionality of human agents." (500)


There's nothing "metaphorical" about the fact that linguistic meanings actually "mean certain things" because meanings were never even supposed to have real - or intrinsic - intentionality or aboutness in the first place. (What about non-linguistic or abstract meanings?) No one ever thought that words or linguistic meanings had consciousness or intentionality. (However, some people do when it comes to computers. Some do even when it comes to thermostats!) So the word ‘cat’ means cat in a thoroughly intentional sense. Though the word’s (or meaning’s) intentionality is derived or as-if, not intrinsic. The meaning of ‘cat’ depends on the consciousness of the human beings who used the word ‘cat’ and think about cats. The same can be said about "maps of California" or "pictures of Winston Churchill" (500). In the case of pictures and maps, we can say the as-if intentionality is more obvious or direct because, perhaps, maps and pictures are depictive, unlike words and linguistic meanings. Though that doesn’t alter anything about the issue here.

Another way of looking at thermostats, carburettors, computers (and perhaps linguistic meanings, maps and pictures, though less so), is that we can take an "intentional stance" towards them (501). That is, we can treat them - or take them - as intentional though inanimate objects. Or we can take them as as-if intentional objects. It's not clear if it makes sense to take an intentional stance towards linguistic meanings, maps or pictures as it may do towards thermostats, carburettors or computers. We're more likely to see a computer or even a thermostat as an intentional object than we are to take linguistic meanings, maps or pictures that way. It would make little sense to take an intentional stance towards linguistic meanings, maps or pictures because, primarily, they aren't concrete things as such. They're about things (as with meanings) and of things (as with maps and pictures); though not themselves things - as computers, thermostats and carburettors are concrete things.


No comments:

Post a Comment