While thinking about the implications of a recent post it occurred to me that philosophy is almost an entirely syntactic enterprise, and pays little more than lip service to semantics. To me, this was a surprising realization. No doubt it explains why my own ideas are very different from those expressed by philosophers. For I have long considered semantics to be the primary concern.
For perspective, let’s start with the Shannon-Weaver model of communication:
- We start with something meaningful, which we describe as semantic.
- The meanings or semantics are encoded into a stream of symbols, often taken to be binary digits. This is the encoding step.
- The stream of symbols is what Shannon referred to as “information”, and is often called “Shannon information”. Shannon’s information theory deals with the communication and manipulation of the symbol stream, and whatever meaning was encoded in that symbol stream is not relevant to the use of the Shannon information. The symbol stream is usually considered to be syntactic, with the symbols as syntactic elements.
- The final receiver(s) of the symbol stream decode it in order to recover the semantics.
When we consider natural language communication, the natural language statements fit the requirements for Shannon information. For written communication, the alphabetic letters are the symbols, while for spoken communication the phonemes are the symbols.
Much of philosophy deals with representations, which are normally taken to be something like natural language statements and are variously referred to as propositions, statements, beliefs, mental attitudes. These representations are said to be intentional, which implies that they are about something. That is, the representations are taken to result from the encoding process in step 2 of the Shannon-Weaver model. However, what is encoded appears to play no role. Philosophers normally see representations as being used with logic. But while the syntactic strings may encode semantics, the logic is done based on form so the encoded semantics is not actually relevant. Fodor’s “Methodological solipsism” thesis helps to make this clear.
Truth is considered an important property of some representations, and one might wonder whether truth is an implicit reference to the semantics. However, truth is usually taken to be “correspondence with the facts” where facts are in turn taken to be metaphysical entities that are in the form of representations (i.e. are syntactic entities).
Philosophers do, of course, discuss semantics. However, they generally try to treat meanings as defined by the truth conditions of statements. That amounts to attempting to provide a syntactic account of semantics.
When knowledge is defined as justified true belief, that is usually treated as if knowledge amounts to stored representations, which would be an entirely syntactic account of knowledge. Perception is typically taken as a system that delivers mostly true representations.
The idea of AI (Artificial Intelligence) fits quite well here. If everything cognitively important is syntactic, then it ought to be possible to automate that on a computer.
My own constrasting view
I see the mind as primarily semantic. I see perception as providing us access to the semantic world. This is more or less consistent with J.J. Gibson’s ideas on direct perception. Thus perception gives us access to the semantics of the world, and we ourselves then select from that and encode our selection into syntactic form for communication with speech. Of course the brain is used for this encoding, but it would be parts of the brain associated with language, rather than parts of the brain associated with perception. I see knowledge as mostly in the form of abilities. These would include the abilities to perceive those semantic aspects of the world that are of interest, as well as the ability to encode that into syntactic form to use in speech.
Related to this, I see truth as not being a criterion for perception because truth is a criterion for syntactic representations rather than for unrepresented semantics. Similarly, I see no role for the storage or retrieval of representations and I am skeptical of the possibility of AI.