The compositionality principle is a strong constraint frequently invoked to explain systematic aspects of semantics (FODOR 1994). According to this principle, the meaning of an expression is entirely deducible from the meanings of its components. The principle alone does not, however, tell how composed meanings are formed. A classical way to fill the gap is to assign semantic structures to lexical entities that host all necessary instructions for successful compositions to occur (JACKENDOFF 1983;1990; PUSTEJOVSKY 1995; SOWA 1984). The mechanism, metaphorically, is not unlike biochemical combination, in which complex molecules highly selectively form larger assemblages maintained by electrostatic forces. Similarly, lexical meanings would consist of rich structures that would assemble in larger meaningful structures through a mere matching mechanism. Artificial intelligence provides a variety of formalisms that implement such combinatorial machinery. The problem of meaning composition is thus easily solved... Unfortunately, we must face a list of difficulties.
- Assigning semantic structures to lexical entities presupposes that identical mechanisms are at work at inter-lexical and intra-lexical meaning levels. While supra-lexical meaning structures are directly motivated by linguistic combination through syntax, intra-lexical meaning structures, as generally proposed, are arbitrary solutions to internal constraints of the adopted model, with little guarantee that they mirror any cognitive reality.
- Structural compositionality is monotonic. Meaning composition predicts growing structures, which become increasingly complex as discourse interpretation moves on, with no systematic way to merge such structures down to simpler ones.
- Lexical semantic structures should be grounded in perception. This implies that perceptual structures be replicated as conceptual structures. Such replication is necessarily partial and inaccurate, in a way that linguistic performance refutes. Meaning is omnipotent: it can be projected on all aspects of perception. Meaning is unbounded: all perceptive subtleties can be conceptualised. By contrast, lexical conceptual structures are necessarily finite in variety and in precision.
- This way of explaining meaning composition amounts to a mere translation, from linguistic expressions to conceptual structures. This second, mental, language needs a lexicon, i.e. a conceptual ontology. Attempts to define the list of the basic elements of this ontology lead to considerable difficulties, and nothing would explain their systematic behaviour (FODOR 1998).
Most attempts to solve the compositionality problem led authors to assign fixed conceptual structures to words, using formalisms such as logical expressions, feature structures, frames, schemas, graphs, etc. Our claim is that such an enterprise must fail (GHADAKPOUR 2003).
We suggest another solution. We accept the possibility that meaning representations are perceptual by nature. Meaning construction requires a conceptual interface, between symbolic representations like words, grammatical structures and logical operations on one side, and perceptual qualities on the other side. We propose to implement this interface through a systematic device, consisting of a contrast operator and a minimal set of topological grids (GHADAKPOUR 2003). Permanent concepts, in our model, are nothing more but an illusion. Conceptual representations are ephemeral. They are constructed “on the fly” by the conceptual interface, and they do not survive the context that gave birth to them.
JERRY A FODOR, 1998, Concepts: where cognitive science went wrong, Oxford University Press.
LALEH GHADAKPOUR, 2003, Le système conceptuel, à l'interface entre le langage, le raisonnement et l'espace qualitatif : vers un modèle: de représentations éphémères, PhD Dissertation: École Polytechnique.
RAY JACKENDOFF, 1983, Semantics and Cognition, MIT Press.
RAY JACKENDOFF, 1990, Semantic Structures, MIT Press.
JAMES PUSTEJOVSKY, 1995, The Generative Lexicon, MIT Press.
JOHN F SOWA, 1984, Conceptual Structures: information processing in mind and machine, Addison-Wesley.