Full text loading...
-
Complexity in linguistic theorizing
- Source: The Mental Lexicon, Volume 9, Issue 2, Jan 2014, p. 144 - 169
Abstract
The general notion of ‘complexity’ is discussed based on foundational ideas by Herbert Simon and Nicholas Rescher. An analytic overview is provided of the ways in which language complexity has been treated in linguistic theories during the past 200 years. The Schlegel brothers, Humboldt, and Schleicher developed the first theory of complexity with their tripartition of languages into progressively complex morphological types. Humboldt also provided the principle of One Meaning – One Form (Humboldt’s Universal) which has turned out to be a widespread tendency in the simplification of morphological complexity. Jespersen’s Ease Theory has been influential in highlighting many instances of phonological change. The main contribution of Structuralism was Markedness Theory, the idea that language at all levels is built on minimal oppositions where one term (e.g. voiceless or singular) is more basic than its complex counterpart (e.g. voiced or plural). Syntactic complexity came to the fore with the theory of Immediate Constituents giving tools for measuring syntactic depth. Generative linguistics tried to measure syntactic complexity by devising Evaluation Measures based on symbol counting. Current linguistics offers a plethora of empirical studies at all levels, invoking considerations of both system, processing, and cognitive complexity.