- Home
- e-Journals
- Evolutionary Linguistic Theory
- Previous Issues
- Volume 3, Issue 2, 2021
Evolutionary Linguistic Theory - Volume 3, Issue 2, 2021
Volume 3, Issue 2, 2021
-
Nonderived environment blocking and input-oriented computation
Author(s): Jane Chandleepp.: 129–153 (25)More LessAbstractThis paper presents a computational account of nonderived environment blocking (NDEB) that indicates the challenges it has posed for phonological theory do not stem from any inherent complexity of the patterns themselves. Specifically, it makes use of input strictly local (ISL) functions, which are among the most restrictive (i.e., lowest computational complexity) classes of functions in the subregular hierarchy (Heinz 2018) and shows that NDEB is ISL provided the derived and nonderived environments correspond to unique substrings in the input structure. Using three classic examples of NDEB from Finnish, Polish, and Turkish, it is shown that the distinction between derived and nonderived sequences is fully determined by the input structure and can be achieved without serial derivation or intermediate representations. This result reveals that such cases of NDEB are computationally unexceptional and lends support to proposals in rule- and constraint-based theories that make use of its input-oriented nature.
-
The computational unity of Merge and Move
Author(s): Thomas Grafpp.: 154–180 (27)More LessAbstractBased on a formal analysis of the operations Merge and Move, I provide a computational answer to the question why Move might be an integral part of language. The answer is rooted in the framework of subregular complexity, which reveals that Merge is most succinctly analyzed in terms of the formal class TSL. Any cognitive device that can handle this level of complexity also possesses sufficient resources for Move. In fact, Merge and Move are remarkably similar instances of TSL. Consequently, Move has little computational or conceptual cost attached to it and comes essentially for free in any grammar that expresses Merge as compactly as possible.
-
Variation in mild context-sensitivity
Author(s): Robert Frank and Tim Hunterpp.: 181–214 (34)More LessAbstractAravind Joshi famously hypothesized that natural language syntax was characterized (in part) by mildly context-sensitive generative power. Subsequent work in mathematical linguistics over the past three decades has revealed surprising convergences among a wide variety of grammatical formalisms, all of which can be said to be mildly context-sensitive. But this convergence is not absolute. Not all mildly context-sensitive formalisms can generate exactly the same stringsets (i.e. they are not all weakly equivalent), and even when two formalisms can both generate a certain stringset, there might be differences in the structural descriptions they use to do so. It has generally been difficult to find cases where such differences in structural descriptions can be pinpointed in a way that allows linguistic considerations to be brought to bear on choices between formalisms, but in this paper we present one such case. The empirical pattern of interest involves wh-movement dependencies in languages that do not enforce the wh-island constraint. This pattern draws attention to two related dimensions of variation among formalisms: whether structures grow monotonically from one end to another, and whether structure-building operations are conditioned by only a finite amount of derivational state. From this perspective, we show that one class of formalisms generates the crucial empirical pattern using structures that align with mainstream syntactic analysis, and another class can only generate that same string pattern in a linguistically unnatural way. This is particularly interesting given that (i) the structurally-inadequate formalisms are strictly more powerful than the structurally-adequate ones from the perspective of weak generative capacity, and (ii) the formalism based on derivational operations that appear on the surface to align most closely with the mechanisms adopted in contemporary work in syntactic theory (merge and move) are the formalisms that fail to align with the analyses proposed in that work when the phenomenon is considered in full generality.
-
Mixed computation
Author(s): Diego Gabriel Krivochenpp.: 215–244 (30)More LessAbstractProof-theoretic models of grammar are based on the view that an explicit characterization of a language comes in the form of the recursive enumeration of strings in that language. That recursive enumeration is carried out by a procedure which strongly generates a set of structural descriptions Σ and weakly generates a set of strings S; a grammar is thus a function that pairs an element of Σ with elements of S. Structural descriptions are obtained by means of Context-Free phrase structure rules or via recursive combinatorics and structure is assumed to be uniform: binary branching trees all the way down. In this work we will analyse natural language constructions for which such a rigid conception of phrase structure is descriptively inadequate and propose a solution for the problem of phrase structure grammars assigning too much or too little structure to natural language strings: we propose that the grammar can oscillate between levels of computational complexity in local domains, which correspond to elementary trees in a lexicalised Tree Adjoining Grammar.
-
Review of De Smedt & De Cruz (2020): The Challenge of Evolution to Religion
Author(s): Carlo Brentaripp.: 245–254 (10)More LessThis article reviews The Challenge of Evolution to Religion
Volumes & issues
Most Read This Month

-
-
On the nature of roots
Author(s): Phoevos Panagiotidis
-
-
-
Construction grammar for monkeys?
Author(s): Michael Pleyer and Stefan Hartmann
-
-
-
Grammar change
Author(s): Hubert Haider
-
- More Less