Lexical functional grammar


Lexical functional grammar is a constraint-based grammar framework in theoretical linguistics. It posits two separate levels of syntactic structure, a phrase structure grammar representation of word order and constituency, and a representation of grammatical functions such as subject and object, similar to dependency grammar. The development of the theory was initiated by Joan Bresnan and Ronald Kaplan in the 1970s, in reaction to the theory of transformational grammar which was current in the late 1970s. It mainly focuses on syntax, including its relation with morphology and semantics. There has been little LFG work on phonology.

Overview

LFG views language as being made up of multiple dimensions of structure. Each of these dimensions is represented as a distinct structure with its own rules, concepts, and form. The primary structures that have figured in LFG research are:
For example, in the sentence The old woman eats the falafel, the c-structure analysis is that this is a sentence which is made up of two pieces, a noun phrase and a verb phrase. The VP is itself made up of two pieces, a verb and another NP. The NPs are also analyzed into their parts. Finally, the bottom of the structure is composed of the words out of which the sentence is constructed. The f-structure analysis, on the other hand, treats the sentence as being composed of attributes, which include features such as number and tense or functional units such as subject, predicate, or object.
There are other structures which are hypothesized in LFG work:
The various structures can be said to be mutually constraining.
The LFG conception of linguistic structure differs from Chomskyan theories, which have always involved separate levels of constituent structure representation mapped onto each other sequentially, via transformations. The LFG approach has had particular success with nonconfigurational languages, languages in which the relation between structure and function is less direct than it is in languages like English; for this reason LFG's adherents consider it a more plausible universal model of language.
Another feature of LFG is that grammatical-function changing operations like passivization are relations between word forms rather than sentences. This means that the active-passive relation, for example, is a relation between two types of verb rather than two trees. Active and passive verbs involve alternative mapping of the participants to grammatical functions.
Through the positing of productive processes in the lexicon and the separation of structure and function, LFG is able to account for syntactic patterns without the use of transformations defined over syntactic structure. For example, in a sentence like What did you see?, where what is understood as the object of see, transformational grammar puts what after see in "deep structure", and then moves it. LFG analyzes what as having two functions: question-focus and object. It occupies the position associated in English with the question-focus function, and the constraints of the language allow it to take on the object function as well.
A central goal in LFG research is to create a model of grammar with a depth which appeals to linguists while at the same time being efficiently parsable and having the rigidity of formalism which computational linguists require. Because of this, computational parsers have been developed and LFG has also been used as the theoretical basis of various machine translation tools, such as AppTek's TranSphere, and the Julietta Research Group's Lekta.