Syntax (linguistics): Difference between revisions
imported>Aleksander Stos m (category cleanup) |
imported>John Pate |
||
Line 21: | Line 21: | ||
==See also== | ==See also== | ||
* [[Amphiboly]] | |||
* [[Grammar]] | * [[Grammar]] | ||
* [[Linguistics]] | * [[Linguistics]] | ||
==External links== | ==External links== |
Revision as of 11:26, 31 July 2007
In linguistics, syntax[1] is the study of the rules, or 'patterned relations' that govern the way words combine to form phrases and phrases to form sentences. The combinatory behaviour of words is governed to a first approximation by their part of speech (noun, adjective, verb, etc., a categorization that goes back in the Western tradition to the Greek grammarian Dionysios Thrax). Modern research into natural language syntax attempts to systematize descriptive grammar and, for many practitioners, to find general laws that govern the syntax of all languages. It is unconcerned with prescriptive grammar (see Prescription and description).
There are many theories of formal syntax - theories that have in time risen or fallen in influence. Most theories of syntax share at least two commonalities. First, they hierarchically group subunits into constituent units (phrases). Second, they provide some system of rules to explain patterns of acceptability/grammaticality and unacceptability/ungrammaticality. Most formal theories of syntax offer explanations of the systematic relationships between syntactic form and semantic meaning. Syntax is defined, within the study of signs, as the first of its three subfields (the study of the interrelation of the signs). The second subfield is semantics (the study of the relation between the signs and the objects to which they apply), and the third is pragmatics (the relationship between the sign system and the user).
In the framework of transformational-generative grammar (of which Government and Binding Theory and Minimalism are recent developments), the structure of a sentence is represented by phrase structure trees, otherwise known as phrase markers or tree diagrams. Such trees provide information about the sentences they represent by showinging the hierachical relations between their component parts.
There are various theories as to how best to make grammars such that by systematic application of the rules, one can arrive at every phrase marker in a language (and hence every sentence in the language). The most common are Phrase structure grammars and ID/LP grammars, the latter having a slight explanatory advantage over the former.[2] Dependency grammar is a class of syntactic theories separate from generative grammar in which structure is determined by the relation between a word (a head) and its dependents. One difference from phrase structure grammar is that dependency grammar does not have phrasal categories. Algebraic syntax is a type of dependency grammar.
A modern approach to combining accurate descriptions of the grammatical patterns of language with their function in context is that of systemic functional grammar, an approach originally developed by Michael A.K. Halliday in the 1960s and now pursued actively on all continents. Systemic-functional grammar is related both to feature-based approaches such as Head-driven phrase structure grammar and to the older functional traditions of European schools of linguistics such as British Contextualism and the Prague School.
Tree adjoining grammar is a grammar formalism with interesting mathematical properties which has sometimes been used as the basis for the syntactic description of natural language. In monotonic and monostratal frameworks, variants of unification grammar are often preferred formalisms.
Footnotes
See also
External links
- AllSyntax.com Programming Languages
- The syntax of natural language: an online introduction using the Trees program by Beatrice Santorini & Anthony Kroch, University of Pennsylvania.