An utterance is normally produced by a speaker in linear time and the hearer normally correctly identifies the speaker intention in linear time and incrementally. This is hard to understand in a standard competence grammar since languages are highly ambiguous and context-free parsing is not linear. Deterministic utterance generation from intention and n-best Bayesian interpretation, based on the production grammar and the prior probabilities that need to be assumed for other perception do much better. The proposed model uses symbolic grammar and derives symbolic semantic representations, but treats interpretation as just another form of perception. Removing interpretation from grammar is not only empirically motivated, but also makes linguistics a much more feasible enterprise.
The importance of Henk Zeevat's new monograph cannot be overstated. Its combination of breadth, formal rigor, and originality is unparalleled in work on the form-meaning interface in human language...Zeevat's is the first proposal which provides a computationally feasible integrated treatment of production and comprehension for pragmatics, semantics, syntax, and even phonology. I recommend it to anyone who combines interests in language, logic, and computation with a sense of adventure. David Beaver, University of Texas at Austin
Henk Zeevat (PhD 1992 University of Amsterdam) is a senior lecturer in Amsterdam. He has worked on semantics, pragmatics and formal grammar and their implementation, especially on discourse semantics and optimality theoretic pragmatics.
Table of contents
1 INTRODUCTION 1.1 Aristotelian Competence Grammars 1.1.1 Against ACG: Ambiguity 1.1.2 Against ACG: Time complexity 1.1.3 Against ACG: The gap between production and interpretation 1.2 Production Grammar 1.2.1 The primacy of production 1.3 Strategies for Coordination 1.4 Bayesian Interpretation 1.4.1 Simulated production in interpretation 1.4.2 Mirror neurons 1.5 Conclusion 1.6 The Other Chapters 2 SYNTAX 2.1 Optimality Theory 2.1.1 Reversing Production 2.2 Optimality-theoretic Syntax 2.2.1 Optimality-theoretic syntax for word order in Dutch 2.2.2 Provisional German 2.2.3 Provisional English 2.3 The Production Algorithm 2.3.1 Procedural interpretation of the constraints 2.4 Higher Level Generation 2.5 Other Issues 2.5.1 More Dutch 2.5.2 A worked example 2.5.3 Incremental syntax checking in interpretation 2.5.4 Quantification 2.6 Conclusion 3 SELF-MONITORING 3.1 Optional Discourse Markers 3.1.1 General Self-Monitoring 3.2 Word order freezing 3.3 Pronouns and Ellipsis 3.4 Differential Case Marking 3.5 A Case for Phonological Self-Monitoring? 3.6 Conclusion 4 INTERPRETATION 4.1 The Interpretation Algorithm 4.2 Vision and Pragmatics 4.2.1 Vision 4.2.2 Other cues 4.2.3 Pragmatics 4.2.4 Clark buys some nails 4.2.5 Scalar implicatures 4.2.6 Relevance implicatures 4.3 Conclusion 5 MENTAL REPRESENTATION 5.1 From links to representation structures 5.2 Logic 5.2.1 Logical operators 5.3 Mental Representations in Philosophy 5.4 Belief 5.5 Definiteness 5.6 Comparison with Discourse Semantics 5.6.1 From Contexts into Discourse Representation Theory 5.7 Conclusion 6 Final Remarks 161 6.1 Rounding off 6.2 Computational Linguistics 6.3 Pragmatics 6.4 Semantic Compositionality 6.5 LFG 3.0 and PrOT 2.0 6.6 Language Evolution 6.7 Conceptual Glue
The intended audience is those syntacticians, computational linguists, psycholinguists, semanticists, pragmaticists, language or vision experts in AI and philosophers of language that want their own work to be interpretable in all the mentioned fields.