**Tolerant, classical, strict**. (with Pablo Cobreros, Paul Egre and Dave Ripley, Journal of Philosophical Logic, 2012) pdf file

In this paper we investigate a semantics for first-order logic originally proposed by R.\ van Rooij to account for the idea that vague predicates are tolerant, that is, for the principle that if $x$ is $P$, then $y$ should be $P$ whenever $y$ is similar enough to $x$. The semantics, which makes use of indifference relations to model similarity, rests on the interaction of three notions of truth: the {\em classical} notion, and two dual notions simultaneously defined in terms of it, which we call {\em tolerant} truth and {\em strict} truth. We characterize the space of consequence relations definable in terms of those and discuss the kind of solution this gives to the sorites paradox. We discuss some applications of the framework to the pragmatics and psycholinguistics of vague predicates, in particular regarding judgments about borderline cases.**Tolerance and mixed consequence in a super/sub-valuationist setting**. (with Pablo Cobreros, Paul Egre and Dave Ripley, to appear in Studia Logica) pdf file

In a previous paper we investigated a semantic framework to deal with the idea that vague predicates are tolerant, namely that small changes do not affect the applicability of a vague predicate even if large changes do. Our approach there rests on two main ideas. First, given a classical extension of a predicate, we might define a strict and a tolerant extension depending on an indifference relation associated to that predicate. Second, we can use these notions of satisfaction to lead to mixed consequence relations that capture non-transitive tolerant reasoning. The present paper intends to explore the possibility of defining mixed notions of consequence in a super/sub-valuationist setting and see to what extent any of these notions captures non-transitive tolerant reasoning.**Reaching transparant truth**. (with Pablo Cobreros, Paul Egre and Dave Ripley) pdf file

This paper presents and defends a way to add a transparent truth pred- icate to classical logic, such that ThAi and A are everywhere intersub- stitutable, where all T-biconditionals hold, and where truth can be made compositional. A key feature of our framework, called STT (for Strict- Tolerant Truth), is that it supports a nontransitive relation of conse- quence. At the same time, it can be seen that the only failures of transi- tivity STT allows for arise in paradoxical cases.**Vagueness, Signaling and Bounded Rationality**. (with Michael Franke and Gerhard Jaeger, in proceedings of LENLS2010) pdf file

Vagueness is a pervasive feature of natural language, but indeed one that is troubling for leading theories in semantics and language evolution. We focus here on the latter, addressing the challenge of how to account for the emergence of vague meanings in signaling game models of language evolution.**Measurement, and interadjective comparisons**. (in Journal of Semantics, 2010) pdf file

This paper shows the relevance of measurement theory for the linguistic analysis of comparative statements. In particular, the paper focusses on interadjective comparatives like `$x$ is $P$-er than $y$ is $Q$' and comparatives involving multidimensional adjectives. It is argued that Bale's (2008) recent proposal to account for such comparatives is rather limited, and just one way to account for interadjective comparison. In fact, it is shown that we can make use of recently developed measurement-theoretic techniques in political economy to handle intersubjective comparisons of utility to account for interadjective comparatives as well. This paper also discusses how to {\it construct} the desired scales, if one starts with a delineation approach of comparatives.**Implicit versus explicit compartives**. (in Egre & Klinedins (eds.): Vagueness and Language, 2011) pdf file

It is natural to assume that the explicit comparative -- John is taller than Mary -- can be true in cases the implicit comparative -- John is tall compared to Mary -- is not. This is sometimes seen as a threat to comparison-class based analyses of the comparative. In this paper it is claimed that the distinction between explicit and implicit comparatives corresponds to the difference between (strict) weak orders and semi-orders, and that both can be characterized naturally in terms of constraints on the behavior of predicates among different comparison classes.**Strategic Vagueness, and appropriate contexts**. with Kris de Jaegher (in `Meaning and Game Theory', edited by A. Benz, C. Ebert, G. Jaeger, and R. van Rooij, 2011) pdf file

This paper brings together several approaches to vagueness, and ends by suggesting a new approach. The common thread in these approaches is the crucial role played by context. We argue that the most plausible application to vagueness in natural language of these models is one where the listener only imperfectly observes the context in which the speaker makes her utterances. Yet, it is clear that not all vagueness can be accounted for by conflicts of interest. This is why the rest of the paper looks at the case of common interest. First, vagueness is thus seen as an application of Horn's pragmatic rule that (un)marked states get an (un)marked expression. Then we argue that the Sorities paradox arises from the use of vague predicates in an inappropriate context. Finally, we follow prospect theory and assume that context directly enter agents' utility functions in the form of reference points, with respect to which agents think in gains and losses. The rationale for vagueness here is that vague predicates allow players to express their valuations, without necessarily uttering the context, so that the advantage of vague predicates is that they can be expressed across contexts.**Vagueness and Linguistics**. (new version) (in G. Ronzitti (ed), The Vagueness Handbook, 2011) pdf file

This paper is a long (and biased) overview of vagueness in linguistics. I argue, among others, (i) that semi-orders are crucial for vagueness, (ii) that the Sorites paradox is best solved by putting constraints on the contexts in which vague predicates can be used appropriately, and (iii) discuss the relation between vagueness and matters of grain-size.**Revealed preference and Satisficing Behavior**. (Synthese, 2011) pdf file

A much discussed topic in the theory of choice is how a preference order among options can be derived from the assumption that the notion of `choice' is primitive. Assuming a choice function that selects elements from each finite set of options, Arrow (1959) already showed how we can generate a weak ordering by putting constraints on the behavior of such a function such that it behaves as a utility maximizer. Arrow proposed that rational agents can be modeled by such choice functions. Arrow's standard model of rationality has been criticized in economics and gave rise to approaches of {\it bounded rationality}. Two standard assumptions of rationality will be given up in this paper. First, the idea that agents are utility {\it optimizers} (Simon). Second, the idea that the relation of `indifference' gives rise to an equivalence relation. To account for the latter, Luce (1956) introduced semi-orders. Extending some ideas of Van Benthem (1982), we will show how to derive semi-orders (and so-called interval orders) based on the idea that agents are utility {\it satisficers} rather than utility optimizers.**Comparatives and Quantifiers**. (in Bonani and Hofherr (eds), Empirical Issues in Syntax and Semantics 7) pdf file

A traditional issue in the analysis of comparatives is whether or not degrees are essential. In the first part of this paper I discuss the traditional analyses that account for comparatives with (Seuren, von Stechow) and without (Klein) degrees, and remind the reader that these are very similar to each other. A more recent issue is how to account for quantifiers in the {\it than}-clause. The traditional analyses account well for Negative Polarity Items in comparative clauses, but have problems with conjunctive quantifiers. The strength of the proposals of Larson (1988) and Schwarzchild \& Wilkinson (2002), on the other hand, goes exactly in the opposite direction. I will discuss two types of strategies so as to account for both types of quantifiers: (i) one based on the traditional analysis, but by making use of more coarse-grained models or of intervals, (ii) one where comparatives are taken to be ambiguous between the traditional reading and the Larson-reading, and where the actual reading is selected with the help of the strongest meaning hypothesis.

E-mail: | R.A.M.VanRooijATuva.nl |
---|