STEPHEN P. S ( ' H W A R T Z
VAGUENESS
AND
A REPLY
INCOHERENCE:* TO
BURNS
In a recent paper, ' V a g u e n e s s and C o h e r e n c e ' , I Professor Linda Burns claims to solve the sorites paradox. Here larger purpose is to vindicate the Wittgensteinian view of vagueness according to which vague languages are perfectly in order as they are. She attempts to do this by defending the coherence of ordinary vague predicates against the attacks of Wright and D u m m e t t . Briefly, D u m m e t t and especially Wright claim that ordinary vague predicates are tolerant and that such tolerance leads to paradox and contradiction. 2 Furthermore, D u m m e t t and Wright claim that tolerance is ineliminable without fundamentally changing the nature of language. In this p a p e r I will argue that although Burns has shown that we must be careful in the way we state tolerance rules, her claims to have solved the sorites paradox are premature. In general, Burns has not shown that ordinary vague predicates do not suffer from radical incoherence.
.
According to Burns, the sorites paradox depends on an existential assumption. It seems then, that the assumption about the series on which the sorites argument rests should be stated in a form which employs an existential quantifier. The induction step holds provided it is assumed that for any vague predicate it is possible to find some series which exhibits apparently continuous variation in the respects that matter for the application of that predicate. (p. 493) T h e heart of Burns's a r g u m e n t is a demonstration that no such series could exist. Thus the sorites paradox is defused in much the same way as the barber paradox. T h e r e simply is no paradoxical barber who shaves all and only those who do not shave themselves. Likewise such a series as the sorites paradox presupposes is shown to be impossible. Burns shows that there could be no such series in the context of a discussion of tolerance rules. Burns distinguishes between strict Synthese 80: 395-4(}6, 1989. © 1989 Kluwer Academic Publishers. Printed in the Netherlands.
396
STEPHEN
P. S C H W A R T Z
tolerance rules and loose tolerance rules. Strict tolerance rules, she agrees, lead to paradox but are obviously false. Loose tolerance rules are true but do not generate paradox. Burns gives three examples of strict tolerance rules. If one thing is heap and a second has just one less grain, the second is a heap also. If one person is bald and a second has one more hair than the first, the second is bald also. If one person is a child then any other indistinguishable from the first in terms of apparent maturity is a child also. (p. 490)
These tolerance rules would work to generate paradox by supporting the induction steps of sorites arguments. Such a sorites might go as follows: A man with 0 hairs is bald. If a man with n hairs is bald, then a man with n + 1 hairs is bald. T h e r e f o r e all men are bald. Burns easily shows that such strict tolerance rules are false and thus that their associated induction steps are likewise false. I will quote her argument here at length since it will be the focus of my discussion. Differences that could not be detected just at a glance may also alter the applicability of a predicate such as "child". Suppose a twenty-year-old dwarf was so like his twelveyear-old nephew in appearance they they were always taken to be identical twins. One but not the other would be a child. Similar counterexamples can be found to the tolerance rules for the predicates "heap" and "bald". Where one thing in a smoothly varying series is a heap, the next may not be even though it contains only one less grain. For shape determines the predicate "heap", as well as numbers of grains. If one member of a series consists of many grains "heaped up" in a single mass, and the next almost the same number raked out fiat, the first will be a heap and the second not. Nor does shape alone determine the predicate, for in terms of just shape a pinch might be indiscernible from a genuine heap. We could also imagine circumstances in which one man is bald but a second with only one more hair is not. Suppose the first genuinely bald man has no hairs at all on the top of his head but quite a few around the sides and back. The second has only one more hair but has had a hair transplant, and his hairs are now distributed evenly over the top of his scalp, He is happily nonbald, despite having only one more hair than someone clearly bald. (pp. 496-97)
More generally: There could be a series which varied continuously with respect to ~, but where a
VAGUENESS
AND
IN('OHEREN('E
397
predicate determined by this property was true of some member but clearly not true of the next. No matter how smoothly the series varied with respect to one determining property there could be large differences with respect to another; differences which would justify the drawing of sharp boundaries. (p. 497)
In order to avoid these obvious sorts of counterexamples to strict tolerance rules we must add a ceteris paribus clause. Such amended tolerance rules are loose tolerance rules. I will quote one example of a loose tolerance rule. If one thing is a heap and a second differs from it in containing only one less grain, and in any other ways dependent on this minor difference, but the two do not differ detectably in any other respects relevant to the application of the predicate 'heap', then the second is a heap also. (p. 498) 3
Burns says: "Principles of this loose sort differ from the strict versions of the tolerance rules in containing an exception clause (italicized in the examples above)" (p. 493). The rest of Burns's argument is quite ingenious. Tolerance rules understood in this loose way are surely true . . . . If an observational predicate applies to the one member of a series it must also apply to the next if we cannot by observation discover any relevant difference between them. But now doubts begin to arise about the existence of such series . . . . (p. 499)
The problem is that where S1 is indiscernible from $2 and $2 is indiscernible from $3 but $3 is discernible from S1, there is a difference between S1 and $2. Namely SI but not $2 is discernible from $3. This is a relevant difference. Thus no series could be such that the end points are discernible and it varied smoothly in every respect. There will be breaks in relative indiscernibilities. 4 Burns summarizes her argument as follows: The source of the problem lies at the assumption about the series on which the Sorites reasoning is based. There are no such series of the kind required for the paradoxical argument to work. The reason why the assumption about the series on which the argument is based is false has to do with two incompatible features which such series would be required to have. To fit the argument a series would have to exhibit perfect continuity in every respect relevant to the application of the predicate to be projected, and also nontransitive indiscernibility in all those respects from member to member. But those indiscernibilities are themselves relevant to the application of the predicate, and nontransitivity inevitably produces breaks in the continuity of the series. How things compare with other members of the series matters for the application to them of observational predicates, and the nontransitivity of indiscernibility guarantees that there will be observable differences from member to member. (p. 510)
398
STEPHEN
P. S C H W A R T Z
. According to Burns, then, the problem with the strict tolerance rules cited by her is that in order to avoid counterexamples they need a ceteris paribus clause, and once we 'loosen' the rules in this way they can no longer be satisfied by sorites generating series. I agree that once loosened the rules cannot be so satisfied, but unfortunately this does not constitute a solution to the sorites paradox, because Burns has misconstrued the strict tolerance rules involved with the vague predicates she discusses. Although she successfully refutes the strict tolerance rules that she attacks, other strict tolerance rules are implied by vague concepts - strict tolerance rules that are not susceptible to the kind of counterexample Burns proposes and that do not need a fatal ceteris paribus clause. Thus Burns's arguments regarding loose tolerance rules are irrelevant to the sorites paradox. Burns says: "All I am claiming is that the tolerance rules must be interpreted as loose rules, and when they are the paradox may be resolved" (p. 501). I propose to show that the tolerance rules need not, indeed should not, be interpreted as loose rules. The strict tolerance rules that Burns refutes are so easily refutable because they each have a misplaced universal quantifier. They say that everything related to the original thing in a certain way deserves the predicate. For example, her version of the strict tolerance rule for 'bald' says that any man with one more hair than a bald man is also bald. This is implausible on the face of it. It seems to me on the other hand that the strict tolerance rules involved with our vague concepts make a much weaker claim. In general, such a strict tolerance rule asserts only that if something deserves a (vague) predicate and it is altered in some small way relevant to the predicate, it (not everything like it in that respect) still deserves the predicate. In other words (and oversimplifying), if a bald man had just one more hair than he has, he would still be bald. 5 Burns has shown that some care must be taken to state precisely the strict tolerance rules for v a g u e predicates. When this is done, none should be refutable by the kinds of counterexamples she uses. . Burns refutes the strict tolerance rule that she proposes for 'child' by supposing a childlike dwarf. It seems to me, however, that 'child' can
VAGUENESS
AND
INCOHERENCE
399
be shown to be paradoxical in a way quite different from that Which Burns considers. The following sorites argument is apparently paradoxical, despite everything that Burns has said: (1) (2)
A person exactly one second old is a child. Any person who is a child when he is n seconds old is a child when he is n + 1 seconds old. Therefore anyone at any age is a child. 6
The idea behind (2) is that there is no specific second in a person's life when he ceases being a child. To deny (2) would be to assert that there is a second n in a person's life such that at the nth second of his life he is a child but at the nth + 1 second of his life he is no longer a child. The idea that there might be such a second in any living person's history is preposterous and is inconsistent with the vagueness of 'child'. Premise (2) would depend on the following strict tolerance rule: (3)
If a person is a child when he is n seconds old, then he is a child when he is n + 1 seconds old.
This rule is meant to reflect the fact that the predicate 'child' tolerates aging of one second. Note that this tolerance rule for 'child', unlike the one cited for 'child' by Burns, says nothing about indistinguishability or apparent maturity and thus is not refuted by the twenty-year-old dwarf. It might be thought that (3) is, nevertheless, very similar in form to the strict tolerance rules that Burns gives for 'bald' and 'heap', and thus to be refutable by counterexamples similar to those that Burns uses to refute them. To see that it is not, consider a version of a strict tolerance rule that is false and is similar in form to the strict tolerance rules that Burns gives for 'heap' and 'bald' (but is different from (3)). (4)
If one person is a child, then any other person exactly one second older than that person is also a child.
I suppose that (4) would be adequately symbolized in the following way: (4')
(Ex)(Px & Cx & Oxn)---~ (y)[(Py & Oyn + 1)---+ Cy]
Here 'Px' stands for 'x is a person', 'Cx' stands for 'x is a child', and
400
STEPHEN
P. SCHWARTZ
'Oxn' stands for 'x is exactly n seconds old'. The sense of (4') could also be given by the English sentence 'If someone who is exactly n seconds old is a child, then anyone who is exactly n + 1 seconds old is also a child.' (4') and thus (4) is easily refuted by counterexample, however, because the consequent has such extreme (and unnecessary) generality. Suppose someone n seconds old is a child. He is very small and immature for his age. There is no doubt someone else, exactly one second older, who is perhaps large and mature for his age and thus would not be considered a child. Although the two only vary by one second in age, they vary greatly in other characteristics which are determining for the predicate 'child'. This is a refutation of (4') and (4) in the style of Burns's counterexamples. Note, however, that (3) is different from (4) and cannot be refuted by the same sort of counterexamples. The correct symbolization of (3) is: (3')
(x)[(Px & Cxt)---~Cxt+ 1 second]
Here 'Cxt' stands for 'x is a child at t'. In English (3') would read: 'For all x, if x is person and x is a child at t, then x is a child at t plus one second'. (3') cannot be refuted by the same sort of c0unterexamples as refute (4'), because it is not making a claim about everyone who is one second older than some child. It seems evident that the predicate 'child' contains a tolerance rule like (3'). Given an adult human we simply cannot believe (even when confronted by paradox) that there was some particular second of his life that was the last second of his childhood - the last second when he was a child. The meaning of the predicate 'child' is such that it seems to rule out the very possibility that there be such a last second of childhood in one's life. 7 This implies that the use of the predicate 'child' is governed by a tolerance rule like (3'). If no particular second of one's life is the last second when one was a child, then every particular second when one was a child was followed by another second of childhood. Since (3') is not refuted by counterexamples in the way that (4') is, it is not necessary to add a ceteris paribus clause, thus it is not necessary to replace (3') with a loose tolerance rule. Thus Burns's arguments regarding loose tolerance rules become irrelevant. (3') is a strict tolerance rule, it leads to paradox. The predicate 'child' is incoherent as claimed by Wright.
VAGUENESS
AND
INCOHERENCE
401
The situation with the predicate 'heap' is somewhat more complicated. The strict tolerance rule that Burns gives for 'heap' is (to repeat): (5)
If one thing is a heap and a second has just one less grain, the second is a heap also.
The symbolization of (5) must be something like: (5')
(Ex)(Lx & Hx & G x n ) ~ (y)[(Ly & G y n - 1)--~ Hy]
Here 'Lx' stands for 'x is a collection of grains', 'Hx' stands for 'x is a heap' and 'Gxn" stands for 'x consists of exactly n grains'. Again, we can easily find counterexamples to (5') because of the misplaced universal quantifier. Some collection of n grains may correctly be called a heap, and yet not every collection of n - 1 grains is a heap. (Not even every collection of n + l grains is a heap!) Due to the generality of its consequent, (5') (and thus (5)) is not the correct strict tolerance rule for 'heap'. Unlike with 'child', however, it is not immediately evident how to state the correct rule. The gist has got to be that if we start with a heap and subtract a grain from that heap, then we will have a heap left. But now the problem is that when we remove the grain we may disturb the other grains so that no heap is left. Surely not every time we remove a grain from a heap do we have a heap left. The intuitive ideal behind the correct strict tolerance rule for 'heap' is not that any time or any way that we remove a grain from a heap do we have a heap left but that there is a way to remove a grain such that there is a heap left. The tolerance rule would then be something like: (6)
If there is a heap and we remove a grain from that heap in the way most conducive to a heap remaining after the removal, then there is still a heap left only with one less grain. 8
The key point here is that there is a way to so remove a grain from any heap and our symbolization of (6) ought to reflect this. Let 'Px' stand for 'x is a procedure' and let 'Fxzy' be a function sign that stands for 'z is the result of applying y to x'. Then our symbolization is: (6')
(x){(Hx & Gxn)--~ (Ey)(Ez)[(Py & Fxzy) & Hz & G z n - 1]}
402
STEPHEN
P. S C H W A R T Z
In English (6') would read: 'For every x, if x is collection of grains and x is a heap and x consists of exactly n grains, then there exists a procedure y such that z is the result of applying y to x and z is a collection of grains and z is a heap and z consists of one less grain than x'. Again I claim that (6') cannot be refuted by the sorts of counterexamples Burns uses against (5'). Further, (6') is true, or at least (acknowledging the paradox to which it leads) that it is impossible for us to imagine how (6') could be false. To do so we would have to suppose there is a heap of grains such that there is no possible way to remove even one grain such that a heap remains after the removal. We see that (6') involves an existential claim. Indeed it seems to me that most correct strict tolerance rules will involve some such existential claim. For example, we suppose that there is always some way to remove a hair from a nonbald man's head such that he remains nonbald after the removal. Likewise the tolerance rule for 'bald' involves the idea that for every bald man there is a way to add a single hair to his head such that after the addition he is still bald. 9 Thus Burns is right that the paradoxes generated by tolerance rules involve an existential claim. But the existential claims in (6') and the other correct strict tolerance rules are very different from the ones Burns supposes are involved in sorites paradoxes. In particular they do not involve the assumption that there is the sort of series that she has shown to be impossible. Rather they involve the assumption that there is a way of minutely altering something along a dimension such that the relevant predicate continues to apply after the alteration. Burns has given us absolutely no reason to doubt these existential claims. Again, as with 'child', it seems clear that the predicates 'heap' and 'bald' contain the kind of strict tolerance rule that I have been claiming they contain. If we see some heap diminished grain by grain until only one grain is left, we believe that it started out a heap, ended up a nonheap, but we refuse to believe that there was some particular single grain removal when it changed from heap to nonheap (or even from heap to borderline heap). The natural, and I think correct, explanation of why we cannot believe that there was some particular grain removal when it changed from heap to nonheap is that 'heap' is vague and, being vague, the predicate 'heap' contains a strict tolerance rule that excludes the possibility of there being a precise point of passing from heap to nonheap (or heap to borderline heap). Likewise
VAGUENESS
AND
INCOHERENCE
403
if we consider a man going bald, we would refuse to say that there was some particular strand of hair such that before he lost that strand it was not the case that he was bald but after he lost it he was bald. . As I have construed the strict tolerance rules, they are universal statements, and thus refutable by even a single counterexample. For example, (3) is rendered false if there is even one person who ceases to be a child at a particular second of his life. Someone might wonder how we can be so sure that there is no such person. It is important to keep in mind that there is nothing sacred about the time unit one second. If that unit is too large, we can always choose a smaller unit of time as long as the unit chosen has some finite duration. Similarly with e.g., 'heap'. Someone may wonder whether there couldn't be a heap that would not survive a one grain loss. If so, then we can pick a smaller unit for removal - a molecule, say, or an atom. All that is necessary for our tolerance rule is that there be a unit such that for each heap there is a way to remove that unit and have a heap left. For those who are still troubled by the possibility of exceptions, say heaps that would not survive even a one atom loss or people whose childhood ends at a given nanosecond of their lives, the tolerance rules will have to be made somewhat more complicated. The idea would have to be that e.g., (3) would be true of normal people people who exhibit a normal growth pattern. T h e supposed counterexample to (3) would have to be someone who goes through an instantaneous transformation from child to nonchild. Such a transformation is abnormal and in fact represents only a logical rather than a physical possibility. So we must understand a strict tolerance rule as applying to a normal instance of the predicate or under normal circumstances where 'normal' is taken in a very broad sense and is only meant to exclude remote logical possibilities. In particular I do not mean by 'normal' that the instances exhibit the kind of absolute indlscernibility from stage to stage that Burns has shown to be impossible. T h e fact is that in order to be susceptible to paradox the instances need not exhibit that absolute indiscernibility from stage to stage, they just need to undergo the sort of gradual transformation that things normally undergo. ~°
404
STEPHEN
P. S C H W A R T Z
Another qualm might be that I have surreptitiously smuggled in a
ceteris paribus clause with the procedure - that is that I am supposing that the procedure removes a grain while everything else relevant to the predicate 'heap' remains the same. Such a ceteris paribus clause would make the principles that I have been arguing for in effect 'loose'. No such ceteris paribus clause is presupposed, however. I am not presupposing that the heap remain the same in every other respect after the removal. All I am presupposing is that what remains is a heap. I am unconcerned with how little or much it is altered in other ways (although I imagine it will not be very much altered). So the tolerance rule says that for each (normal) heap there is a way to remove a grain (or some smaller unit) such that a heap remains, not necessarily an indiscernible h e a p - just a heap. II .
The tolerance rules that I have been proposing can generate paradoxical sorites arguments in various ways but, it seems, these sorites arguments cannot contain induction steps. Since the tolerance rules do not make the same existential assumptions about a series of indiscernible objects, they will not support the standard induction steps that one finds in sorites arguments. Instead of using an induction premise in the sorites argument we could use a series of conditional premises, one for each removal. Such an argument would be rather lengthy - much longer than the induction version - but that is alright since it is still finite. The first premise asserts that some heap consists of say 108 grains. Each conditional premise (we'll need 108 of them) will say that if we have a heap with exactly n grains, then we can (by some procedure or other) remove a grain and have a heap with exactly n - 1 grains. We can then prove that there are heaps consisting of exactly i grains for each i including 1 and 0.12 But this is absurd. One grain is not a heap and yet our tolerance rule for heap forces us to say that it is. This is similar to a version of the paradox given by Unger: We may begin by supposing that there are heaps, and that a million beans typically arranged gives us an instance of that concept. But, then, removing a single peripheral bean gently from such a typical heap, it seems, will not leave us with no heap before us. Hence, we must conclude that even when we have but one bean left, or none at all, we still have a heap of beans. But this is absurd.
Another version of the sorites paradox could use something like the
VAGUENESS
AND INCOHERENCE
405
least number principle. Pick any typical adult human. We believe that o u r t y p i c a l a d u l t h u m a n was a c h i l d b u t t h a t h e n o l o n g e r is a c h i l d . B e c a u s e o f t h e t o l e r a n c e o f o u r c o n c e p t o f c h i l d w e will r e f u s e to say t h a t t h e r e w a s a last s e c o n d o f his c h i l d h o o d . If t h e r e w a s n o last s e c o n d of his c h i l d h o o d , t h e n g i v e n a n y s e c o n d of his life w h e n h e was a c h i l d , h e w a s a c h i l d at t h e n e x t s e c o n d of his life ( o t h e r w i s e it w a s t h e last s e c o n d o f his c h i l d h o o d a f t e r all). T h u s at t h e p r e s e n t t i m e w e a r e c o m m i t t e d to s a y i n g b o t h t h a t h e is a c h i l d a n d t h a t he is n o t a c h i l d . W e ' v e a r r i v e d at a c o n t r a d i c t i o n a n d , as W r i g h t w o u l d say, all w e ' v e d o n e is f o l l o w t h e r u l e s . NOTES * I would like to thank Terence Horgan, William Throop and especially my wiEe, Diane Schwartz, for helpful suggestions for improving this paper and for valuable and insightful discussions on the topic of vagueness. I would also like to thank Ithaca College for generously supporting the work on this paper with a Summer Research Grant. Linda Burns, 1986, 'Vagueness and Coherence', Synthese 68, 487-513. All page references are to this article. 2 See Crispin Wright: 1975, 'On the Coherence of Vague Predicates', Synthese 30, 325-65. 3 Strictly speaking there are problems with the wording of this rule, since it implies that any two collections of grains that only differ in one grain will both be heaps (since any other difference between them will be due to the one grain difference). I think the loose tolerance rule should by amended by deleting the clause " . . . and in any other ways dependent on this minor difference . . . . ". 4 I am not attempting to capture the subtlety of Burns's argument here. I am satisfied that Burns has shown that there cannot be such a series. In the text of her article and in the footnotes she addresses many problems and objections that might arise. It would take us too far afield to attempt to summarize all of the details here. -~ Just for the record, here is a recent statement by Crispin Wright regarding tolerance: A predicate is prima facie susceptible to [the sorites paradox] just in case it is tolerant; that is, sufficiently small variations in some associated parameter are apparently insufficient to affect the justice with which it can be applied to something, whereas sufficiently large variations are always so sufficient. 1987, Realism, Meaning, and Truth, Basil Blackwell, p. 108. 6 Of course, there is a sense in which everyone at any age is a child - a child of his/her parents. Parents do not have fewer children as their children become adults. The ordinary predicate "child' is usually used in the sense of 'young child' and is so used here. Another point is that, of course, if someon e dies during childhood, then there is a last second of his childhood. I am assuming throughout that we are talking about living people or people who have not died during childhood.
406
S T E P H E N P. S C H W A R T Z
7 That there are borderline cases makes no difference, We find it just as impossible to believe that there was a last second when he was definitely a child or a first second when he was a borderline child. On another related point: Of course, it is logically possible that someone go through a miraculous instantaneous transformation from child to adult. This remote logical possibility is irrelevant to my point. (See the discussion of normality in Section 4.) s My wording here is influenced by Peter Unger's careful versions of the sorites paradox in: 1979, 'There Are no Ordinary Things', Synthese 41, 117-54. A minor point is that someone may claim these strict tolerance rules are not rules used in the teaching and learning of the predicates. This may be so but is irrelevant to my purpose. Burns claims to have a solution to the sorites paradox. I deny that she does. If (6) is true, then the paradox remains whether or not (6) is a strict tolerance rule or even a rule at all. Keep in mind that these procedures need not be natural, uniform, or even physically practical nor need we have any notion what they might be, although usually we will have a very good notion. It is only necessary that there be some possible procedure (possibly different) for each head and each condition of hair. H) I do not think that it is unfair or overly limiting to restrict the tolerance rules to normal instances. I agree with Burns that our ordinary predicates exhibit open texture in the sense that we can imagine cases where the rules just don't tell us what to do. The rules that define the usage of our ordinary concepts are meant for normal circumstances. i~ Throughout this discussion I have avoided questions about identity. Someone might doubt that after a grain removal that the same heap remains. I do not suffer from these doubts but in order to avoid unnecessary contentions I say 'a heap remains' thus not necessarily the same heap. What I believe is that the same heap remains although somewhat altered. Another detail that I have not emphasized but is obvious upon reflection is that when I say 'removal of a single grain' I mean 'net removal of a single grain'. Again, I refer the reader to Peter Unger's paper 'There Are no Ordinary Things' for a discussion of these and many other difficulties and fussy details involved in a most careful version of these paradoxes. Unger, for example, discusses the problem that counterfactuals are being used or are implied, that the discussion relies on intensional (merely possible) entities, and many other things as well. 12 The argument would be of the form: P
p---o q
y -"-~ Z
Therefore z Any such argument is, of course, valid. Department of Philosophy and Religion Ithaca College Ithaca, NY 14850 U.S.A.