Workshop In Honour of John Williams:To Celebrate His Retirement From SMU
Date: 30 May 2018, Wednesday
Time: 10am to 4pm
Venue: Philosophy Resource Room (AS3-05-23)
10am to 11.30am : John Williams, SMU, “Once You Think You’re Wrong, You Must Be Right: New Versions of the Preface Paradox”
Some take the view that the so-called ‘preface paradox’ shows that rationality may allow you to have inconsistent beliefs, in contradiction of orthodox views of justification. Here argue for the conclusion that rationality may require you, as a real human thinker, to have inconsistent beliefs, even when you recognize the inconsistency. Perhaps the most vigorous opposition to my conclusion comes from classical and insightful objections by Doris Olin. After preliminary clarification, I first discuss three versions of the paradox. These are Makinson’s Original Version, my World Capitals and Olin’s Fallibility. For each version, I consider objections that my conclusion does not get established. None of the three versions is entirely free from objection. I show that there is an important mistake in Makinson’s logic that seems to have long gone unnoticed, with the result that his original case is one in which your beliefs are not inconsistent. The case may be modified to evade this difficulty but then it is doubtful that it is realistic. World Capitals avoids this difficulty but is vulnerable to Olin’s objection—one that she makes against her own version, Fallibility—that accepting the possibility of justified inconsistent beliefs saddles you with a pair of justified beliefs that are in explicit contradiction. However I present Modesty, a version that supposes that you believe that at least one of your beliefs (excluding this) is false. I argue that this version escapes all the objections that could trouble the other versions as well as some interesting general objections that Olin makes. I conclude that this is a living and everyday case in which rationality requires you to have inconsistent beliefs even while you recognize that your beliefs are inconsistent. I also argue more tentatively for the same verdict for Modesty*, a version that supposes that you believe that at least one of your beliefs (including this) is false.
About the Speaker:
John N. Williams (PhD Hull) works primarily in epistemology. He also works in philosophy of language and applied ethics. He has published in Mind, Analysis, the Journal of Philosophy, the Australasian Journal of Philosophy, Synthese, Philosophical Studies, Acta Analytica, Philosophia, Philosophy East and West, American Philosophical Quarterly, Philosophy, Philosophy Compass, the Journal of Philosophical Research, Religious Studies, Theoria, Social Epistemology Review and Reply Collective and Logos and Episteme. He is co-editor of Moore’s Paradox: New Essays on Belief, Rationality and the First Person, Oxford University Press together with Mitchell Green. He researches and teaches in the School of Social Sciences, Singapore Management University and before that taught in the Philosophy Department at the National University of Singapore and was Head of the Unit of Philosophy at the University of the West Indies. In August he will take up a Professorship in Philosophy at Nazarbayev University in Kazakhstan.
11.30am to 12.30pm : Robert Beddor, NUS, “Modal Conditions on Knowledge and Skilled Performance”
In the talk I examine two prominent analyses of knowledge in the current literature. One is a modal analysis, which identifies knowledge with safe belief. The other is a virtue epistemological analysis, which identifies knowledge with a type of apt performance. These two approaches are usually viewed as rivals; this talk offers a path to reconciliation. I outline a new form of virtue epistemology, which combines an analysis of knowledge as skillful performance with a modal analysis of skillfulness. I argue that the resulting view – “Modal Virtue Epistemology” – preserves the main benefits of both analyses.
2pm to 3pm : Tang Weng Hong, NUS, “Moore’s Paradox and Degrees of Belief”
It is absurd to assert or to believe the following:
(1) It’s raining, and I do not believe that it’s raining.
(2) It’s raining, and I believe that it’s not raining.
But is merely assigning a degree of belief greater than 0.5 to (1) or to (2) absurd? I maintain, along with Adler and Armour-Garb (2007), that (a) assigning a degree of belief greater than 0.5 to (1) need not be absurd. But I also maintain that (b) assigning a degree of belief greater than 0.5 to (2) is indeed absurd. What explains this discrepancy? Adler and Armour-Garb think that (a) can be explained by their view that full beliefs are transparent whereas partial beliefs are not. But such a view does not explain (b). In my talk, I consider John’s account of why it is absurd to believe (1) and to believe (2). (See, in particular, ‘Moore’s Paradoxes, Evans’s Principle and Self Knowledge’.) I also consider how John’s account may be supplemented to help us account for the aforementioned discrepancy.
3pm to 4pm : Ben Blumson, NUS, “Knowability and Believability”
Moore’s paradox in belief and Fitch’s paradox of knowability are very closely related – whereas the first concerns whether when truths of the form “p and I don’t believe p” are believable (without absurdity), the second concerns whether truths of the form “p and it’s not known that p” are knowable. In this paper, I consider how and whether responses to Moore’s paradox constrain the correct response to Fitch’s paradox, and vice versa. Finally, I discuss implications for metaphysical anti-realism.
All are welcome