Speaker: J. Michael Dunn, Founding Dean Emeritus, Professor Emeritus of Informatics and Computer Science
When: Wednesday, September 6, 4:00 PM
Where: Ballantine Hall 217
Topic: The Paradox of the Two Firefighters: A Paraconsistent Logic Solution and a Probability Solution
Abstract: This talk is based on joint work with Nicholas Kiefer (Departments of Economics and Statistical Sciences, Cornell Univ.).
Bar-Hillel and Carnap (1953, p. 229)) wrote: "It might perhaps, at first, seem strange that a self-contradictory sentence, hence one which no ideal receiver would accept, is regarded as carrying with it the most inclusive information. ... A self-contradictory sentence asserts too much; it is too informative to be true." Floridi (2011) advocates for a “strong semantic theory of information" that requires informative sentences to be true, and thus avoids what he calls the “Bar-Hillel Carnap Paradox” (BCP). Floridi says contradictions contain zero information, and that "inconsistent information is obviously of no use to a decision maker."
Various paraconsistent logicians (including myself) have argued for theories of information wherein contradictions can contain information. But the “Paradox of the Two Firefighters” below occurred to me only recently as a “common-sense” example to illustrate this.
Suppose you are awakened in your hotel room on the 30th floor by a fire alarm. Your look around you and your room is on fire. You open the door. You see three possible ways out: left, right, straight ahead.
Scenario 1. You see two apparently experienced and well-intentioned firefighters. One says there is exactly one safe route and it is to your left. The other says there is exactly one safe route and it is to your right. Contradictory information! Scenario 2. You find no one to give directions. Incomplete information! Nothing! Question: Which scenario would you prefer?
I think it is obvious that a rational agent would prefer to be in Scenario 1. While the two firemen are giving you contradictory information, they are also both giving you the useful information that there is a safe way out and it is not straight ahead. It appears that three choices have been reduced to two, thereby increasing your odds for survival. Now you have to pick from those two and run, hoping to find the exit.
In my talk I shall consider various reactions to the Paradox of the Two Firefighters and defend that it is an example of contradictory information being better than nothing (and relevant to decision making). I shall present two solutions. One is a (paraconsistent) logical solution and the other is a probability approach.
The first uses the apparatus of Dunn (2009), where the "Belnap-Dunn 4-valued Logic" (Truth, Falsity, Neither, Both) was embedded into a context of subjective probability generalized to allow for degrees of belief, disbelief, and two kinds of uncertainty --- that in which the reasoner has too little information (ignorance) and that in which the reasoner has too much information (conflict). Jøsang's (1997) "Opinion Triangle" was thus expanded to an "Opinion Tetrahedron" with the 4-values as its vertices.
But there is an alternative solution, also based on subjective probability but of a more standard type. This solution builds upon “linear opinion pooling” (Stone (1961)). Kiefer (2007, 2011) developed apparatus for assessing risk using expert opinion, and this influences the second solution.
Kiefer and I had hoped to find some “isomorphism” between our two solutions, but so far all we have is two different solutions – the minimal number required for any paradox. J
The talk shall also consider “Big Data” and the World Wide Web. Large data sets are by their nature likely to be inconsistent, and especially so when they are as open as the WWW. Suppose you want to find an answer to a certain yes/no question on the WWW. Which of the following scenarios do you prefer? (A) You google and get no (relevant) response. (B) You google and get multiple conflicting responses. It seems reasonable that we have all found in our googling that it is often better to find contradictory information on a search topic rather than finding no information at all. Some of the various reasons why this is so include finding that there is at least active interest in the topic and being able then to appraise the credentials of the informants, count their relative number, assess their arguments and evidence, try to reproduce their experimental results, discover authoritative sources, etc. Any or all of these might apply to a particular case, and together they allow us to assign the contradictory information a coordinate in the Opinion Tetrahedron that is different than merely False. There is again though the alternative of assessing the provenance of the information using linear opinion pooling.
J. Michael Dunn (2009), "Contradictory Information: Too Much of a Good Thing," Journal of Philosophical Logic, 39, 425-452.
Yehoshua Bar-Hillel and Rudolf Carnap (1953), "Semantic Information," The British Journal for the Philosophy of Science, 4, 147-157.
Luciano Floridi (2011), The Philosophy of Information, Oxford University Press.
Audun Jøsang (1997), "Artificial Reasoning with Subjective Logic," Proceedings of the Second Australian Workshop on Commonsense Reasoning, Perth.
Nicholas M. Kiefer (2007), “The Probability Approach to Default Probabilities,” Risk, 146-150.
Nicholas M. Kiefer (2011), “Default Estimation, Correlated Defaults and Expert Information," Journal of Applied Econometrics 26, 173-192.
Mervyn Stone (1961), "The Opinion Pool," Annals of Mathematical Statistics, 32, 1339-1342.