We present an approach to combining three areas of research which we claim are all based on information theory: knowledge representation in Artificial Intelligence and Cognitive Science using prototypes, plans, or schemata; formal semantics in natural language, especially the semantics of the `if-then' conditional construct; and the logic of subjunctive conditionals first developed using a possible worlds semantics by Stalnaker and Lewis. The basic premise of the paper is that both schema-based inference and the semantics of conditionals are based on Dretske's notion of information flow and Barwise and Perry's notion of a constraint in situation semantics. That is, the connection between antecedent $A$ and consequent $B$ of a conditional `if $A$ were the case then $B$ would be the case' is an informational relation holding with respect to a pragmatically determined utterance situation. The bridge between AI and conditional logic is that a prototype or planning schema represents a situation type, and the background assumptions underlying the application of a schema in a situation correspond to channel conditions on the flow of information. Adapting the work of Stalnaker and Lewis, the semantics of conditionals is modeled by a refinement ordering on situations: a conditional `if $A$ then $B$' holds with respect to a situation if all the minimal refinements of the situation that support $A$ also support $B$. We present new logics of situations, information flow, and subjunctive conditionals based on three-valued partial logic that formalizes our approach, and conclude with a discussion of the resulting theory of conditionals, including the "paradoxes" of conditional implication, the difference between truth conditions and assertability conditions for subjunctive conditionals, and the relationship between subjunctive and indicative conditionals.
"An Information-Based Theory of Conditionals." Notre Dame J. Formal Logic 41 (2) 95 - 141, 2000. https://doi.org/10.1305/ndjfl/1038234607