Registered users receive a variety of benefits including the ability to customize email alerts, create favorite journals list, and save searches.
Please note that a Project Euclid web account does not automatically grant access to full-text content. An institutional or society member subscription is required to view non-Open Access content.
Contact firstname.lastname@example.org with any questions.
The process [by which any individual settles into new opinions] is always the same. The individual has a stock of old opinions already, but he meets a new experience that puts them to a strain…. The result is an inward trouble to which his mind till then had been a stranger, and from which he seeks to escape by modifying his previous mass of opinions. He saves as much of it as he can, for in this matter of belief we are all extreme conservatives. So he tries to change first this opinion, and then that (for they resist change very variously), until at last some new idea comes up which he can graft upon the ancient stock with a minimum of disturbance of the latter, some idea that mediates between the stock and the new experience and runs them into one most felicitously and expediently.
The new idea is then adopted as the true one. It preserves the older stock of truths with a minimum of modification, stretching them just enough to make them admit the novelty, but conceiving that in ways as familiar as the case leaves possible. (William James, Lectures on Pragmatism, 1907)
The Egli-Milner power-ordering is used to define verisimilitude orderings on theories from preference orderings on models. The effects of the definitions on constraints such as stopperedness and soundness are explored. Orderings on theories are seen to contain more information than orderings on models. Belief revision is defined in terms of both types of orderings, and conditions are given which make the two notions coincide.
This paper continues the power ordering approach to verisimilitude. We define a parameterized verisimilar ordering of theories in the finite propositional case, both semantically and syntactically. The syntactic definition leads to an algorithm for computing verisimilitude. Since the power ordering approach to verisimilitude can be translated into a standard notion of belief revision, the algorithm thereby also allows the computation of membership of a belief-revised theory.
In this paper we discuss Gabbay's idea of basing nonmonotonic deduction on semantic consequence in intuitionistic logic extended by a consistency operator and Turner's suggestion of replacing the intuitionistic base system by Kleene's three-valued logic. It is shown that a certain counterintuitive feature of these approaches can be avoided by using Nelson's constructive logic N instead of intuitionistic logic or Kleene's system. Moreover, in N a more general notion of consistency can be defined and nonmonotonic deduction can thus be based on a logical system satisfying the Deduction Theorem.
We consider the connections between belief revision, conditional logic and nonmonotonic reasoning, using as a foundation the approach to theory change developed by Alchourrón, Gärdenfors and Makinson (the AGM approach). This is first generalized to allow the iteration of theory change operations to capture the dynamics of epistemic states according to a principle of minimal change of entrenchment. The iterative operations of expansion, contraction and revision are characterized both by a set of postulates and by Grove's construction based on total pre-orders on the set of complete theories of the belief logic. We present a sound and complete conditional logic whose semantics is based on our iterative revision operation, but which avoids Gärdenfors's triviality result because of a severely restricted language of beliefs and hence the weakened scope of our extended postulates. In the second part of the paper, we develop a computational approach to theory dynamics using Rott's E-bases as a representation for epistemic states. Under this approach, a ranked E-base is interpreted as standing for the most conservative entrenchment compatible with the base, reflecting a kind of foundationalism in the acceptance of evidence for a belief. Algorithms for the computation of our iterative versions of expansion, contraction and revision are presented. Finally, we consider the relationship between nonmonotonic reasoning and both conditional logic and belief revision. Adapting the approach of Delgrande, we show that the unique extension of a default theory expressed in our conditional logic of belief revision corresponds to the most conservative belief state which respects the theory: however, this correspondence is limited to propositional default theories. Considering first order default theories, we present a belief revision algorithm which incorporates the assumption of independence of default instances and propose the use of a base logic for default reasoning which incorporates uniqueness of names. We conclude with an examination of the behavior of an implemented system on some of Lifschitz's benchmark problems in nonmonotonic reasoning.
A representation theorem is obtained for contraction operators that are based on Levi's recent proposal that selection functions should be applied to the set of saturatable contractions, rather than to maximal subsets as in the AGM framework. Furthermore, it is shown that Levi's proposal to base the selection on a weakly monotonic measure of informational value guarantees the satisfaction of both of Gärdenfors' supplementary postulates for contraction. These results indicate that Levi has succeeded in constructing a well-behaved operation of contraction that does not satisfy the postulate of recovery.
Alchourrón, Gärdenfors and Makinson have developed and investigated a set of rationality postulates which appear to capture much of what is required of any rational system of theory revision. This set of postulates describes a class of revision functions, however it does not provide a constructive way of defining such a function. There are two principal constructions of revision functions, namely an epistemic entrenchment and a system of spheres. We refer to their approach as the AGM paradigm. We provide a new constructive modeling for a revision function based on a nice preorder on models, and furthermore we give explicit conditions under which a nice preorder on models, an epistemic entrenchment, and a system of spheres yield the same revision function. Moreover, we provide an identity which captures the relationship between revision functions and update operators (as defined by Katsuno and Mendelzon).
This paper presents logics for reasoning about extension and reduction of partial information states. This enterprise amounts to nonpersistent variations of certain constructive logics, in particular the so-called logic of constructible falsity of Nelson. We provide simple semantics, sequential calculi, completeness and decidability proofs.
In this paper we describe two approaches to the revision of probability functions. We assume that a probabilistic state of belief is captured by a counterfactual probability or Popper function, the revision of which determines a new Popper function. We describe methods whereby the original function determines the nature of the revised function. The first is based on a probabilistic extension of Spohn's OCFs, whereas the second exploits the structure implicit in the Popper function itself. This stands in contrast with previous approaches that associate a unique Popper function with each absolute (classical) probability function. We also describe iterated revision using these models. Finally, we consider the point of view that Popper functions may be abstract representations of certain types of absolute probability functions, but we show that our revision methods cannot be naturally interpreted as conditionalization on these functions.