Registered users receive a variety of benefits including the ability to customize email alerts, create favorite journals list, and save searches.
Please note that a Project Euclid web account does not automatically grant access to full-text content. An institutional or society member subscription is required to view non-Open Access content.
Contact email@example.com with any questions.
Edward Nelson published in 1986 a book defending an extreme formalist view of mathematics according to which there is an impassable barrier in the totality of exponentiation. On the positive side, Nelson embarks on a program of investigating how much mathematics can be interpreted in Raphael Robinson's theory of arithmetic Q. In the shadow of this program, some very nice logical investigations and results were produced by a number of people, not only regarding what can be interpreted in Q but also what cannot be so interpreted. We explain some of these results and rely on them to discuss Nelson's position.
We survey recent advances on the interface between computability theory and algorithmic randomness, with special attention on measures of relative complexity. We focus on (weak) reducibilities that measure (a) the initial segment complexity of reals and (b) the power of reals to compress strings, when they are used as oracles. The results are put into context and several connections are made with various central issues in modern algorithmic randomness and computability.
In this paper we isolate a notion that we call “formalism freeness” from Gödel's 1946 Princeton Bicentennial Lecture, which asks for a transfer of the Turing analysis of computability to the cases of definability and provability. We suggest an implementation of Gödel's idea in the case of definability, via versions of the constructible hierarchy based on fragments of second order logic. We also trace the notion of formalism freeness in the very wide context of developments in mathematical logic in the 20th century.