I came across a PDF version of this book online. It contains a number of fine essays, including the ones excerpted from below. A recurring question concerning Godel's incompleteness results is whether they impact "interesting" mathematical questions.

This volume also includes Paul Cohen's essay (chapter 19) on his work on the Continuum Hypothesis and his interactions with Godel. See also Horizons of Truth.CHAPTER 21 The Godel Phenomenon in Mathematics: A Modern View: ... Hilbert believed that all mathematical truths are knowable, and he set the threshold for mathematical knowledge at the ability to devise a “mechanical procedure.” This dream was shattered by Godel and Turing. Godel’s incompleteness theorem exhibited true statements that can never be proved. Turing formalized Hilbert’s notion of computation and of finite algorithms (thereby initiating the computer revolution) and proved that some problems are undecidable – they have no such algorithms.

Though the first examples of such unknowables seemed somewhat unnatural, more and more natural examples of unprovable or undecidable problems were found in different areas of mathematics. The independence of the continuum hypothesis and the undecidability of Diophantine equations are famous early examples. This became known as the Godel phenomenon, and its effect on the practice of mathematics has been debated since. Many argued that though some of the inaccessible truths above are natural, they are far from what is really of interest to most working mathematicians. Indeed, it would seem that in the seventy-five years since the incompleteness theo- rem, mathematics has continued thriving, with remarkable achievements such as the recent settlement of Fermat’s last “theorem” by Wiles and the Poincare conjecture by Perelman. Are there interesting mathematical truths that are unknowable?

The main point of this chapter is that when knowability is interpreted by modern standards, namely, via computational complexity, the Godel phenomenon is very much with us. We argue that to understand a mathematical structure, having a decision pro- cedure is but a first approximation; a real understanding requires an efficient algorithm. Remarkably, Godel was the first to propose this modern view in a letter to von Neumann in 1956, which was discovered only in the 1990s.

Meanwhile, from the mid-1960s on, the field of theoretical computer science has made formal Godel’s challenge and has created a theory that enables quantification of the difficulty of computational problems. In particular, a reasonable way to capture knowable problems (which we can efficiently solve) is the class P, and a reasonable way to capture interesting problems (which we would like to solve) is the class NP. Moreover, assuming the widely believed P ̸= NP conjecture, the class NP -complete captures interesting unknowable problems. ...

My recent interest in this topic parallels a remark by David DeutschCohen: ... I still had a feeling of skepticism about Godel's work, but skepticism mixed with awe and admiration.

I can say my feeling was roughly this: How can someone thinking about logic in almost philosophical terms discover a result that had implications for Diophantine equations? ... I closed the book and tried to rediscover the proof, which I still feel is the best way to understand things. I totally capitulated. The Incompleteness Theorem was true, and Godel was far superior to me in understanding the nature of mathematics.

Although the proof was basically simple, when stripped to its essentials I felt that its discoverer was above me and other mere mortals in his ability to understand what mathematics -- and even human thought, for that matter -- really was. From that moment on, my regard for Godel was so high that I almost felt it would be beyond my wildest dreams to meet him and discover for myself how he thought about mathematics and the fount from which his deep intuition flowed. I could imagine myself as a clever mathematician solving difficult problems, but how could I emulate a result of the magnitude of the Incompleteness Theorem? There it stood, in splendid isolation and majesty, not allowing any kind of completion or addition because it answered the basic questions with such finality.

The reason why we find it possible to construct, say, electronic calculators, and indeed why we can perform mental arithmetic, cannot be found in mathematics or logic. The reason is that the laws of physics "happen" to permit the existence of physical models for the operations of arithmetic such as addition, subtraction and multiplication.that suggests the primacy of physical reality over mathematics (usually the opposite assumption is made!) -- the parts of mathematics which are simply models or abstractions of "real" physical things are most likely to be free of contradiction or misleading intuition. Aspects of mathematics which have no physical analog (e.g., infinite sets) are prone to problems in formalization or mechanization. Physics (models which can be compared to experimental observation; actual "effective procedures") does not ever

*require*infinity, although it may be of some conceptual convenience. Hence one suspects, along the lines above, that mathematics without something like the "axiom of infinity" might be well-defined. Is there some sort of finiteness restriction (e.g., upper bound on Godel number) that evades Godel's theorem? If one only asks arithmetical questions about numbers below some upper bound, can't one avoid undecidability?

## 20 comments:

Proof Envy http://rjlipton.wordpress.com/2014/06/11/proof-envy/

That last paragraph you wrote was brilliant.

"primacy of primacy of physical reality over mathematics over mathematics"

However, "physics" is the mind's abstraction of the physical reality, and thus it is on par with math.

"Physics does not ever require infinity"

However, it does encompass or give rise to singularity.

I can't imagine why anyone would find the continuum hypothesis uninteresting.

It's not just convenience, but coherence which is lost if one tries to wholly exclude the infinite from physics. Nor have intuitionism and even more finitism or ultrafinitism shown themselves to be useful mathematical tools, despite the intrinsic interest of showing that many things still work in their new context.

Search Aristotelian mathematics Franklin, interesting stuff & new book on similar topics.

I am not a physicist. I've been trying to find out the relationship of incompleteness theorem and uncertainty principle for quite sometime without success. The only reference I found was a footnote in a book that said that while Feynman was still a graduate student he approached Godel personally to ask about that and he was promptly kicked out by Godel. No further explanation was given.

This BBC program explained Godel's 'dangerous' mental state which might be why that Feynman episode happened.

http://topdocumentaryfilms.com/dangerous-knowledge/

Apparently Godel was trying to prove that the incompleteness theorem did not applied to the creative mind without success and that drove him literally mad. Feynman tried to link incompleteness theorem to something physical must had touched on a raw nerve.

Recursion comes at the cost of decidalbility

That is similar to something I wrote on the Ultranet list back in 2004. If ZF axioms include the axiom of infinity then the existence of infinity cannot be deduced from the other axioms. The maximum physically significant number at any given time might be something like the number of Planck 4-volumes in the past light-cone raised to the power of the number of possible permutations of particles in the universe, or perhaps the number of permutations of possible particles,( for instance converting all mass to microwave-background temperature photons.) So if the universe is open (no big crunch) then the maximum possibly physically significant number grows without bound, but is never a completed infinity.

The continuum problem has a length-scale cutoff well above putting any of the above numbers in the length denominator. Probing below a certain resolution will involve using so much energy in such a small volume that persistent black holes will form.

That idea fundamental conflicts with quantum mechanics. Only infinite dimensional spaces support the commutation relationships observed in quantum systems. You do not have the Heisenberg uncertainty relationship in finite dimensional spaces, there would always be at least one QFT field mode where both momentum and position where simultaneously observable.

You can prove this yourself from the basic properties of the determinant on finite dimensional spaces.

That's an interesting point, which I'll have to think about. I suspect that there aren't any observable differences between literal infinity and being without bound. Renormalization's unsatisfactory meaning seems like a reason to prefer finite concepts, but physicists seem to have developed a taste for it, despite perhaps not being too clear on what they are actually doing.

Yeah, the real problem is with putting together any kind of theory that looks like physics as we know it which does not involve infinities. People have tried -- some of them exceedingly smart people -- with no outcome that is convincing. Intuitionist and similar kinds of mathematical approaches produce their own highly counterintuitive implications.

Certainly physics as we know it involves continuous functions and therefore the continuum, so the question of the continuum hypothesis is a real one in the standard form of physics.

No actual calculation in physics requires infinity. The easiest way to see this is to note that calculations required to compare theory to experiment can be done on finite computing machines. The continuum is an idealization and we are not sure, due to quantum gravity, whether the structure of spacetime is actually continuous. (The same applies to quantum fields and even to Hilbert space.) Some *theories* of physics may invoke infinity, but careful consideration reveals that the necessity of infinity will never be *experimentally testable* (i.e., there are related theories which do not invoke infinity which cannot be excluded by experiment).

Intuitionism in mathematics captures a good deal of what you are getting at here -- it focuses on "mental constructions" as the basis for mathematics, which is a reasonable surrogate for "calculations" as you are using the term here. Intuitionism likewise rejects the concept of an actual infinity.

http://en.wikipedia.org/wiki/Intuitionism

The problem is that intuitionism displays well known conceptual anomalies (or what are at least perceived as anomalies by many), such as the failure of the law of the excluded middle. And if one is even more restrictive than the usual intuitionism, then only more such anomalies crop up.

Rather than go down those messy paths, though, I think there might be a useful thought experiment for you to clarify your thinking. Do you really believe that there aren't an infinite number of locations between, say, the tip of your nose and the top of your head? Even if any calculation involving those two points involves only a finite number of steps, don't you believe that in fact, out in the real world, there are an infinite number of intermediary locations? Indeed aren't there a continuum number of such points out there? If not, why not? What does calculation have to do with the number of such points?

I think the evidence is quite strong that there is a "minimal length" in spacetime. The continuum is an *idealization* that comports with our intuition because this minimal length (the Planck length) is so much smaller even than the size of atoms.

http://arxiv.org/abs/hep-th/0405033

Newton and Leibniz are rolling in their grave with that statement. To start with the whole of calculus needs not only infinity but also the continuum, and without calculus you cannot derive any of the functions used in falsification experiments.

As for space-time being discrete, considering how carefully Lorentz invariance has been tested, and the latest gamma ray observations that severely constrain any frequency dependence on the speed of light (which would occur if space-time were discrete) my money is one both infinities and continuum. First because there have been no experiments to falsify those assumptions, second because of the enormous utility of the assumptions in theory and calculation.

Lorentz invariance could be an emergent symmetry. It doesn't require anything about discreteness at small scales -- see the lattice (where Euclidean invariance is emergent at long distances), for example.

Obviously, there is a discrete version of calculus (see numerical analysis).

Then I challenge you to, for example, derive Maxwell's equations, or the stress-energy of a non-Riemannian manifold, using only finite difference equations; and remember by your assumptions you have to a priori state a cut off for the calculations - you are not allowed to cheat and say "we will do such and such in finite steps, but take the steps off to infinity"

As for Lorentz invariance being discretely emergent, the sub-group of one dimensional boosts is not cyclical so the smallest discrete sub-group of the Lorentz symmetrises would be isomorphic to the integers. Unless you want to claim that the boosts just stop at some level, but given the momentum's of protons in the LHC, electrons at SLAC, and cosmic rays that level is very large.

Among other things infinity is the only meaningful way to understand approximation and error. That is, it provides meaning to statements of the form "if you do X for N steps we will be within Epsilon of the solution, and doing M more steps will reduce the error by Delta"

Also check out Rudy Rucker's Conversations with Kurt Gödel and a follow-up post that is linked near the top of that page. Rudy writes: "I scanned my mostly handwritten notes on my conversations with the

great logician Kurt Gödel during the years 1972-1977, and saved the

scans into a single PDF ."

“Is there some sort of finiteness restriction … that evades

Gödel’s theorem?

Consider 4 ideas:

DNA makes RNA makes protein.

Experimental physics trumps theoretical physical trumps

mathematics trumps philosophy.

Milgrom is the Kepler of contemporary cosmology.

Nature is finite and digital. (Zuse and Fredkin)

"The Fate

of the Quantum" by Gerard 't Hooft, 2013

the

dark matter crisis

MOND and the

Photon Underproduction Crisis

Lambda-VDM

Model: a Testable Modification of Lambda-CDM

"What is

Measurement? Why Does Measurement Exist?"

Post a Comment