I am a big proponent of there being only a finite number of numbers. Yes, I know the argument about their being an infinite number of natural numbers: assume not, take the largest one, add one. And it really is that infinity which lets in all the rest, as far as I can tell.
Here is a counterexample: computer mathematics. I like to program. I like math. Putting them together is a sweet deal. But one has to admit that there are not an infinite number of numbers that can be represented on a computer or even all computers being used. You can setup systems that can get rather large, perhaps arbitrarily large, but there will always be a finite largest number. And yet, we can model everything we need with the computer just fine. We do our calculations with these machines all the time.
So in some sense, embracing the computer, leads us to contemplate a mathematical world of finite extent. Let’s start with what goes wrong with the above argument about infinity: it assumes we can know the largest number. And that is false on the computer system. There is a largest number that will ever be represented on a computer, at least with our current technology; I do not know of any speculative technology that disputes this idea either. If there are a finite number of particles in the universe, we are pretty much stuck. But that largest number we do not know. If we did, we could just add one to it, even on a computer. For the standard number representation on a computer, if we add one to the largest (which is known for any machine/language) will either be an error or cycle to some other number. Notice that the computer model is not a model of all the arithmetic axioms. In particular, addition is not closed in the system and/or does not preserve the ordering. Also take note that we are free to represent numbers differently than the standard representation: we could use arrays to do numbers in base a million, we could use logarithms to represent a much larger range of numbers easily, we could use strings directly.
What brought this up for me? Newton’s method. I was coding it up, ignoring the calculus of taking derivatives. Just approximate by fitting a secant to the function instead of the actual tangent line. So we need to take two points fairly close to each other. Aye, there’s the rub. They can only get too close. For example, if the computer can hold 4 digits for a number, then 3.124 and 3.125 are adjacent numbers. There is nothing in between. Adding .0001 to 3.124 will lead to 3.124. So the delta x is exactly 0. In reality, we can easily get something like 10 digits of accuracy which is enough for anything. But still, there is a limit.
Calculus, on the other hand, allows us to compute the derivative exactly. We get a function which we can then use to get the slope and not encounter a problem of resolving the differences and quotients of the numerical derivative. So it is in this way that the infinite world of normal mathematics really does make a difference in the finite world. It is perfectly acceptable to treat it as an outside trick, a rule that works without an underpinning.
So do I believe in the infinite? Much like my view of God, I think of it being there as a useful guide, but without really committing myself.
Drafted on 6/15/11.