I am reading The Number Devil by Hans Magnus Enzensberger. It’s kind of cute and will (hopefully) help a lot of young children with their arithmetic anxiety. I am only about halfway through it and I find it kind of fun. I am hoping that some of my grandchildren will enjoy it and that it will help with their math skills.
I have however hit a snag. When he talks about a diagonal drawn in a square that is 1 unit on each side, it turns out that the length of the diagonal is the square root of two. Now the square root of two is an irrational number, or as Hans Magnus calls it, an unreasonable number.
So far so good, we can easily draw a square, or just in case we are the nit-picking kind, we can at least imagine that we can draw a perfect square. And then we can draw a perfect imaginary diagonal inside that square. Using our imaginary measuring stick we get some kind of number that is only close to the actual length of the true diagonal, because the real length of the diagonal is an irrational number that does not translate into a real one.
I can’t believe that I never paid attention to this one simple fact. I mean, when you are trying to count banana, oranges or bales of cloth, the Roman numeral system is just fine. But then you want to build pyramids etc. so you need a zero and that also works, on the outside, but it is intrinsically faulty.
Basically, I feel that everything that humanity tried to put into symbols up to the publication of Newton’s Principia Mathematica was wrong. Not totally, just partially wrong. I mean, the calculation were good enough to enable the pyramids to stand. And yet they were a little bit off the actual true measurements.
I need to work on a better understanding of the Principia Mathematica, but in the meantime, in light of this understanding, I have to say that anything that humans attempted to explain before Newton is highly suspicious.