“Mathematical concepts are not wired into the human condition. They are learned, acquired through cultural and linguistic transmission.” - Numbers and the Making of Us by Caleb Everett
I was struggling to come up with an idea or topic to explore for our first print. I wanted to start at a basic level but still keep things interesting. So I decided to explore the concept which first drew my attention to the world of math: numbers.
On the surface level, this may seem like an entirely trivial concept. After all it would seem that all humans have this natural born ability to count. But that is not quite the case. University of Miami Anthropology Professor Caleb Everett explores in his book, Numbers and the Making of Us, the indigenous Amazonian tribe: the Pirahã. Everett and other anthropologist concluded the Pirahã did not have any consistent way to count or the ability to quantify anything significant. Researchers conducted further tests on this indigenous tribe of perfectly healthy and mentally sound folk. They placed a row of batteries on the table and asked tribe members to place the same number of batteries on the other side of the table, parallel to the original set. When 1-2 batteries were given, the task was easily done. But given 4+ batteries, the Pirahã struggled greatly. Researchers noted as the number of batteries increased, so did the error in placement. Anthropologist came to conclude that the Pirahã’s lack of number comprehension meant trouble when discerning quantities above three. This led them to believe that the ability to count past the fingers on our hands was a result of “cultural and linguistic transmission.” Which means that writing the knowledge down, preserving it and passing it on to the next generation made all the difference. Even cave men had to learn math back in the day!
I’ve always wondered who those crazy bastards were that thought “I wonder what happens after I count all the fingers on my hands and toes?” Well it turns out the first numeric cipher system was introduced by the Egyptians. They certainly must’ve had a lot of time on their…hands (please hold applause till the end). Then the Greeks took over and decided to match these counting numbers to their alphabet. This led to the inevitable rise of the Roman numeral system, which dominated Europe (and still is used widely across the world by fancy people and clocks). Then came the “real” numbers of the traditional system we have today, thanks to the Hindu-Arabic system, who’s brilliance was in developing the number zero. Though, they were certainly not the first to develop it, theirs just stuck around. The Egyptians used nfr (not nft) to represent a zero balance for the purpose of accounting. While the Greeks just could not wrap their head around the nature of 0; always asking themselves ‘how can nothing be something?’ This would go on to lead to philosophical discussions and even religious zealotry on it’s interpretation. People were really fighting over nothing back in ye olde daye.
Negative numbers also gave folks the willies back in those days, as well. Originally developed as an abstract concept by the Chinese around 100-50 BC, it didn’t form use cases until much afterward. Europeans haaaatttteeedd them until about the 17th century, though the first observable use of negative numbers in Europe is in the 15th century by French Mathematician Nicolas Chuquet. This man used them as exponents but referred to them as “absurd numbers.” Silly Nic.
Rational numbers or decimals that don’t run on forever came shortly afterward (~300 BC) followed by irrational numbers (decimals that don’t end, like square root of 2). Irrational numbers broke a lot of dudes minds too. Pythagoras didn’t know how to emotionally deal with irrational numbers, so he (allegedly) sentenced men who spread the knowledge of irrational numbers, to death by drowning. Math spooked grown men back in the day more than kids today!
Then came transcendental numbers (like Pi and e) which really are just special irrational numbers, with the caveat that they cannot be found as a root of a polynomial equation (like how square root 2 is the root for x2 - 2 = 0, so square root 2 is not a transcendental number) and real numbers which is just an umbrella for including all the numbers that can be represented on a number line.
My favorite discovery was the radical conception of infinity. This was first seen to appear in an ancient Indian script around 400 BC. Aristotle brought this idea to Western math, but the craziest guy of them all was Greorg Cantor. In 1895, Cantor proposed the idea of different sizes of infinity. Frankly it blows my mind every time I think about it and it’s a topic that deserves its own print. While this is one of the more philosophical concepts in math, scientist and engineers use the notion on a daily a basis, as it is a foundation to calculus (more on that in the future).
And now for everyone’s favorite: Imaginary Numbers (or complex numbers)! French math guy Rene Descartes coined the term imaginary as an insult when he first started working with them. #StopBullyingComplexNumbers. It really wasn’t until 1799 when Carl Friedrich Gauss gave a proof for the Fundamental Theorem of Algebra, which basically showed that imaginary numbers could be used in the real world for real stuff. So are they imaginary or real Carl??? Pls make up your mind (on the real though, Gauss was the GOAT).
Now this doesn’t end the story for numbers. Just as one use a single world to convey something that could be said in a sentence, mathematicians are constantly looking for new and concise to express their thoughts. Math is nothing but a language and all the different types of numbers are its dialects.
If you’re reading this, thank you for making it all the way through my first post! If you enjoyed this post, please make sure to subscribe for more and share it with those interested! For more on the history of numbers and how a bunch of apes learned to count, check out Caleb Everett’s book Numbers and the Making of Us (not an ad, I just enjoyed this book). Thanks!