Translate This

Zero, My Hero

By: Gary Arndt

Last Updated on


Transcript

This episode is about nothing.

Not in the sense that Seinfield was a TV show about nothing, but rather this is literally about nothing. 

It is about the number zero. 

A number that few people bother to give much thought to, yet without it, modern mathematics wouldn’t exist, and neither would any of the digital technologies you are using to listen to this right now.

Learn more about the fascinating history of zero, and why it took so long to develop on this episode of Everything Everywhere Daily. 

================

Zero is a relatively recent development in the big scheme of human history. 

To understand how it came about we have to go back to the beginning of mathematics, which today we call accounting. 

People developed numbering systems to describe real things in the real world. You would have 3 chickens or 4 sacks of grain. Numbers weren’t an abstract concept. They were tied to actual things, and written systems of numbers were designed to keep track of things. In other words, it was accounting. 

There was no need to describe zero of something. If you didn’t have something, you just didn’t have it. There was no need to keep records of nothing. 

The idea of not having something isn’t complicated. Children learn about zero as soon as they can learn to count. It is a pretty easy idea to grasp.

However, expressing the concept of not having something in the form of a written number, the same as 1 or 2 or 3 are numbers, took some time.

The primary use of zero for us is using in to represent the absence of a value in a number. The zero in 201 represents that there is nothing in the tens place in the number. 

This use of a place for a value in a number was where the earliest uses of zero appeared. 

The earliest system of written numbers that we know of comes from ancient Babylon. The Babylonians had what is known as a sexagesimal numbering system. It was based on 60 digits, whereas our numbering system is based on 10 digits. 

The Babylonian system used a space instead of a number to indicate something didn’t have a value. Eventually, they came to use a double slash to indicate the same thing, but it wasn’t really the same as a zero. It was more like using punctuation, and they never used it at the end of a number, which could cause a great deal of confusion. The only way you could tell certain numbers apart would be by context. It would be like not being to differentiate 530 from 53 in our number system. The zero is what tells us the magnitude of the number.

Other cultures independently developed this place holder system for their written numbers. 

The early Mayans in Central America developed a similar placeholder character, which really didn’t function the same as a true zero. 

Early Chinese had developed a proto tool for calculating known as counting rods. Like the Babylonians, they would just leave a space for where a value should be, rather than using a number. 

The first real record of a true zero came from India. The  Bakhshali manuscript, an ancient mathematical text written on birch bark, that uses a dot symbol as a zero. The carbon dating of this text places it back to around 250 CE.

Another Indian text, the  Aryabhatiya, is the first recorded use of a zero in a decimal-based system. This system dates back to about 500 CE, and the  Aryabhatiya even explains the use of a decimal-based system by saying: “from place to place each is ten times the preceding.”

The person who is credited with discovering the zero that we know today is the Indian mathematician and astronomer Brahmagupta. He developed the earliest concept of zero, not just as a place holder, but as a full-blown number, complete with mathematical rules. He even placed zero into a system with positive and negative numbers. 

One place you will notice that is totally absent in the early development of zero is Europe. The Europeans were really late to the party when it came to zero, and for adopting a decimal-based system of numbers for that matter. There was no real independent European system of zero which developed.

For the ancient Greeks, the problem was a philosophical one.  They didn’t have a zero, or even a placeholder value in their numbering system. They had a very difficult time getting their heads around the idea of nothing. In fact, debates about nothing and the physical concept of a vacuum lasted well into the middle ages. 

Most people listening to this are at least somewhat familiar with Roman numerals which has no zero. Latin numbers are great for numbering things like Super Bowls or Wrestlemanias, but they are horrible for doing things like multiplication or division. 

The way zero came to the modern world is through the Arab and Persians. They adopted the decimal system created in India and began using it themselves. The Persian mathematician Muhammad ibn Musa al-Khwarizmi wrote the text which became the source for the use of the decimal number system. The book, titled “al-Khwarizmi on the Numerals of the Indians” was eventually translated into Latin and was the primary vehicle for popularizing the Indian numbering system.

Zero and decimal numbers eventually arrived in Europe in the 11th century in Moorish Spain in Andalusia. Because they were brought to Europe by Arabs, they became known as Arabic numbers, even though the numbers really originally came from India.

It was the Italian mathematician Fibonacci, also known as Leonardo of Pisa, who was the first European to really adopt zero and the Hindu-Arab number system. The fact was, it was just way easier to do math in this system than it was in Roman Numbers. 

By the 15th century, most mathematicians in Europe were using zero and the Hindu-Arab numbers, but most business people were still using Latin numbers for their bookkeeping. It wasn’t until the Renaissance was in high gear in the 16th century that zero and the number system was fully adopted in all of Europe. 

Zero is a unique number, with properties that no other number has.

Anything multiplied times zero is zero. Anything added to zero is the original number. You can’t divide by zero. As one of my math professors once liked to say “diving by zero well send you to hell”. It just doesn’t make sense, and no, it doesn’t give you infinity if you do it. 

In calculus, zero divided by zero can have a value depending on the function which is being used, but that is well beyond the scope of this podcast. For those of you who took calculus, just think back to L’Hôpital’s rule.

Anything to the power of zero is 1, and zero to the power of anything is zero.

Zero is the only number that is neither positive nor negative. However, zero is an even number. It fulfills all the requirements of an even number if you think about it.

There are some cases where mathematicians have defined certain values using zero by convention. 0 raised to the power of 0 is defined to be 1, and zero factorial is also defined to be 1. The square root of zero is just zero. 

In computer science, zero is also extremely important. Zero is one of the building blocks of everything digital. My voice you are hearing right now is nothing but encoded of sound waves into millions of ones and zeros.

Spreadsheets and databases have even caused us to have to rethink zero. 

If a cell of a spreadsheet has nothing in it, the value isn’t zero, it’s null. The difference between null and zero is subtle, but zero is called the cardinality of the null set. If there is nothing in a cell spreadsheet, there are zero things in it, but the moment you type a zero in the cell, then there is something in the cell, and there are no longer zero elements, even if the only element is the number zero. Got that?

So, zero is really important and it wasn’t always something that humans had. So take a moment, of some time greater than zero, and give thanks to the number that means nothing, and yet everything.



The original content (article & images) is owned by Gary Arndt. Visit the site here for other interesting stories.

No comments:

Post a Comment