Over the course of European history, the naming of numbers has followed a kind of inflationary upwards trajectory.
In ancient Greece, nobody needed a number much above a thousand, the Greek word for which is “myrioi “ (μύριοι).
It took until the late Middle Ages before demand for higher numbers led Italian bankers to invent a new word for a thousand thousands by simply adding a suffix to the existing word for a thousand, „milglio“. They chose „one“, meaning „fat“ (as in „Peopone“, a fat guy named Pepe).
In the 15th century, the term „million“ was imported from Italy to France, where a thousand thousands were also called a „milione“. However, this term was often used more causally and imprecisely to describe a very large sum of money, so a French millionnaire was simply a very rich person.
Later, when the need for even greater numbers arose, the French mathematician Nicolas Chuquet (1414 or 1455 – 1488 or 1500, nobody seems to know for sure) dreamed up a system that bears his name and which is still in use in America today. In his book Triparty en la science de nombres (1484) he wrote: „Instead of saying a thousand thousand, we will say a million, instead of saying a thousand million, we will say byllion, etc …, and tryllion, quadrilion … octylion, nonyllion, and so others if we wanted to proceed.“ The “bi” in billion is Latin for two, the “try” means three, the “quad” four, and so on. So a billion in English is actually “the second power of a million”, a trillion the third power, and so on.
Unfortunately, at about the same time a second naming convention appeared in France, dreamed this one up by a Renaissance poet and mathematician named Jacques Pelletier du Mans (1517–1582), who insisted on calling a thousand millions (109) a “milliart”, which by the 18th century became softend to “milliard”. So when they needed a word to describe a million millions, they used the same prefix as Chuquet, namely “bi”. Now the word “billion” could, confusingly, either mean 109 (a thousand million), also called „short scale, or 1012 (a million millions), known as „long scale“. Go figure.
Germans initially stuck to the rather staightforward way of describing a million as „tausend mal tausend“ (literally „a thousand times thousand“) and it appears like that in German dictionaries as late as the 18th century. Alternatively, a million was also sometimes called a “Grosstausend” (“big thousand”)
Things stayed that way until 1919, when the Germans, having lost the First World War, were forced by the terms of the Versailles Treaty to pay the unheard-of sum of 20 milliards in Gold Marks as reparations. This was the first instance of its use in German publications, and it came just in time for them to be able to print bank notes for multiple billions (Milliarden) of marks during the hyperinflation of 1923, when the exchange rate for a single dollar reached 4.2 milliards, or billions (sic!) of marks.
Like most European countries, the Germans have followed the Pelletier system and the long scale to this day. The United Kingdom, as usual, is a special case because, historically, they have always used the long scale billion like the rest of Europe. In 1974, however, official UK statistics switched to the „American“ short scale, which had already come into frequent use since the 1950s in technical writing and journalism. Still, the „European“ long scale definition continues to be used widely. Presumably, that will terminate with Brexit.
Another notable exception is Italy, which normally follows Chuquet, but gives him a typical Italian twist: Besides “bilione”, they also use “miliardo” to describe a million millions – but “miliardo” is used more loosely to describe any incredibly large number, sort of like the old “milione”.
If this all sounds mighty confusing to you, that’s because it is. Lacking any international organization powerful enough to force the whole world to abide by a common naming convention, I guess we are stuck with this mixed-up way of counting which would certainly sound very familiar to the old Babylonians.