Today, temperature seems a straightforward idea that is easy to measure. But like all scientific concepts that are tought in schools, it has not always been so obvious. Thousands of years, many careful and dedicated researchers and knowledge exchange across times and cultures were necessary before we began to understand what actually lies behind the feeling of warm and cold.

Millennia of research

Ancient genius

The observation that liquids and gases expand with temperature - the basis of most thermometers before the electronic aera --- goes back into antiquity. The earliest documented thermometer was described by the Greek engineer Philon of Byzantium (also known as Philo Mechanicus, ca 280-220 BCE) who published an extensive Compendium on Mechanics (Mechanike syntaxis) with descriptions of a wide range of machines, including a kind of "machine gun" (a mechanically driven crossbow that shot arrows automatically until the magazine was empty) and an “automatic theatre” (an apparatus with mechanically driven puppets). One volume, Pneumatika, explains devices driven by air and water. Besides many practical machinery like water mills it also describes the “dripper”, a device that modern historians call “Philon's thermoscope”. It consists of a tube that connects a hollow sphere with water in a jug. When the temperature rises, the air in the sphere expands and bubbles can be seen in the water. When the temperature falls and the air in the sphere contracts, then the water will rise in the tube. Whether this thermoscope was an entertaining demonstration experiment or used in practice to determine temperatures is not known. An improved thermoscope design is shown by Heron of Alexandria (ca 10-70 CE), the inventor of the steam engine (Heron's ball or Aerolipile), a rotating ball driven by steam. The earliest temperature scale seems to have been devised by the Greek physician Galen of Pergamon (Aelius Galenus, 129/131-199 or 215). For his medical studies he introduced eight degrees of cold and heat, defined by different mixtures of freezing and boiling water

Islam and the Middle Ages

Philon's Pneumatika and most of Heron's original books and drawings are lost now, but they were translated into Arabic and many manuscripts survive. Islamic scholars were the world-leading experts in astronomy, medicine, physics, chemistry (alchemy) and engineering throughout much of the Middle Ages. While it seems reasonable to assume that Islamic alchemists had a need to measure temperatures, it is unknown if they used used Philon's and Heron's thermoscopes or even developed them further. These early developments are an exciting subject for historians of science, but methods for determining temperature before the Renaissance were probably not accurate and precise enough to be of much practical use for weather or climate studies. Arabic manuscripts are a valuable source for studying past climates, but weather observations were qualitiative (describing droughts, flooding, snowfall or frozen rivers and similar phenomena) and there is no hint that temperatures were measured with any kind of instrument (Vogt et al. 2011, Domínguez-Castro et al. 2012).

Italy, Netherlands, France, England - pan-European research networks, part I

Instrument development really took off in the Renaissance. It is quite unclear who invented the modern thermometer. Cornelius Drebbel (1572-1634), Galileo Galilei (1564-1642), Robert Fludd (1574-1637), Giovanni Francesco Sagredo (1571--1620), Giambattista della Porta (1538-1615), Guiseppe Biancani (1566-1624), Salomon de Caus (1576-1630?), Santorio Santorio (1561-1636) and probably several more contemporary scientists have been named at some time as the inventor of the liquid thermometer around 1600. At that time, researchers all over Europe were engaged in an intense knowledge exchange, published books that were traded afar, and travelled to learn from one another. Many outstanding scholars translated Arabic manuscripts into Latin to make the ancient knowledge and Islamic science accessible to Europeans. But modern academic practices of careful citations and attributing sources had not been developed yet, making it difficult to reconstruct the history of important scientific discoveries and inventions.

Good liquid thermometers were probably developed in many centres of excellence at roughly the same time, drawing on the experience from contemporary and older sources; Santorio Santorio (1561-1636), Professor of Theoretical Medicine in Padua and Venice, explicitly mentions that the device is of great antiquity. By the early 1600s, thermometers were precise enough for useful applications: In 1625, Santorio wrote that “the instrument was developed by Heron for different purposes, but I modified it to measure the warm and cold temperature of the air and also of all parts of the body, as well as for testing the degrees of fever in men” (in “Comments on Avicenna”, 1625).

A plethora of temperature scales

Snow and boiling water

Santorio also used snow and boiling water to define a scale for his thermometer, as Galen had done one and a half millenia before. This idea was further refined by several scientists in the 18th century, when thermometers had become precise enough so that a finer scale made sense. They initially defined their calibration points based on various considerations of practicality and usefulness, but generally ended up fixing their scales at the freezing and boiling points of water again, these being the most practical and reproducible calibration points. Besides calibration points, a temperature scale has to define the gradation between them, and the 18th century scientists used volume changes of various thermometer liquids with different expansion characteristics, so that their original scales are nonlinear and hard to convert into one another.

The amazing year 1701

The first documented calibrated temperature scale was devised in 1701 by the Danish astronomer Ole Christensen Rømer (1644-1710), who is more famous for proving that the speed of light is finite. Initially he defined the temperature of freezing brine as 0 degrees - this would be the lowest temperature one can expect to measure, thus avoiding negative values. His second calibration point was boiling water; using a hexagesimal system he defined its temperature as 60 degrees. However, he soon found that brine was difficult to standardise, so he also decided to use pure water for calibration, defining its freezing point as 7.5 degrees so that his original scale only had to be modified slightly. In the same year (1701), Isaac Newton (1642-1727) published his scale which used “the heat of air in winter where water begins to freeze” as “0 degrees heat” (“zero gradus caloris”) and a range of other reference points, some rather ambiguous and difficult to standardise (“the heat of midday in July”), others based on the melting point of miscellaneous materials like wax, bismuth and lead. His scale is particularly impressive as it covers temperatures up to 600°C. One of his reference points was boiling water at 33 degrees, and the Newton scale was later interpreted as a linear scale between freezing and boiling point, using linseed oil as thermometer liquid (Grigull 1984).

Denmark, Germany, France - pan-European research networks, part II

The first temperature scale that found widespread application was introduced in 1724 by the German physicist Daniel Gabriel Fahrenheit (1686-1736). After visiting Ole Rømer in Copenhagen, he wanted to refine the idea of using freezing brine as calibration point. He found that a mixture of equal parts of water, ice and ammonium chloride is a “frigorific mixture”, i.e. a mixture that reaches an equilibrium temperature independent of the ambient temperature, in this case -17.8 °C; this reproducible temperature he used as the zero point for his scale. He defined two other calibration points: 32 degrees for a 1-to-1 mixture of ice and water (i.e. the freezing point again), and 96 degrees for highest temperature of the human body. Again, these calibration points were less reproducible than hoped for, so the Fahrenheit scale was soon re-defined to use the easily reproducible freezing point (=32 °F) and the boiling point (=212 °F) of pure water. The Fahrenheit scale became the main temperature scale in English-speaking countries in the 18th well into the second half of the 20th century. Today, it is still used in everyday life (TV weather reports, cooking recipes etc.) in a handful of countries (Bahamas, Belize, the Cayman Islands, Palau, and the USA), but in science it has been replaced by the Celsius or Kelvin scale worldwide.

Contemporary with Fahrenheit, in 1730, French scientist René Antoine Ferchault de Réaumur (1683-1757) introduced another scale that would be used widely. He again set the freezing point of water as the zero point for his temperature scale. The second calibration point was essentially again the boiling point, albeit in an indirect way: He used diluted alcohol for the thermometer and divided the tube into intervals so that one degree was equivalent to 1/1000 of the volume of the bulb up to the zero mark. The alcohol was to be diluted such that it expanded by 8% when moved from a bath of freezing water to boiling water. This essentially defined a scale of 80 degrees between freezing and boiling point. However, alcohol was inconvenient, it required large and bulky thermometer designs and could evaporate or boil off, so instrument makers preferred mercury. But then the gradation on the thermometer would no longer be linear, as mercury had a different expansion characteristic, which lead to some confusion and different standards. Nonetheless, variations of the Réaumur scale were used widely in Europe, particularly in France, Russia and many German states, in the 18th century and in some regions into the 19th century. Today, it is mostly of historical interest, with its main applications now being in cheesemaking and confectionary manufacturing.

France, Russia, Sweden - a pan-European research network, part III

At roughly the same time, in 1732, French astronomer Joseph Nicolas de l'Isle (Delisle, 1688--1768) developed a mercury thermometer, using the boiling point of water as the zero of his scale. The gradation was defined by the contraction of mercury: one degree was a volume reduction of 1/100000. A Russian winter --- Delisle had been invited to St. Petersburg by Zar Peter the Great in 1725 - then had around 2500 degrees. The German anatomy professor Josias Weitbrecht (1702-1747), who also worked in St. Petersburg, noted that the freezing point of water was near 1500 degrees on the Delisle thermometer, so in 1738 he recalibrated the thermometer using the freezing point as calibration point, defined as 150 °De; this redefined scale became known as the Delisle scale and was the predominant temperature scale in use in Russia for more than a century. To modern physicists, it is counterintuitive that the Delisle scale is "backwards" and has high values for low temperatures. However, without our understanding of temperature as a measure of thermodynamic energy, there is no clear reason to prefer one direction; from a phenomenological point of view it doesn't matter if "cold" is measured in high or low values.

Delisle sent his thermometer to colleagues all over Europe, among them the Swedish astronomer, physicist and mathematician Anders Celsius (1701-1744) in Uppsala. Like Weitbrecht, but apparently independently, he found it difficult to calibrate thermometers based on the fractional contraction of mercury, and instead decided in 1742 to use the freezing point of water as the second calibration point, dividing the thermometer scale in 100 "centigrades" (from Latin centum=hundred and gradus=step). Like the Delisle (or Weitbrecht) scale it went "backwards" with increasing values as temperatures get colder.

In 1745, the Swedish botanist Carl von Linné (Carolus Linnaeus), who, like Celsius, lived and worked in Uppsala, received his own thermometer, made by instrument maker Daniel Ekstrøm in Stockholm (an earlier instrument, ordered in 1743 and due to be delivered in 1744, was damaged during transport). Linné had decided to reverse the temperature scale, introducing the now familiar Celsius scale with 100 centrigrades between 0 °C for freezing water, and 100 °C for boiling water, which  thought more appropriate for botanical measurements. It became known as the Swedish scale, the Celsius novum ("new Celsius") or later simply the Celsius scale. The first documented temperature measurement in "modern" degrees centigrade was recorded on 16 December 1745 in the orangery of the Botanical Garden of Uppsala University (Moberg 2008). Linné was one of the first researchers to investigate the effect of temperature on plants systematically.

From phenomenology to physical understanding

All these scales are phenomenological scales, based on the expansion and contraction of a particular material (mercury or alcohol), but the concept of temperature was not linked to other fundamental physical concepts. This changed in the 19th century with the development of thermodynamic theory. The key to understanding the physical meaning of temperature came from the investigation of the behaviour of gases. In 1802, French scientist Joseph Louis Gay-Lussac (1778-1850) published what is now known as \emph{Charles's Law}, acknowledging unpublished work by Jacques Alexandre César Charles (1746-1823) a decade earlier; similar ideas had been published in 1702 by Guillaume Amontons (1663-1705). This gas law states that volume of a gas at constant pressure is proportional to the temperature V/(T-T0)=const$, with the volume of an ideal gas hypothetically reaching zero at a temperature of around -273 °C. The British physicist William Thomson (1824-1907) developed this concept further in his paper "On an Absolute Thermometric Scale'' (Thomson 1848) and proposed a scale with a zero point at "infinite cold" as the "point corresponding to the volume of air being reduced to nothing, which would be marked as -273° of the scale''. William Thomson was ennobled in 1892 for his work in thermodynamics and became 1st Baron Kelvin (after the river Kelvin which flows near his laboratory in Glasgow), and his temperature scale consequently became known as the Kelvin scale.

A similar scale is the Rankine scale (unit °R or °Ra when there is potential for confusion with Réaumur). It is, like the Kelvin scale, a thermodynamic scale, with a zero point at absolute zero, but its interval is based in the Fahrenheit scale. The freezing point of water is then 491.688 °Ra. This scale was proposed in 1859, a decade after Kelvin, by the Scottish engineer William John Macquorn Rankine (1820-1872). It was used in some engineering applications in English-speaking countries where the Fahrenheit scale dominated.

The final piece in understanding temperature was provided by Austrian physicist Ludwig Eduard Boltzmann (1844-1906). He developed statistical mechanics, linking the collective behaviour of atoms or molecules to macroscopic physical properties like heat capacity or viscosity; temperature could now be understood as a measure of the mean kinetic energy of molecules or atoms.

On temperature scales and units

A comparison of different scales

The different historical temperature scales and their conversion to the Celsius scale are shown in this table. Note that the conversion formulae are modern; historically some scales had regional variations and were based on different thermometer liquids making the conversion slightly nonlinear.

The convention in the table is as follows: TC is the numerical value (without unit) measured in °C on the Celsius scale, TX in each row is similarly the numerical value measured in degrees on the scale X. For example, boiling water has TK=373.15 (i.e. with units, 373.15 K), so TC=TK-273.15=100.00 (i.e. 100.00 °C). The gradation is the value that a temperature difference of 1°C has in the respective scale.

Name Symbol Conversion to °C
TC = ...
1°C = ...
Rømer °Rø 40/21 (TRo-7.5) 21/40°Rø
Newton °N 100/33 TN 33/100°N
Delisle °De 2/3 (150-TDe) 3/2°De
Réaumur °R, °Ré 5/4 TRe 4/5°Ré
Fahrenheit °F 5/9 (TF-32) 9/5°F
Rankine °R, °Ra 5/9 (TRa-491.67) 9/5°Ra
Celsius °C TC 1°C
Kelvin K TK-273.15 K

A note on modern SI units

There is a certain degree of confusion about the correct usage of temperature units, perhaps more than for other physical variables. In the International System of Units (Système International d'Unités, SI), the thermodynamic temperature is one of the seven independent physical quantities. In the current definition (from 1967), the SI base unit for the thermodynamic temperature is the kelvin (in small letters) with the symbol K: “The kelvin, unit of thermodynamic temperature, is the fraction 1/273.16 of the thermodynamic temperature of the triple point of water” using water of a precisely defined isotopic composition (The SI will likely be revised in future to be independent of particular materials like water and instead will based on physical constants, but currently there are no plans to change names or terminology of the fundamental units).

Therefore, the correct usage is, for example, to say that the triple point of water has a temperature of 273.16 K or "273.15 kelvin". Before 1968, the unit was called "degree Kelvin" (symbol °K), so you may find this in older sources. The kelvin is not only used for absolute temperature, but also for temperature differences - before 1968 differences were usually expressed in degrees (°). You may find some confusion in the literature and different historic usages, but the current SI convention is actually fairly straightforward: the kelvin is used exactly in the same way as other units like the metre (you wouldn't express length differences, for example, in “degrees metre"). To add a bit more confusion: while the unit is spelled with small letters, the word “Kelvin", being derived from a proper name, is still capitalised when not used as a numerical unit, for example in the phrase “the Kelvin scale”.

SI also defines the "degree Celsius" (°C) as a derived unit, based on the kelvin. The magnitude of the degree Celsius and the kelvin are exactly the same, but the Celsius scale is shifted by 273.15 against the Kelvin scale, so that the triple point of water is precisely 0.01 °C = 273.16 K. The degree Celsius (plural: degrees Celsius) is the only SI unit with a capital letter --- while some other units, like the watt (symbol W) or indeed the kelvin (K) have a capital letter as symbol, their full names are spelled with small letters. The SI also has a general rule that there is a space between the numerical value and the unit, and as the symbol °C is part of the symbol for degrees Celsius, temperatures should also be written with a space, e.g. 0 °C. The exception to this rule are the symbols for angular degrees, where no space is used, e.g. 360° for a full circle. However, other languages or publishing companies may have a different style guide for the typographical conventions. To express temperature differences, the same unit "degree Celsius" (°C) is used, this time with the degree (°) sign as it is an integral part of the unit name.