Tuesday, September 7, 2010

About Temperature

A Brief History of Temperature

Temperature is by far the most measured parameter. It impacts the physical, chemical and biological world in numerous ways. Yet, a full appreciation of the complexities of temperature and its measurement has been relatively slow to develop.

Intuitively, people have known about temperature for a long time: fire is hot and snow is cold. Greater knowledge was gained as man attempted to work with metals through the bronze and iron ages. Some of the technological processes required a degree of control over temperature, but to control temperature you need to be able to measure what you are controlling.


Until about 260 years ago temperature measurement was very subjective. For hot metals the colour of the glow was a good indicator. For intermediate temperatures, the impact on various materials could be determined. For example does the temperature melt sulphur, lead or wax, or boil water?
In other words a number of fixed points could be defined, but there was no scale or any way to measure the temperature between these points. It is, however possible that there is a gap in the recorded history of technology in this regard as it is difficult to believe that the Egyptians, Assyrians, Greeks, Romans or Chinese did not measure temperatures in some way.

Galileo invented the first documented thermometer in about 1592. It was an air thermometer consisting of a glass bulb with a long tube attached. The tube was dipped into a cooled liquid and the bulb was warmed, expanding the air inside. As the air continued to expand, some of it escaped. When the heat was removed, the remaining air contracted causing the liquid to rise in the tube and indicating a change in temperature. This type of thermometer is sensitive, but is affected by changes in atmospheric pressure.
The Eighteenth Century: Celsius and Fahrenheit

By the early 18th century, as many as 35 different temperature scales had been devised. In 1714, Daniel Gabriel Fahrenheit invented both the mercury and the alcohol thermometer. Fahrenheit's mercury thermometer consists of a capillary tube which after being filled with mercury is heated to expand the mercury and expel the air from the tube. The tube is then sealed, leaving the mercury free to expand and contract with temperature changes. Although the mercury thermometer is not as sensitive as the air thermometer, by being sealed it is not affected by the atmospheric pressure. Mercury freezes at -39° Celsius, so it cannot be used to measure temperature below this point. Alcohol, on the other hand, freezes at -113° Celsius, allowing much lower temperatures to be measured.

At the time, thermometers were calibrated between the freezing point of salted water and the human body temperature. (Salt added to crushed wet ice produced the lowest artificially created temperatures at the time). The common Flemish thermometers of the day divided this range into twelve points. Fahrenheit further subdivided this range into ninety-six points, giving his thermometers more resolution and a temperature scale very close to today's Fahrenheit scale. (In fact there appeared to have been between 15 and 20 different temperature scales at this time, determined by nationality and application.)

Later in the 18th century, Anders Celsius realised that it would be advantageous to use more common calibration references and to divide the scale into 100 increments instead of 96. He chose to use one hundred degrees as the freezing point and zero degrees as the boiling point of water. Sensibly the scale was later reversed and the Centigrade scale was born. See Olof Beckman's short History of the Celsius Temperature Scale.

The Nineteenth Century: A productive era
The early 1800's were very productive in the area of temperature measurement and understanding.
William Thomson (later Lord Kelvin) postulated the existence of an absolute zero. Sir William Hershel, discovered that when sunlight was spread into a colour swath using a prism, he could detect an increase in temperature when moving a blackened thermometer across the spectrum of colours. Hershel found that the heating effect increased toward and beyond the red in the region we now call 'infrared'. He measured radiation effects from fires, candles and stoves, and deduced the similarity of light and radiant heat. However it was not until well into the following century that this knowledge was exploited to measure temperature.
In 1821 T J Seebeck discovered that a current could be produced by unequally heating two junctions of two dissimilar metals, the thermocouple effect. Seebeck assigned constants to each type of metal and used these constants to compute total amount of current flowing. Also in 1821, Sir Humphrey Davy discovered that all metals have a positive temperature coefficient of resistance and that platinum could be used as an excellent temperature detector (RTD). These two discoveries marked the beginning of serious electrical sensors.

Gradually the scientific community learnt how to measure temperature with greater precision. For example it was realised by Thomas Stevenson (civil engineer and father of Robert Louis Stevenson) that air temperature measurement needed to occur in a space shielded from the sun's radiation and rain. For this purpose he developed what is now known as the Stevenson Screen. It is still in wide use.
The late 19th century saw the introduction of bimetallic temperature sensor. These thermometers contain no liquid but operate on the principle of unequal expansion between two metals. Since different metals expand at different rates, one metal that is bonded to another , will bend in one direction when heated and will bend in the opposite direction when cooled (hence the term Bimetallic Thermometer or BiMets). This bending motion is transmitted, by a suitable mechanical linkage, to a pointer that moves across a calibrated scale. Although not as accurate as liquid in glass thermometers, BiMets are more hardy, easy to read and have a wider span, making them ideal for many industrial applications.

The 20th Century: Further discovery, refinement and recognition
The 20th century has seen the discovery of semiconductor devices, such as: the thermistor, the integrated circuit sensor, a range of non-contact sensors and also fibre-optic temperature sensors. Also, Lord Kelvin was finally rewarded for his early work in temperature measurement. The increments of the Kelvin scale were changed from degrees to Kelvins. Now we no longer say "one-hundred degrees Kelvin;" we instead say "one-hundred Kelvins". The "Centigrade" scale was changed to the "Celsius" scale, in honour of Anders Celsius.

The 20th century also saw the refinement of the temperature scale. Temperatures can now be measured to within about 0.001°C over a wide range, although it is not a simple task. The most recent change occurred with the updating of the International Temperature Scale in 1990 to the International Temperature Scale of 1990 (ITS-90). This document also covers the recent history of temperature standards.