The measurement of temperature is a comparatively new concept. Early scientists understood the difference between "hot" and "cold," but they had no method to quantify varying degrees of heat until the seventeenth century. In 1597, Italian astronomer Galileo Galilei invented a simple water thermoscope, a device that consisted of a long glass tube inverted in a sealed jar that contained both air and water. When the jar was heated, the air expanded and pushed the liquid up the tube. The water level in the tube could be compared at different temperatures to show relative changes as heat was added or removed. However, the thermoscope lacked an easy way to directly quantify temperature.
Several years later, the Italian physician and inventor Santorio Santorio improved Galileo's design by adding a numerical scale to the thermoscope. These early thermoscopes led to the development of the fluid-filled thermometers commonly used today. Modern thermometers operate based on the tendency of some fluids to expand when heated. As the fluid inside a thermometer absorbs heat, it expands, occupying a greater volume and forcing the fluid level inside the tube to rise. When the fluid is cooled, it contracts, occupying a smaller volume and causing the fluid level to fall.
Temperature is a measure of the amount of heat energy possessed by an object (see our Energy module for more on this concept). Because temperature is a relative measurement, scales based on reference points must be used to accurately measure temperature. There are three main scales commonly used in the world today to measure temperature: the Fahrenheit (°F) scale, the Celsius (°C) scale, and the Kelvin (K) scale. Each of these scales uses a different set of divisions based on different reference points, as described in detail below.
Temperature is a(n) _____ measurement.
Daniel Gabriel Fahrenheit (1686-1736) was a German physicist who is credited with the invention of the alcohol thermometer in 1709 and the mercury thermometer in 1714. The Fahrenheit temperature scale was developed in 1724. Fahrenheit originally established a scale in which the temperature of an ice-water-salt mixture was set at 0 degrees. The temperature of an ice-water (no salt) mixture was set at 30 degrees and the temperature of the human body was set at 96 degrees. Using this scale, Fahrenheit measured the temperature of boiling water as 212°F on his scale. He later adjusted the freezing point of water from 30°F to 32°F, thus making the interval between the freezing and boiling points of water an even 180 degrees (and making body temperature the familiar 98.6°F). The Fahrenheit scale is still commonly used in the United States.
Anders Celsius (1701-1744) was a Swedish astronomer credited with the invention of the centigrade scale in 1742. Celsius chose the melting point of ice and the boiling point of water as his two reference temperatures to provide for a simple and consistent method of thermometer calibration. Celsius divided the difference in temperature between the freezing and boiling points of water into 100 degrees (thus the name centi, meaning one hundred, and grade, meaning degrees). After Celsius's death, the centigrade scale was renamed the Celsius scale and the freezing point of water was set at 0°C and the boiling point of water at 100°C. The Celsius scale takes precedence over the Fahrenheit scale in scientific research because it is more compatible with the base ten format of the International System (SI) of metric measurement (see our module on The Metric System). In addition, the Celsius temperature scale is commonly used in most countries of the world other than the United States.
Which temperature scale is used more in science?
Lord William Kelvin (1824-1907) was a Scottish physicist who devised the Kelvin (K) scale in 1854. The Kelvin scale is based on the idea of absolute zero, the theoretical temperature at which all molecular motion stops and no discernible energy can be detected (see our States of Matter module for more information). In theory, the zero point on the Kelvin scale is the lowest possible temperature that exists in the universe: -273.15ºC. The Kelvin scale uses the same unit of division as the Celsius scale; however, it resets the zero point to absolute zero: -273.15ºC. The freezing point of water is therefore 273.15 Kelvins (graduations are called Kelvins on the scale and neither the term "degree" nor the symbol º is used), and 373.15 K is the boiling point of water. The Kelvin scale, like the Celsius scale, is a standard SI unit of measurement used commonly in scientific measurements. Since there are no negative numbers on the Kelvin scale (because theoretically nothing can be colder than absolute zero), it is very convenient to use Kelvins when measuring extremely low temperatures in scientific research. (The three scales are compared in Figure 1.)
Temperatures below absolute zero on the Kelvin scale
Although it may seem confusing, each of the three temperature scales discussed allows us to measure heat energy in a slightly different way. A temperature measurement in any of the three scales can be easily converted to another scale using the simple formulas below.
|From||to Fahrenheit||to Celsius||to Kelvin|
|ºF||F||(ºF - 32)/1.8||(ºF-32)*5/9+273.15|
|ºC||(ºC * 1.8) + 32||C||ºC + 273.15|
|K||(K-273.15)*9/5+32||K - 273.15||K|
This module provides an introduction to the relationship between energy, heat, and temperature. The principle behind thermometers is explained, beginning with Galileo’s thermoscope in 1597. The module compares the three major temperature scales: Fahrenheit, Celsius, and Kelvin. It discusses how the different systems use different references to quantify heat energy.
There are three different systems for measuring heat energy (temperature): Fahrenheit, Celsius, and Kelvin.
In scientific measures, it is most common to use either the Kelvin or Celsius scale as a unit of temperature measurement.
Nothing can be colder than absolute zero, which is the point at which all molecular motion ceases.