Measurement affects many different aspects of our lives. Our admittance to college depends on grades – the measure of our performance in various classes; we assess phone plans by how much data usage they allow; we count calories and our doctors look to see that our blood sugar and cholesterol levels are safe. In almost every facet of modern life, values – measurements – play an important role.
From the earliest documented days in ancient Egypt (see Figure 1), systems of measurement have allowed us to weigh and count objects, delineate boundaries, mark time, establish currencies, and describe natural phenomena. Yet, measurement comes with its own series of challenges. From human error and accidents in measuring to variability to the simply unknowable, even the most precise measures come with some margin of error.
Early history of measurement
Archeological artifacts show us that systems of measurement date back before 2500 BCE – over 4,500 years ago. As ancient civilizations in parts of the world as disparate as Greece, China, and Egypt became more formalized, the acts of dividing up land or trade with others led to a need for standardizing techniques for measuring things. Since measurement is largely a matter of comparison of one thing to another, it isn’t surprising that early systems often began with objects that were common to the community. The weight of one grain of wheat, for example, or the volume of liquid that could be held by one goat skin were used as standards.
Interestingly, many of these systems initiated with the human body. For example, the Egyptian "cubit" was assessed as the length of a man’s forearm from the tip of the middle finger to the elbow (roughly 48 cm, or 19 in). In India’s Mauryan period (500 BCE), 1 "angul" was the width of a finger (roughly 1 cm, or 0.4 in). The Ancient Greeks and Romans used the units "pous" and "pes," both of which translate into "foot." Unsurprisingly, this measurement was based on the length of a man’s foot from the big toe to the heel (roughly 29.5 cm, or 11.6 in).
However, as any trip to a clothing or shoe store will show, not all bodies are the same. When measuring something small, like a table, the difference between one man’s foot and another’s might not make much difference. However, if what is being measured is much larger – say, a plot of land – those small differences add up (a magnification error that we’ll discuss shortly). In an effort to be fair to all its citizens, many civilizations moved to standardize measurements further. By 2500 BCE, the “royal cubit” in Egypt was determined by the forearm length of the Pharaoh and carved into black marble. It was approximately 52 cm in length (20.5 in) and was further divided into 28 equal segments, approximating the width of a finger. This provided a baseline for others and consistency across the kingdom. Individuals could bring a stick or other object that could be marked, lay it against the marble and, in effect, create a ruler that they could use to measure length, width, or height elsewhere.
As civilizations advanced and measurements became more standardized, systems of measurement were developed with increasing complexity. The ancient Mesopotamians were among the first to measure angles and time, dividing the path of the sun on the celestial sphere into twelve 30-degree intervals (1/360 of the circumference of the circle, Figure 2). They also used the new crescent phase of the moon to mark the start of a new month. Celestial objects like the Sun and stars were used to track hours, through the use of sundials or the known seasonal positions of stars. Measurement has a long and complex history.
Measurement: Standardized numbers and units
Measurement gives us a way to communicate with one another and interact with our surroundings – but it only works if those you are communicating with understand the systems of measurement you are using. Imagine you open a recipe book and read the following:
Mix white sugar (10) with flour (1) and water (100). Wait for 1, and then bake for 1.
How would you go about using this recipe? How much sugar do you use? 10 grams? 10 teaspoons? 10 pounds? How much flour or water? cups? liters? Milliliters? How long do you wait? Minutes? Hours?
All measurement involves two parameters: the amount present (i.e., the number) and the unit within a system of measurement. The recipe lists the amounts (1, 10, and 100), but not the units. Without both parameters, the information is virtually useless. (To see a recipe with amounts and units, see Figure 3.)
There are many different systems of measurement units in the world, but one commonly used in science is the metric system (described in more detail in our Metric System module). The metric system uses very precise base standards, such as the meter, a unit of length, which is defined as "the length of the path travelled by light in a vacuum during a time interval of 1/299,792,458 of a second."
Standard units exist in the metric system for a host of things we might want to measure, ranging from the common such as the gram (mass) and liter (volume), to the more obscure such as the abampere, or biot, an electromagnetic unit of electrical current. Despite our best efforts to organize and standardize measurement, there still exist non-standard units that do not fit neatly into any formal systems of measurement. This may be because the exact quantity may not be known, or because it has some historical relevance. For example, horses continue to be measured in the unit of height called the hand (equal to 4 inches) out of tradition. But other examples do exist; for example, the "serving size" we often encounter on pre-packaged food (Figure 4). Serving sizes vary depending on the type of item you are eating. One serving of dry cereal like Cheerios® is listed as one cup, but a serving of potato chips is commonly listed as 1 ounce, and a serving of a snack like a Twinkie® is often listed as a number of objects (for example, two Twinkies®).
How do we measure? Direct versus indirect measurement
The question of how to measure has been the topic of great discussion since antiquity. Many of the systems of measure discussed in the previous section relate to direct measurement. Direct measurement gives us a very clear, quantifiable value of "this-equals-that." I can count the number of minutes or hours until my summer vacation, or the number of miles between my house and my favorite restaurant. But some quantities are not so easily measured. While you might be able to use a ruler to measure the dimensions of your bedroom, or even the distance to a neighbor’s house, you can’t simply use a long ruler to measure the depth of the ocean.
In cases like these, scientists are called upon to make measurements that are challenging or impossible to make in a direct way. Thus, indirect measurements are commonly used in science to determine values for properties that cannot be measured directly. Indirect measurement involves estimating an unknown value by measuring something that is known. For example, the National Oceanic and Atmospheric Administration (NOAA) of the United States government commonly relies on sonar-based measurements to create maps of ocean depth. This technique involves sending out sound waves into the water and then measuring the amount of time it takes for the sound to be reflected back to the instrument. Since the speed of sound is known, by measuring the time between the original transmission and reception of the response, a sonar operator can calculate the distance to the object, and thus the depth of the ocean (Figure 5).
Science has, over time, built a reputation as being objective, careful, and precise – or, at least, as precise as is possible given the latest knowledge and technology. This leads many to believe that when errors in science occur they are the result of human error. While mistakes certainly do happen, the "error" in measurement error does not mean a mistake has taken place, it refers to the variability around a specific measurement.
All measurements have variability. Think about it, if you take a ruler and try to make a mark six inches from the edge of your paper, a number of things affect the measurement – from the width of the pencil you are using to how you mark your line in relation to the line on the ruler. Or consider the amount of calories in a pre-packaged snack food item, like a Twinkie®. If you look at the nutrition label on the package, you’ll see a serving size (77g, or two Twinkies®), calories per serving (290), and other nutritional details. However, you might imagine that the precise amount of batter or filling used can vary from cake to cake. And these differences will affect the number of calories per serving. The differences aren’t large, but there would be variation nonetheless.
While the fraction of a calorie difference in the size of a Twinkie® may not make or break your diet, measurement error is compounded across multiple steps of a calculation and can become a problem. Think of all of the measurements needed to send a spacecraft to Mars, for example. We need to accurately know the speed at which the craft will travel, which itself depends on measurements like the force the engines produce, the weight of the craft, the gravitational pull of Earth, etc. Small amounts of uncertainty in each of those measurements can add up to cause significant error in our calculations (Figure 6). This is why scientists not only report on the value of measurements they collect, but they also try to estimate the uncertainty associated with those measurements. For more information on measurement error, see our module Uncertainty, Error, and Confidence.
We are constantly measuring the world around us and using that information to make decisions. From the casual decision on the type of snack to enjoy to the important one of how much medicine to take, we quantify and measure values. And we’ve been measuring the world since very early times, making adjustments and new discoveries of how to measure continuously. With all of these measurements there is a margin of error included in even the most precise measurement. But through awareness of these errors and careful attention to the values and units, we can approach very high levels of accuracy in our measurements. And that is the ultimate goal of measurement – to provide accurate information that everyone can understand and use.
Activate glossary term highlighting to easily identify key terms within the module. Once highlighted, you can click on these terms to view their definitions.
Activate NGSS annotations to easily identify NGSS standards within the module. Once highlighted, you can click on them to view these standards.