How Are Earthquakes Measured?

On Exhibit posts

1906 San Francisco Aftermath

Aftermath of the San Francisco earthquake on April 18, 1906.

© Library of Congress


Thanks to the scale at which they take place, natural disasters can be challenging to measure. Consider earthquakes: you can’t ask how high an earthquake is, or quantify the weight of tectonic plates shifting against one another. What seismologists try to do instead is to measure the energy released by a quake, which you can learn all about at the Museum’s Nature’s Fury exhibit. 

Efforts to detect earthquakes stretch back thousands of years. In 132 CE, Chinese polymath Zhang Heng crafted what is thought to be the first seismic instrument, a bronze vase-shaped device with eight tubes, corresponding to direction points on a compass, protruding from it. When the vase detected an earthquake, the ball would drop from the appropriate tube into a container below, indicating the direction of the quake. Contemporary reports indicate this primitive seismoscope could detect quakes hundreds of miles away, though later attempts to replicate the device couldn’t reproduce this degree of accuracy.

Fast-forward to the 20th century. Most Americans are familiar with the Richter scale, which was developed by seismologist Charles Richter in 1935 at the California Institute of Technology. This scale is based on the largest shock wave recorded by a seismograph 100 km from the earthquake epicenter (the point on Earth’s surface directly above the rupture). 

Seismometer

Gurlap CMG3T compact sesimometer

Courtesy of Gurlap 


Initially devised only to compare the strength of moderate quakes along the San Andreas fault in southern California, the Richter scale was eventually generalized to measure earthquakes all over the world. The Richter scale is logarithmic, with each step up the scale marking a tenfold increase in quake strength—a 4.0 quake on the Richter scale is, for instance, releases 10 times the energy of a 3.0 earthquake. The problem was that for large quakes—over 7.0 on the scale—the Richter scale was less reliable. 

In 1979, as geologists developed more accurate techniques for measuring energy release, a new scale replaced the Richter: the moment magnitude, or MW scale, which seeks to measure the energy released by the earthquake. It’s also a logarithmic scale and comparable to Richter for small and medium quakes—a 5.0 on the Richter scale, for example, is also about a 5.0 MW quake—but better-suited to measuring large quakes. 

No matter what scale is used, quakes are detected using devices called seismographs, which measure ground motion and produce images showing how these vibrations travel over time. The magnitude of a quake determines how it is classified by organizations such as the U.S. Geological Survey, from  “micro” quakes—the smallest that can be felt by humans—to “great” quakes, which can cause devastation over huge areas.