The Moment Magnitude Scale (MMS) is a logarithmic scale used to measure the magnitude of earthquakes. It was developed in the 1970s by seismologists Thomas Hanks and Hiroo Kanamori as a replacement for the Richter magnitude scale, which was found to be inadequate for measuring larger earthquakes.

The MMS is based on the seismic moment of an earthquake, which is a measure of the total energy released during an earthquake. It takes into account the size of the fault that ruptured, the amount of slip on the fault, and the rigidity of the rock in the fault zone. The seismic moment is calculated using seismograms recorded at seismic stations, and the resulting value is used to determine the earthquake's magnitude.

The MMS is expressed in whole numbers and decimal fractions, with each whole number representing a tenfold increase in the amplitude of ground motion and a 32-fold increase in energy release. For example, an earthquake with a magnitude of 7.0 releases 32 times more energy than an earthquake with a magnitude of 6.0.

The MMS has become the standard measure of earthquake magnitude and is used by seismologists worldwide. It provides a more accurate and consistent measure of earthquake size, particularly for larger earthquakes, and is also more suitable for use in earthquake hazard assessment and engineering design.