20
三月 2019

Counting down to the SI redefinition: Give us a second!

As the UK’s National Measurement Laboratory for bio and chemical measurements at LGC, we are particularly excited about the redefinition of the International System of Measurement (SI) units made last November by measurement scientists from around the world. Each month, we are publishing a blog focusing on one of the seven SI base units in the lead up to World Metrology Day (20 May), when the redefinition will go into effect. 

This month we are taking a peak at the redefinition of the Second.

“What’s the time?” is one of the most frequently asked questions and a subject of fascination ever since human societies first tried to organise time. So, let's welcome our March unit of the month: the second, the SI unit of time.

Our civilization today is highly time-dependent. Working hours, school timetables, shop opening hours and train timetables all (more or less) keep to time. Some industries require much higher levels of timing accuracy, such as banking, stock exchanges, telecommunications, Internet communications and satellite navigation. If it were not for accurate timing we might get lost when navigating by GPS, not connect with the person that we’re trying to call, not guarantee bank transfers via the Internet, or a reliable exchange of information by e-mail.

Time is the most accurately measured quantity in metrology, and very accurate timing is a pre-requisite for measuring other quantities, such as determining laser frequencies for measuring length. While such precision isn’t always necessary, most of us come into contact with time measurements in our daily lives that we’d prefer measured accurately - such as for household electricity, gas, and other bills.

Within the NML and at LGC, accurate time is of utmost importance. Chemical reactions, metabolic rates of pharmaceuticals, partnering with companies to develop rapid tests- time is a factor in so much of the science done around the world. 

A short history of time

Humanity has sought to measure time for thousands of years. Natural markers of the passage of time, such as day passing into night and changing of the seasons, held huge significance throughout history but the first civilization we know of to apply astronomical observations for measuring time was the ancient Egyptians in approximately 2000 BC. They used sundials, following the movement of a shadow cast. Designs varied from the monumental Egyptian obelisks, down to smaller, portable and even pocket-size versions. But solar clocks have some obvious shortcomings: not working at night and construction needed to account for latitude (that affects elevation of the sun).

Mechanical clocks were developed in Europe during the 13th and 14th centuries, making use of mechanical energy stored in a spring or weight. Hundreds of years of innovation culminated in the Schortt clock, the most accurate pendulum clock commercially produced, that had the highest standard for timekeeping between the 1920s and the 1940s. This used two electrically coupled pendulum systems (basic and auxiliary) and electromagnets regulated an auxiliary clock.

The era for quartz as the base for time standards began in the 1930s. When you apply a voltage to quartz it causes vibrations at a precise frequency. Typical modern quartz clocks may be accurate to about half a second per day, while the most accurate may be out by just about a second over 20-30 years.

Superseding quartz clocks for precision were atomic clocks, an idea actually first proposed by Lord Kelvin back in 1879. Atomic clocks are based on quantum phenomena in atoms that allow only step-changes of energies at set values, accompanied by the emission of electromagnetic radiation at fixed frequencies.

The first accurate atomic clock, based on a transition of the caesium-133 atom, was built in 1955 at the National Physical Laboratory in the UK. The caesium atom remains the most popular solution, the frequency of which occurs in the microwave band (~9 GHz). Clocks with frequencies in the optical band are in development and these - in theory – promise an accuracy of measurement equivalent to losing or gaining just one second in about 30 billion years.

 

How we have defined the second, and decreased its uncertainty with time

In this graph, you can see how the relative standard uncertainty of the definition of the second has been decreasing as our measurement capabilities have been improving throughout the 20th century, leading us to modern day and shortly, the SI redefinition of the second!

For most of its history, the second was defined as a fraction of the average solar day, based on the position of the sun in the sky. One second was exactly 1/86 400 of the average solar day.

In 1956, the International Committee for Weights and Measures (CIPM) proposed new definition of the second as "the fraction 1/31 556 925.9747 of the tropical year for 1900 January 0 at 12 hours ephemeris time."

This definition was formally adopted by the General Conference of Weights and Measures (CGPM) in 1960.

In 1967, after the development of atomic clocks, a revolution in the definition of the second took place which still applies today:

The second is the duration of 9 192 631 770 periods of the radiation corresponding to the transition between the two hyperfine levels of the ground state of the caesium 133 atom.

However, on 20 May 2019, the latest redefinition of the second will come into force:

The second, symbol s, is the SI unit of time. It is defined by taking the fixed numerical value of the caesium frequency ∆νCs, the unperturbed ground-state hyperfine transition frequency of the caesium 133 atom, to be 9 192 631 770 when expressed in the unit Hz, which is equal to s–1.

So what’s the difference?

From a technical point of view - none. Microwave caesium radiation will still be used with the same defined frequency. However, the wording of the definition will be changed so that after redefinition all base SI units have consistency.