Why Don't We Use Local Mean Time (LMT) Anymore?
Did you know that by using Local Mean Time (LMT) you can calculate the time of day in relation to the current solar time?
Have trouble understanding that? Don’t worry, we’ll break it down into simpler terms.
What Is Local Mean Time (LMT)?
Local mean time is a timekeeping method that factors in the sun’s movement, and will return the current time of day in relation to the length of an average solar day.
The most common tool used to determine local mean time is a sundial – which is designed to show true solar time.
There are some strange limitations that go along with solar time and local mean time. For instance, the length of a true solar day will always vary because the Earth’s rotation around the sun is not constant. This means that some solar days will be longer while others will be shorter. This also means that the speed of true solar time is not constant either.
It all has to do with the equation of time. On a sundial, any measurements made above the axis will appear much faster in relation to a clock, while any measurements below the axis will appear much slower.
So, local mean time is not actually synced with true solar time. It is used to make solar time more applicable, at least when it comes to keeping time, by correcting these slight speed variations.
What Is Mean Solar Time?
Mean Solar Time is equal to the length of a mean – that mean being the average solar day. Surely you remember the math function mean? It is used to calculate the average in a series of numbers.
The average solar day is 24 hours long and it moves at a constant speed, unlike true solar time.
Why does this matter?
Because local mean time is calculated by identifying the mean solar time for a certain location on Earth. This is much easier to figure out than it seems on paper. In general, locations that share the same latitude have the same local mean time.
Modern Local Mean Time
Although we don’t use local mean time directly, it helps ensure our clocks are synced with the Earth’s rotation.
One version of Universal Time - or UT1 - is a standard, which denotes the local mean time of Greenwich, London. You may recognize this location as the Prime Meridian. It is the main location of the time standard used to declare time zones and current time across the entire globe. You can read more on the Prime Meridian and Universal Time here.
For now, just know that UT1 is the local mean time of the Prime Meridian, and it’s crucial for determining Coordinated Universal Time (UTC). UTC is the timescale we use today to sync up all the local times across the world.
Local Mean Time Used to Be the International Time Standard
Until the 1960s, the local mean time in Greenwich - called Greenwich Mean Time or GMT - was the international standard. All time clocks across the world were synced using GMT.
But, in 1967 the time standard was replaced by Universal Coordinated Time which is more accurate. It is the time standard we still use today.
Why Was Local Mean Time Abandoned?
During the 19th century, many countries observed local mean time as the standard. This meant that each location – including major cities – had different local time standards which related to their longitude.
This translated to a difference of about 4 minutes for each degree of longitude. It’s easy to see how this would be confusing. It was responsible for scheduling conflicts, especially for anyone traveling long distances.
This became an even bigger problem as transportation and communication evolved over the centuries. As faster modes of transportation came about, people could travel greater distances in a shorter time. This also meant they would encounter a wide variety of local time standards – and by proxy, time changes.
Local mean time just wasn’t a practical standard to continue using. It’s a good thing we came up with Coordinated Universal Time instead, huh?
What is Terrestrial Time?
What Is a Time Zone?
UTC vs GMT: What's the Difference?
10 Interesting Facts You Didn't Know About Leap Day