Phenomenon and Graphing

Time keeps on ticking! - Kesler Science Weekly Phenomenon and Graph

Written by Chris Kesler | Jul 25, 2025 6:00:00 PM

Ancient astronomers were amazing. They created places like Stonehenge, where the huge rock columns match the Sun’s position during the summer and winter solstices. The Mayan pyramid of Kukulkan is wild, too. During the equinoxes, the shadows on the stairs look like a massive serpent slithering down the side of the pyramid. 🐍 How cool is that?

Clearly, people have been tracking celestial events for ages. The truly amazing thing is, the science of keeping time—the minutes and seconds we live by—is still evolving.

Early cultures, like those who created Stonehenge, would use the Sun's position in the sky to estimate the time of day. About 4000 years ago, artisans created water clocks by putting a small hole in the bottom of a clay pot. When the pot was filled with water, the water dripped out into a bucket at a regular rate. Markings on the side of the pot or on the side of the bucket that collected the water would the show how much time had passed. Pretty ingenious!

Sand and water ruled timekeeping until about 800 years ago, when mechanical clocks moved by swinging pendulums and heavy weights made their debut. The "tick-tock" of these clocks quickly replaced the "drip-drip" of water clocks. Around 1675, the mechanical clocks allowed timekeeping to become portable--even wearable! These portable watches were not as accurate as their bigger, stationary cousins, but they were very convenient.

In the 20th century, scientists learned to use quartz crystals to keep time. A quartz crystal will bend in a predictable way when exposed to an electrical current so it’s great for keeping every second exactly the same length. Pretty clever to use a rock to tell time.

Now, if you walked up to someone today and asked, “What time is it?" they would probably look at you funny, but then they might check their phone. And while our phones do have quartz crystals inside, they’re also synced to GPS satellites and cell towers. That’s how they stay so precise.

Here’s where it gets even wilder: those GPS satellites rely on military-grade atomic clocks to stay in sync. Atomic clocks measure time based on cesium-133 atoms, whose electrons jump between energy levels when zapped by photons. In the U.S., all our phones stay in sync with one super-precise atomic clock in Boulder, Colorado.

And scientists aren’t stopping there. They’re working on nuclear clocks, which measure time using the center of the atom instead of the electrons. The center nucleus of an atom is more stable than the moving electrons, so the clock might only lose 1 second every billion years!

This could bring big changes to navigation. Imagine pairing a nuclear clock with a GPS satellite—your location could be precise down to the millimeter. Spacecraft with a nuclear clocks could help us measure distances between planets and moons down to the centimeter. The future of timekeeping is looking pretty mind-blowing!

The graphs below show how accuracy in timekeeping has changed in recent history. Accuracy in clocks is measured by how many seconds, minutes, or hours incorrect the clock might become throughout the day. A clock might "gain" six minutes (showing 10:15 when it is really 10:09) or it might "lose" 4 minutes (showing 10:15 when it is really 10:19). 

The first graph shows the improvements in timekeeping accuracy with a linear scale - the normal scale where, in this case, each line on the scale represents a 100 second increase in seconds from the line below it. 

The second graph shows the improvement in a logarithmic scale, where each line on the scale is 10 times more than the line below it.

If I brought this pair of graphs to class, here are some questions I'd have to go with them:

💡Around 1675, Huygens created the first watch. Looking at the first graph, what changes correlate with this event? The accuracy of timekeeping in 1675 decreased from 60 seconds per day to 600 seconds per day. 

💡Compare the way the data from 1721 to 1921 looks on the first graph and the second graph. The first uses a linear scale, which increases the same amount all the way up the scale. The second uses a logarithmic scale, which increases times 10 each step of the scale. How does this difference affect the way the data looks? The linear scale makes the line from those dates look flat. The trend of improving accuracy is easier to see on the logarithmic scale.

💡Quartz movement, developed in 1927, allows accuracy in which a second might be gained or lost in a month, not a day. Which of the two graphs would be better to show this accuracy, and why? On the linear scale, the line would continue to look flat. On the logarithmic scale, the line would continue to drop.  The logarithmic scale would be better to show the trend.