Monthly Archives: December 2022

How Long Is A Millisecond?

“Time is relative; its only worth depends upon what we do as it is passing.”
― Albert Einstein

Different trades and professions have their favorite units of measurement. For astronomers it’s light years, for farmers it’s acres, and inches for carpenters. For us systems programmers it’s clearly milliseconds.

Images from a video camera running at 60 FPS arrive every 16.7 milliseconds. An operating system’s time-sliced task scheduler might switch tasks every 50 milliseconds. During full-breaking, an anti-lock breaking system typically applies/releases breaking pressure every 75 milliseconds. Milliseconds seem to be perfectly suited for close-to-the-metal programming and for smoothly controlling systems.

But how long is a millisecond, actually? Sure, everybody knows that it’s one thousandth of a second, but I’m rather curious as to how much work can be accomplished in a millisecond. Is it, for instance, a waste of CPU cycles if a thread wakes up every 5 ms to poll a sensor?

People often use “dog years” to better understand the behavior of dogs. We can apply the same idea and use “CPU years” to better understand CPUs.

For us humans, a second is a very important unit of measurement as the human heart beats (roughly) once every second. Let’s use a fictitious modern-day RISC CPU that runs at a clock rate of one GHz and requires one clock cycle to execute an instruction as an example. If we view the clock frequency as the heartbeat of a CPU, then a CPU’s heart beats every nanosecond[1]. Thus, one millisecond corresponds to one million CPU hearbeats.

One million human heartbeats equal roughly 12 days (i. e. 1,000,000 / 60 / 60 / 24), a time-span in which a human being can get a lot of work done[2]. Likewise, for our CPU, a millisecond corresponds to 12 CPU days. A full second corresponds to — lo and behold! — 32 CPU years (i. e. 12,000 CPU days).

Now, you might be coding for an embedded system that runs at “only” 100 MHz, so in this case a millisecond is “just” 1.2 CPU days. Or, you might have a Raspberry Pi 4 running at 1.5 GHz whose Cortex-A72 CPU sports a 3-way out-of-order superscalar pipeline, in which case you can probably multiply the 12 days by three.

While not perfect, this “a millisecond is a dozen CPU days” rule-of-thumb is a nice reminder that on a modern CPU, a millisecond really stands for “a lot of time”.

________________________________

[1] A nanosecond is such a mind-boggingly short period of time that even the fastest thing in the universe (i. e. “light”) can only move one foot (30 cm) in a nanosecond.

[2] At least theoretically. Definitely not if you work for a large corporation and your schedule is packed with meetings.