Adrian Colyer has a cool way to help understand orders of magnitude: translate them into human terms. For example, to understand how long a second takes in computer time, upscale nanoseconds to seconds:
If we set up a ‘human time’ scale where one nanosecond is represented by one second, then we have:
1 nanosecond = 1 second
1 microsecond = 16.7 minutes
1 millisecond = 11.6 days
1 second = 31.7 years
That slow page render time starts looking different when you think about it taking 15 years…
He has similar analogies for latency and throughput.