I got interested in comparing the experience of human time to computer time. I didn’t double verify the exact numbers below. Take it with a grain of salt. I think it’s interesting, if only from a sci-fi perspective.
| Computer Information | Computer Time | Human Time | Human Information |
| 1 CPU cycle | 0.3 ns | 1 s | Add two small numbers in your head |
| Level 1 cache access | 0.9 ns | 3 s | A number scribbled in the margin of a piece of paper |
| Level 2 cache access | 2.8 ns | 9 s | Something written a few pages away |
| Level 3 cache access | 12.9 ns | 43 s | A term you have to look up in the index at the back of the book |
| Main memory access | 120 ns | 6 min | Something you wrote in your notebook, but aren’t sure exactly where |
| Solid-state disk i/o | 50 µs | 2 days | A book from the local library |
| Rotational disk i/o | 1 ms | 1 month | A book you have to research, find, order, and have mailed to you |
| Internet: US East to West | 40 ms | 4 years | Something you have to get a college degree to learn |
| Internet: North America to Europe | 81 ms | 8 years | Something you learn after starting your career |
| Internet: North America to Australia | 183 ms | 19 years | Something you learn after a career spent in the field |
| Virtual OS reboot | 4 s | 423 years | It takes generations of people to figure this out |
| Physical system reboot | 5 m | 32 millennia | Since the time people were hunter/gatherers |
| Modern home computers, 1980s to 2020s | 40 y | 40,000 millennia | Longer ago than the Earth itself even exists |
I’m not exactly sure where the original source for this is, but here is an example: Here’s a neat trick for understanding how long computer processes take.