An algorithm runs in O(n log n) time. If it takes 1 second for n = 1000, approximately how long will it take for n = 1,000,000?
- A.~1000 seconds
- B.~2000 seconds✓ Correct
- C.~6000 seconds
- D.~1,000,000 seconds
Explanation
n grows by 1000x, log n grows by roughly 2x (from ~10 to ~20), so total work grows by ~2000x. That puts the runtime near 2000 seconds — not the linear extrapolation (1000s) and nowhere near the O(n^2) answer (1,000,000s).