Moore’s Law gets all the press. It’s easy to present even to non-technical readers, and the way it’s most often expressed is something like, "computers double in speed every year," though that’s a bastardization of the axiom, which actually states that the transistor count of integrated circuits tends to double every eighteen months or so. This formulation does succinctly capture how fast computers have gotten in so short a time.
But integrated circuit density hasn’t been the only computing tech which has shown extremely rapid progress over the past thirty years. Consider magnetic storage. Modern hard drives are precisely manufactured miracles, products of billions of dollars and decades of research into magnetism and quantum mechanics, squeezing ludicrously large amounts of data into ludicrously tiny spaces. A hard drive with about three terabytes of capacity can be had for less than $150 today; a PC equipped with two or three of these would have more on-board storage than most large enterprises had in aggregate even a decade ago.
That kind of inexpensive capacity has revolutionized the way people keep and use data, both at home and at work. From complex storage- and compute-intensive tasks like oil and gas upstream processing all the way down to editing a home vacation video, the ability to store and manipulate increasingly voluminous data actually drives serious innovation.