Moore’s Law, which predicts computer power will double roughly every 18 months, actually fairs worse as a predictor of technological advance than the little known Wright’s Law, which states that cost decreases as a function of cumulative production. The analysis was completed by the Santa Fe Institute which compared “the performance of six technology-forecasting models with constant-dollar historical cost data for 62 different technologies—what the authors call the largest database of such information ever compiled.” The data set naturally includes statistics on computer hardware, but also on energy and chemical products.
What’s the Big Idea?
As a prediction, Moore’s Law is based specifically on the number of transistors which can be fit onto a computer chip and, to be sure, it has generally held true. But researchers discovered that the production of low-tech goods over time functions very similarly to high-tech goods like silicon computer chips. “Ironically, Moore’s Law performed particularly poorly in predicting transistor prices; the rapid increase in chip density and soaring numbers of chips manufactured created a ‘superexponential’ increase in cumulative production that Wright’s Law accommodated better than Moore’s.”