As the number of transistors on a single microchip continues to increase, and the power of computing along with it, the energy needed to run the next generation of computers may be too demanding to achieve. According to a paper presented this summer at International Symposium on Computer Architecture, as more transistors are placed on a single chip, more of them will have to power down to avoid overheating. The initial consequences of overheating are incorrect computing results after which the chip might fuse and become useless.
What’s the Big Idea?
The energy problems now facing technological and computer advancement put into doubt a famous prediction made in the 1970s by Intel co-founder Gordon Moore. Moore predicted that the number of transistors that could be placed on a single microchip would double every two years. This trend still holds, but if the necessary power isn’t available to support all the transistors, the exponential rate of computer advancement people have taken for granted ever since Moore made his prediction may slow considerably for the foreseeable future.