Back in 1965, Gordon Moore’s “10-year prediction that chip makers would squeeze roughly twice as many transistors into the same area of silicon every year” became one of the most well-known laws regarding the speed of microprocessors and transistor counts per circuit (Clark). Now, the law has transitioned into a two to two-and-a-half-year period, in which chips are likely to hold twice as many transistors as it did in the previous two or so years. The persistence of technological innovation has allowed for microprocessors to become more efficient, but this recent piece of news has declared otherwise. Christopher Mims’ recently wrote an article about “the transforming of specific software tasks into physical silicon chips,” which is taking over the “specialized computing” world (Mims). This new use of computing has an effect on Moore’s Law because as this new software is becoming more relevant, central computing units are reaching their physical limits and are struggling to keep the pace that Moore’s Law states.
This article, How Chip Designers Are Breaking Moore’s Law, connects well with our class, mostly because we are witnessing an integration of a new form of computing. Specialized computing is taking over all sorts of important applications, which is taking away from CPUs and the pace of Moore’s Law. Applications such as “Artificial intelligence, image recognition, self-driving cars, virtual reality, bitcoin,” and many more are all designed through software that CPU’s can’t process at the same speed (Mims). The ending to Moore’s Law was always inevitable because a computer has to have a limit to its’ speed, and considering the high speed of computers today, the end is near. As Daniel Reed, chair of computational science and bioinformatics at the University of Iowa, stated, “It’s not like Moore’s Law is going to hit a brick wall-it’s going to kind of sputter to an end” (Mims).
I think that this article speaks wonders to how far civilization has come in regards to technological advancement. The development of the central processing units has inspired many to create sequenced processors that are so quick, while diminishing in size of the hardware. However, a new era of graphic and specialized computing is here and I believe it is our future.
Overall, this law was never based off a scientific specific hypothesis, but more of a prediction as to the relative price per transistor, was well as the increasing of components per chip. Personally, now that “specialized computing” software is incorporated into almost all technology, the price and components of a CPU don’t matter because microprocessors have become as small and fast as they can get. I think it’s silly that silicon-circuit designers (CPU designers) are trying to change the architecture to maintain the pace of Moore’s Law. As AI switched from CPUs to graphics processors the speed increased up to 100 times. (Mims) The circuit designers can continue changing the dimensions of silicon, but the demand for the new specialized computing is here.
https://www.wsj.com/articles/how-chip-designers-are-breaking-moores-law-1489924804 (Christopher Mims)