Thursday, March 23, 2017

Intel on the outside: The rise of artificial intelligence is creating new variety in the chip market, and trouble for Intel

             Artificial Intelligence, or “AI” for short, has been popping up in headlines in the past few years. Recent advancements in AI have opened the path to endless possibilities in a wide variety of fields: government, finance and medicine being among a few examples of areas that can expect to see the implementation of this technology in the near future.
This article discusses, in part, the implications developments in AI and machine learning; however, there is a caveat associated with the application of these new technologies--Moore’s Law. Moore’s Law states that the size and processing power of computers will double exponentially every year for the foreseeable future. Some believe that “future,” unfortunately, has arrived. Conventional CPUs are starting to come short of the block, so to speak. AI, machine learning, cloud computing and similar up-and-coming technologies will require a massive amount of processing power to fully tap into what they have to offer us.
Nvidia, an American based technology firm, manufactures graphical processing units (GPUs) that are most notably used for running video games. These GPUs, however, may pose a possible solution to our problem with Moore’s Law. GPUs pack an impressive amount of processing capabilities over conventional CPUs (central processing units): “Nvidia’s latest processors boast 3,584 cores; Intel’s server CPUs have a maximum of 28.” (The Economist) That’s right, Nvidia is tackling one of the tech-giants of the world--Intel. Intel has been a market leader in the production of CPUs for quite some time now (since the 1970s), but innovations made in the video game industry with GPUs, of all places, may threaten its once prominent position in the tech world. Chances are that similar tech-giants that have more of a stake in software development like Google or Apple may begin looking towards GPUs as AI software and applications are developed.
As a personal prediction, I believe that the development and advancement of AI technologies may continue to follow the model set by the video game industry. PC gaming requires a great deal of sophisticated hardware and software to run games at their highest level relative to the latest title releases. Companies compete with one another to push the limits of next gen gaming by pushing the limits of current gaming technologies. It wouldn’t be surprising to see AI become the norm in new video game simulations, or to see GPUs shrink in size and grow in processing power to accommodate devices with virtual/augmented reality capabilities.    
The world of computation is only beginning to take its first infantile steps in the application of these technologies, and the implications for their uses in information technology are essentially limitless. CPUs aren’t going anywhere anytime soon, but for now we’ll have to keep Moore’s Law at bay with our trusty GPUs before the next “big thing” comes along (Quantum Computing?).

Economist Article:
http://www.economist.com/news/business/21717430-success-nvidia-and-its-new-computing-chip-signals-rapid-change-it-architecture

No comments: