Back to Reports

Thematic Research: AI chips (2019)

Published: 02 May 2019 Code: GDTMT-TR-S213

Thematic Research: Technology GDTMT-TR-S213

Moore's Law is Broken. It's Time for a New Generation of Silicon

Over five decades, Moore's law has held sway in the chip industry. Since the 1970s the performance capabilities of microprocessors and other silicon chips have roughly doubled every two years. That era is now over.

Transistors are now so tightly packed onto chips that they are barely a couple of atoms apart, and the amount of power generated within a square millimeter of silicon now produces more heat than it is possible to efficiently dissipate. As a result, the entire chip industry is changing direction. Abandoning the single-minded, "double-everything" approach, chip designs are becoming more sophisticated and complex. We are entering a new era of microprocessor design which will lead to new chip architectures, new classes of chips, new business models, new alliances, new materials, and radical decisions about who does what in terms of design and manufacture. This transition away from traditional approaches will bring turmoil to incumbents and opportunities to newcomers. While the established leaders like Intel, Qualcomm, and NXP have recognized the threat and are investing billions of dollars in R&D, they will encounter new competitors from among their biggest customers (Amazon, Google, IBM, and Facebook) and from well-funded start-ups like Cambricon.

AI Chips

May 2019

Inside

  • Players
  • Technology briefing
  • Trends
  • Industry analysis
  • Value chain
  • Companies section
  • Sector scorecard
  • Glossary

In this report, we analyze the changes that are already underway, and the companies that are likely to dominate this rapidly-evolving market. We take a five-year view, and anticipate much turmoil breaking out in the $500bn chip sector within two to three years.

Winners

Below we identify the winners in each of the chip categories that will benefit from a boom in the use of artificial intelligence (AI):

  • AI platforms: Amazon, Baidu, Google, Microsoft.
  • Machine learning chips: Nvidia, Intel, Apple, Google, Cambricon, Xilinx.
  • Neuromorphic computing systems: IBM, Intel.
  • Microcontrollers: Texas Instruments (TI), NXP, STMicroelectronics.
  • Micro-electromechanical systems (MEMS): Broadcom, STMicroelectronics, NXP, Bosch.
  • Non-volatile memory: Samsung, SK Hynix, Intel, Micron.
  • Quantum computing: IBM, Google, Microsoft.

Players

The $500bn chip industry is going through a phase of dramatic disruption, as the traditional approach to making processors faster and more powerful can no longer deliver the performance improvements it used to. Rather than simply doubling the power of microprocessors, manufacturers have to find ways to make them smarter, so the emphasis in chip design has shifted from a race to place more transistors onto a square millimeter of silicon to a focus on building microprocessors as systems made up of multiple components, each of which is designed to perform a specialized task. The graphic below lays out which companies are likely to dominate in the key AI chip technologies.

Who are the Big Players in AI Chips?

| Market leaders | Challengers | | | |------------------|----------------|-----------------|------------------| | ARM | Google | Alibaba | Oracle | | Intel | Microsoft | IBM | HPE | | Amazon | Facebook | SAP | Tencent | | Baidu | Tesla | GE AMD | Cerebras | | Qualcomm | Wave | | | | Nvidia | Google | Baidu | Graphcore | | Intel | Microsoft | Huawei | Amazon | | IBM | ARM | KnuEdge | Groq | | Apple | Xilinx | Cambricon | Facebook | | Horizon Robotics | Alibaba | | | | Bitmain | Tenstorrent | | | | Qualcomm | Samsung | | | | IBM | Intel | Graphcore | Sony | | HPE | KnuEdge | | | | TI | STMicro | Bosch | Microchip | | NXP | Renesas | Continental | Infineon | | Fujitsu | | | | | STMicro | Analog Devices | Infineon | Omron | | NXP | Knowles | Velodyne | ASM | | TI | Sony | TDK | Quanergy | | Bosch Intel | SK Hynix | Crossbar | TSMC | | Micron | Samsung | Nantero | Global Foundries | | IBM | D-Wave | HP | | | Google | Intel | 1QBit | | | Microsoft | Alibaba | Lockheed Martin | |

AI Platforms

By 2025 the current set of internet giants will be even more powerful as they establish themselves as the gatekeepers to the world's richest data sets (notably in medicine, transport, and the industrial internet of things (IIoT)), armed with the most powerful proprietary AI systems which they will have largely designed and built themselves. Amazon, Google, Microsoft, Alibaba, and Baidu are already coming to market with cloud-based software as a service (SaaS) platforms that all-comers can tap into on a pay-as-you-use basis to rent AI resources and download software development kits (SDKs). The availability of SaaS-based AI platforms will spur the adoption of AI, as it makes it possible for even very small organizations to use AI without having to make an enormous investment in computing resources.

Machine Learning Chips

Over the past three years, there has been an explosion in the number of specialist AI chips, mainly accelerators for legacy systems, many of which were developed by start-ups such as Cambricon. They threaten the old guard graphics processing unit (GPU) and central processing unit (CPU) suppliers like Nvidia, Intel, and AMD, and so do the internet giants like Google and Amazon, as they are increasingly designing their own AI chips.

Neuromorphic Computers

By 2025 a new class of microprocessor, in the form of neuromorphic or neuro-synaptic chips, will be on the market to help handle the deluge of data and power deep learning applications that make use of neural networks. IBM and Intel are the leaders with the highest profile at present, but UK start-up Graphcore and Wave Computing in the US, among others, will be important challengers.

Microcontrollers and MEMS

As the internet of things (IoT) expands, the proportion of data processing that takes place at the edge of the network will increase. As connected devices collect more data, and perform a wider range of functions, they will take on an increasing share of the processing, rather than shipping large quantities of data over the network to a centralized application. The dominant players are TI, NXP, Broadcom, STMicroelectronics, Bosch, and Analog Devices.

Non-volatile Memory

Among the memory makers, the main challenge over the next few years will be to design much more non-volatile memory on-chip for machine learning at the edge, and into AI server chips at the data centers. SK Hynix and Samsung will increasingly be challenged by Intel and Micron's 3D XPoint hybrid technology, and carbon nanotube start-up Nantero.

Quantum Computing

Quantum computing represents a completely new approach to information processing which, though still in its early days, will have a profound impact on the way computers are designed. IBM, Google, and Microsoft lead the pack as the first players to build viable, albeit still relatively small-scale, quantum computers.

Technology Briefing

The Demise of Moore's Law

For over four decades the processor industry has followed Moore's law, which predicted that the density of transistors in microchips would double approximately every two years. Since the 1970s a race has been on to develop chips with higher and higher densities of transistors, but Moore's law couldn't continue to apply forever. By 2010 it was clear that it would no longer be possible to simply continue doubling the density of transistors on microchips in order to make processors faster, because of two factors. The first related to the physical proximity of the transistors within the silicon - once they are separated by only a few atoms they tend to interfere with one another - while the second related to power density. Each transistor consumes a small amount of power as the processor functions. As more and more transistors are crammed into a smaller and smaller space, the heat generated increases accordingly, making it increasingly difficult to cool the processor down.

The Era of Non-binary Computing

The chip industry has responded by focusing less on making processors faster (although this remains important) and more on making processors smarter.

This has led to the emergence of coprocessors, which can perform tasks that are offloaded to them by the central processor. Increasingly, manufacturers are merging these coprocessors with CPUs, such that a modern CPU is better described as a collection of subsystems, each specialized for specific tasks. More recently, research has begun to look beyond the binary computing model of processor design and is now exploring radically different approaches to computing like neuromorphic processors, and quantum computing.

The Concept of AI Computing

We are entering a world in which the ability to process large quantities of data, draw inferences from it, and act quickly in response to the insight it gives will determine the survival of many organizations.

Over the next five years, we will see real progress in developing computers that operate a lot more like the human brain than they do now. Rather than depending on a collection of binary transistors, neuromorphic computers allow for a rich combination of different states, and complex linkages between neurons, mimicking the way the human brain stores and processes information. Another emerging technology is quantum computing, which exploits the laws of quantum physics to define a processor architecture in which individual units of processing (qubits) can exist in a spectrum of states between 0 and 1.

Billions of dollars of research investment will flow into research projects looking at these new models and, while real results may be as much as a decade away, they will eventually transform the way we think about data, the software we write to process it, and the algorithms we apply to it.

Deep Learning Eats the World

Deep learning software makes use of neural networks, which enable it to train itself and make inferences on the basis of data rather than simply following a predefined set of rules. This technology has already arrived and it will appear in every industry and in almost every application from data centers to edge computing devices by 2025.

The Zettabyte Challenge

According to Cisco's Visual Networking Index, annual global internet protocol (IP) traffic will reach 4.8 zettabytes by 2022, up from 1.5 zettabytes in 2017. This surge in activity is being driven by significant growth in the number of connected devices that generate, and process data - from smart vehicles, to modern medical technology, to the IoT. Cisco estimates that the number of devices connected to IP networks will be more than three times the global population by 2022. This dramatic growth in data volumes means that not only do processors have to advance in order to be capable of processing all of the data that is being generated, but there also need to be improvements in storage systems, networks, and approaches to data security and governance.

Crunch Time?

As connected sensors become increasingly pervasive, whether installed in cars and trucks, attached to goods and packages, or built into the infrastructure around us, the volume of data they generate will demand the pervasive presence of high-speed communications technology to transport the data to hyperscale data centers, which will have to find less energy-intensive ways to process the deluge of data that flows into them. Without accompanying advances in data storage and transmission, many of these advanced processors will spend the majority of their time waiting for data to arrive. The arrival of newer, faster networking technologies will certainly help, but even the much-vaunted and overhyped arrival of 5G will not prevent networks becoming saturated unless a better way is found to process, aggregate, and transmit the tidal wave of data that will flow across the internet. Crunch time is approaching, and wholly new ways to process, analyze and draw inferences from data are needed. Across the world, global giants like IBM, Google, Intel, Qualcomm, Samsung, and HPE, start-ups such as Graphcore and Cambricon, and countless universities are showing signs of making crucial breakthroughs by the early 2020s. All of these organizations are making deep investments into neuromorphic and quantum computing research, as well as in ultra-light, superconductive materials - most famously graphene - from which to make electronic circuits.

Learning from the Brain

The human brain is an exascale computing system that runs on less than 20 watts of energy in a space capable of holding just two liters of liquid. There is no man-made computing platform in existence that is capable of delivering this type of computing power at the same size and energy cost. Despite its many flaws, it is a peerless adaptive inference or Bayesian engine. The goal of neuromorphic computing research is to develop computers that combine the digital 'left brain' with the digital 'right brain'. The former, which handles the largely repetitive, training part of deep learning, will still be based on the same Von Neumann architecture (see Glossary section for definition) that has governed processor design for the past few decades, but the latter, for the more creative, pattern-recognizing, inferencing part of the process, will be neuromorphic; that is, based on the neuro-synaptic operation of the brain. It will be clockless, event-driven, and involve massively parallel processing.

Step Change Ahead

However fast 5G data packets can be shifted round the world, the data needs to shift with near zero latency within and between the digital devices themselves. At the same time, despite moves towards submersible data centers and liquid cooling, the global IT sector is still expected to need more than 25% of the world's electricity generation capacity by 2025 without a dramatic step change.

The next era of processor design is setting out to deliver that step

Register below for a copy of your FREE report

 

First name is required
Last name is required
Job Title is required
Company is required
Work email is required
Country is required
Industry is required
 Tick here to opt-out of curated industry news, reports, and event updates from GlobalData
By clicking 'Submit' you agree to our Terms & Conditions and consent to us collecting your details for the purposes of your enquiry.
 

GDPR + CCPA Compliant

Personal & transactional information is kept safe from unauthorized use.

Visit our privacy policy for more information about our services, how we may use, process and share your personal data, including information on your rights in respect of your personal data and how you can unsubscribe from future marketing communications. Our services are intended for corporate subscribers and you warrant that the email address submitted is your corporate email address.

Still looking?

Have you explored all of our expert analysis? Uncover your next opportunity with our reports.

Explorer

Access more premium companies when you subscribe to Explorer