When it comes to cloud AI chips, you have to mention the names of NVIDIA and Intel. The two leading companies in the cloud AI chip, on August 16, NVIDIA released the second quarter of 2020 earnings. According to the financial report, as of July 28 this year, NVIDIA's total revenue was 2.579 billion US dollars. NVIDIA GPUs drive business growth and revenue of 2.1 billion US dollars. "We have achieved continuous growth across platforms." Nvidia CEO Jensen Huang said: "Real-time light Tracking is one of the most important innovations in the field of graphics computing for a decade. He also said that laptops capable of handling complex games and robotic AI models capable of handling live chats are driving demand.
One of the key factors behind the rise of artificial intelligence is the improvement of cloud computing capabilities, which is mainly driven by the enhancement and upgrade of cloud AI chips. Cloud AI chips are computing chips that focus on AI workloads and are typically deployed in cloud or data center applications.
ABI Research's first report, "Cloud AI Chips: Market Outlook and Supplier Positioning," said that the AI chip market generated by cloud AI reasoning and training is expected to grow from $4.2 billion in 2019 to $10 billion in 2024. .
Compared with the terminal AI chip, the cloud AI chip usually has higher computing power, higher power consumption, larger physical footprint, and therefore is relatively more expensive.
So far, the cloud AI market has been dominated by NVIDIA's GPUs and Intel's CPUs, and now they are facing challenges from companies such as Cambrian Technology, Graphcore, Habana Labs, and Qualcomm.
Today's cloud AI chip market can be roughly divided into three parts:
1. Public cloud, hosted by cloud service providers, such as: AWS, Microsoft Azure, Google, Alibaba Cloud, Baidu Cloud and Tencent Cloud.
2, private cloud + hybrid cloud, mainly used in enterprise data centers, such as: VMware, Rackspace, NetApp, HPE, Dell and other companies of various products.
3. Telecom Cloud, a cloud infrastructure deployed by telecommunications companies for core networks, IT and edge computing workloads, is an emerging market.
Cloud Reasoning The AI chip market presents a different picture. The market is not dominated by a single player, depending in part on the nature of the inference task and the areas that are different in the vertical direction. ABI Research expects that ASICs will achieve strong growth in this area from 2020 onwards.
One example is Google's TPU, which is primarily used in cloud AI-related training and reasoning tasks and is seen as a powerful challenger for CPU and GPU technologies. As stated in the report, Google's success on the TPU provides a blueprint for other cloud service providers to develop AI-specific chips. According to ABI Research, 15% to 18% of the market will fall under cloud service providers by 2024.
Terminal AI chip
As the AI reasoning task shifts to the edge, the terminal AI chip becomes more important.
The terminal AI chip refers to a computing chip that focuses on the AI workload. It is mostly used in AI inference tasks. It is usually deployed in terminal environments such as terminal devices, gateways, and on-premises servers. In some cases, the terminal AI chip can also support AI training, especially the training of deep learning models.
Terminals, that is, smart devices that perform edge computing, such as mobile phones, security cameras, automobiles, smart home devices, and various IoT devices. The number of terminals is large and the demand varies greatly.
ABI Research's second report, "Terminal AI Chips: Technology Outlook and Use Cases": The terminal AI chip market is expected to grow to $71 billion by 2024 and a compound annual growth rate of between 31 and 2024. %.
This strong growth is mainly due to the migration of AI reasoning tasks to the terminal, especially in the smart phones, smart homes, autonomous driving, wearable devices and robotics industries.
Currently, the terminal infers that there are three niche markets:
Robots, because robots rely on many types of neural networks, they often require a heterogeneous computing architecture. For example, SLAM for navigation, session AI for human-machine interface, machine vision for object detection, etc., all use CPU, GPU and ASIC to varying degrees. NVIDIA, Intel and Qualcomm are particularly competitive in this area.
Intelligent industrial applications, including manufacturing, smart buildings, and oil and gas sectors. In this respect, FPGA vendors perform well, thanks to the programmability of FPGAs, and the flexibility and adaptability of FPGAs in the face of traditional devices.
"Very terminal", the ultra low power AI chip that needs to be embedded in sensors and other small end nodes in the WAN. Given the focus on ultra-low power consumption, this market is dominated by FPGA companies, RISC-V designs and ASIC vendors.
The niche market refers to companies that choose a small product or service area, concentrate on entering and becoming a leader, from the local market to the whole country to the world, and at the same time establish various barriers and gradually form a lasting competitive advantage.
In addition, the report also studies the current development of open source chips. Under RISC-V, open source chip startups have begun to develop AI-specific chips with high parallel computing power. Due to the participation and contribution to the entire industry, open source AI chips will be more in line with market requirements and expectations, greatly reducing errors and development costs in product maintenance and upgrades.
This article is from ABI Research and is reproduced as a reprint.
Disclaimer: This article is written by the author of the electronic column, or reproduced online. The opinion only represents the author and does not represent the position of the electronic enthusiast network. If there is any infringement or other problem, please contact the report. Infringement complaint