NVIDIA maintains tight grip on market for AI processors in cloud and data centers
OMDIA's AI Processors for Cloud and Data Center Forecast Report has declared NVIDIA as one of the frontrunners in terms of how it uses artificial intelligence (AI) in the cloud and data centers.
According to the report, NVIDIA swept up an impressive 80.6% share of all global revenue - a total of $3.2 billion in 2020, up from $1.8 billion in 2019.
The report labels NVIDIA's market dominance as 'supremacy in the market for GPU-derived chips' - the likes of which are commonly deployed in servers, workstations, and expansion cards across cloud and data center equipment.
Omdia principal analyst advanced computing, Jonathan Cassell, says NVIDIA employs key strategies to maintain its growth.
"With their capability to accelerate deep-learning applications, GPU-based semiconductors became the first type of AI processor widely employed for AI acceleration. And as the leading supplier of GPU-derived chips, NVIDIA has established itself and bolstered its position as the AI processor market leader for the key cloud and data center market," explains Cassell.
NVIDIA's dominance is also leading to intense market competition as suppliers battle it out and claim their share of the total $4 billion market revenue for cloud and data center AI processors. Total market revenue could reach $37.6 billion by 2026.
"Despite the onslaught of new competitors and new types of chips, NVIDIA's GPU-based devices have remained the default choice for cloud hyperscalers and on-premises data centers, partly because of their familiarity to users," says Cassell.
Cassell points to the NVIDIA Compute Unified Device Architecture (CUDA) Toolkit, which is used extensively in the AI software development community. This, by default, provides a boost for NVIDIA's associated products such as GPU chips.
But market competition will only increase, particularly as the market looks towards other, non NVIDIA-based GPU chips and other AI processors in the future.
According to Omdia's research, other major market players in the cloud and data center AI processor market include Xilinx, Google, Intel, and AMD.
Xilinx ranked second behind NVIDIA. Xilinx provides field-programmable gate array FPGA products commonly used for AI inferencing in cloud and data center servers.
Google ranked third. Its Tensor Processing Unit (TPU) AI ASIC is employed extensively in its own hyperscale cloud operations.
Intel ranked fourth. Its Habana AI proprietary-core AI ASSPs and its FPGA products are designed for AI cloud and data center servers.
AMD ranked fifth for its GPU-derived AI ASSPs for cloud and data center servers.