NVIDIA is, by all accounts, the de facto standard in AI chipsets today. In addition to being a chipset manufacturer, NVIDIA offers extra value to its customers by creating derivative products and solutions based on its chipsets. This has helped the company not only understand customer pain points, but also become Wall Street’s darling by creating additional value.
Tractica has been following the AI chipset market for some time. According to our estimates, the market for AI chipsets will be worth roughly $71 billion by 2025. Tractica’s Deep Learning Chipset report segments the market in multiple ways, including by architecture, power, and performance. Another way of segmenting the market is by enterprise versus edge and training versus inference.
Stretching Its Reach
As of 2019, NVIDIA is well on its way to sell into the majority of these market segments. Its data center business reached $2.9 billion in FY 2019 (ended January 2019) and continues to grow. NVIDIA’s V100 is the standard for training that is in production today and available to OEMs. AMD is the other company shipping, but in Tractica’s discussions with users, nobody has brought it up as a viable option. NVIDIA also developed a chipset for inference, the T4. T4 was adapted for 57 separate server designs by the world’s leading OEMs within 60 days of introduction, a record by itself.
The company is not far behind in automotive market. Its automotive business generated $633 million in FY 2019. Although this is a small number in comparison with the data center market, not many automotive OEMs are shipping AI-based products just yet. When L4 and L5 cars start appearing in the market, this number will go up drastically. The only other competitor in this market, Mobileye, is way ahead of the other players in terms of automotive AI products. Still, NVIDIA’s Xavier platform, which punches 30 TOPS, is the most advanced platform in the market.
These three chipsets fall into the 250 W, 75 W, and 30 W ranges and offer mid to high performance. Together, they cover the majority of the AI chipset market and segments, as shown below.
|Market||NVIDIA Product||Other Segments|
|Enterprise Training||V100||High performance, high power, training|
|Enterprise Inference||T4||Mid power, high performance, inference|
|Edge – Mid Power||Xavier||Mid power, mid performance, inference|
|Edge – Low Power||Jetson Nano||Low power, low performance, inference|
However, NVIDIA has no intentions of stopping there. With the introduction of the Jetson Nano, the company now has a product in the 5W edge segment. Jetson Nano, which is priced as low as $99, is targeted toward hobbyists and universities that are trying to learn AI. This is another potential market for NVIDIA products in the low power category.
This leaves only one category that is not being pursued by NVIDIA: a sub-5W category that is targeted toward mobile phones and tablets.
Making All the Right Moves
The greatness of NVIDIA’s AI product line is that the software stack is uniform across all the products. If an application runs on Jetson Nano, it can just as easily run on Xavier or T4 (though much faster). The brilliance of this strategy is that once a developer is trained on Nano, they can potentially build applications for cloud, on-premises, edge, or client devices. So once university students are trained, they can carry this knowledge to their employers and recommend GPUs for production products.
AI chipsets have generated lot of excitement, and a wide range of companies have jumped into a market that includes semiconductors, cloud companies, hyperscalers, and startups. As of 2019, of all the companies in the race, NVIDIA is the only one that’s been able to generate the billions of dollars that the market has promised. We are still at the beginning of AI revolution and the race is wide open, but NVIDIA is making all the right moves to become the AI chipset king of the future.