Investors have been keeping a close eye on Nvidia’s data center business. Nvidia’s graphics processors (or GPUs) have become a dominant force in deep learning, a popular type of machine learning, and most of that computing is taking place in the data center. Each quarter Nvidia’s data center unit, though still relatively small to its gaming business, has been doubling and tripling from the previous year.
But a big chunk of its data center business is actually hidden its gaming unit financials, according to one financial analyst.
In a report out Thursday, Susquehanna Financial Group analyst Christopher Rolland estimated that as much as 10% of Nvidia’s reported gaming business is being used for data center infrastructure.
This is because many people developing artificial intelligence software are buying gaming GPUs in their data center. Gaming GPUs are cheaper and sometimes can perform at higher speeds than the server-grade chips. The giant data center operators — like Google, Microsoft and Facebook — are buying more of the server GPUs.
If Susquehanna’s estimates are true, this is important because Wall Street values Nvidia’s data center business much higher than its gaming business — a multiple of 20 for data center versus only 6 for gaming. Data center customers are willing to pay more and there’s a bigger growth potential with the growing use of deep learning systems across a variety of products and services. Nvidia has reported growing average selling price in its gaming unit, but that’s likely the result of these data center buyers, Rolland said.
In Nvidia’s most recent financial quarter ending in August, Nvidia reported gaming revenue of $1.19 billion, up 52% year-over-year, and data center revenue of $416 million, up 175% from the previous year. If Rolland’s estimates are correct, that would give Nvidia more than $100 million extra revenue in the data center business for the quarter.
“This [data center] portion of gaming revenue may be worth as much as $11 billion, versus just ~$3.5 billion if it were stand-alone gaming,” the Susquehanna report said.
With this reevaluation, Susquehanna is pushing its price target from $140 to $155.
Susquehanna used Adam Geitgey, an independent consultant who helps companies build AI systems, as a case study in this trend. Geitgey prefers Nvidia’s Titan gaming GPUs when building out AI infrastructure, and the main reason is the cost. A data center GPU, called Tesla, can cost four times as much as a gaming card.
Another example is New York-based AI startup Clarifai, which maintains its own data center in New Jersey for specialized training of its AI models for image recognition tools. Clarifai uses Nvidia gaming GPUs because that hardware can typically be run at speeds higher than what they’re supposed to (overclocking), whereas Nvidia doesn’t let its server GPUs run at higher speeds because of overheating concern, said Clarifai founder and CEO Matt Zeiler. Server GPUs tend to be packed more tightly than something meant for a consumer desktop computer.
There are some drawbacks with developing AI software on gaming cards, Rolland pointed out. There is only a certain amount of memory gaming GPUs can handle before they max out, and some extra software development is required.
But all of this might change. Nvidia’s recently announced its latest generation GPU architecture, called Volta, that’s currently only be available as a server product. The Volta-based server GPU, called the Tesla V100, includes 640 specialized computes core designed to better run the mathematical operations used in deep learning networks. Soon, to get the best performance in deep learning operations, developers are likely going to need to buy Nvidia’s server hardware — that’s at least until Nvidia comes out with Volta-based GPU for the consumer gaming market.
Nvidia GPUs have exploded in the world of AI in recent years. As a result, the Santa Clara-based chipmaker’s stock is up nearly 170% in the past 12 months.
“If you look back a few years ago, Nvidia was just a gaming company,” Zeiler told Forbes. “They’ve completely shifted to machine learning.”
More Info: www.forbes.com