12/30 2025
513
News has emerged that Nvidia has restructured its cloud computing team, merging the several-hundred-strong DGX Cloud unit into its engineering and operations division.
The team's primary focus has pivoted from offering cloud computing services to external enterprise clients to catering to the AI model development needs of Nvidia's in-house engineers using its proprietary chips.
This reorganization was accompanied by staffing changes. Alexis Black Bjorlin, the former head of the cloud computing department who reported directly to CEO Jensen Huang, has assumed a new position within the company. The team is now under the leadership of Senior Vice President Dwight Diercks, who also reports to Huang.
Nvidia's foray into cloud computing began in March 2023. At that year's developer conference, Huang himself introduced the DGX Cloud service, presenting a roadmap for direct engagement with AI developers.
Initially, this service had a dual strategic purpose: firstly, to diversify revenue streams beyond chip sales, and secondly, to forge direct relationships with AI developers.
At the time, Nvidia was already aware of potential threats. Major cloud providers like Google, Microsoft, and Amazon were developing their own AI chips, such as AWS's Trainium and Google's TPU.
The competitive edge of DGX Cloud rested on its performance. Nvidia asserted that the chips offered through its service would outperform those from traditional cloud providers like Amazon Web Services (AWS) in terms of both configuration and performance.
On the surface, the business seemed promising. Nvidia had highlighted that renowned companies such as ServiceNow and SAP were among its initial clients.
However, sources familiar with the department's operations revealed that the DGX team consistently faced challenges in attracting a sufficient client base.
A more significant hurdle was technical support. Since DGX Cloud services were hosted in data centers operated by various cloud providers like AWS, Nvidia encountered substantial difficulties in resolving customer issues.
Nvidia's strategic realignment reflects a nuanced balance within the commercial ecosystem and a clear prioritization of objectives.
A key consideration is avoiding conflict with its largest customers. An insider noted that Huang had been hesitant to significantly expand the DGX Cloud business, primarily to prevent alienating core customers like Amazon Web Services (AWS) — major buyers of Nvidia's chips.
This approach has led to an intriguing contradiction in Nvidia's investment strategy. Nvidia is simultaneously providing financial support to emerging GPU cloud service providers such as CoreWeave and Lambda through various means.
The operations of these invested companies directly compete with DGX Cloud. This arrangement resembles a diversified investment strategy under a self-competing framework, ensuring that market demand for Nvidia's GPUs remains robust regardless of the circumstances.
A recent statement from an Nvidia spokesperson set the tone for this restructuring: "We will continue to invest in DGX Cloud to provide world-class infrastructure for cutting-edge R&D... Our goal has always been to develop DGX Cloud as a pilot project."
This statement clearly redefines its role from an "external commercial service" to internal R&D support and an ecological testing ground.
This strategic shift does not signify a reduction in Nvidia's investment in cloud computing; rather, it represents a fundamental change in direction. The company remains one of the largest lessees of servers powered by its own chips.
Nvidia has publicly announced plans to invest up to $26 billion over the next few years in leasing such servers. This substantial computing capacity will no longer primarily serve external DGX Cloud customers but will instead support Nvidia's internal AI model R&D efforts.
Despite facing competition, Nvidia's dominance in the AI chip market remains unchallenged. Its strategic focus is firmly on securing the core of AI computing power: chip design and ecosystem development.
Data indicates that in the first three quarters of 2025 alone, Nvidia participated in 50 AI-related venture capital investments, exceeding the total for the entire year of 2024.
Its investments span the entire industrial chain, from infrastructure and model developers to specific applications. Notably, in the AI cloud infrastructure sector, investments in companies like CoreWeave and Lambda have secured a stable market demand for its GPU products.
At the model layer, Nvidia has not only made substantial investments in industry giants like OpenAI and xAI but has also extensively funded enterprise-level model companies such as Mistral AI and Cohere.
Nvidia has also shown a keen interest in cutting-edge fields that may require vast amounts of computing power. For instance, it participated in the financing of nuclear fusion energy company Commonwealth Fusion and invested in Nscale, which is constructing data centers for OpenAI's "Stargate" project.
These diverse investments ultimately converge on a common objective: creating a future ecosystem that continuously demands powerful AI computing capabilities, thereby ensuring sustained demand for its GPUs.
References:
https://finance.sina.com.cn/world/2025-12-28/doc-inhehpya9373581.shtml
https://news.futunn.com/post/66520545/no-longer-challenging-aws-head-on-nvidia-restructures-its-cloud?futusource=news_stock_stockpagebignews&ns_stock_id=202597&src=6&lang=zh-cn&level=1&data_ticket=1766116194545042
https://www.ithome.com/0/890/490.htm