Nvidia keeps Buy rating as Citi highlights key investor topics on stock
Citi has maintained a Buy rating on Nvidia (NASDAQ:NVDA) and set a price target, emphasizing key investor topics. Analysts predict a strong +40% year-over-year growth in cloud data center capex next year but expect the stock to remain range-bound until CES in January. They highlight the importance of gross margins, forecasting a low of 72% in January, stabilizing in the mid-70s% long-term. The analysts argue that Nvidia's GPUs are preferable for enterprises adopting multi-cloud strategies due to their flexibility, while ASICs are limited to specific applications. They also note that data center operators prioritize total cost of ownership and return on investment, areas where Nvidia excels.
Citi maintained a Buy rating on Nvidia (NASDAQ:NVDA) and a $150 price target on the shares, while highlighting key investor topics on the stock.
Analysts led by Atif Malik said that while they are bullish on another strong +40% year-over-year cloud data center capex growth next year, they expect the stock to likely remain range bound through CES Jan before Blackwell driven year-on-year sales and gross margin inflection in the April quarter.
The analysts believe AI adoption remains in third-fourth innings as enterprise AI demand takes off next with AI agents.
The trajectory of the gross margins in the near-term is a key investor topic, according to the analysts. Citi models low 70s or about 72% trough in the January-quarter, with long-term gross margins to stabilize in the mid-70s% once Blackwell fully ramps.
Custom application-specific integrated circuit, or ASIC, versus graphics processing unit, or GPUs: The analysts think the reason to choose one versus the other has not changed in the last few years. ASIC remains fixed functioned as it often runs one application. With AI applications evolving swiftly, model sizes growing, and more applications being AI enabled, GPU market will continue to grow.
The analysts added that from a CSP perspective, running cloud AI on ASIC is not viable. From an enterprise standpoint, building applications on ASIC ties them to a specific cloud vendor which often enterprises do not want as they often take the multi-cloud approach. If an enterprise adopting a multi-cloud strategy picks a given cloud and runs its applications on an ASIC, it will have to re-write that application once they move to another cloud.
With Nvidia GPUs, enterprises write their applications once as it will be transferable across clouds. Moreover, Nvidia's large installed base is a strong pull for developers who aim to have the largest possible adoption of their applications, according to the analysts.
GPU competition — In addition, Malik and his team said that while performance metrics are important, they think the data center operators are focused on total cost of ownership, or TCO, and return on investment, or ROI, that are both functions of throughput which Nvidia leads. As Nvidia runs various applications including AI, the data center operators rely on the company to have the hardware to run multiple applications rather than buying accelerators that are limited in their use cases.
Nvidia's products have evolved from chips, to cards to systems, and to finally racks. This approach allows the company to drive innovation as the individual pieces can be further optimized, the analysts noted.
For Blackwell Sales Mix, the analysts expect a mix shift more towards GB200 format (rather than B100's 8-GPU format) given its TCO and ROI benefits.