All eyes are on Nvidia as the AI juggernaut prepares to release what could be the most consequential tech earnings of 2024.
Bloomberg reported that the chip giant’s stock could move about 8% in either direction following the results, potentially shifting its market value by nearly $300 billion — an amount exceeding the total worth of most S&P 500 companies.
The heightened uncertainty centers on Nvidia’s new Blackwell chip line. While the company projects billions in fourth-quarter revenue from the product, production delays have complicated supply forecasts.
Customers may delay purchases of current-generation Hopper chips while awaiting Blackwell’s release.
“Many of Nvidia’s customers, particularly cloud service providers and major enterprises, are prioritizing AI chip investments despite broader cost-cutting measures,” Dhanvin Sriram, founder of AI tool PromptVibes, told PYMNTS. “AI workloads have become mission-critical, and Nvidia’s GPUs remain the gold standard for training large models. However, spending plans are becoming more strategic. Customers are looking for ways to maximize efficiency, such as deploying fewer but more powerful chips or exploring software solutions like Nvidia’s AI Enterprise suite to optimize their existing hardware.”
The stakes are high, given Nvidia’s nearly 200% stock surge in 2024, which made it the world’s most valuable company. The chipmaker has exceeded revenue estimates by an average of $1.8 billion over the past five quarters.
Bloomberg reported that customers, including Microsoft, Alphabet, Amazon and Meta, indicated increased capital spending plans for the year ahead, pointing to continued demand for Nvidia’s artificial intelligence accelerator chips.
“From what I’ve seen in the industry, many enterprise customers are becoming more strategic with their AI investments,” Sriram said. “Initially, there was a rush to adopt Nvidia’s GPUs as AI projects gained momentum, but now customers are focusing on maximizing ROI from existing deployments. This shift means slower growth in bulk orders for AI chips as companies reassess their spending to ensure scalability and efficiency. That said, Nvidia’s expansion into cloud partnerships and software ecosystems helps mitigate this trend, as it ties customers into a broader value chain that extends beyond just hardware.”
Originally a gaming chip maker, Nvidia has transformed into the backbone of the AI industry, with its graphics processors now powering everything from chatbots to autonomous vehicles. The company’s 1993 founding as a gaming hardware pioneer has evolved into a commanding presence in artificial intelligence, where its specialized chips have become essential infrastructure. Nvidia’s CUDA platform and tensor core GPUs, like the A100 and H100, dominate AI model training and inference, making it a cornerstone of data center infrastructure. Its software ecosystems, such as TensorRT and NVIDIA AI Enterprise, bolster its offerings in the healthcare, automotive and robotics industries.
Overall, many observers remain bullish on Nvidia and AI’s prospects. Nikhil Vadgama, director of University College London’s Centre for Blockchain Technologies and co-founder of Exponential Science, told PYMNTS that the surging demand for artificial intelligence infrastructure is propelling tech stocks to record heights as businesses aggressively scale up their computing capabilities.
“As AI becomes ever more ubiquitous, the myriad legal and moral discussions around the technology will become more pressing issues for regulators,” he said. “Across the Magnificent Seven, we now have an AI sector with quarterly revenues approaching the GDP of some European nations at more than $450 billion. This is a monumental amount of financial firepower that must come with an appropriate level of transparency and responsibility.”