top of page
Writer's pictureGad Benram

What can shift NVIDA's stock up or down?

Updated: Sep 19

Nvidia’s stock has climbed 142% YoY (as of August 2024) and more than 2400% over the past 5 years. Its current price-to-earnings ratio stands at 56.07, which isn’t unusually high, largely because they currently have no significant competition in manufacturing GPUs for data centers. However, to understand why consumers are willing to pay three times the cost of manufacturing a GPU—and to do so at such an enormous scale—we need to examine the state of AI.




NVIDIA’s stock price has experienced a remarkable surge, rising 142% year-over-year as of August 2024 and more than 2750% over the past 5 years only. The company's price-to-earnings (P/E) ratio now stands at 56.07, which, though high, seems acceptable given NVIDIA's dominance in the GPU market, especially for data centers. However, this impressive valuation begs the question: is it sustainable, and can it be justified by the future prospects of AI, specifically AGI?


Nvidia’s sales are mostly driven these days by what’s called sales to data centers, these jumped 141% YoY in Q2 FY24 and as they are the source of the company’s biggest growth, let’s try to understand who is paying for these data centers and why.


The Power Behind the AI Revolution

NVIDIA’s GPUs have become the backbone of the AI revolution, particularly in training and hosting large-scale AI models. The demand for these GPUs is massive. For instance, Meta recently purchased 350,000 GPUs for a data center focused on AI training, with a similar number being sold to Microsoft in 2023. Google and Amazon have also invested heavily, buying around 50,000 units each.



These sales have significantly boosted NVIDIA’s revenues, particularly from its data center division. But what exactly are these companies doing with such an immense number of GPUs, and how do they plan to generate value from them? The answer may lie in two primary revenue streams: current AI applications and the ambitious goal of developing AGI.


Revenue Stream 1: Supporting Existing AI Models

A large portion of NVIDIA’s GPU sales are currently used to power existing AI models. When users interact with AI-driven applications like ChatGPT or Google's Gemini, GPUs work behind the scenes to process these interactions. These AI models are also employed to enhance productivity across various industries, from generating personalized content to improving data analytics.

However, despite the widespread work on AI innovation, the profitability of these models is still in question. Many companies, especially those outside the tech giants, are struggling to move beyond the proof-of-concept phase and the main winners are consulting businesses. Even AI leaders like OpenAI, which is projected to generate around $3.5 billion in annual revenue, are not yet turning a significant profit. This raises doubts about whether the current applications of AI can justify the massive investments in GPUs.


Revenue Stream 2: The Quest for the "God Computer"

The second, more speculative, revenue stream is the development of what some in the industry refer to as the "God computer"—a superintelligent AI that surpasses human cognitive abilities. This endeavor requires creating AI models with trillions of parameters, far beyond the capabilities of current models like GPT-4.


The Role of GPUs in Scaling AI

The drive to create increasingly powerful AI models requires four main components: algorithms that are the result of research, data, compute power, and energy. I’ll leave the other three aside and focus on compute power, which essentially translates to more GPUs to handle the vast amounts of data and the complex computations required. NVIDIA’s GPUs are currently the industry standard for this task, making the company indispensable to any organization aiming to push the boundaries of AI. But why do you need more GPUs to create a smarted AI?


source: Google’s Pathways Language Model (PaLM), 2022

For instance, Google’s Pathways Language Model (PaLM), introduced in 2022, demonstrated that larger AI models (measured by their number of parameters) are not just more accurate but also capable of handling a broader range of tasks. As the size of these models grows, their generation requires extensive computational resources, and GPUs are the key component in the compute infrastructure.


How many GPUs do you need to generate a GPT model?

Let’s put it into perspective of GPU count. When OpenAI released GPT-3 in 2020, it was supported by a data center with around 10,000 GPUs. This model, which contains 175 billion parameters, is considered to have the cognitive abilities of an intern. By scaling up to 600,000 GPUs, as some companies like Meta are now doing, the aim is to develop AI models that could rival the intellect of a genius—or even approach the fabled "God computer."


The High Stakes of AGI Development

Achieving AGI could revolutionize industries, automate complex tasks, and drive innovations that were previously thought impossible. This potential justifies the immense investments that companies are making in GPUs, with some spending between $10 billion and $100 billion annually on AI research and development. However, as mentioned, this progress also depends on breakthroughs in research, having sufficient energy resources, and access to high-quality data.


If AGI is achieved, the value generated could far surpass the costs, enabling these companies to replicate software with superhuman intelligence, fundamentally changing the world as we know it. And trust us, they wouldn’t want to get caught off guard, as happened with OpenAI’s release of more advanced AI. However, this is a high-stakes gamble. Even having enough electricity is not trivial, as pointed out by former President Donald Trump. The timeline for achieving AGI is uncertain, with some speculating that the next level of AI (GPT-5, for example) could emerge as early as 2025, while others caution that it may take much longer—if it happens at all.


The Risk of an AI Bubble

There is a significant risk that the current AI frenzy could lead to a bubble. If AGI development stalls or fails to deliver on its promises, the immense investments in AI and GPUs could become unsustainable. This could lead to a sharp decline in GPU demand, a slowdown in AI-related investments, and a significant correction in NVIDIA’s stock price.

Moreover, competitors are not standing still. Companies like Google are developing their own AI hardware, such as Tensor Processing Units (TPUs), which could eventually challenge NVIDIA’s dominance in the GPU market. If these alternatives prove effective, they could erode NVIDIA's market share and put additional pressure on its stock valuation.


Conclusion: A Future in Flux

NVIDIA’s soaring stock price reflects the market’s high expectations for the future of AI, particularly the development of AGI. While the potential rewards are enormous, so are the risks. The success of NVIDIA's current valuation depends largely on whether AGI can be achieved and whether it will deliver the transformative impact that many are hoping for.

If AGI does come to fruition and fulfills its promise, NVIDIA could continue its extraordinary growth, justifying its high valuation. However, if progress stalls or fails to meet expectations, the current exuberance around NVIDIA’s stock could quickly turn into a painful correction. Investors are betting heavily on a future that is still very much in flux, and only time will tell if this gamble will pay off.


Comentários


Sign up to get updates when we release another amazing article

Thanks for subscribing!

bottom of page