Author: Ma Leilei
Source: CHANNELWU by Wu Xiaobo
Buffett once said, "Never invest in a business you cannot understand." Yet, as the "era of the Oracle of Omaha" draws to a close, Buffett made a decision that went against his own "family rules": he bought Google stock, and at a high premium of about 40 times free cash flow.
Yes, for the first time, Buffett bought an "AI concept stock"—not OpenAI, nor Nvidia. All investors are asking one question: Why Google?
Let’s go back to the end of 2022. At that time, ChatGPT burst onto the scene, and Google’s top management sounded the "red alert," holding constant meetings and even urgently recalling its two founders. But back then, Google looked like a slow-moving, bureaucratic dinosaur.
It hastily launched the chatbot Bard, but made factual errors during its demo, causing the company’s stock price to plummet and its market value to evaporate by tens of billions of dollars in a single day. Then, it merged its internal AI teams and launched the multimodal Gemini 1.5.
But this product, seen as a trump card, only stirred up a few hours of buzz in tech circles before being completely overshadowed by OpenAI’s subsequent release of the video generation model Sora, quickly fading from attention.
Somewhat awkwardly, it was Google’s researchers who published the groundbreaking academic paper in 2017 that laid the solid theoretical foundation for this AI revolution.

The "Attention Is All You Need" paper
Proposed the Transformer model
Rivals mocked Google. OpenAI CEO Altman looked down on Google’s taste, saying, "I can’t help but think about the aesthetic differences between OpenAI and Google."
Google’s former CEO was also dissatisfied with the company’s laziness: "Google has always believed that work-life balance... is more important than winning the competition."
This series of predicaments led people to doubt whether Google had fallen behind in the AI race.
But change finally came. In November, Google launched Gemini 3, which outperformed competitors, including OpenAI, on most benchmark metrics. More crucially, Gemini 3 was trained entirely on Google’s self-developed TPU chips, which are now positioned as low-cost alternatives to Nvidia GPUs and are officially being sold to external customers.
Google is showing its strength on two fronts: directly responding to OpenAI’s software front with the Gemini 3 series, and challenging Nvidia’s long-standing dominance on the hardware front with TPU chips.
Kicking OpenAI with its feet, punching Nvidia with its fists.
Altman felt the pressure as early as last month, stating in an internal letter that Google "may bring some temporary economic headwinds to our company." And this week, after hearing that major companies were buying TPU chips, Nvidia’s stock price plunged as much as 7% intraday, prompting the company to issue a letter to reassure the market.
Google CEO Sundar Pichai said in a recent podcast that Google employees should get some rest. "From an external perspective, we may have seemed quiet or lagging during that period, but in reality, we were solidifying all the foundational components and pushing forward with all our might on that basis."
The situation has now reversed. Pichai said, "We have now reached a turning point."
At this moment, it is exactly the third anniversary of ChatGPT’s release. Over these three years, AI has kicked off a grand feast of Silicon Valley capital and alliances; yet beneath the feast, the shadow of a bubble looms—has the industry reached a turning point?
Overtaking
On November 19, Google released its latest AI model, Gemini 3.
One test showed that in almost all tests covering expert knowledge, logical reasoning, mathematics, and image recognition, Gemini 3 scored significantly higher than the latest models from other companies, including ChatGPT. Only in a single programming ability test did it slightly underperform, ranking second.
The Wall Street Journal said, "It might as well be called America’s next-generation top model." Bloomberg said Google has finally woken up. Musk and Altman both praised it. Some netizens joked that this is Altman’s ideal GPT-5.
The CEO of cloud content management platform Box, after early testing of Gemini 3, said its performance improvement was so incredible that they once doubted their own evaluation methods. But repeated tests confirmed that the model won by double-digit margins in all internal assessments.
The CEO of Salesforce said he had used ChatGPT for three years, but Gemini 3 overturned his perception in just two hours: "Holy shit... there’s no going back. This is simply a qualitative leap—reasoning, speed, text, image, and video processing... all sharper and faster. It feels like the world has been turned upside down again."

Gemini 3
Why is Gemini 3’s performance so outstanding, and what has Google done?
The head of the Gemini project posted, "Simple: improved pre-training and post-training." Some analyses say the model’s pre-training still follows the logic of the Scaling Law—by optimizing pre-training (such as larger-scale data, more efficient training methods, more parameters, etc.), the model’s capabilities are enhanced.
The person most eager to know Gemini 3’s secrets is probably Altman.
Last month, before the release of Gemini 3, he issued a warning in an internal letter to OpenAI employees: "From every perspective, Google’s recent work is outstanding," especially in pre-training, where Google’s progress may "bring some temporary economic headwinds" to the company, and "the external atmosphere will be rather severe for a while."
Although in terms of user numbers, ChatGPT still holds a significant advantage over Gemini, the gap is narrowing.
Over these three years, ChatGPT’s user base has grown rapidly. In February this year, its weekly active users were 400 million, and by this month, it had soared to 800 million. Gemini reports monthly active users: in July, Gemini had 450 million monthly active users, rising to 650 million this month.
With about 90% of the global web search market, Google naturally controls the core channels for promoting its AI models, allowing it to directly reach massive numbers of users.
OpenAI is currently valued at $500 billion, making it the world’s highest-valued startup. It is also one of the fastest-growing companies in history, with revenue surging from nearly zero in 2022 to an estimated $13 billion this year. However, it is also expected to burn through more than $100 billion in the coming years to achieve artificial general intelligence, and will need to spend tens of billions more renting servers. In other words, it still needs to seek financing.
Google has an undeniable advantage: a much deeper pocket.
Google’s latest quarterly report shows its revenue surpassed $100 billion for the first time, reaching $102.3 billion, up 16% year-on-year, with profits of $35 billion, up 33%. The company’s free cash flow is $73 billion, and AI-related capital expenditures will reach $90 billion this year.
For now, it also doesn’t need to worry about its search business being eroded by AI, as its search and advertising still show double-digit growth. Its cloud business is booming—even OpenAI rents its servers.
In addition to self-sustaining cash flow, Google also holds chips that OpenAI cannot match, such as massive ready-made data for training and optimizing models, and its own computing infrastructure.

On November 14, Google announced a $40 billion investment in new data centers
OpenAI is adept at making deals, signing computing power agreements worth over $1 trillion with various parties. So, as Google rapidly closes in with Gemini, investors are even more doubtful: can OpenAI’s grand growth vision really fill the deficit?
Cracks
A month ago, Nvidia’s market value surpassed $5 trillion, and the market’s passion for artificial intelligence pushed this "AI arms dealer" to new heights. But the TPU chips used by Google’s Gemini 3 have cracked Nvidia’s solid fortress.
The Economist, citing data from investment research firm Bernstein, reported that Nvidia’s GPUs account for more than two-thirds of the total cost of a typical AI server rack. In contrast, Google’s TPU chips cost only 10% to 50% of Nvidia chips with equivalent performance. These savings add up considerably. Investment bank Jefferies estimates that Google will produce about 3 million such chips next year, almost half of Nvidia’s output.
Last month, well-known AI startup Anthropic planned to adopt Google’s TPU chips on a large scale, with the deal reportedly worth tens of billions of dollars. On November 25, it was reported that tech giant Meta is also in talks to use TPU chips in its data centers by 2027, with the deal valued at several billion dollars.

Google CEO Sundar Pichai introduces TPU chips
Silicon Valley’s internet giants are all betting on chips, either developing their own or partnering with chip companies, but none have made as much progress as Google.
The history of TPU dates back more than a decade. At that time, Google began developing a dedicated accelerator chip for internal use to improve the efficiency of search, maps, and translation. Since 2018, it has been selling TPUs to cloud computing customers.
Since then, TPUs have also been used to support Google’s internal AI development. During the development of models like Gemini, the AI team and chip team interacted: the former provided practical needs and feedback, and the latter customized and optimized TPUs accordingly, in turn improving AI R&D efficiency.
Nvidia currently holds over 90% of the AI chip market. Its GPUs were originally designed for realistic game rendering, relying on thousands of computing cores to process tasks in parallel, a structure that also gives it a huge lead in AI operations.
Google’s TPU, on the other hand, is an Application-Specific Integrated Circuit (ASIC)—a "specialist," designed for specific computing tasks. It sacrifices some flexibility and applicability for higher efficiency. Nvidia’s GPU is more of a "generalist," flexible and highly programmable, but at a higher cost.
However, at this stage, no company, including Google, can completely replace Nvidia. Although TPU chips have reached their seventh generation, Google is still a major customer of Nvidia. An obvious reason is that Google’s cloud business serves thousands of customers worldwide, and using GPU computing power ensures its appeal to customers.
Even companies buying TPUs have to embrace Nvidia. Shortly after Anthropic announced its partnership with Google’s TPU, it also announced a major deal with Nvidia.
The Wall Street Journal said, "Investors, analysts, and data center operators say Google’s TPU is one of the biggest threats to Nvidia’s dominance in the AI computing market, but to challenge Nvidia, Google must begin selling these chips more broadly to external customers."
Google’s AI chips have become one of the few alternatives to Nvidia’s chips, directly dragging down Nvidia’s stock price. Nvidia had to post to calm the market panic triggered by TPUs. It said it was "happy for Google’s success," but emphasized that Nvidia is already a generation ahead of the industry, and its hardware is more versatile than TPUs and other chips designed for specific tasks.
Nvidia also faces pressure from concerns about a market bubble, as investors fear that massive capital investment may not match profit prospects. Investment sentiment can change at any time: they worry about Nvidia’s business being taken away, and also about AI chips not selling.
Famous American "short seller" Michael Burry said he has bet over $1 billion shorting Nvidia and other tech companies. He became famous for shorting the US housing market in 2008, a story later made into the acclaimed film "The Big Short." He said today’s AI craze is similar to the internet bubble of the early 21st century.

Michael Burry
Nvidia distributed a seven-page document to analysts, refuting criticisms from Burry and others. But this document did not quell the controversy.
Model
Google is enjoying a sweet period, with its stock price rising against the AI bubble. Buffett’s company bought its stock in the third quarter, Gemini 3 received positive feedback, and TPU chips have investors excited—all of which have pushed Google to new heights.
Over the past month, AI concept stocks like Nvidia and Microsoft have fallen more than 10%, while Google’s stock price has risen about 16%. Currently, with a market value of $3.86 trillion, it ranks third in the world, behind only Nvidia and Apple.
Analysts call Google’s AI model "vertical integration."
As a rare "full-stack self-builder" in the tech world, Google holds the entire chain in its own hands: deploying self-developed TPU chips on Google Cloud, training its own AI large models, and seamlessly embedding these models into core businesses like Search and YouTube. The advantage of this model is obvious: it does not rely on Nvidia and possesses efficient, low-cost computing power sovereignty.
The other model is the more common loose alliance. Giants each play their own roles: Nvidia supplies GPUs, OpenAI and Anthropic develop AI models, and cloud giants like Microsoft buy GPUs from chip manufacturers to host these AI labs’ models. In this network, there are no absolute allies or opponents: they collaborate when possible, and compete fiercely when necessary.
Players have formed a "circular structure," with funds circulating in a closed loop among a few tech giants.
Generally, the circular financing routine goes like this: Company A pays Company B a sum of money (such as investment, loan, or leasing), and Company B uses that money to buy A’s products or services. Without this "seed money," B might not be able to afford it at all.
One example: OpenAI spends $300 billion to buy computing power from Oracle, Oracle then spends billions to buy Nvidia chips to build data centers, and Nvidia invests up to $100 billion back into OpenAI—on the condition that OpenAI continues to use its chips. (OpenAI pays $300 billion to Oracle → Oracle uses the money to buy Nvidia chips → Nvidia invests its earnings back into OpenAI.)
Such cases have spawned a maze of capital flow charts. Morgan Stanley analysts, in a report on October 8, used a photo to depict the capital flows in Silicon Valley’s AI ecosystem. The analysts warned that the lack of transparency makes it difficult for investors to discern the real risks and returns.
The Wall Street Journal commented on this photo, saying, "The arrows connecting them are as tangled as a plate of spaghetti."

With the boost from capital, the outline of that giant is waiting to take shape—no one knows its true form. Some are panicked, some are delighted.



