You can’t talk AI without including Nvidia… The company’s powerful and unique GPUs (Graphics Processing Units) have been used to train every headline-grabbing AI model, from ChatGPT to Gemini. But what if I told you there’s another chip quietly emerging from the shadows – and it has nothing to do with Nvidia?
It’s called…
A Tensor Processing Unit (TPU)!
A TPU is a custom-built chip that Google designed specifically to run AI models.
Unlike Nvidia’s GPUs, which were originally built for video game graphics and later repurposed for AI, TPUs sole purpose is to execute computations at blistering speed and maximum efficiency.
And while GPUs are best at training AI models, TPUs were made to dominate inferencing: the part where AI thinks, reasons, and responds to users in real time. That’s where AI models are heading, after all.
Historically, the AI race has been all about training. And that’s exactly why Nvidia’s GPUs became the leader. But once the model is trained, inference is forever. The AI runs billions of times to serve billions of people.
What’s so great about TPUs?
Google’s TPUs are already being used internally to power Search, Translate, YouTube, Ads, Gemini, and even “Veo 3”, its latest AI video model.
These TPUs have a few advantages such as:
• Super fast for inference: Compared to general-purpose GPUs, they can deliver lower latency and higher throughput for inference workloads.
• Power-efficient: Because TPUs are custom-built for the specific demands of AI, they avoid unnecessary overhead and can process more computations per watt.
• Integrated directly into Google Cloud, allowing developers and enterprises to tap into the same hardware that powers the company’s flagship AI services – without needing to build their own data centres or rely on Nvidia GPUs.
Simply put – as AI shifts from the training to inference phase, TPUs are primed to steal the spotlight.
Of course, that doesn’t mean Nvidia’s reign as AI chip leader is doomed. The company still dominates the AI training space. And its newer chips, like the Blackwell B200, are getting better at inference, too.
It, however, does mean hotter competition for Nvidia. But competition is good. Competition drives innovation. So, ultimately, it’s a big win for the AI gold rush.
Not a subscriber to Money Morning?
You can get free daily recommendations like these with Money Morning eletter. Just sign up here.
