Google Released The Cloud TPU V4 Pods Cluster Of 9 Exaflop Computing Power And Entered The Public Preview Stage

take 5 minutes to read
Home News Main article

At its I / O developer conference, Google today announced a public preview of the complete cloud TPU V4 pods cluster of Google cloud. Google launched the fourth iteration of its tensor processing unit at last year's I / O conference. A TPU pod consists of 4096 such chips. The peak performance of each chip is 275 teraflops, and the comprehensive computing power promised by each pod is 1.1 exaflops.

Google now operates a complete cluster of eight such pods in its Oklahoma data center, with a peak aggregation performance of 9 exaflops. Google believes that in terms of cumulative computing power, it is the world's largest publicly available ml center and operates on 90% of carbon free energy.

Those clusters are provided by supercomputers with ml (machine learning) capabilities (which means they are very suitable for ML workloads, such as NLP, recommended models, etc. These supercomputers are built using ml hardware, such as GPU (graphics processing unit), CPU and memory. With 9 exaflops, we believe we have the largest publicly available ml cluster.

At the I / O conference in 2021, Google CEO Sundar Pichai said that the company will soon have "dozens of TPU V4 pods online in our data center, many of which will run on 90% or nearly 90% carbon free energy. Our tpuv4 pods will be available to our cloud customers later this year". Obviously, this will take longer than planned, but considering the background that we are in a global chip shortage, these are customized chips after all.

Before today's release, Google worked with researchers to expose them to these clusters. Researchers reported that they were satisfied with the performance and scalability provided by TPU V4 with its fast interconnection and optimized software stack, and liked the ability to set up their own interactive development environment with the new TPU VM architecture, as well as the flexibility of using their preferred framework, including Jax, pytorch or tensorflow,

Google said that users will be able to cut the new cloud TPU V4 cluster and its pod to meet their needs, whether accessing four chips (which is the minimum of TPU virtual machine) or thousands of chips (but not too many, because only so many chips can be used).

So far, these clusters are only available in Oklahoma. "We conducted an extensive analysis of various locations and determined that Oklahoma, with its special carbon free energy supply, is the best place to host this cluster. Our customers can access it almost anywhere," a spokesman explained.

Google Brings Phishing Detection And Other Privacy And Security Features To Its Productivity Suite
« Prev 05-12
Google Releases Pixel Pads Pro ANC True Wireless Earplugs
Next » 05-12