Google has introduced a new Arm-based central processing unit (CPU) named Axion, along with the latest version of its artificial intelligence chips, in a move set to enhance data center performance.
Why It Matters
Tensor Processing Units (TPUs) from Google are emerging as formidable alternatives to Nvidia's AI chips. However, unlike Nvidia's offerings, TPUs are accessible exclusively through Google's Cloud Platform. With Axion, Google aims to provide developers with a high-performance CPU option that integrates seamlessly with existing workloads.
\"We're making it easy for customers to bring their existing workloads to Arm,\" said Mark Lohmeyer, Google Cloud's vice president and general manager of compute and machine learning infrastructure. \"Axion is built on open foundations but customers using Arm anywhere can easily adopt Axion without re-architecting or re-writing their apps.\"
Performance and Innovation
The Axion CPU promises a 30% performance improvement over general-purpose Arm chips and a 50% boost compared to current generation x86 chips from Intel and AMD. This marks a significant advancement in processing power for cloud-based applications and services.
The new TPU v5p chip is designed to operate in large-scale pods of 8,960 chips, delivering twice the raw performance of its predecessor. To maintain optimal performance, Google employs liquid cooling in its data centers.
Looking Ahead
Google plans to integrate Axion into services like YouTube Ads on Google Cloud in the near future. The TPU v5p chip is now generally available to users via Google Cloud, marking a step forward in AI and cloud computing capabilities.
Reference(s):
cgtn.com