AMD Instinct MI200 ‘Aldebaran’ GPUs Are Shipping Out, AMD Is The First To Utilize MCM GPU Die


AMD’s next-generation Instinct ‘Aldebaran’ compute graphics cards, based on the CDNA2 architecture, has begun shipping as detailed in AMD’s Q2 earning report presentation.

The Power Of Up To 256 Compute Units & 128GB Of HBM2E Is In The Hands Of Customers With The AMD Instinct MI200 ‘Aldebaran’ GPU

The AMD Instinct MI200 ‘Aldebaran’ is the successor to the Instinct MI100 ‘Arcturus’ and is already making its way into customer’s hands. At the bottom of the slide, AMD says,

AMD & NVIDIA Rumored To Refresh Their Radeon RX & GeForce RTX Mobility GPUs Early Next Year

Initial shipments of next-generation AMD Instinct accelerators featuring 2nd Gen CDNA architecture.

The AMD CDNA 2 architecture will be powering the next-generation AMD Instinct HPC accelerators. We know that one of those accelerators will be the MI200 which will feature the Aldebaran GPU. It’s a very powerful chip and the first GPU to feature an MCM design. The Instinct MI200 competes against Intel’s 7nm Ponte Vecchio and NVIDIA’s refreshed Ampere parts. Intel and NVIDIA are also following the MCM route on their next-generation HPC accelerators but it looks like Ponte Vecchio is going to be available in 2022 and the same can be said for NVIDIA’s next-gen HPC accelerator as their own roadmap confirmed.

Inside the AMD Instinct MI200 is an Aldebaran GPU featuring two dies, a secondary and a primary. It has two dies with each consisting of 8 shader engines for a total of 16 SE’s. Each Shader Engine packs 16 CUs with full-rate FP64, packed FP32 & a 2nd Generation Matrix Engine for FP16 & BF16 operations. Each die, as such, is composed of 128 compute units or 8192 stream processors. This rounds up to a total of 256 compute units or 16,384 stream processors for the entire chip. The Aldebaran GPU is also powered by a new XGMI interconnect. Each chiplet features a VCN 2.6 engine and the main IO controller.

The block diagram of AMD’s CDNA 2 powered Aldebaran GPU which will power the Instinct MI200 HPC accelerator has been visualized. (Image Credits: Locuza)

As for  DRAM, AMD has gone with an 8-channel interface consisting of 1024-bit interfaces for an 8192-bit wide bus interface. Each interface can support 2GB HBM2e DRAM modules. This should give us up to 16 GB of HBM2e memory capacity per stack and since there are eight stacks in total, the total amount of capacity would be a whopping 128 GB. That’s 48 GB more than the A100 which houses 80 GB HBM2e memory. The full visualization of the Aldebaran GPU on the Instinct MI200 is available here.

With customers already receiving the AMD Instinct MI200 ‘Aldebaran’, AMD has cemented itself as the first to deliver graphics cards with a Multi-Chip-Module design and is expected to expand this design process to consumer cards with RDNA 3 in 2022.





Source link

Leave a Reply

Your email address will not be published. Required fields are marked *