AMD preparing Instinct MI300 refresh with HBM3E memory

Published: Feb 27th 2024, 15:03 GMT   Comments

AMD MI300 expected to get HBM3E update

Mark Pepermaster (Chief Technology Officer and Executive Vice President) confirms AMD MI300 were architectured to support HBM3E. 

Micron and Samsung have recently reported on important updates to High Bandwidth Memory (HBM), specifically the HBM3E variant, targeting next-gen and current-gen refreshed data-center accelerators. NVIDIA’s H200 are already confirmed to incorporate the advanced HBM3E. However, it seems AMD has plans to upgrade their Instinct series from HBM3 as well.

Many enterprise customers are exploring alternatives, considering factors like cost and availability. AMD has positioned its Instinct MI300 series as formidable contenders against NVIDIA’s Hopper and Ampere processors. The MI300 series has several advantages, such as larger memory capacity.

The MI300A from AMD is equipped with 228 CDNA3 Compute Units, accompanied by 24 Zen4 cores based on x86 architecture. This SKU is outfitted with 128 GB of HBM3 memory. For those with compute-intensive requirements, the MI300X steps up with 304 CDNA3 Compute Units and 192GB of HBM3 memory. However, this product lacks Zen4 cores. The MI300 lineup competes with NVIDIA H200 and H200 SuperChip, based on Hopper architecture and paired with ARM Neoverse V2 cores in the case of the Super Chip.

HPC Accelerators
VideoCardzArchitectureGPU CU/SMCPU CoresMemory
AMD Instinct MI300AAMD CDNA 322824 “Zen 4”128 GB HBM3
AMD Instinct MI300XAMD CDNA 3304192 GB HBM3
NVIDIA H100 NVIDIA Hopper132
132
132
64 GB HBM3
80 GB HBM3
96 GB HBM3
NVIDIA H200 NVIDIA Hopper~132141 GB HBM3E
NVIDIA H200 Super ChipNVIDIA Grace Hopper132
132
72 ARM
72 ARM
96 GB HBM3
141 GB HBM3E

AMD has now confirmed that the company will upgrade their Instinct products to HMB3E technology. The HBM3E not only has lower power consumption, it also enables faster bandwidth and more capacity by introducing 50% more stacks. The 12-Hi stack can enable up to 36GB of capacity per module, as shared by Samsung today. Theoretically, this would enable 8 HBM3E modules to support 288 GB of capacity.

We are not standing still. We made adjustments to accelerate our roadmap with both memory configurations around the MI300 family, derivatives of MI300, the generation next. […] So, we have 8-Hi stacks. We architected for 12-Hi stacks. We are shipping with MI300 HBM3. We have architected for HBM3E.

Mark Papermaster (AMD CTO) via Seeking Alpha

With MI300 receiving an update to HBM3E this year, one should also look forward to MI400, which is expected to hit the server racks in 2025. Meanwhile, NVIDIA is now expected to tease its B100 Blackwell HPC accelerator at the Spring GTC 2024 keynote in March.

Source: Wccftech




Comment Policy
  1. Comments must be written in English.
  2. Comments must not exceed 1000 characters. Comment splitting is not allowed.
  3. Comments deemed to be spam or solely promotional in nature will be deleted.
  4. Discussions about politics are not allowed on this website.
  5. Sharing relevant links is permitted; avoid bypassing the link and word filters. Our team will approve links.
  6. Comments complaining about the post subject or its source will be removed.
  7. Offensive language in comments or usernames result in a ban.
  8. Direct attacks/harassment result in immediate ban.
  9. VideoCardz isn’t and was never sponsored by AMD, Intel, or NVIDIA. Users claiming otherwise will be banned.
  10. Moderators may edit/delete comments without notice.
  11. If you have any questions about the commenting policy, please let us know through the Contact Page.
Hide Comment Policy
Comments