AMD Introduces Instinct MI300 APUs with Up to 24 Zen 4 Cores and 192GB of HBM3 Memory for AI

0
12
AMD Introduces Instinct MI300 APUs with Up to 24 Zen 4 Cores and 192GB of HBM3 Memory for AI
1686765973 amd introduces instinct mi300 apus with up to 24 zen.jpeg

AMD introduced the Instinct MI300 as its new generation of accelerators designed for supercomputing and artificial intelligence applications. The new line of processors with onboard graphics and memory bring the highest performance that the hardware giant can offer on a unified platform with the latest technologies.

Lisa Su, CEO of AMD, took the stage last Tuesday (13) to reveal two new accelerators: Instinct MI300A and Instinct MI300X. Both are equipped with the new CDNA 3 acceleration architecture, which promises a huge leap in performance and efficiency compared to the Instinct MI200 APUs introduced in 2021.

AMD Instinct MI300A

The Instinct MI300A is a platform equipped with 24 CPU cores based on Zen 4 architecture and produced with 5-nanometer lithography. This accelerator stands out for its low processing latency and memory communication, since it supports up to 128 GB of HBM3 (high-bandwidth memory) — a standard that succeeds HBM2e.

Memory is shared between the CPU and GPU, so different workloads can benefit from low data latency. According to the manufacturer, this unified architecture allows eight times more performance and five times more efficiency than the AMD Instinct MI200X, one of the most widely used accelerators in supercomputers.

Dr. Lisa Su introduces the Instinct MI300A specs (Image: AMD/YouTube)

Like the brand’s other products, including its high-performance notebook processors, the Instinct MI300X features a chiplets that optimizes costs and reduces system energy consumption. Another characteristic that draws attention is the high density of the accelerator, which has more than 146 billion transistors in all chiplets.

The availability date of the Instinct MI300A has not yet been defined by AMD, but the company claims that the accelerator is already in the sampling phase for customers, suggesting that the model may reach the market by early 2024.

AMD Instinct MI300X

The hardware’s modular design allowed AMD to develop the Instinct MI300X, a variant optimized for large language models and other artificial intelligence tasks by replacing the three chiplets with Zen 4 cores per chiplets additional CDNA 3 cores, the manufacturer has created an option entirely based on GPU cores.

To meet the highest demands of the technology industry — including scientific research, artificial intelligence model training and other applications — the Instinct MI300X is equipped with 192 GB of HBM3 memory, delivering a bandwidth of 5.2 TB/s. Here, we are talking about a platform with more than 153 billion transistors.

Instinct MI300X is optimized for large generative AI models (Image: AMD/YouTube)

Comparing this accelerator with the competition, the Instinct MI300X has HB33 memory density 2.4 times higher than NVIDIA’s Hopper H100 GPUplus 1.6 times the bandwidth of its rival hardware.

The Instinct MI300X will be available on a platform with 8 APU units totaling 1.5TB of HBM3 memory. According to the company, high memory capacity is a key factor to ensure high performance in artificial intelligence models that have ever-increasing amounts of parameters.

Previous articleExpect Stock Rally to End, Recession to Hit, Fed to Stop Hikes: Siegel
Next articleHarmonyOS 4.0: Huawei confirms system presentation at HDC 2023 in August
Abraham
Expert tech and gaming writer, blending computer science expertise