Micron Unveils Massive 256GB DDR5-8800 MCRDIMM Memory Units at Nvidia GTC 2024: Ideal Solution for Next-Gen AI Servers

Tom’s Hardware Reports Micron's Innovation, Offering Tall and Standard Versions to Meet Diverse Server Needs

At Nvidia GTC 2024, Micron revealed its massive 256GB DDR5-8800 MCRDIMM memory units. These modules, with a capacity of 20 watts and a double-height design, are ideal for next-gen AI servers like those powered by Intel’s Xeon Scalable ‘Granite Rapid’ processors, which demand ample memory for training.

According to Tom’s Hardware, Micron presented a ‘Tall’ module variant at the conference but planned to provide Standard height MCRDIMMs for 1U servers as well.

The 256GB MCRDIMMs
[Image Credit: Tom’s Hardware]

The 256GB MCRDIMMs

Available in both Tall and Standard versions, utilize monolithic 32Gb DDR5 ICs. The Tall module has 80 DRAM chips on each side, while the Standard module uses 2Hi stacked packages and runs slightly hotter.

MCRDIMMs, known as Multiplexer Combined Ranks DIMMs, are dual-rank memory modules with a specialized buffer enabling concurrent operation of both ranks.

According to Tom’s Hardware, this buffer allows simultaneous data retrieval from both ranks, effectively doubling performance. The buffer communicates with the host memory controller using the DDR5 protocol at speeds beyond standard specifications, reaching 8800 MT/s in this instance.

Sanjay Mehrotra, Micron’s CEO, stated in an earnings call that sampling for the 256GB MCRDIMM module has commenced, enhancing performance and increasing DRAM content per server.

Although Micron hasn’t disclosed pricing, each module is expected to surpass $10,000 in cost.

LEAVE A REPLY

Please enter your comment!
Please enter your name here

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Table of contents

Related Posts