Micron unveiled the world's first 256GB LPDDR‑SOCAMM2 memory, perfect for AI.

Micron unveiled the world's first 256GB LPDDR‑SOCAMM2 memory, perfect for AI.

12 hardware

Micron launches a new type of memory for data centers

The company Micron has introduced the world’s first 256 GB LPDRAM SOCAMM2 module, specifically designed for high-performance data centers and artificial intelligence workloads.

What’s new?
ParameterTraditional MemoryLPDRAM SOCAMM2Power consumption–three times lowerModule size–compact, built from monolithic 32 Gb crystalsBandwidth–enhanced for AI inferenceLatency–significantly reduced
When running on an eight‑channel processor with 2 TB of total memory, the time to first token for AI models increases more than two and a half times.

Who needs it?
The module is aimed at:

- deploying large language models;
- any workloads where memory capacity, bandwidth, and latency are critical;
- scenarios where part of the KV cache can be offloaded from video memory to cheaper system memory without loss in performance.

Form factor and collaboration
SOCAMM2 is a compact variant of the RDIMM form factor but with lower power consumption. The memory development was carried out jointly with Nvidia, underscoring its suitability for GPU‑accelerated AI.

Thus, Micron offers an efficient solution for data centers that reduces energy and space costs while improving AI workload performance.

Comments (0)

Share your thoughts — please be polite and stay on topic.

No comments yet. Leave a comment — share your opinion!

To leave a comment, please log in.

Log in to comment