Korean media: NVIDIA stops SOCAMM reopens design, Micron first develops its advantages and suffers frustration

According to Korean media ETNews reports, the exporter NVIDIA has suspended the development of the first generation of SOCAMM (System-on-Chip Attached Memory Module) and shifted its research and development focus to SOCAMM2.

NVIDIA previously listed SOCAMM1 supports up to 18TB LPDDR5X capacity and 14.3TB/s bandwidth in the GB300 NVL72 specification, showing that it is considered cost-effective alternatives in data center computing. SOCAMM was originally positioned as a modular LPDDR memory, focusing on providing high-frequency width and low-power consumption advantages similar to HBM in AI servers, but due to hindering technological progress, the plan was forced to reset. With ETNews, the upgrade direction of SOCAMM2 includes data rates increased from 8,533 MT/s to 9,600 MT/s, and may support LPDDR6.

In terms of memory manufacturers, Micron was the first to announce the volume production of SOCAMM1 in March this year, becoming the first memory supplier to introduce products into AI servers. In comparison, Samsung and SK Hynix only revealed at the financial conference that it plans to start mass production in the third quarter of 2025. If SOCAMM1 confirms the suspension, the rival will be able to re-short the gap with Micron.

The market previously estimated that SOCAMM shipments could reach 600,000 to 800,000 in 2025, highlighting that NVIDIA was originally intended to quickly introduce them. However, in the context of the surge in demand for AI memory and the increasing tight supply of HBM, SOCAMM is still regarded as the key link in NVIDIA silicon blueprint. The direct transition from SOCAMM1 to SOCAMM2 this time may mean that NVIDIA is accelerating its upgrade to ensure that it maintains a leading position in the AI ​​memory competition.

Nvidia rumored to ditch its first-gen custom memory form factor for newer version — SOCAMM1 for faster ‘SOCAMM2’ standard