Samsung Unveils New HBM4E Memory at Nvidia Conference, Highlights Expanded Partnership

California: Samsung Electronics Co. unveiled its seventh-generation high bandwidth memory (HBM), known as HBM4E, during an annual technology conference hosted by Nvidia Corp. The U.S. tech giant emphasized its growing collaboration with the Korean chipmaker beyond memory chips.

According to Yonhap News Agency, Samsung Electronics provided updates on the development of its HBM4E products, showcasing its capabilities as a total memory solution provider for Nvidia's Vera Rubin AI platform during Nvidia GTC 2026. The event commenced on Monday (U.S. time) and is set to run for four days in California. This unveiling marked the first time Samsung introduced the physical HBM4E chip, which is anticipated to support speeds of 16 gigabits per second per pin and a bandwidth of 4.0 terabytes per second.

The performance of HBM4E shows an improvement over its predecessor, HBM4, which has speeds of 13 gigabits per second per pin and a bandwidth of 3.3 terabytes per second. During his keynote speech, Nvidia CEO Jensen Huang expressed gratitude to Samsung Electronics for producing the Groq 3 language processing unit (LPU), which will enhance Nvidia's AI platform performance.

Huang stated, "I want to thank Samsung, who manufactures the Groq 3 LPU chip for us, and they are cranking as hard as they can. I really appreciate you guys," confirming that Samsung Electronics' foundry division manufactures the chip. This remark indicates that Samsung Electronics and Nvidia have broadened their cooperation in the AI sector to include the foundry, or chip contract manufacturing, business.

Last month, Samsung Electronics began its first commercial shipments of sixth-generation HBM, or HBM4, designed for Nvidia's Vera Rubin platform. The chipmaker claims this offers the "ultimate performance" for AI computing. Samsung also introduced hybrid copper bonding (HCB) technology, allowing the stacking of more than 16 layers while reducing thermal resistance by 20 percent compared to thermal compression bonding (TCB), highlighting its packaging capabilities for next-generation HBM.

"In order for innovation in the AI industry, a strong AI system, such as the Vera Rubin platform, is essential," Samsung stated. "Samsung Electronics plans to continue supplying high-performance memory solutions supporting the Vera Rubin platform," it added. The company aims to lead a transition in the global AI infrastructure paradigm, backed by its ties with Nvidia.

During the event, Samsung Electronics set up an exhibition booth comprising three zones -- AI Factories, Local AI, and Physical AI -- showcasing the firm's next-generation chips meeting the demand from the AI industry.