SK hynix begins mass production of next-gen AI server memory module

Home > Business > Industry

print dictionary print

SK hynix begins mass production of next-gen AI server memory module

This photo provided by SK hynix shows the company's 192GB SOCAMM2 next-generation AI server memory module. [YONHAP]

This photo provided by SK hynix shows the company's 192GB SOCAMM2 next-generation AI server memory module. [YONHAP]

 
SK hynix said Monday it has begun mass production of a next-generation memory module designed for AI servers, as it seeks to strengthen its position in the AI infrastructure market.

 
The company said the 192GB SOCAMM2 module is based on its sixth-generation 10-nanometer-class LPDDR5X low-power DRAM technology. The module was designed particularly for use with Nvidia's Vera Rubin AI platform.
 

Related Article

 
The module adapts mobile-oriented low-power memory for server environments and is designed to serve as a primary memory solution for next-generation AI servers.
 
According to SK hynix, the product delivers more than double the bandwidth and over 75 percent improved power efficiency compared with conventional RDIMMs, or registered dual in-line memory modules, making it suitable for high-performance AI operations.
 
The Korean chip giant said the new product is expected to help resolve memory bottlenecks in the training and inference of large language models with hundreds of billions of parameters, significantly improving overall system performance.
 
"By supplying the 192GB SOCAMM2, SK hynix has established a new standard for AI memory performance," said Kim Joo-sun, the president and head of AI infrastructure at the company.

Yonhap
Log in to Twitter or Facebook account to connect
with the Korea JoongAng Daily
help-image Social comment?
s
lock icon

To write comments, please log in to one of the accounts.

Standards Board Policy (0/250자)