SK hynix Showcases Next Generation AI Memory Innovation at CES 2026
Werte in diesem Artikel
- SK hynix to operate customer exhibition booth to enhance customer connection
- Unveils 16-layer HBM4 with 48GB for the first time and showcases conventional and AI focused products such as SOCAMM2 and LPDDR6
- With 'AI System Demo Zone', visualizes custom HBM structure to present future technology
- Company to create new value based on differentiated memory solution and close collaboration with customers
SEOUL, South Korea, Jan. 5, 2026 /PRNewswire/ -- SK hynix Inc. (or "the company", www.skhynix.com) announced today that it will open a customer exhibition booth at Venetian Expo and showcase its next generation AI memory solution at CES 2026, in Las Vegas from January 6 to 9 (local time).
The company said that, "Under the theme 'Innovative AI, Sustainable tomorrow', we plan to showcase a wide range of next generation memory solutions optimized for AI and will work closely with customers to create new value in the AI era."
SK hynix has previously operated both a SK Group joint exhibition and a customer exhibition booth at CES. This year, the company will focus on the customer exhibition booth to expand touchpoint with key customers to discuss potential collaboration.
The company showcases 16-layer HBM4 product with 48GB, next generation HBM product, for the first time during the exhibition. The product is the next generation product of 12-layer HBM4 product with 36GB, which demonstrated industry's fastest speed of 11.7Gbps, and is under development aligned with customers' schedules.
12-layer HBM3E product with 36GB which will drive the market this year will also be presented. In particular, the company will jointly exhibit GPU modules that have adopted HBM3E for AI servers with customer and demonstrate its role within AI systems.
In addition to HBM, the company plans to showcase SOCAMM2, a low-power memory module specialized for AI servers, to demonstrate the competitiveness of its diverse product portfolio in response to the rapidly growing demand for AI servers.
Also, SK hynix will exhibit its lineup of conventional memory products optimized for AI, demonstrating its technological leadership across the market. The company will present its LPDDR6, optimized for on-device AI, offering significantly improved data processing speed and power efficiency compared to previous generations.
In NAND flash, the company will present its 321-layer 2Tb QLC product, optimized for ultra-high capacity eSSDs, as demand surges from rapid expansion of AI data centers. With best-in-industry integration, this product significantly improves power efficiency and performance compared to previous generation QLC products, making it particularly advantageous in AI data center environments where lower power consumption is needed.
The company will set up an 'AI System Demo Zone' where visitors can experience how its AI system memory solution that is being prepared for the future, interconnect to form AI ecosystem.
In this zone, the company will present customized cHBM[1] optimized for specific AI chip or system, PIM[2] based AiMX[3], CuD[4] which conducts computing in memory, CMM-Ax[5] that integrated computing capabilities into CXL[6] memory, and Data-aware CSD[7].
[1]Custom HBM (cHBM): A product that integrates some functions located in GPUs and ASICs to the HBM base die, reflecting customer requirements. As the AI market evolves from conventional to inference efficiency and optimization, HBM is also evolving from conventional products to customized solution. This solution is expected to enhance the performance of GPUs and ASICs while reducing the power required to transfer data with HBM, leading to imrprove overall system efficiency.
[2]Processing-In-Memory (PIM): A next-generation memory technology that integrates computational capabilities into memory, addressing data movement bottlenecks in AI and big data processing.
[3]Accelerator-in-Memory based Accelerator (AiMX): SK hynix's accelerator card prototype featuring a GDDR6-AiM chip which is specialized for large language models (LLMs).
[4]Compute-using-DRAM (CuD): A next generation product that contributes to accelerating data processing by performing simple computations within the cell.
[5]CXL Memory Module-Accelerator xPU (CMM-Ax): Asolution that adds computational functionality to CXL's advantage of expanding high-capacity memory, contributing to improving performance and energy efficiency of the next-generation server platforms.
[6]Compute Express Link (CXL):A next-generation interface that efficiently connects CPU, GPU, memory, and other components in high-performance computing systems to support massive, ultra-fast computation. Based on PCIe interface, CXL allows fast data transfer and has pooling capability to efficiently utilize memory
[7]Computational Storage Drive (CSD):A storage device that can process data on its own.
For cHBM(Custom HBM), due to specific interest from customers, a large-scale mock-up has been prepared to allow visitors to visually sight its innovative structure. As the competition of the AI market shifts from mere performance to inference efficiency and cost optimization, this visualizes a new design approach that integrates part of computation and control functions into HBM which was handled by conventional GPU or ASIC in the past.
"As innovation triggered by AI accelerates further, customers' technical requirements are evolving rapidly," Justin Kim, President & Head of AI Infra at SK hynix, said. "We will meet customer needs with differentiated memory solutions. With close cooperation with customers, the company will create new value to contribute to the advancement of the AI ecosystem."
About SK hynix Inc.
SK hynix Inc., headquartered in Korea, is the world's top tier semiconductor supplier offering Dynamic Random Access Memory chips ("DRAM") and flash memory chips ("NAND flash") for a wide range of distinguished customers globally. The Company's shares are traded on the Korea Exchange, and the Global Depository shares are listed on the Luxembourg Stock Exchange. Further information about SK hynix is available at www.skhynix.com, news.skhynix.com.
View original content:https://www.prnewswire.com/news-releases/sk-hynix-showcases-next-generation-ai-memory-innovation-at-ces-2026-302653161.html
SOURCE SK hynix Inc.
Ausgewählte Hebelprodukte auf SK hynix
Mit Knock-outs können spekulative Anleger überproportional an Kursbewegungen partizipieren. Wählen Sie einfach den gewünschten Hebel und wir zeigen Ihnen passende Open-End Produkte auf SK hynix
Der Hebel muss zwischen 2 und 20 liegen
| Name | Hebel | KO | Emittent |
|---|
| Name | Hebel | KO | Emittent |
|---|