SK Hynix will demonstrate functional GDDR6-AiM at CES.
SK Hynix has announced that it will demonstrate computing-capable GDDR6-AiM memory at CES next month. The GDDR6 accelerator-in-memory technology is intended to accelerate the processing of artificial intelligence and massive amounts of data by transferring fundamental computational functions to memory chips.
According to SK Hynix, its GDDR6-AIM chips can process memory data at 16 Gbps, which makes certain computations up to 16 times faster than other methods. These chips are intended for machine learning, high-performance computing, and the computation and storage of massive amounts of data. The transfer of data from memory to a processor is time-consuming and energy-intensive, so it makes sense to perform data processing in memory.
According to the memory manufacturer, its GDDR6-AiM chips operate at 1.25V and reduce power consumption by 80 percent compared to applications that move data between the CPU and GPU. These chips are designed to be drop-in compatible with existing GDDR6 memory controllers, so it should be possible to use them to boost the performance of existing graphics cards for AI, ML, Big Data, and HPC workloads.
Early in 2022, SK Hynix completed the development of its GDDR6-AiM but has only demonstrated actual applications on a limited number of occasions. Consequently, it will be especially intriguing to see what type of device SK Hynix will exhibit at the trade show.
Memory manufacturer SK Hynix is not the only one experimenting with processing-in-memory (PIM) technology. Samsung has demonstrated its HBM2 and GDDR6 embedded memory on multiple occasions over the past two years. In the meantime, PIM has yet to gain traction, as the majority of users still prefer traditional CPUs, GPUs, and FPGAs.
In addition to GDDR6-AiM memory chips, SK Hynix plans to demonstrate its new HBM3 memory devices “with the best specifications for high-performance computing in the world.”