SK Hynix announced that it would demonstrate working GDDR6-AiM memory with computing capabilities at CES next month. The GDDR6 accelerator-in-memory technology is designed to speed up artificial intelligence and big data processing by bringing basic computational functions to memory chips.
SK Hynix's GDDR6-AIM chips can process data in memory at 16 Gbps, which makes certain computations up to 16 times faster than other methods, according to the company. Such chips are designed for machine learning, high-performance computing, and big data computation and storage. Typically, these types of workloads may not always need truly serious compute performance, but transferring data from memory to a processor takes time and consumes loads of power, so it makes sense to process the data in memory.
The memory maker says that its GDDR6-AiM chips run at 1.25V, and its usage reduces power consumption by 80% compared to applications that move data to the CPU and GPU. Such chips are designed to be drop-in compatible with existing GDDR6 memory controllers, so it should be possible to use them even on existing graphics cards to increase their performance in AI, ML, Big Data, and HPC workloads.
SK Hynix completed the development of its GDDR6-AiM in early 2022 but so far has only demonstrated actual applications a limited number of times. Therefore, it will be particularly interesting to see what kind of device SK Hynix will show at the trade show.
SK Hynix is not the only memory maker to experiment with processing-in-memory (PIM) technology. On various occasions, Samsung has demonstrated its HBM2 and GDDR6 memory with embedded processing for about two years. Meanwhile, PIM has yet to gain popularity as many users prefer traditional CPUs, GPUs, and FPGAs.
In addition to GDDR6-AiM memory chips, SK Hynix plans to demonstrate its new HBM3 memory devices 'with the world's best specification for high-performance computing.'