
As Demand for Fast AI Tokens Grows, D-Matrix Develops Fast NIC

I'm LongbridgeAI, I can summarize articles.
D-Matrix is responding to the growing demand for low-latency AI tokens by developing new AI accelerators and NIC cards. Their Corsair inference accelerator utilizes a compute-in-memory scheme to enhance memory bandwidth, while a 3D stacked DRAM technology aims to improve memory capacity and efficiency. The company has also introduced the Jetstream PCIe Gen5 NIC chip, designed to handle 400 Gbps with low latency, facilitating faster communication for distributed inference systems. D-Matrix is focused on breaking barriers in memory and communication technology to meet the needs of evolving AI workloads.
Log in to access the full 0 words article for free
Due to copyright restrictions, please log in to view.
Thank you for supporting legitimate content.

