Jump to section
To deliver best-in-class economics for AI inference acceleration in the datacenter.
64% employee growth in 12 months
The explosion in the field of AI puts significant strain on traditional computers, and the silicon architectures currently in use are a bottleneck on the speed and effectiveness of emerging AI models. d-Matrix aims to tackle this challenge by with its digital in-memory computing accelerator for AI. Using a combination of machine learning software tools and modular chiplets it delivers a huge increase in AI computing efficiency without a corresponding increase in energy use.
D-Matrix has major competitors in the race to deliver the next generation of scalable AI solutions. These include the likes of Nvidea which is developing GPU architecture that runs at ten times the efficiency of existing models. However the flexibility and easy scalability of d-Matrix's solution helps it stand out, and its novel approach has gained a high degree of support from investors. The company is currently expanding its operations in the US, India and Australia, and is well poised for success as demand for AI compute solutions continues to grow exponentially.
Freddie
Company Specialist at Welcome to the Jungle
Sep 2023
$110m
SERIES B
Apr 2022
$44m
SERIES A
This company has top investors
Sid Sheth
(CEO)Previously Director at NetLogic Microsystems, Director of Marketing at Broadcom Inc and SVP/GM atInphi Corporation.
Sudeep Bhoja
(CTO)Previously CTO at Inphi Corporation, Technical Director at Broadcom and Chief Architect at Big Bear Networks.