Author: EIS Release Date: Feb 21, 2025
Positron, a two year-old Reno-based startup, has raised $23.5 million to pursue its FPGA-based approach to AI inference chips.
Backers included Valor Equity Partners, Atreides Management, Flume Ventures and Resilience Reserve.
Positron specialises in transformer model inference. “From the very beginning of the company, philosophically, we wanted to have these really small, iterative goals,” says founder & CTO Thomas Sohmers (pictured), “we fundamentally thought that transformers and this new generative AI craze were going to completely change computing. And so far, it’s paying off.”
Using Agilex FPGAs allowed the company to ship hardware 18 months after it was founded. Its Atlas V1 device uses eight Agilex 7M series FPGAs and claims to deliver a 2x improvement in performance per dollar and a 3x improvement in performance per watt compared to Nvidia chips.
Positron’s Atlas V2 device is aimed at a 1.77x performance gain over Nvidia’s H100 and up to 3.3x more performance per dollar.
Positron’s hardware is compatible with Nvidia’s ecosystem including the Hugging Face Transformers library.