UK Made: 3U VPX artificial intelligence accelerator

Author: EIS Release Date: Dec 9, 2020


Concurrent Technologies announce its first 3U VPX artificial intelligence accelerator board based on an Intel Arria FPGA earlier this year, developed in accordance with a proposed VITA 65.1 profile aligned with the SOSA Technical Standard, according to the company.

Concurrent-Technologies-TR_AEx_3sd_Rcx
Called TR AEx/6sd-RCx, it is intended for inference at the edge applications such as real-time object recognition and behaviour monitoring.
“Inferencing performance can be dramatically improved using TR AEx/6sd-RCx compared to a CPU-only solution based on Concurrent Technologies TR H4C/msd-RCx VPX server board,” said Concurrent, which added that it can supply benchmarks showing throughput improvements with the FPGA accelerator of around 5x compared to a CPU only solution, at the same time as a five-fold reduction in latency.
The inferencing hardware is supported by Intel’s OpenVINO tool kit – as well as deep learning, OpenVINO supports OpenCV – an open source library for computer vision applications – and OpenVX – an API for heterogeneous compute.

It supports AI frameworks including Caffe, TensorFlow and MXNet along with neural network models like AlexNet and Resnet.
“As AI technology has matured, interest is being picked up by the defence and energy exploration markets,” said concurrent. “This AI accelerator engine, enables a user to receive and process real-time actionable intelligence from vision, RF or other sensors, providing a solution to many problems faced at the edge.”
Concurrent Technologies is is a public company quoted on the London Stock Exchange, headquartered in Colchester UK.
It designs a range of high performance Intel processor boards, switches, networking, storage and software products for use in embedded computing solutions.
All of its board products are made in Colchester.
It also has a US headquarters in Massachusetts, as well as additional design facilities in Bangalore, and sales and support in China.