BrainChip Execs Talk Advantages of Putting AI on Sensors at Edge

BrainChip Holdings

BrainChip Holdings Ltd. (ASX: BRN), (OTCQX: BRCHF), a leading provider of ultra-low power, high-performance AI technology, will present the Expert Bar session “Can You Put AI at the Sensor? (Not the Edge of the Cloud!)” at the Embedded Vision Summit May 27 at 11:30 a.m. PDT. The virtual presentation will be broadcast live as well as be available on-demand for attendees of the event.

The BrainChip team will help viewers better understand the requirements of sensors at the edge and how challenges associated with traditional machine learning make it difficult to properly enable these devices. Deploying a solution that leverages advanced neuromorphic computing as the engine for intelligent AI at the edge can be better used to solve critical problems such as privacy, security, latency, and low power requirements, while providing key features, such as one-shot learning and computing on the device itself, without dependency on the cloud.

“Cloud use for AI might be effective in a data center setting but relying on it for the millions of edge sensors being deployed in emerging ‘smart’ markets is a recipe for disaster,” said Anil Mankar, Chief Development Officer at BrainChip. “How do those devices overcome latency in communicating with the cloud? Next-generation AI needs a solution that will provide resources to edge devices. We look forward to sharing with attendees of the Embedded Vision Summit how our Akida Neural Processing Unit has been developed to address these concerns and provide true device intelligence without the need for the cloud.”

BrainChip is delivering on next-generation demands by achieving efficient, effective AI functionality. The company’s Akida neuromorphic processors are revolutionary advanced neural networking processors that bring artificial intelligence to the edge in a way that existing technologies are not capable. The solution is high-performance, small, ultra-low power and enables a wide array of edge capabilities. The Akida (NSoC) and intellectual property, can be used in applications including Smart Home, Smart Health, Smart City and Smart Transportation. These applications include, but are not limited to, home automation and remote controls, industrial IoT, robotics, security cameras, sensors, unmanned aircraft, autonomous vehicles, medical instruments, object detection, sound detection, odor and taste detection, gesture control and cybersecurity. The Akida NSoC is designed for use as a stand-alone embedded accelerator or as a co-processor, and includes interfaces for ADAS sensors, audio sensors, and other IoT sensors. Akida brings AI processing capability to edge devices for learning, enabling personalization of products without the need for retraining.

Since 2012, the Embedded Vision Summit has been the premier conference and expo devoted to practical, deployable computer vision and visual AI. The Summit is organized by the Edge AI and Vision Alliance, an industry partnership operated by BDTI. Additional information about the event is available at

For more such updates and perspectives around Digital Innovation, IoT, Data Infrastructure, AI & Cybsercurity, go to

Related posts

AI company Lia 27 Inc. announces partnership with D-ID

PR Newswire

Pennsylvania Student’s New Algorithm Coordinates Robot Behaviors

AI TechPark

The AI-Driven COMPUTEX 2021 Blazes New Paths for Tech Community

PR Newswire