Sensors

Sensors Converge panel sees more credible Edge AI apps emerging

Sensors Converge panel sees more credible Edge AI apps emerging

A power panel of Edge AI practitioners from five competing sensors-related companies wrapped up Sensors Converge 2026 by offering witness to significant growth in embodied AI applications moving to commercial production. They provided examples from the SC26 event.

In 2026 , “applications are more real about what is possible at these [limited] power budgets and those conversations were not there last year,” said Shreyas Derashri, chief commercial officer at Innatera. “What’s happening this year is we’re seeing apps and partners coming to us with credible apps. That’s definitely changed and things have come a long way.”

Abdullah Raouf, senior director of sales at Syntiant, said he saw a “lot of demos last year” at Sensors Converge 2025, but this year he heard about what applications are deployed and in production, with more people at the event from the education and business sector. “There’s a lot more people here this year, so I think the interest level on embodied AI on just sensors in general and what the sensors are going to be able to do is getting higher and higher,” he said.

The panelists also described more partnerships between AI-focused companies and sensors makers to bring about apps with AI and sensors, as well as a focus on multi-modal sensors used in connection with AI. 

Pete Bernard, CEO of EDGE AI FOUNDATION, moderated the final day’s panel on May 7 on the topic, “Embodied Intelligence: When Edge AI Gets Legs, Wings and Wheels.”  Many people at the event used “embodied AI” interchangeably with the term “physical AI.” In addition to Raouf and Derashri, the other panelists on the Main Stage were Scott Smyser, vice president of sales and marketing at EMASS, a Nanoveu business, Steve Chen, senior sales manager at Himax and Tharak Krishnan, director of technical product management at BrainChip.

In the multimodal area, Smyser said EMASS is using sensors and AI to monitor machine health in factory settings so a company can take action before a machine breaks down. “We have a reference design where we’ve got an IMU to measure vibrations, a barometer and temperature sensor for pressure and temperature changes and then also a microphone for noises,” he said. “So we can look at all those sensors to run some AI models and do inferences and say, ‘Hey, we picked up this anomaly, someone can check it out.’”

Chen said he also noticed “a lot more use cases” on the showfloor at Sensors Converge. “There’s a lot more thinking into how to realize and embody intelligence,” he said.  “People are thinking about how to deploy to the field more easily with battery-based products. I do see a lot of realized product this year, compared with last.”

The applications the panel described ran from industrial to medtech to drones.  Medtech was clearly a hot area. Krishnan described BrainChip’s work with Onsar Technologies to make AI-powered wearable smart glasses to be worn by patients with epilepsy to predict seizures before they occur. The system, first announced in 2025, leverages neuromorphic computing and on-device machine learning to give users and caregivers real-time alerts.

Derashri at Innatera called out an application for fall detection in homes, particularly in elderly care. “We use radar to detect falls,” he said. “We process it through the spiking neural networks and make it really efficient. The key thing here is that…the place you need it the most to detect a person falling is actually your bathroom. Can you have a camera there? No. You’re going for a shower. You probably removed all your wearables that you’re wearing to detect a fall. You cannot rely on that. So what is the most secure, private thing you can install in your bathroom to detect a fall? …It’s quite a life-changing kind of a use case. “

Innatera is also working with partners who have developed artificial fingers and an artificial hand and with the Pulsar chip have been able to run it in low sub-milliwatt power and with low latency because it doesn’t have to be looped through the cloud, Derashri said. “You can now get very reflex-based actions on the robotic arm,” he added. 

EMASS is pursuing an app in drones with an always-on inference system to improve efficiency across IMU, GPS and imaging sensors. “We’ve done over 300 simulations and we can improve efficiency by 60% and when I say efficiency, that means more flight time,” Smyser said. “That’s huge.” He predicted drone uses in the future for deliveries and survey work, among others.

Chen described an endoscopic capsule using a Himax AI chip, just 3mm x 5mm, with low power to capture eight hours in a patient’s abdomen to detect medical problems. Another use would apply the chip to a small camera with 10 years of battery life for wildlife preservation monitoring in the field where it is not practical to change batteries more often.

The panel ended with a high degree of confidence about the future for embodied AI with panelists urging developers to learn as much possible. Bernard urged visiting the EDGE AI FOUNDATION YouTube channel and other outlets for insights and education.

“Lean in and get more educated,” Bernard said. “I assure you, you are going to start learning a lot more of the value that these technologies will bring to your lives.” 

for may 15 story

Edge AI panelists at SC26: Pete Bernard, EDGE AI FOUNDATION; Steve Chen, Himax; Abdullah Raouf, Syntiant; Scott Smyser, EMASS; Shreyas Derashri, Innatera; and Tharak Krishnan, BrainChip. The topic: “Embodied Intgelligence: When Edge AI Gets Legs, Wings and Wheels.”
(Sensors Converge)

Leave a Reply

Your email address will not be published. Required fields are marked *