Sensors

Q&A: Omar Abed, TDK Invensense CTO, talks smart glasses

Q&A: Omar Abed, TDK Invensense CTO, talks smart glasses

Omar Abed, CTO at Invensense, a TDK Group company, agreed to describe the state of smart glasses adoption with Fierce as a preview to his keynote address on May 6 at Sensors Converge 2026 in Santa Clara, CA. He says there are components that use low power while interpreting multiple algorithms at once that help bring smart glasses tech to a tipping point. 

Fierce: Why talk about smart glasses at Sensors Converge, and why now?  

Abed: AI is the fastest growing disruptive technology we have ever seen and its adoption is also at a record pace. Given this development, we believe smart glasses is one of the most natural ways for humans to harness the power of AI while also being engaged in the world around us. Imagine a personal assistant who sees everything from your point of view. Or traveling while receiving live translations. Or walking down a city street with the map in view, without having to lift your phone. Smart glasses have a lot to offer.  

Fierce: So why aren’t we all wearing smart glasses now?  

Omar Abed

Abed: There’s a gap between the reality and the vision. There’s the bulk, style, and power needs that keep people from wanting to wear them all day. And there are privacy concerns that need to be worked out. Then there’s price, the need for applications to be developed, and more. In my talk at Sensors Converge, I’ll talk about how engineers, OEMs and technology providers can work together to help bridge these gaps.  

Fierce: What are the biggest tech advances that will shape the future of smart glasses, especially around sensors?  

Abed: Sensors and edge AI can help close the gaps needed for performance at low power, tiny size and weight, while also preserving privacy. People want to get to the point of the ‘disappear effect,’ where they can put smart glasses on and in 5 minutes not even be aware that they’re on. They also want to ensure their privacy and the privacy of people around them are preserved.  I’ll talk about how sensors are harnessing low power techniques and edge AI to accomplish all of this.  

Fierce: Are these various tech advances really at a tipping point now to make smart glasses ubiquitous personal devices?  I ask it that way because it feels like it was only five years ago that they felt far off from reality. 

Abed: I do think the tipping point is now, and the results of that will become evident over the next few years. Our latest IMUs for smart glasses are already at a power and size that allows sensing vis bone conduction, optical image stabilization, head orientation tracking, and gesture detection. Twelve months ago, our customers had to choose a limited number of algorithms, but our latest IMUs enable multiple algorithms at once, while still operating at ultra-low power. Plus, we’re delivering on sensor fusion at the edge with PositionSense, a combined IMU+TMR custom sensing solution for always-on absolute orientation detection.  

It has been a long road. In preparing for this keynote, I remembered it was 20 years ago when I first gave a talk about low power sensors. But today, sensing solutions are ready to not only deliver low power sensors, but also low-power products, by driving system power down.  

Fierce: What should engineers and OEMs consider as they integrate sensors for smart glasses?  

Abed: For OEMs to succeed in smart glasses, they really need to get the flywheel turning, and that will require some initial investment to address the concerns listed above. At the heart of this is to create stylish, wearable glasses while also addressing privacy concerns. Partnering with sensor and SoC providers will be key to delivering real products that customers want to buy. 

Engineers should also consider the evolution of Edge AI and how sensor pre-inference and edge inference will be needed to move closer to the size, weight, power, and privacy needs for the smart glasses market to take off. 

Fierce: What is your expectation for sudden adoption? Will it be dramatic?  

Abed: In the last year we have seen a marked increase in the number of smart glasses OEMs we began working with. At CES, we saw a dramatic increase in the number of businesses highlighting smart glasses on the show floor. This foreshadows that in the coming 12 months we expect to see many new products in the category launching. 

Actual consumer adoption is still early, but by the turn of the decade, I expect we’ll see the privacy and power targets needed to drive exponential growth beyond early adopters.  

Where can people learn more about sensors in smart glasses?  

Leave a Reply

Your email address will not be published. Required fields are marked *