Oct 23 2025
Engineering a leap: AI at wearable scale
Read time: 5 mins

Takeaways

  • Meta Ray-Ban Display and Meta Neural Band represent breakthroughs in wearable AI technology, enabling new ways to interact with digital content.
  • Cross-functional collaboration was key to overcoming technical and design challenges, resulting in intuitive and accessible products.
  • Scaling Meta AI to run locally on small devices required innovative engineering to balance performance with limited resources.
  • Wearables like Ray-Ban Meta, Oakley Meta Vanguard, and display glasses are redefining how people engage with AI in everyday life.


Ever since he was a kid, Emanuel Strauss wanted to be an inventor. He grew his childhood interest in computers into a dynamic career in science and technology, even studying particle physics at CERN in 2012 when the Higgs boson particle was discovered.

“When I was still working as a research scientist, the discovery of the Higgs boson was such a huge moment for our field. It inspired me to think about what I wanted to do next. Data science was really taking off during that time, so it was a natural transition into industry. Meta has been an amazing place for me to learn and build things. I’ve spent the better part of a decade here, working on technical projects in machine learning and artificial intelligence.”

Today, Emanuel is the director of engineering for Neural Interfaces. Together, he and his colleague Kenan Jia, product manager, AI glasses, are part of the team that built Meta Ray-Ban Display and Meta Neural Band — two breakthrough products recently announced at Connect 2025.

“Meta Ray-Ban Display represents the most advanced AI glasses we've ever created,” explains Kenan.

“They build on top of all the features people love about Ray-Ban Meta glasses, including capture, music sharing and Meta AI, but now with a full-color, high-resolution display. When used with the Meta Neural Band, they help people stay more present and engaged. Together, these products will fundamentally change how people interact with AI and the world around them.”

Building a new category of wearables was a long and challenging process, one that ignited the expertise of multiple cross-functional teams.



Productizing research concepts for all

Emanuel describes Meta Neural Band as a “zero-to-one product” that originated as a research project. His team knew the EMG wristband could work well for individuals. However, they had to prove the technology could scale to help people everywhere control devices with simple gestures like finger taps, thumb swipes and wrist rolls — no touchscreen required.

“One of the biggest challenges was convincing our partners, and ourselves, that this was feasible. We built prototypes, iterated and incrementally worked our way toward designing a wristband that anyone can use. There was no blueprint. There were no other products like it for us to reference. Everything was done from first principles, which created a truly unique experience and set of challenges.”

Kenan also experienced the magic that happens when cross-disciplinary experts work together. She describes how product design, engineering and horizontal research teams came together on Meta Ray-Ban Display to perfect advanced features like the 42 pixels per degree (ppd) display and 2% light leakage. For Kenan, the most critical thing is that everything came together in a seamless user experience.

“Meta Ray-Ban Display is our first generation of display glasses; they offer a new way for people to interact with AI and digital content by keeping their heads up, eyes forward and hands by their sides. That means my team and I had to think about every user experience from the ground up. Chief among them was how to make sure the UI is intuitive and easy to use. We coordinated across research, UX, feature and product design teams to create a sleek experience that can work for anyone.”


Bringing Meta AI to new use cases

Scaling Meta AI down to run locally on wearable devices was another way Emanuel pushed AI technology in new and interesting ways.

“We’re not running big, beefy servers built to handle huge LLMs. Our technology runs on tiny devices with limited compute, memory resources and battery life. However, we still need it to perform at the levels people expect. There are so many technical challenges that come with shrinking this technology down — battery management, running things less frequently, fitting the most powerful AI we can into a couple of megabytes. Coming up with clever engineering solutions makes my job fun! Our team is full of people who spend their days thinking about how to make things work with less and create more space for other features to coexist.”

Kenan agrees, sharing how working on wearables combines her lifelong interest in consumer electronics with her desire to impact how people and technology intersect.

“In using these products for myself, I’ve realized that wearables are more than just technology backed by research and innovation. They also tie closely into everyday life. Whether it's Ran-Ban Meta glasses, Oakley Meta Vanguard, or our new category of display glasses, wearables are redefining how people interact with AI and technology more broadly.”

Kenan and Emanuel believe their work with scaling AI for wearables sets Meta apart from others in the industry.

“A lot of the current field is focused on building bigger, better models that run across larger numbers of GPUs and compute clusters, but we’re pushing AI’s boundaries in the opposite direction,” shares Emanuel. “We're going smaller. We're shrinking things down. We're making more and more powerful AI work in very limited settings. That provides a whole new set of opportunities for the field. Suddenly, AI that previously could only interface through a computer can now run on a phone or AI glasses. Our work is bringing AI to people in places and settings that were never possible before.”

Stay connected.