Guest(s): Kenan Jia, Product Manager; Emanuel Strauss, Director of Engineering, Neural Interfaces
Behind the scenes of Meta Ray-Ban Display, Meta’s most advanced AI glasses yet. In a previous episode Pascal met the team behind the Meta Neural Band, the EMG wristband packaged with the Ray-Ban Display. Now he’s delving into the glasses themselves.
Product manager Kenan Jia and engineering director Emanuel Strauss, from Meta’s Wearables org, join Pascal Hartig to talk about all the unique challenges of designing game-changing wearable technology, from the unique display technology to emerging UI patterns for display glasses.
You’ll also learn what particle physics and hardware design have in common and how to celebrate even the incremental wins in a fast-moving culture.
Pascal: When we first teased the Meta Neural Band back in episode 77, we had to awkwardly dance around one core question, “when will people actually get to use this?” Fast forward just two months and suddenly, boom. Zuck is on stage at Meta Connect, unveiling the wristband as part of the groundbreaking Meta Ray-Ban Display. Today, we are pulling back the curtain on how these futuristic technologies came together, and I'm thrilled to be joined by two brilliant minds who have been at the heart of integrating cutting-edge EMG wristbands and next level display tech into a seamless polished product.
Kenan and Emanuel, welcome to Meta Tech Podcast.
Emanuel: All right. Thanks for having us.
Kenan: Thanks for inviting us.
Pascal: Okay. I want to first talk a bit about you before we talk about this really exciting product that was recently launched, Kenan. Can we maybe start with you? How long have you been at Meta and what did you do before?
Kenan: Yeah, so I have been at Meta for seven years, which is crazy. Uh, I actually have only been working on Reality Labs products, uh, because I love consumer electronics. I joined around 2018. That's actually when, um, we were doing our first homegrown hardware. If you still remember, uh, it's Portal. So I worked on the first SMART displays launching from our brand, which was Facebook at the time.
And it's really interesting because. That was prior to the AI age, so still voice assistant driven, uh, speakers and smart displays. And it's a lot of learning, both for myself and also our org to see how we build a new brand and completely new device, uh, from ground up. And then I also moved forward to work on wrist wearables, so including EMG and our other product explorations.
And most recently I worked on the glasses team and Meta Ray-Ban Display, which we all know and love right now. And what I did before Meta, this is where Emanuel and I actually share an interesting story. He will share it later. So I actually came from a science background. Um, I studied physics at Yale and during my school time I actually had a professor who worked at Cern, the particle collider in Europe.
And I did research with him and I figured later that Emanuel was actually a physicist and also worked at Cern. I don't think our time overlapped there, but we both worked on particle physics, which was very interesting. 'cause I think you have a lot of, um, new research beyond the current theories, which was the standard model.
So you're doing a lot of data analysis to search for particles beyond what we know today. Uh, there's a running joke of you often spend five to 10 years to prove that something doesn't exist. So that's, I think where I figured that I love research, but I also love more like faster turnaround for both the impact and as well as for the development, whether it's like product or more on the engineering side.
So, um, after school I actually worked around, um, a few years in consulting where I worked for tech and consumer products, both on the market user research as well as product development and then the rest is history. I think I love more on the building side. I also love consumer electronics, so the rest is at Meta.
Pascal: There must also be a joke somewhere in there about, you know, unlikely collisions that happen in that setup. And also, I still have the Portal standing behind me. I'm not sure if we will have video versions of this podcast out with this particular one, but it is somewhere blurry that we're in the background and serve me practically every day during the height of lockdown. I really love my Portal. Okay, Emanuel, let's pass it on to you. How long have you been here and what did you do before?
Emanuel: Uh, seven years is a long time. I'm clocking it at 11 as of, uh, well in December. So I joined in, uh, December of 2014. Um, the first seven years here were spent in, uh, what we call integrity. So this was, um, you know, working to protect users on our platform. Uh, particularly interesting set of challenges, uh, spanning 2016, 2020, if people wanna think back to what was going on in the world around that time and the discussions we were having online about the role of social media.
Um, so like, just an absolutely fascinating place to be. Right in the middle of, you know, what felt like, just really important discussions, taking really important decisions about things that affected people in really important ways. Um, and uh, also a great place to build out interesting technology to do that.
Um, I built out our core ML functions, uh, kind of so that we could have. Really well powered, very well leveraged systems, uh, to protect people at scale. Um, so that was, uh, you know, just great. Um, and then the last, uh, four years have been here in, uh, Reality Labs, uh, working on Wearables. Um, specifically, uh, I work on the EMG program and that, you know, was a foray into consumer electronics.
Uh, a brand new experience for me. Um, but one in which I think there's a lot of, you know, transfer of knowledge and skills and, um, things. I'm sure we'll touch on more of that later.
Pascal: fascinating to me and I don't want to derail the entire conversation, but that is a pretty big move and I feel like is so illustrative of the culture that we have where if you're boxed into integrity, that doesn't mean you can't do something like build some fancy AI glasses. What was that like for you to taking that plunge?
Emanuel: Uh, nerve wracking. Uh uh, but you know, you're absolutely right. So, um, Meta really encourages people to. Have, uh, the flexibility to move around. Um, and I think what was great about spending seven years in integrity was I got to go very deep. I had a lot of domain knowledge. I, I think I was very good at doing what I was doing because I had spent so much time in that domain.
But, um, you know, people need change. And what was fantastic is that I didn't have to leave the company to find that change. I was able to go and take a chance on something internally. Um. And, and, you know, you can have very open conversations with people about this. I, I was very clear with my, my manager, my management chain, even at the time, saying, oh, you know, like, this is, I'm feeling a little long in the tooth here, but also it's, it's a little tiring and you need, you know, something fresh.
And um, they were very supportive of me. Exploring internally, uh, had open conversations with people here too, and boy was it like exciting technology, uh, exciting problems, and, uh, they were very welcoming of someone who had no neuroscience background coming and jumping into that type of a project. So that, that was great.
Yeah.
Kenan: Or you're a hardcore consumer, electronics like me. Maybe when I get to 14 years, I will also 11 or 14 years I would change as well.
Emanuel: I admire your consistency, although things have changed a lot within, uh, RL over your time here. Yeah.
Pascal: Very true. Okay. Let's try and get us back on track, even though this is absolutely fascinating and I feel like I could spend an entire hour talking about career transitions, but I want to talk a bit about your particular involvement in this project. So could you first talk about your specific role and responsibilities when working on Meta Ray-Ban Display and maybe Emanuel we first continue with you.
Emanuel: Sure. So, uh, I'm an engineer here. Uh, I am kind of an engineering lead for the neural wristband. Product, uh, and kind of working on, uh, EMG more broadly in the various different ways that we're considering using it in the future. Um, when I joined, uh, this was really about taking kind of like a new technology and turning into something that you could put on people's wrists.
So my job has basically been, um. How do you take applied science? How do you turn that into a product? How do you deliver that product to users? And then how do you integrate that, um, with all of the, uh, connected components? Um, so I'm here to kind of drive the execution and to drive the technical strategy from an engineering perspective.
Pascal: And the sure are, quite a few components involved in this, so I am sure we will get to that. But Kenan, now let's go over to you. So what exactly is your role and what are your responsibilities in this?
Kenan: I am a device PM lead for Meta Ray-Ban Display as the device PM, one of the cool thing is you actually get to, uh, follow through the development of a product from concept to launch. So our role and responsibility also evolve at different stages of development. Uh, very early on you would be working on product definition.
Positioning target audience and almost key value prop or future areas. Thinking about what are the most important things, right for this new device or even new category of experiences. So that's a lot of working with marketing user research early on to have an early hypothesis. And then in the middle development phase, um, very early, you actually have more collaboration with hardware teams.
So that's around hardware architecture, selecting the chip set or components. For a new device, which sets a foundation for, you know, what's a ceiling of your audio display or even, um, computing, uh, for this device. And then you would also work a lot with industrial design, mechanical or electrical engineering.
So if you're purely on the software side of Meta, you might not work a lot with these teams, but they are truly critical for consumer electronics. So that would be working a lot on the product design, trade offs around hardware side, which will impact later on software experiences. So around the middle stage, we work a lot more with software experiences team.
So they include software engineering teams like Emanuel and our uh, different feature teams as well as software, you know, product teams working on messaging, calling and AI, all the other experiences on this device to bring that together. And then for that last sprint for product launch, it's really more focusing on getting the program together, getting through launch readiness, you know, dogfooding, all the iterations, which I know we're gonna spend a lot of time talking about.
And then working with go to market teams, right? Um, pricing channel strategy. How do you demo, how do you do press briefing? So yeah, it's a very, I would say it's like a both deep and also broad experience. That's why I think I'm still loving it and still learning it through the process.
Pascal: For sure, and just for the people who might not have seen all the demos and would like to have some sort of visual in their heads as we talk about this, could you quickly describe what Meta Ray-Ban Display is?
Kenan: Yeah, so I can take a shot and Emanuel, feel free to go. Uh, Meta Ray-Ban Display is our first gen display glasses. It's actually very intuitive. Think through it as building on top of what people already love about rebound meta glasses. So that's listen capture and ai. But now you have a full color monocular display integrated into the lens, so you can actually enhance a lot of the experiences you already have.
So for example, if you're taking picture now you got a view finder to see what you're actually capturing and you can do digital zoom, you can share it directly, right to your friends and family. Same thing for music. You can see album art, playback control. You can also have more integrations later on for music experiences and entertainment.
And then AI, I think this is one of the bigger differentiation where you can have visual responses for a lot more richer context. And then for the display glasses, you can also enable new experiences only possible with a display. So navigation is a big one. If you're traveling, you can get pedestrian navigation, which is everyone's favorite.
And you can also do live speech captioning or translation, which we have for Ray-Ban Meta, but it's audio only. But now you can actually see the text. So you can imagine this is just a starting point, right? And then building on top of it. Most importantly, you have the Meta Neural Band, which you absolutely need, right, to control the glasses given you have the visual display, you have completely new ui.
We need a very intuitive and very easy to use input model to work with the glasses. So that's all Emanual and our EMG teams work to make sure that integrates very well into the overall product.
Emanuel: Yeah. Um, and kind of as, as Kenan teed it up there, um, the moment you have this like super rich experience on your face. The limits of your ability to control become very apparent with historical ways of doing things. And so the uh, Neural Band is all about giving people like really powerful, discreet control over the compute that they're wearing, that they're carrying with them.
Uh, you can imagine all these things you can do great, well, you're gonna wanna navigate to get to them. Uh, once you're within them, you're going to want to control them in kind of much more subtle ways. And, you know, uh, when I say subtle, I mean kind of discreet little ways, but also discreet as in like around you, not kind of flailing around on the street, uh, with your hands in front of you doing things or speaking out loud.
People want to be able to do things in a way that they feel a little bit more, um, personal. Um, and so the Neural Band is about, uh, translating the signal that comes from the peripheral, uh, nervous system. The neuromotor activations. When you flex your fingers, you've got electrical signals coming up your arm.
And, uh, it's about reconstructing those things in a really high fidelity way. Very accurate, very, uh, reliable way so that people can like very comfortably with confidence. Um, but with small, discreet gestures, uh, make kind of magic happen within the product. Um, and it reflects, I think, one of the places where Meta has a different vision of how people should be able to interact with the next generation of computing platforms.
I think it's kind of, it's very unique in the arch that we're trying to produce there. Now, I'm sure we'll touch on that more.
Pascal: Yeah. To me this is absolutely fascinating because I don't think too many people were actually even actively loudly at least thinking about different input modalities before this particular product came out, but it feels like, at least with the benefit of hindsight, like a prerequisite for creating a product like this, because as you said, we need some discreet input mechanism and so far everything, especially when it comes to AI, is either typing, which is really hard unless you want to track your fingers and then poke around right in front of you where at least the, the cameras can see it, or its voice. And voice doesn't do particularly well if you are on the underground, on the subway, on a bus surrounded by other people, and you wanna have a private conversation. So this is such a crucial piece of building this entire product.
Kenan: Yeah, by the way to add, we actually have voice, right? And we have capacity of touch on the temple arm, so those modalities still exist, but you will, as you use the glasses, you will know like EMG totally is a game changer for discrete input.
Emanuel: And each one has like the right use for the right time. But we think EMG is pretty great for a lot of stuff.
Pascal: Exactly. We even have voice input for our laptops and it serves a purpose for, for many things, and voice assistants are there for a reason too, but it is great having the choice, figuring out what feels right in a given context.
Emanuel: Yep. Yep. Meeting our users where they're at. Absolutely.
Pascal: So let's talk just kind of broad picture because there are so many different things in there.
When you think about this entire project as a whole, all the different engineering pieces that went into it, what is like one
unique challenge that you will always remember from this project?
Emanuel: Oh, um, so. Where to pick because there were many.
Kenan: Yeah, there are many.
Emanuel: You, you know, I think one that really stands out to me, um, because it speaks to the subtlety of what you're building and like the complexity of what you're building. It also speaks to how collaborative the building had to be, um, for the band to work well.
The electrodes on the band must maintain good contact on the skin. And so things like the stiffness of the band, uh, all of these things can like dramatically change the performance. And um, even things like how does the closing mechanism, the closure, the clasp function, can change those things. And there was one particular, um.
Problem we ran into where in fact, the, uh, one of the earlier closure mechanisms we'd chosen wasn't performing up to spec. And you had teams across, you know, materials, hardware, you had teams across, like the electronics, the signal validation teams all have to come together to really rapidly fix this, come up with a solution, fix it, and, and, and drive a, something that is both comfortable, that looked good and met all of those specs.
Um, and you know, you, you wouldn't think, oh, yeah, like the little closing thing, critically important so that that one stands out. Yeah.
Pascal: In our chat about the neural wristband, they also talked about some alternatives they were dealing with in the beginning, and one was kind of gel based. The idea that you would have to then ensure to, in some way moist the context of your wristband. It sounds fun for a research prototype, but probably not something you want to send out to the world that comes with a gel subscription, I suppose.
Emanuel: No. Um, but actually you're, you're picking at one of the things that makes this, um, I think a really tour de force and I'm, you know, put a lot of credit to the people who founded the company that eventually built out this and was acquired by Meta, the people who came up with this idea in the first place, um, and who, who drove its execution for quite a long time.
Um, in a lot of medical settings where we use EMG a lot. Yeah, sure. You know, apply a gel, use wet electrodes. No problem. Those are great. They maintain all the conductive properties you want. Fantastic. Um, but that's not a good consumer product as you kind of, you know, humorously pointed out. And so, so much of the challenge of building this was having, uh, dry electrodes that were comfortable, that you could wear, uh, that people would not have, um, you know, skin sensitivity issues with that people would not have comfort issues with, of course. Um, so, so yeah, that, that's, I think one of the things that's actually very impressive about the technology that was, uh, built out here is that you can have a consumer product with this.
Pascal: And to draw a probably imperfect analogy, but I've recently got a chest strap, um, for heart rate tracking for running, and I found it actually incredibly hard to get it to reliably work. The electrodes there actually need to have some sort of conductivity, and I only got this in the first place because it is really hard on the wrist to get accurate heart rate readings.
And now I'm thinking. You are tracking something that has so many more dimensions, so much more depth in what you can get out of it, and you do this all without any additional conductivity, and it should just work out of the box. So the challenge to me just seems absolutely nuts how you pulled this off.
Emanuel: The magic is if people don't have to think about it.
Pascal: Yeah. Okay. Kenan, have you been able to think of one standout challenge among the many that you faced whilst building this?
Kenan: Yeah, I think I can pick one, but secretly sneaking two related things in that, uh, one for hardware team, one for software team. I think one of the biggest challenges is the display and UI. And they are related. And just as context. So the display architecture we use for, uh, Meta Ray-Ban Display is highly custom.
The most important components are number one, there is a custom light engine here in the hinge area that projects light into the wave guide, which we use a geometric wave guide, a very custom solution, which is fully integrated in the lens and you can't see it.
And given this new display technology, it have a lot of challenges. Um. Both making it great for visual quality and visual comfort, right? So one of the bigger question is how do you set the right KPIs for visual quality? So as one example, you might have issues with color uniformity or text readability, like how sharp the text is and around the border.
Do you get some artifacts? Of course, as a product manager or user, you want it to be as high as possible, but the challenge is this is closely related to your manufacturing process - yield rate and volume, right, and cost. So the biggest question is how do you set the acceptable range, which is great, but also balancing all the other trade-offs.
So at that time, you would work with all the hardware architecture teams to do evaluations from users on what's acceptable, and also balancing all the other considerations. And then it comes to, okay, what mitigations you have from the software side, right, to make that look great. 'cause once this is set, and then all the rest of the work would be on the software side to optimize.
So I think that's the challenge around the display from the hardware side, I think the challenge for UI and display from the software side is a way bigger issue. It's not just about looking. Great, right? As Emanuel was saying, this is a completely new interaction model. You have already a limited set of gestures that work great, but how do you design the UI and all the experiences to fit into a very easy to use and um, very consistent model?
So this is where I think our teams iterated the most, even through the last few months going into launch. Um, initially we have a very early ui. Um, some of us still remember internally it's actually one long array of app icons, similar to the phone model, but you know, like somewhat easy to like know, but also a lot harder to navigate through and like lower efficiency.
So we have to like come up with all the evaluation and feedback and also. Principles to make sure like, Hey, this, meet the standards. So through the feedback cycle we figure like, hey, we actually need to really improve the task efficiency, right? It can't take people forever to get from one thing to another.
We need to make sure, like AI is more like towards the center. 'cause this is where, you know, Meta and AI is going, and that needs to be more front and center in the experience than UI. And of course you need to make sure like, uh, this is high design polish. So we did a lot of iterations on the UI side from software teams.
And again, also knowing some of the like challenges from like visual comfort and visual artifact. How do you actually solve those, you know, challenges you might have from hardware side? So these are the, you know, like the hardest challenges and it brings together input teams, UI teams, all the experiences, teams and hardware teams, right, to make sure it's all a very solid experience for what people see and interact with.
I think the result we're very happy, but it took a lot of effort and that was one of the biggest challenges on the product.
Emanuel: Yeah, actually, if I could, uh, just. Hop on to that for one second. 'cause I, I think people often kind of like to hear what was like the thing that went wrong, or like one, like big thing you ran into. But like what was just described there is just the, like the long dedicated, incremental, iterative. Work of making hard things actually turn into a product and achieving those goals.
And, uh, I, I think you captured it beautifully, right? Like there's the whole collaborative element, but boy, it's the, you know, you're, you're whittling down on these things. There's not the one thing, it's like, it's the whole, can you deliver on that goal? Uh, and that, that's like, those are the engineering challenges and those are the product challenges.
It just, you captured it beautifully.
Pascal: How did you build a culture that manages to celebrate these incremental wins? Because as you say. Most of the time, or at least it's easiest to just celebrate the firefighting, the Big Bang events where somebody jumps in and rescues everything. Generally, how we call it the big SEVs, but all the work that goes into preventing this and creating these polished experiences and putting in the kind of extra effort to make things just particularly nice.
Emanuel: Hmm. Um, so you're right. There's a natural inclination to see the latest fire and to swarm on that and to celebrate the firefighting, um, uh, heroic work that gets done. And it worth celebrating because people frequently do really difficult things under tough time constraints. Um. A lot of this is about the engineering rigor.
I think you, you, you described it as celebrating it and you're absolutely right. Like there are milestones that you have to hit and when you hit those, in fact, it feels great and people get a lot of pride out of it. But I think the, the bigger thing is really just the engineering practice. It's, uh, understanding that, you know what, like every week is a new week and you build a, a clock to the program and you look at your progress.
Every two weeks or on whatever granularity makes sense for the specific thing being worked on, and, um, bringing a lot of rigor and discipline to doing that, to coming in and going, how are we doing? What's going well? What's not going well? And like going, oh, that thing that wasn't going well now is going well.
Well, I mean, it took two months to get there. Wow. Fantastic. Right. And you know, there's like, there's about. You know, congratulating people and kind of making people feel recognized and things like that. But fundamentally, it's about having those structures in place, uh, and knowing what the problems are and then focusing on them and then driving them down.
Um, and I think that when you create that environment, definitely what I've seen is, uh, the scientists and the engineers that are working on those problems, like they get a lot of gratification just out of that. You've created space for them to do the stuff that they love to do, um, because people like to feel good about solving a giant fire, right?
Like, but it's tiring. It's exhausting. It's draining. And I think when you create the space for people to do that work, like they just, they feel better. You know? And, and yes, and absolutely you go and you go, Hey, giant milestone and you know, like we'd all get together and we'd celebrate it because those are great moments.
Uh, as well. So the combination of those two things create the space and really value progress over time.
Pascal: And I want to pick up on something that Kenan said earlier about the feedback cycle. Can we talk a bit more about this? Because most of us here who have either been on the podcast or are probably listening to it, are coming from the software space. And the feedback cycle is usually just ship something in production, see what happens.
Um, I'm simplifying here. There are various more rigorous methods of doing this, but this model is just fundamentally incompatible with hardware. So what did it look like for you?
Kenan: Yeah, I think there are a couple of forums and also processes and tools we use. Uh, as Emanuel mentioned, like you need to have those checkpoints and milestones, right? Because for a very complex program, like Meta Ray-Ban Display and Meta Neural Band, they all have to come together. I would say a few of the most important things, number one is dogfooding. So this is where we have internal users using our products. Pre-launch, right? Eat your own dog food test. A lot of it, and this is where I think what really differentiated our launch is we baked in so much time. For dogfooding, like actually over a year, right? Starting from the very early like alpha build software, which is uh, somewhat not usable, but people really crank through.
And then to get the early feedback, 'cause you have to have people using it in a naturalistic um, setting. And then this is where they have UX feedback, they have product design feedback and also bug feedback, right. For your engineering. Um, processes, so do footing. We run it as a very important program and we have weekly assignments for the doll footers, so they would complete the questionnaires that would help actually inform the product design, especially on the software side, right?
Early on. And then you also know you are UX KPIs from the dogfooding and what really well at Meta is we have these big open groups where people can post their feedback and it matters, right? Because one user posts one thing, the other person might be saying like, “hey, I experienced the same thing.”
And then the owners for those areas will see it and then have to solve it. Um, so I think this really brings, uh, visibility and also accountability through the process. And I think the other thing very importantly is more around. Bug filing. Again, very closely related, right? But this is more for the development teams.
So ENG product, you know, managers and then like designers, we all use it even at a more frequently, uh, like manner. And then you just follow bugs. Anytime you see it. Right? And you, you need to have very clear dashboard tracking that. And this is very critical as you go to the launch mode. So we actually have KPIs, not just UX KPIs, but more for quality, stability.
And then you have, uh, I think a more frequent checking manner for how we are burning down the bugs, which ones are launch blocking, which ones, you know, we are okay and we can fix post-launch. So I think that's bug filing. And also like software, we have a monthly software release cadence, which seems long maybe compared to the software side.
But I think for a pre-release device, it's actually a lot of work for all the teams to get ready. So those, I think, dogfooding, more frequent release as well as bug filing and burning down. Those are the critical iteration cycles. Um, and using all of that in a very open culture and sharing your feedback and having a strong ownership, those are the most critical things..
Emanuel: Yeah, Kenan. I think there's another really big thing, and I think you're actually underselling what you and others have delivered here, which is, um, there was a lot that was about having product vision and conviction in that vision. Um, and, and then there's a lot that you change and you learn from users as you go through, but.
You know, um, I remember reviews with, with you and, and others, uh, like very early in the product where we said like, this is how it's going to have to be for our users. This is what we think is going to be important. And, uh, you know, to the point of the question, like certain things, you're not gonna A/B test your way out of it.
Um, there was a, a lot of, uh, you know, belief in certain fundamentals and then, you know, you do the best that you can with that as you learn from your users. But yeah, that product conviction I thought was really, really important and, and it showed a lot of, you know, real foresight.
Kenan: Yeah, and I would say this is probably the biggest difference also between more like a new device, hardware, heavy product versus purely software is you can’t A/B test. There are actually decisions, what we call it, one-way door. It's like once you pass this, you can't go back, but you only have limited information at the time.
So it's about how you make an informed decision. I think to Emanuel's point, some you will try your best to collect all the data and do the evaluation, but there's also a lot of conviction and intuition almost at a principle level. So those are very critical and you get the feedback from like dog footing, from also external demos.
I forgot to mention that we did a lot of external demos from very early on, both like VIPs, press, internal external demos, those all come together, um, to iterate.
Emanuel: Mm-hmm. Yep.
Pascal: And just to double down on the open culture bite, because I think Meta is fairly unique in that respect when it comes to hardware product that we do have such an open system. The dog fooding for these new hardware products isn't even limited to a particular org like Meta Reality Labs, but anyone can just sign up to it and fills out a little questionnaire.
Saying how much time they're willing to dedicate. I have quite a few VR headsets behind me that are from dog fooding programs. I didn't quite manage to commit myself to dog feeding this one because I'm already doing quite a lot on that front. But getting the insights from all these different user groups, not just software engineers, which for instance, which I think is incredibly important because we have a very specific lens of looking at UIs and problems.
I think my software engineers know when they talk to their parents to explain a certain concept, how some of the abstractions they use don't quite resonate on the other side, I feel like it's quite similar when you as a software engineer, use a certain product and give feedback to it. So yeah, highlighting it from all angles is super important.
When it comes to these one-way doors where you said you had to just kinda have the conviction that you made the right choice there, can you think of certain trade-offs you were considering and ended up on one particular site where you had to make that call?
Kenan: I think for the one-way door decisions you see a lot of that, especially on the hardware design side, which has downstream implications for software experiences. So one thing you mentioned is battery selection, right?
Of course, all your software teams are gonna come over and say bigger battery. The more the better. Again, bigger memory, the more the better. But this is where you also have to balance, for example, the battery size. We increase the density of the battery format for Meta Ray-Ban Display, but there's still a lot of trade trade-offs around the size and form factor.
Right. Which is absolutely critical. 'cause people want it to be as lightweight and as thinner, you know, like and more comfortable as possible. So we did a lot of evaluation early on. Uh, actually to try to model out what is the power consumption, what are the key use cases, right? For peak power, for max, you know, runtime usage.
And again, like you try to make your best evaluation then with some buffer. But once you select the battery size right, which balances out with all the mechanical design, everything downstream needs to be software optimization. So I would say we still have a lot of learning on like, hey, like how much we can improve from software side.
Then like for next generation, you have actual data, right from actual usage and now that informs your next run of one-way door. But I think this is where it's the hardest for our gen-one display product launch 'cause you just do not have any like real usage data. So kudos to our engineering team and all the design teams is a lot of work. And then after that decision is made, it's all optimizations from then on.
Emanuel: Yeah, so on the, on the side of the wristband, um, we crossed a couple important one-way doors in finalizing the actual industrial design. Um, so fundamentally. Uh, for this to work, the sensors have to maintain contact with the skin. Again, this is the whole challenge between wet and dry electrodes and, and this kind of a thing.
And at the same time, you want something that's beautiful and that's comfortable. And the more those electrodes protrude from the band, the better the signal. But, you know, you're trading off on looks and to a certain extent, comfort. Um, and since this was a first of its kind product, um.
There's a lot that you had to guess about what is good enough, right?
Um, you had to use, uh, knowledge from the neuroscience domain, but you're doing this in a consumer product, and so the we're very frankly, contentious discussions about what consumers would be comfortable with and what we needed and fundamentally then we had to make a pick and we had to run with it, and there was no, oh, but then let's see how the market responds.
And then, you know, well, we can do a small modification, you know, we can learn from that for our next generation, which will be more flexible and better in a lot of different ways. But we just had to make calls there. Um, so, so I think that was an important one. I think, uh, another set of important one-way doors that people often don't think about as one-way doors is, uh, how you spend your time.
'cause frequently you'll do something and you go, technically you can flip a switch and change this in the future, but that clock is running down to when you need to launch, and at some point the decision you've made or not made is just what you're going to go with, and so there are a lot of implicit one-way doors that you run into about how you're going to implement the thing or how you're going to do the thing, or how you're gonna dedicate your resources, and those kind of happen like every day.
Pascal: Great answers. One thing about the whole development cycle I want to understand a little bit better, is that you mentioned that you had to iterate really quickly how. Things came together, different components at a fairly late stage, as you say, getting closer to the launch and having to make decisions.
How did she ensure that as quickly as things came together in the end, you still achieved the right level of polish to get to a mass market product?
Emanuel: So I think this is part of like Meta’s superpower. Time and time again and again. We've both been here for a number of years. I've seen this company in the soft world and in the hardware world. Go, wait a second, this is important time. There's not a lot of time left, but we're going to rally around this and we're gonna make it happen.
And so part of that is a cultural thing. Part of that is a process thing, but it's a, it's a willingness to get people to come and surge on things and, uh, if they're really important. Just pour our hearts into it and, and make it happen. And sometimes those happen as a result of we start to learn how people are responding to the product in dog fooding, things like that.
And we go some of those things that we thought were, were good, that maybe we met conviction on, were not as great as we need them to be. So interestingly enough, the willingness to go surge sometimes is in service of the quality. Right. If we realize that the way the interface looks or the way people are getting to navigate through it, or the way that they're getting to interact with different experiences just is not working and will not be delightful, then like you go and you fix that. So I think that’s a really big, important part of it.
Kenan: Yeah. And a lot of grinding in that process. Put it bluntly because I think we mentioned like you celebrate the wins. To be honest, I think we really, really celebrated after launch when we see all the reaction, we still celebrated the wins. Right? But it's a lot of. Hard work for all the teams. And I think one thing in addition to like willingness to search is at a org level, we actually put quality and polish as one of the top goals.
'cause if you look at the product development, you know, like milestones, you can hit a lot of things right? And you have a hard schedule. Everyone wants to ship. But I think it's also being very open and willing to call out. Like, Hey, this is. Not good enough, right? It comes back to both from like leadership level as well as from teams level, right?
With user feedback and also just with gut feeling on like, Hey, this is still not there yet. So I think that's, uh, very critical. I think we're also seeing the good feedback now, like, hey, like it really works and now the, for the future generations, we have to continue this high bar for quality and polish, so not an easy process.
Emanuel: Yeah. It's interesting 'cause it really is like, it's these two prongs, right? There is the, you do the regimented define your quality metrics and you drive them where they need to be. And you are filing, uh, issues, people are finding bugs, and you're driving those down and there's a whole development process around that.
And this is like the, the, the crunchy nitty gritty build a high quality product stuff. And then at the same time there's like, you really gotta love this product. You have to like, you have to want it to succeed. And like from the top down, from our CEO down, people want it to be amazing. And when you realize that despite all these metrics, you're not where you want to be, you go and you just tackle it.
And so these two things have to go hand in hand. And sometimes it's hard to balance, but you have to do these two things to make it work.
Kenan: Yeah, and I think we know it, right? We're all users, so if you are not willing to put this on and it had a great experience, there's something wrong. So I think that's why the dog fooding culture comes back, that we have to do it as end users.
Emanuel: Mm-hmm.
Pascal: As a UI engineer, I can't resist ask a little follow up question about what you just said about the polish when it comes to the UI elements. Were there any specific principles that you found worked particularly well? Because you, you talked about the iteration. This is an interface that has really never existed before, especially with the different input, input modality and taking up a kinda small portion of your visual real estate that you have in front of you. So whether certain things that were particularly well displaying information to users and maybe potential things that did not work at all that you thought would.
Kenan: Yeah, I think we mentioned a couple of principles earlier. I think one is around the UI navigation and the other ones are gonna be around the visual quality. Right. I think navigation, this is where I mentioned like very early on, we have this long app array, right? Which is familiar, right? In a good sense, like it's intuitive, but what we figure is people need a good mental model to reduce the cognitive load, right?
If it takes very long for them to complete a task, that's not a good user experience. So we have to think about, hey, how do we improve the design so that people can very easily get, you know, like whatever they wanna get done, whether it's replying to a message or you know, like going to take a picture and share it with their friends, or ask Meta AI.
So we actually went through this design process with very clear KPIs on like task completion time, and efficiency. Ease of use and our UI design changed from this, uh, you know, like more like app array model to like a three panel model where you have a very clear home, which has AI at the center and also your top notifications, like reminders. And then to the left you have your control panel where it's. You know where you change your volume, brightness, or quick access. And then to the right side is all the app center where you see the apps and experiences. So again, it's still like a visual UI. You need some navigation, but I think it's much more clear and easier mental model for people to internalize.
And this is where I think the ease of use task efficiency really came in. And then I think coming back to the visual quality, that part is related to what we talk about on like, hey, visual comfort, right? Can I use this comfortably for like over like an hour or even 30 minutes or 10 minutes for different types of content.
Can I see things clearly, right? Because for this visual display, you actually see the text or the content overlay on top of the real world. So we get a lot of feedback around, hey, like, is the text somewhat blurry? Like is it harder to focus? So we get a lot of the feedback around this area. And we have designed principles on like making sure people can comfortably wear it, they can see things clearly. And this is where you have the KPIs for some of the, you know, like spec areas we talk about for text readability, sharpness, you know, any visual artifacts. And then you use early on you have hardware design or process changes to improve it, but later on it's gonna be in the UI.
Like how do you change color? How do you have some UI treatment to make sure people can use it comfortably for a long time? Um, so that, I think we got a lot of feedback early on and try our best to improve. And I think the feedback from actual users post-launch, uh, confirmed we made the right changes and there's still a lot of improvement to go from here on.
Pascal: I’m sure because all of this is brand new. Nobody has done this before, so we will all probably learn a lot over the coming years of how to do this in the best possible way for our users. Okay. We are slowly getting close to time, but I wanted to ask you if there's a feature that you're really looking forward to, that you are bringing to Meta Ray-Ban Display in the future.
Emanuel: Well, uh, yes, this was teased at Connect. And we are actively working on, uh, the next update of containing, handwriting as a part of our early access program. So, right now if you use the Neural Band and the display glasses, you can navigate up and down left and right. You can select and kind of go back.
Um, and these are important discreet gestures, very rich already, but we, we really think like the magic here is gonna come through what we call kinda high bandwidth communication. Very like rich, intentional communication. And, uh, so people are going to get to try this in not too long. And I think it's gonna blow people's minds when they can kind of just sit there.
No pencil, no digital anything. No, you know camera tracking that it's like people are just gonna be able to sit there and think, I want to communicate to my device and I am going to tell it by writing with my fingers what I want it to know, and it's going to know that. And then I can do things with that.
And it's, I mean, I think it's what wild.
Pascal: It sounds like pure science fiction at this point.
Emanuel: It is. Yeah.
Pascal: Do wonder how long it's gonna take before this is just another thing that just happens next to us on the bus.
Emanuel: Boy, like, I mean, I think success would be it becoming banal, right? If, if people just go like, oh yeah, like that's sure no biggie. Um, but, uh, I think when people first see it, it's gonna be, uh, a real wild moment.
Kenan: Yes, I can confirm that because we did a lot of demos where we have people try handwriting and they got it the first run, right? They have never used it. They just write naturally with their fingers on any surface and it worked. So I think that, yeah, it has to be. I think a very big and also like a surprising and a good way feature.
And it's getting integrated in our features, right? Like messaging and in the long term different input surface. And yeah, I think also a lot of things to be excited down the road we're gonna have more content integrated. Ideally in the long term we'll have a lot more. You know, like developers coming to help develop for Meta Ray-Ban Display.
We open up device access toolkit for Ray-Ban Meta Glasses. So that's something we hope to enable in the future. I think that's gonna be really exciting as people see these products in the market and then have their own creative ideas, maybe none of us, you know, have ever thought of and bring those experiences to our glasses.
I think that will be very exciting in the future.
Emanuel: A hundred percent.
Passy: Ah, there are so many more questions I would love to throw at you, but unfortunately we're running out of time, so at this point I can only thank you for making sci-fi a reality and so openly sharing about your development of Meta Ray-Ban Display. Thank You so much, Kenan and Emanuel for joining me here on the Meta Tech Podcast.
Emanuel: Thanks for having us.
Kenan: Thank you.
RELATED JOBS
Show me related jobs.
See similar job postings that fit your skills and career goals.
See all jobs