After more than a decade at Meta, Fei W. continues to be excited by the opportunity to innovate with technology and be challenged by his work. From collaborating with teams across Reality Labs to pushing the boundaries of what’s possible in AI, Fei is constantly developing his skills and pursuing new interests.
“Meta encourages us to switch teams and take on new opportunities, so it’s almost like working at a new company each time,” he explains. “I’ve had the chance to work on everything from blockchain-based stablecoin payment systems to the machine learning infrastructure for Facebook’s recommendation engine, and now building the future of AI experiences for Meta wearables.”
Today, Fei works as a software engineering manager. His latest challenge? Cracking the code on AI-enabled live translation in the Ray-Ban Meta glasses.
There are many reasons why glasses are the perfect form factor for everyday AI. However, in the process of building live translation, Fei and his team had to overcome multiple unknown technical challenges.
“Glasses are a common enough accessory that most people already feel comfortable wearing them,” says Fei. “Ray-Ban Meta glasses allow users to experience the everyday benefits of AI in a completely hands-free and seamless way.”
Of course, finding a way to incorporate this emerging technology into a lightweight, comfortable glasses frame was just one of the obstacles that Fei’s team faced when developing live translation.
“When you’re building something completely new, you’re bound to experience some development challenges along the way,” he recalls. “We did a lot of troubleshooting to ensure the glasses could respond quickly to users over a sustained period of time without overheating. This allows users to hold a steady conversation without experiencing lags in the AI model’s response time. We also explored how we could best leverage the glasses’ microphone array to identify the person speaking while filtering out ambient noise. This was especially critical since people will be wearing these glasses in crowded and noisy environments. The glasses had to be able to screen out extra noise and only translate for the user and the person they’re speaking to.”
To help solve these complex challenges, Fei collaborated with multiple teams across Meta, including speech AI, systems engineering and hardware engineering. In addition to ensuring the glasses could accurately translate human speech, teams also prioritized data privacy by running the AI models locally on the glasses so the wearer’s speech data could remain on the device. This also had the added benefit of enabling live translation to work even when the glasses are offline.
"Our glasses can help bridge that communication barrier and enable people to interact with the world around them in a more natural way."
“We took any opportunity we could to demo the product and test its capabilities in real time,” shares Fei. “Not only did we frequently consult with in-house experts from differing technical backgrounds, but we also invited teammates to interact with the glasses and worked with native-language speakers across the organization to test live translation. This was invaluable for identifying issues early on and quickly scaling performance for a broader user base. Meta also gave us access to a wide range of resources to support our work, including large data sets for training and grounding.”
Today, live translation is available globally in all markets where Ray-Ban Meta glasses are sold.
Now that live translation is in the early stages of rolling out to a global user base, Fei is excited to see the technology's impact on users.
“Most people hear ‘live translation’ and think about using the glasses when traveling in foreign countries, but the opportunity is so much bigger than that,” Fei describes. “Our work is expanding the way people connect on a broader scale. Think about families where multiple generations speak different languages.
You don’t have to go through a translation app or hand a phone back and forth. You can actually have a conversation.”
And the innovation doesn’t stop there. Fei is also looking forward to exploring future AI use cases, leveraging all of the resources at his disposal.
“One thing that I’ve always loved about Meta is how engineering-led and mission-driven we are. We’re encouraged to pursue interesting work and iterate quickly to test our ideas.
"If you can show how your concept will make a meaningful difference for users, Meta will give you the resources and support you need to push the limits of that technology."
Our team works with many different kinds of technology—including multimodal AI, large language models and mixed reality—so as an engineer, I can consult a wide range of expertise to scale my ideas and build AI-enabled experiences for millions of people around the world. That kind of impact has been really special for me to experience during my time here at Meta.”
Equal Employment Opportunity
Meta is proud to be an Equal Employment Opportunity employer. We do not discriminate based upon race, religion, color, national origin, sex (including pregnancy, childbirth, reproductive health decisions, or related medical conditions), sexual orientation, gender identity, gender expression, age, status as a protected veteran, status as an individual with a disability, genetic information, political views or activity, or other applicable legally protected characteristics. You may view our Equal Employment Opportunity notice here.
Meta is committed to providing reasonable support (called accommodations) in our recruiting processes for candidates with disabilities, long term conditions, mental health conditions or sincerely held religious beliefs, or who are neurodivergent or require pregnancy-related support. If you need assistance or an accommodation due to a disability, fill out the Accommodations request form .