The Central Integrity team aims to protect people and communities from harmful experiences on our platform. We accomplish this goal by building support experiences for the people who use our family of products, creating AI detection systems to detect malicious content, and establishing human review systems for the thousands of community operations agents who help care for our global community.
The Integrity Human Ops Platform team is building the tools and systems used by human reviewers and community operations agents across the globe to keep our community of billions of users safe. Our centralized perspective also spans Facebook’s range of user facing products and services, with the goal of allowing Facebook to implement trust and safety at scale.
Our Central Integrity Research team conducts rigorous, human-centric research geared toward safeguarding people and communities from harm and empowering them to identify harmful content and behavior when it arises. We employ both qualitative and quantitative methodologies in our research to inform strategic product execution and policy development, as well as to identify ways to minimize violations such as spam, hate, terrorist propaganda, and harassment. Our primary research ranges from formative to evaluative, using approaches such as field research, in-depth interviews, quantitative surveys, and statistical modeling.
For this position, we are looking for a Qualitative Researcher experienced in conducting interviews, contextual inquiry, task analysis and other qualitative methods who can help us deeply understand who the customers of our content moderation tools are, how best to capture their journeys, what their needs and pain points might be, and how to ensure that content reviewers make the right decisions consistently. Ideal candidates will also have a demonstrated ability to work closely with quantitative researchers and data scientists to generate comprehensive insights. The right candidates will have experience conducting rigorous, creative, impactful primary end-to-end research with the goal to inform broad strategy across tools and teams. They will work within a flat, fast-moving organization, collaborate with cross-functional partners, and focus on protecting the safety and well-being of the billions of people that use our platforms.
Qualitative Researcher, Integrity Responsibilities
Work closely with cross-functional partners (product management, data science, design, engineering) to identify and prioritize knowledge gaps in the ways in which content moderation tools are used to protect people from harm at scale from a customer centric approach
Design and execute end-to-end custom primary research using a wide variety of qualitative methods, and interpret analysis through the lens of business impact, UX, HCI, human factors, and/or social science
Uphold a rigorous research standard while working on complex systems and problems
Establish usability frameworks and best practices to scale impact
Adapt methods creatively to the research question at hand using a range of methods, weave together qualitative and quantitative insights for data-informed understanding
Deliver actionable insights that shape how product teams think about medium and long term product strategy
Communicate results and illustrate suggestions in compelling ways
Manage and prioritize research plans through ambiguous and fast-changing environments
Mentor and collaborate closely with other researchers
Master's or Ph.D. in Human Computer Interaction, Cognitive or Experimental Psychology, Anthropology, Human Factors, Sociology, Information Science or a related field
Experience in applied product research with a human-centered perspective
Experience conducting usability, task analysis, heuristic evaluations, user journey, experience maps, personas, or requirements analysis
Experience working with multi-method studies and working with quantitative researchers and/or data scientists
3+ years of experience conducting research in an organization or consulting environment, including working with key stakeholders to understand and clarify their people research needs, and communicating analyses to technical and non-technical audiences
Experience conducting research on internal company tools and processes and with internal users
Experience with survey design, sampling, deployment and analysis
Experience and desire doing research on integrity, safety, or wellbeing topics
Meta builds technologies that help people connect, find communities, and grow businesses. When Facebook launched in 2004, it changed the way people connect. Apps like Messenger, Instagram and WhatsApp further empowered billions around the world. Now, Meta is moving beyond 2D screens toward immersive experiences like augmented and virtual reality to help build the next evolution in social technology. People who choose to build their careers by building with us at Meta help shape a future that will take us beyond what digital connection makes possible today—beyond the constraints of screens, the limits of distance, and even the rules of physics.
Meta is committed to providing reasonable support (called accommodations) in our recruiting processes for candidates with disabilities, long term conditions, mental health conditions or sincerely held religious beliefs, or who are neurodivergent or require pregnancy-related support. If you need support, please reach out to firstname.lastname@example.org
(Colorado only*) Estimated salary of $157,000/year + bonus + equity + benefits
*Note: Disclosure as required by sb19-085(8-5-20)