The Reality Labs team at Meta is looking for Lead Computer Vision Engineers to support our engineering teams as we build towards our goal of helping more people around the world come together and connect through world-class Augmented, Mixed and Virtual Reality and on-the-go hardware and software. With global departments dedicated to AR, MR, and AI research, computer vision, haptics, social interaction, and more, we are committed to driving the state-of-the-art forward through relentless innovation. AR, MR and AI potential to change the world is immense — and we’re just getting started. Meta is building the Metaverse, the “spatial internet” where immersive virtual worlds will coexist with the real world. Augmented and Mixed reality will transform the way people come together to interact, work and play. By developing new hardware and software products capable of understanding the real world and the user within their environment, we aim to make it possible for people to interact with content in their environment and share it with others. Our Reality Labs division explores, develops and delivers cutting-edge technologies that serve as the foundations for the Metaverse and other future Reality Labs products, such as Oculus headsets, future AR glasses and our FB Family of Apps (Messenger, Instagram, WhatsApp). From Visual Localization, SLAM, 3D reconstruction, Context/Semantic Understanding, Mapping, Tracking, and Sensor Fusion, our team is focused on taking new technologies from early concept to the product level while iterating, prototyping, and realizing the human value and new experiences they open up.
Computer Vision Engineer (Leadership) Responsibilities
Minimum Qualifications
Preferred Qualifications
About Meta
Meta builds technologies that help people connect, find communities, and grow businesses. When Facebook launched in 2004, it changed the way people connect. Apps like Messenger, Instagram and WhatsApp further empowered billions around the world. Now, Meta is moving beyond 2D screens toward immersive experiences like augmented and virtual reality to help build the next evolution in social technology. People who choose to build their careers by building with us at Meta help shape a future that will take us beyond what digital connection makes possible today-beyond the constraints of screens, the limits of distance, and even the rules of physics.
Equal Employment Opportunity
Meta is proud to be an Equal Employment Opportunity employer. We do not discriminate based upon race, religion, color, national origin, sex (including pregnancy, childbirth, reproductive health decisions, or related medical conditions), sexual orientation, gender identity, gender expression, age, status as a protected veteran, status as an individual with a disability, genetic information, political views or activity, or other applicable legally protected characteristics. You may view our Equal Employment Opportunity notice here .
Meta is committed to providing reasonable accommodations for qualified individuals with disabilities and disabled veterans in our job application procedures. If you need assistance or an accommodation due to a disability, fill out the Accommodations request form .
NVIDIA is searching for an outstanding AI researcher to join the learning and perception research team. We are passionate about...
How to applyBase Pay Range: $124,100.00 – $211,000.00 Annually Primary Location: USA-CA-Milpitas-KLA KLA’s total rewards package for employees may also include participation...
How to applyAptiv is a leader in in-cabin sensing technology and a well-established supplier of sensing and computer hardware as well as...
How to applyResearch Scientist Intern, 3D Computer Vision and Generative AI (PhD) Apply to this job Location pin icon Redmond, WA •...
How to applyJoin the leader in entertainment innovation and help us design the future. At Dolby, science meets art, and high tech...
How to applySummary We are a team in Apple developing the state-of-the-art 3D computer vision algorithm for 3D reconstruction and 3D content...
How to apply