Skip to main content

Meta has finally released the long-awaited Passthrough Camera API for Quest, which gives developers direct access to the headset’s passthrough RGB cameras for better scene understanding—kickstarting the next generation of more immersive mixed reality experiences on Quest.

Until now, Quest’s passthrough cameras were mostly locked down, limiting what developers could do beyond Meta’s built-in functionalities. The company mentioned back at Connect in September it would eventually release Quest’s Passthrough Camera API, although it wasn’t certain when.

Now, in the Meta XR Core SDK v74, Meta has release Passthrough Camera API as a Public Experimental API, providing access to Quest 3 and Quest 3S’ forward-facing RGB cameras.

Image courtesy Roberto Coviello

Passthrough camera access will essentially help developers improve lighting and effects in their mixed reality apps, but also apply machine learning and computer vision to the camera feed for things like detailed object recognition, making mixed reality content less of a guessing game of what’s in a user’s environment.

When it was announced last year, former Meta VP of VR/AR Mark Rabkin said the release of Quest’s Passthrough API would enable “all kinds of cutting-edge MR experiences,” including things like tracked objects, AI applications, “fancy” overlays, scene understanding.

This marks the first time the API has been publicly available, although Meta has previously released early builds with select partners, including Niantic Labs, Creature, and Resolution Games—which are presenting today at GDC 2025 in a Meta talk entitled ‘Merge Realities, Multiply Wonder: Expert Guidance on Mixed Reality Development’.

Granted, as an experimental feature, developers can’t publish apps built using the Passthrough Camera API just yet, although it’s likely Meta is again taking an iterative approach to the feature’s full release.

The v74 release also includes Microgestures for intuitive thumb-based gestures (e.g., thumb taps, swipes), an Immersive Debugger so developers can view and inspect Scene Hierarchy directly within the headset, and new building blocks, such as friends matchmaking and local matchmaking.

Quelle:

Google Invests $250 Million into HTC VIVE to Support Android XR HeadsetsExamples

Google Invests $250 Million into HTC VIVE to Support Android XR Headsets

Following the debut of AndroidXR late last year, Google is investing quarter of a billion…
28.01.2025
Can Simulation Training Close The Skills Gap Faster Than Traditional Learning?AIImmersive LearningXR

Can Simulation Training Close The Skills Gap Faster Than Traditional Learning?

Learn how simulation-based training accelerates skill development and prepares your workforce for real-world challenges. Closing…
13.07.2025
DLR-Robotik und Mechatronik setzt Forschung für immersives taktiles Internet in Phase II des CeTI-Exzellenzclusters fortXR

DLR-Robotik und Mechatronik setzt Forschung für immersives taktiles Internet in Phase II des CeTI-Exzellenzclusters fort

Das DLR-Institut für Robotik und Mechatronik wird seine herausragende Forschung für immersives taktiles Internet im…
13.07.2025

Leave a Reply