Skip to main content

Openxr tagged news

Designed with a developer-first approach, the Snapdragon Spaces XR Developer Platform combines proven technology and an open ecosystem for headworn Augmented Reality (AR) devices to enable the creation of immersive experiences that seamlessly blur the lines between our physical and digital realities to usher in a new frontier of spatial computing. Snapdragon Spaces is based on the Khronos OpenXR specification to enable application portability and is the first headworn AR platform optimized for AR Glasses tethered to smartphones with an OpenXR conformant runtime.

Ultraleap launched it’s fifth-generation hand tracking platform, Gemini. The platform features:

    • Improved two-handed interaction
    • Faster initialization and hand detection
    • Improved robustness to challenging environmental conditions
    • Better adaptation to hand anatomy

Ultraleap’s OpenXR API implementation can be found at

During this tutorial held at ISMAR 2021, four world-renowned leaders in standards development with a focus on Augmented Reality, presented on open standards for AR interoperability. Proceedings from the session are now available including presentations and video. A website for the ISMAR21 tutorial organized by Christine Perey with the support of Khronos Group, ETSI and NIST on the topic of Interoperability and Standards for AR is now live.

The annual Auggie Awards have been the most recognized AR & VR industry awards in the world since 2010. OpenXR has been nominated for transforming the way developers deliver true cross-platform AR/VR experiences. Now is your chance to vote for OpenXR in the Best Developer Tool category!

RedGamingTech’s Paul Eccleston sits down with Khronos President Neil Trevett to discuss the technologies Khronos has been working on with Vulkan, including ray tracing, and revolutionary new tech such as glTF, 3DCommerce and OpenXR, which has potentially huge ramifications for virtual reality. As well as the current roadmap for Khronos standards, and more…

The field of 3D Computer graphics has grown from a niche technical curiosity in the mid-1970s to mass appeal and distribution via movies and games. We’ve seen applications grow from flying logos, to highly engaging real-time renderings in games, to synthetic humans and de-aged actors in movies finally crossing the “uncanny valley” to be nearly indistinguishable from reality. However, the creation of 3D assets - computer graphics objects and the worlds they inhabit - still requires highly skilled technicians and artists, presenting a bottleneck to more widespread applications, such as creating 3D graphics for websites and E-Commerce. LiDAR has the potential to alleviate this bottleneck.

The Virtual Learning Factory Toolkit (VLFT) project is a pioneering program commissioned by the European Union using virtual and augmented reality to enhance engineering education programs across Europe. Five EU partners make up the VLFT Consortium, including Estonia’s Tallinn University of Technology, Hungary’s Institute for Computer Science and Control, Italy’s Politecnico di Milano, National Research Council of Italy, and Sweden’s Chalmers University of Technology. Using Khronos® standards, the VLFT consortium has created a suite of tools to gamify learning, strengthen information and communication technology (ICT) skills, and better prepare students for jobs in 21st century manufacturing.

Phasmophobia has outlined plans for future updates with new content, an overhaul of the progression system, and OpenXR support. Swapping OpenVR to OpenXR will allow support for new headsets and controllers with easier integration.

War Thunder now uses the OpenXR standard, which provides better image quality, increasing picture sharpness compared to the previous implementation at the same super sampling settings. For example, a picture at 100% SS in SteamVR is comparable in quality with 140% SS in the previous implementation! Implementation of OpenXR allowed the developer to fix a lot of bugs, for example, the incorrect display from the gunner position or blurred object outlines.

In the release notes of the Microsoft Mixed Reality Toolkit 2.7, Microsoft noted additional support for OpenXR including:

  • Added support, both in-editor and at runtime, for the system-provided motion controller model on OpenXR.
  • Added support for WinMR gestures (select, hold, manipulation, navigation) on OpenXR.
  • Added support for controller haptics across legacy WMR, Windows XR Plugin, and OpenXR.
  • Added support for spatial mesh when using OpenXR on HoloLens 2.

Unreal Engine 5 is the next major evolution of Unreal Engine, redesigned and enhanced for the next generation of games, real-time visualizations, and immersive interactive experiences. It will empower game developers and creators across all industries to realize next-generation real-time 3D content and experiences with greater freedom, fidelity, and flexibility than ever before.

Unreal Engine 5 has a new VRTemplate using the OpenXR framework, the multi-company standard for VR development. The template is designed to be a starting point for all VR projects. It includes encapsulated logic for teleport locomotion and common input actions, such as grabbing and attaching items to your hand. The VR platforms currently supported by VRTemplate include:

  • Oculus Quest 1 and 2
  • Oculus Quest with Oculus Link
  • Oculus Rift S
  • Valve Index
  • HTC Vive
  • Windows Mixed Reality