AWS’ approach to spatial computing focuses on ways to enable customers to be successful. This focus extends to the company’s partnerships in the industry. AWS already publicly supports a plethora of XR companies big and small, including Campfire 3D, Meta, Magic Leap, NVIDIA, and many others. AWS’ Spatial Computing unit incorporates multiple divisions, including gaming, film, simulation, and geospatial, which makes sense when you consider that all of these mediums are inherently spatial. Many, but not all, will use a game engine to render their worlds. AWS has also heavily supported open standards such as OpenXR and even turned its game engine, Lumberyard, into the Open 3D Engine in partnership with the Linux Foundation to create an open-source game engine that it regularly updates.
Unity’s new XR hands package lets developers add hand tracking without using headset-specific SDKs. The package, currently in preview, supports hand tracking via Unity’s standard XR Subsystem and OpenXR. It works with other standard systems like the XR Interaction Toolkit. XR Hands already works with Quest and HoloLens, and Unity plans to add support for other OpenXR headsets with hand tracking.
Rendering in VR demands that hardware and applications maintain very high frame rates. A typical PCVR (PC and VR) setup comprises a PC connected to a head-mounted device (HMD) and a pair of hand-held controllers that must all function in real-time. This setup must contend with fluid controller and game movements, 6DoF animations, head movements, and two render passes (one per eye) at 90 to 120 FPS. Switch this setup to a wireless HMD, and the communications channel (e.g., Wi-Fi, 5G, etc.) must also be up to the task of real-time data transfer.
Last year, Qualcomm collaborated with Guy Godin, creator of Virtual Desktop, to enhance PCVR rendering performance. Qualcomm added Space Warp functionality to their Adreno Motion Engine which runs on all headsets powered by the Snapdragon XR2 and its Qualcomm Adreno GPU. Space Warp produces missing frames just-in-time on the HMD with no PC overhead, thus reducing PC-to-HMD bandwidth and stress on the encoder. This doubles the available PC render time and the effective encoder bitrate for PCVR-to-HMD streaming.
Virtual reality headsets are getting smaller, faster, wireless, and more portable than ever before. As the hardware advances, we are also seeing a seamless transition on the software side to OpenXR.
Using Monado, Colllabora’s open-source OpenXR runtime, intern Moses Turner completes a hand tracking project. In this blog, Moses guides us through the process, limitations, results and next steps.
OpenXR is composed of the core feature-set, multi-vendor extensions, and vendor-specific extensions. One of those vendors-specific extensions is VARJO_quad_views, available on Varjo HMDs as the name of the extension indicates. By default, in OpenXR, an application will generate two images, one for each eye. This is known as stereo views. But one unique capability of Varjo HMDs (except the Aero), is to have two displays per eye, allowing human eye resolution in the center area of the screen. The purpose of VARJO_quad_views is to enable the application to generate one view covering the entire field of view at the resolution of the context screen, and one inner view to match with the resolution and field of view of the focus screen (padded for taking into account the head movement and latency).
The VARJO_foveated_rendering extension is augmenting the VARJO_quad_views extension by adding eye tracking. The idea is that every rendered frame the application will be provided a different position of the inner view based on where the eyes are looking at. But also, since we know where the eyes are looking, the size of the inner view is much reduced from the default quad_views. And then the resolution of the outer view can also be reduced, as there is no need to even calculate as many pixels as the context screen can display, because even though it is not as high resolution as the focus screen, it is still higher resolution than the eye can detect at its periphery.
HTC VIVE announced developers can now join the OpenXR public beta program for the VIVE Focus 3. Developers have access to start new projects or port existing applications to multiple XR devices. OpenXR compatibility benefits consumers in addition to developers, giving them more flexibility to enjoy their VR content on multiple devices, including high-performance headsets like VIVE Focus 3.
The Award-winning technology company announces it received the prestigious financial grant form Epic, the company behind the popular Unreal Engine and Fortnite. The Epic MegaGrant is awarded for the further development of its OpenXR runtime, adopting the industry-standard. With recently being honoured an Innovation Award at CES last January, this is the 2nd award the Dutch technology company wins in Q1 of 2022 based on its Simulated Reality technology.
Godot recently released Godot 4.0 alpha 4 release candidate. In the release, Godot promoted OpenXR from a plug-in into the core for better integration with Vulkan and other engine components.
The Khronos Group has an open Request for Quote (RFQ) for OpenXR CTS Improvements. The OpenXR working group has developed a conformance test suite for validating vendor runtimes against the OpenXR specification to help ensure uniformity across AR and VR platforms. This project will extend the current conformance test suite with additional tests of the core specification as well as implementing tests for a number of extensions. The OpenXR Working Group may also seek some sample development to provide application developers simple example code to bootstrap their own development efforts. The deadline for RFQ submission has been extended to Friday, March 25th, 2022.
The Haptics Industry Forum (HIF) and The Khronos® Group have entered into a cooperative liaison agreement to foster synergy between the two organizations to encourage the integration of advanced haptics functionality into the Khronos OpenXR™ open standard for portable augmented and virtual reality, to enable broad availability of haptics in the metaverse and beyond. This agreement enables HIF and Khronos to collaborate with a shared goal to enable broad, cross-platform access to next-generation haptic feedback in XR applications, enabling rich multisensory experiences to reach beyond 3D visuals for the eyes and spatial audio for the ears – and include expressive haptics for touch. Read the press release.
The Haptics Industry Forum (HIF) and The Khronos® Group have entered into a cooperative liaison agreement to foster synergy between the two organizations to encourage the integration of advanced haptics functionality into the Khronos OpenXR™ open standard for portable augmented and virtual reality, to enable broad availability of haptics in the metaverse and beyond. This agreement enables HIF and Khronos to collaborate with a shared goal to enable broad, cross-platform access to next-generation haptic feedback in XR applications, enabling rich multisensory experiences to reach beyond 3D visuals for the eyes and spatial audio for the ears – and include expressive haptics for touch.
The recording for the panel, Complementary Standards for an Open Metaverse, held at the RealTime Conference features Khronos’ Neil Trevett, Martin Enthed, and Brent Scannell and is now available for viewing.
The Godot XR contributors released the latest version of the Godot OpenXR plugin. The release contains updates that provides Godot XR developers access to the latest XR APIs and features.
The new product combines the Snapdragon Spaces XR Developer platform and the ThinkReality A3 smart glasses. Snapdragon Spaces is based on the Khronos OpenXR specification to ensure portability and is the first headworn AR platform optimized for AR Glasses tethered to smartphones with an OpenXR conformant runtime.