Openxr tagged news

Valve is Transitioning To OpenXR

OpenXR was created with the goal to enable engines and developers to target a single non-proprietary SDK, easing the friction in creating polished VR experiences. Valve has worked closely with VR hardware vendors, game engine developers, and graphics hardware providers to develop this new API and we believe it represents a big step forward in cross-vendor application support. Valve expects new features on SteamVR to appear on the OpenXR side, rather than as new OpenVR APIs. Find out what this change means for both the Developers and the Users.

AREA Webinar “Introduction to OpenXR 2020” now online

Christine Perey of PEREY Research & Consulting hosted a public webinar in early June 2020. Neil Trevett, President of the Khronos Group, and Brent Insko, Lead VR Architect at Intel and OpenXR Working Group Chair, presented the OpenXR architecture. They provide an overview of a typical OpenXR application lifecycle including the order of function calls, creation of objects, session state changes, and the rendering loop. The webinar video and presentation slides are now available online.

Introduction to OpenXR Webinar June 2, 2020

Join the AREA Interoperability & Standards Webinar on June 2nd, 2020, to hear: Paul Davies of the Boeing Company will share how following the progress of OpenXR and other standards is helping companies future-proof their deployments and investments in AR technologies. Neil Trevett, President of the Khronos Group, and Brent Insko, Lead VR Architect at Intel and OpenXR Working Group Chair, will present the OpenXR architecture, and provide an overview of a typical OpenXR application lifecycle including the order of function calls, creation of objects, session state changes, and the rendering loop.

On June 2nd, Christine Perey of PEREY Research & Consulting will host a public webinar: Paul Davies of the Boeing Company will share how following the progress of OpenXR and other standards is helping companies future-proof their deployments and investments in AR technologies; Neil Trevett, President of the Khronos Group, and Brent Insko, Lead VR Architect at Intel and OpenXR Working Group Chair, will present the OpenXR architecture, and provide an overview of a typical OpenXR application lifecycle including the order of function calls, creation of objects, session state changes, and the rendering loop.

At Laval Virtual, Ryan Pavlik presented “Unifying Reality: Building Experiences with OpenXR”, a master class on OpenXR, the open standard API for building VR and AR experiences that work across devices, now and into the future. With version OpenXR 1.0 officially released at SIGGRAPH 2019, the standard has grown through increased adoption, vendor and multi-vendor extensions for additional functionality, and the upcoming release of the conformance test suite for verifying runtimes. As the OpenXR specification editor, Ryan provided an in-depth look at how the OpenXR application is structured, how the “action”-based interaction system works, and more. Below is the full recording and slides of his presentation. You can also see what his talk was like in VR here.

​Collabora announces the 0.14 release of xrdesktop, the Open Source project which enables interaction with traditional desktop environments, such as GNOME and KDE, in VR. The most exciting improvement is that xrdesktop is now able to run on XR runtimes providing the OpenXR API, which enables running xrdesktop on a full Open Source stack with Monado.

Laval Virtual asserts its expertise and its role as a facilitator in virtual reality with 6 conference cycles: VRtical, TransVRsal, ConVRgence, Virtual World, Art and Tech Talk. There will be two talks this year that are Khronos related:

  • Unifying Reality: Building Experiences with OpenXR - Ryan Pavlik (Collabora)
  • Building the Metaverse one open standard at a time – Khronos APIs and 3D asset formats for XR - Neil Trevett (President, The Khronos Group)

Learn more about this event and register.

Blender, a popular free open-source modeling and animation tool, is soon to get VR support via the OpenXR API. Initially users will be able to step into their 3D scenes to see them up close and at scale, with new features expected in the future. The next version of Blender, version 2.83 planned for release in late-may, will include a first wave of VR support, the company recently announced. VR support is being added via the OpenXR API, which will allow the software to interface with any headset supporting OpenXR (which has wide support in the VR industry, though is still in the early process rolling out to individual headsets).

Oculus releases OpenXR Mobile SDK

​Oculus has released their OpenXR Mobile SDK v1.0.6. This is a prototype version of OpenXR support that is presented as a developer preview. The OpenXR Mobile SDK includes the resources necessary to use the OpenXR API for native development of VR apps for Oculus Quest. OpenXR offers an alternate development path that allows developers to create portable code that can be used on devices from multiple vendors.

LunarG releases OpenXR overlay extension open source implementation

LunarG announces the release of an open source implementation of the experimental XR_EXTX_overlay extension, introduced in OpenXR 1.0.8. An overlay composition layer can add a rich variety of content into other XR applications. Examples of overlay applications include desktop OS windows in-world, in-game heads up display (HUD), virtual keyboard, and chat. Work continues on the implementation with the goal of running many applications at the same time. Support for all the functions in the OpenXR core specification is planned for a later release. Please take a look at the LunarG release and refer to this document for more information.

Ultraleap Joins Khronos to Help Bring Advanced Hand Tracking to the OpenXR Open Standard

Ultraleap has joined The Khronos Group,and started participating in the OpenXR Working Group. Ultraleap is focused on facilitating highly engaging, natural interaction between people and technology. The fast‐growth company employs over 150 people in the UK, US and Asia. It has become the first to offer the full vertical stack of software and hardware to enable immersive virtual touch controls for the automotive, advertising, industrial, immersive entertainment and enterprise sectors.

Founder and CTO of Third Dimension Technologies (TDT) recently gave a SMPTE webinar for members titled “Streaming Model for Field of Light Displays” (SMFoLD). The webinar focused not on the displays themselves, but the technology needed to stream real-time field of light video with synchronized sound over more-or-less ordinary network connections. TDT is working on this problem along with the Oak Ridge National Laboratory (ORNL) in a project managed by the Air Force Research Laboratory (AFRL). The title of the project is “Open Standard for Display Agnostic 3D Streaming” (DA3DS). The DA3DS project has taken the approach of transmitting not the images but the OpenGL primitive graphics calls over the network along with the data needed by the OpenGL calls. Learn how the DA3DS project is using OpenGL and how OpenXR plays a part.

devilish