Catching up after the Khronos 3D Commerce Working Group Webinar: “Render Everything Everywhere”
“Render Everything Everywhere”
Online merchandising has evolved from using text descriptions of products to images, and now to interactive 3D. This new medium will be impactful for e-commerce, especially when combined with augmented reality (AR) and virtual reality (VR) platform capabilities. These visual computing technologies enable customers to visualize products on their device, in their space, or immerse them in a virtual environment for increased product interaction and engagement. However, there are significant industry hurdles holding back cross-platform deployment of 3D assets at industrial scale which are hampering widespread adoption.
Standardization is a critical next step to make 3D models easily usable by retailers so they can bring 3D Commerce shopping experiences to consumers everywhere.
Recently, the Khronos 3D Commerce Working Group hosted a webinar to discuss its activities, including why industry alignment on the glTF file format (the “JPEG for 3D”) is crucial, and how standardization will bring new opportunities to any designer, retailer, manufacturer, or technology company developing 3D experiences.
The panel was led and moderated by Leonard Daly, President of Daly Realism, who was joined by industry experts including:
- Shrenik Sadalgi, Director of R&D, Wayfair and 3D Commerce Working Group Chair
- Norbert Nopper, Managing Director and co-founder, UX3D
- Eric Bennett, Senior Science Manager, Amazon Imaging Services
- Brent Scannell, Product Owner in the Entertainment Creation Products group at Autodesk 3ds Max
- Max Limper, CEO and co-founder of DGG
- Nathaniel Hunter, COO and co-founder, DreamView
- Thomas Lucchini, Senior Program Manager at Microsoft of Babylon.js
- Jon Wade, Product Manager at Shopify
At the end of the webinar, the audience submitted questions for panelists. As this dialogue benefits the whole community, we’re sharing the answers as a Q&A. You can watch the complete webinar recording, but this is not a verbatim transcription: The questions have been reordered for a logical flow, and additional data released since the webinar has been added. If you have questions of your own, comment below and we’ll be sure to get back to you!
Question: Is the future aimed at being able to take any 3D model, textured & lit in any renderer (e.g. Chaos Group - Vray), and be able to open that model in software that can read/display the 'new' format? What examples do you see as closest to “JPEG for 3D” today?
Shrenik (Wayfair & 3D Commerce Chair) explained that it is the aim of the Khronos 3D Commerce Working Group to create a universally understood representation of 3D virtual products so they can be displayed consistently across platforms and devices. Today, the most promising specification for universally distributing and displaying the product is glTF. We are working to ensure that DCC tools make it easy to publish 3D Commerce assets by adding publishing support for glTF and its extensions. It will also be beneficial to have as many renderers as possible support Physically Based Rendering (PBR) workflows to enable frictionless publishing workflows to glTF.
Question: Is it possible to download 3D products and environments with high quality and workable interactions with the 3D models?
Yes, this is possible with glTF and already done in practice. The key is to preserve the scene’s node structure that is needed to display, highlight, hide or move individual parts of a 3D model during runtime inside the glTF file. Also, it is key to understand that glTF is a pure file format; interactive behavior (such as event routes) is not part of glTF itself, but is implemented by the surrounding application which enables enormous flexibility. You can use glTF to store animations which you can then trigger individually inside your application.
Question: The demos I’ve seen thus far rotate the camera around the product. Are there plans to specify rotation of the product, keeping the camera fixed, as one typically sees in a showroom or studio? Some manufacturers and retailers also strongly prefer to only rotate about the Y axis or other limitations, e.g. to prevent the product from being turned upside down
Shrenik (Wayfair) mentioned that we’re trying to achieve per-pixel rendering consistency across viewers across platforms. We believe the presentation of the product (camera angles, freedom of rotation, annotations etc.) is a part of the retailer, the advertiser/social platform, or the brand’s marketing strategy and that should be completely under their control.
Question: When will the main platforms where customers are online i.e (Instagram, Facebook, WeChat) start supporting complex high polygon glTF configurable models?
Shrenik (Wayfair & 3D Commerce Chair) said many of these platforms already support 3D and are partnering with retailers to advertise virtual products. It is the goal of 3D Commerce to help streamline how to pervasively leverage 3D as a new medium.
Question: Is there any additional information on adding product options and colorways to a glTF model?
The Working Group is working to enable consolidation of products options into one model file that will help streamline content distribution, and it will be interesting to learn how this feature functions with back end data.
Jon (Shopify) pointed to the specific glTF extension, KHR_materials_variants, as the first step in merging that data together in a single asset. This extension was just ratified, and we just released a blog showing it in operation. It’s focused strictly on the materials within the model, and it doesn’t handle geometry variations. Geometry will be coming in the future.
Norbert (UX3D) referred to a new wave of glTF PBR materials including metallic or dielectric materials for effects such as sheen on cloth, or a clear coat for car paint. There is a lot of activity here, so also look at the 3D Commerce Asset Creation Guidelines where you’ll find a deep dive into how to do this on a technical level.
Brent (Autodesk) added that the root of the question is, how can we find out more about how to manage configurations and options through glTF and be aligned with 3D Commerce requirements? The Working Group would love to hear about the details and examples of your problems. The more context we have on real-world pain points, the more successful our solutions can be.
Question: Regarding the previously asked product variants question, everyday merchants will have a hard time with the idea of a PBR Base Asset. They will want to “own the model” and render photos with that model. This also simplifies conversion to USDZ, because USDZ’s spec might not line up with glTF’s. Any comments?
Max (DGG) mentioned that the PBR Base Asset doesn’t need to be in the glTF format. The only recommendation we make is that there should be a base asset from which the optimized variants for real-time publishing targets (be they encoded in glTF or USDZ) can then be derived automatically, and merchants definitely own that model. We recommend to keep an eye on the materials (creating them in such a way that they can be easily converted to PBR real-time materials, or better use such materials in the first place where possible), and to automate the conversion to publishing targets instead of modeling the optimized variants manually. This may involve down conversions where trade-offs need to be made; for example, you may need to convert tiled textures to atlased ones when aiming for base color and additional AO map and exporting to USDZ, because the format does not support two UV channels (in contrast to glTF). But these things should happen only on the “last mile” to the publishing targets, and not affect the base asset.
Jon (Shopify) added PBR, variant merging, etc. are very technical concepts that might not map on to a merchant’s mental models and day-to-day workflows. Our standards are really pushing towards making sure that models look accurate when visualized and are performant both for load times and a smooth end-user experience. However, specifications and recommendations related to storage and delivery don’t require implementers to 1:1 map the technical details with the user experience (UX). I’m confident platforms and tools can come up with ways to leverage the performance/accuracy we are outlining, but present content management to merchants in a way that fits their mental model for merchandising. Ideally, a lot of these optimizations and validations happen through automated pipelines without users needing to guide them or even know that they are taking place while still enjoying the benefits.
Question: In addition to variants, what about accessories/ detachable parts? How do you handle assets with removable/ added parts?
Supporting mesh variants is coming soon! The glTF KHR_mesh_variants extension is in design and will provide a solution for this use-case.
Question: Are there any updates on PBR Next and compatibility over multiple maps and material systems?
The 3D Formats Working Group at Khronos just announced the release of a wave of new Physically Based Rendering (PBR) material extensions for glTF, including transmission, clear coat, and sheen — and there are more materials coming. This topic is of great interest to tool and technology vendors in 3D Commerce. We are all keeping a close eye on the evolution of PBR materials in the context of glTF to ensure that compatibility is maximized and well understood, so that our end users don’t need to deal with conversion artifacts downstream.
Question: What’s the challenge in dealing with assets with transmission and emissive effects?
Norbert (UX3D) outlined that there are two rendering techniques: ray tracing and rasterization. Using ray tracing, these material effects are part of the overall lighting calculation. For rasterization, the rendering engine has to be enhanced to provide a similar visual effect. The glTF PBR (Physically Based Rendering) material definitions are designed to enable visual consistency even when different rendering techniques are used.
Question: Can Khronos support the concept of 1SKU=1 glTF? From day-to-day experience with customers on the ground, hyper-optimizing variant factoring out can confuse most people
Max (DGG) explained that the current recommendation is to use 1 SKU=1 “Base Asset”. This asset, which does not necessarily have to be stored in glTF format, should be the most faithful, detailed 3D representation of the product, ideally already set up with real-time-ready materials where possible. Such detailed base assets may be used for CGI processes, generating realistic product images for a catalog, for instance. From each base asset, the optimized publishing targets (e.g., a reduced GLB for Android and a reduced USDZ for iOS) can then be derived automatically - pretty much like it is done in the background if you upload a video to YouTube - you don’t have to care about what is delivered, they create optimized versions automatically in the background. See our blog on the recently announced Real Time Assets Creation Guidelines or visit the GitHub repository.
Question: How do make a digital twin interoperable with 3D goods and services on the vendors' platform or one's own computing infrastructure, and still keep one's interest confidential?
Shrenik (Wayfair & 3D Commerce Chair), the real-time asset is usually published for specific use cases for consumption in real-time experiences (such as AR, VR, 3D-on-web etc.). It’s usually derived from a high quality asset that may contain proprietary information (such as CAD data, finer design details etc.). Check out the Asset Creation guidelines for sample workflows that help maintain a boundary between proprietary and public information. In order for 3D content to gain widespread adoption in retail and to help streamline the creation process, we need to have interoperable publishing standards so content can be exchanged between vendors and platforms in the industry. An analogy from the world of photography is to distribute JPEG files while keeping the PSD file private.
Question: Are you involving Rhino software?
The standards developed at 3D Commerce are vendor agnostic and applicable to multiple content creation tools. Companies, including tool vendors, are welcome to participate directly in development discussions as Khronos members, or join the 3D Commerce Advisory Forum.
Question: Are you considering immersive experiences (AR,VR,XR,MR) for the standardization of the file format?
Shrenik (Wayfair & 3D Commerce Chair) explained that while there are few callouts to those specific experience types in much of what we have released so far, there is significant overlap between creating content for 3D on the web, native apps, and AR/VR experiences. In all cases, you need 3D assets that are visually accurate and performant. Currently, our primary focus is on creating genuinely useful content creation guidelines for today’s primary deliverable platforms - website and mobile - but we believe this work will have significant benefits for all kinds or 3D and XR applications and use cases.
Question: Any plans here for streamlining UX for new users trying out AR for the first time? In particular, for a cross-platform web AR experience that doesn’t require downloading of any apps, and that minimizes permissions needed to ask the user (“Access to camera?”) for every new commercial vendor?
Shrenik (Wayfair & 3D Commerce Chair) said that cross platform Web AR standards (WebXR) already exist and have good support across browsers. Major OEMs and platforms also support custom HTML tags for inserting 3D/AR content into web pages making it very easy for content creators to create and distribute 3D content and for customers to experience it. The web is an accessible platform to distribute AR content and many retailers already have web-based AR experiences which do not require a user to download any apps. Users still need to grant camera permissions in order for any app/website to access the camera, which is an important and necessary step since that’s core to managing their privacy. But it may be a good idea for platforms to standardize AR experience onboarding so that users understand AR capabilities and exactly how to interact with AR content. This will make AR experiences more accessible, less overwhelming and easier to use.
Question: How can we standardize web viewers? We have tried different viewers and all differ in the final visual output
Max (DGG) responded that the upcoming 3D Commerce Viewer Certification Program will have a standardized testing framework to encourage consistency between viewers, and there are already efforts to compare differences in rendering between viewers. Khronos is also working on a glTF sample viewer to help with an agreed baseline. Eventually, the output does not only depend on the asset, but also on the lighting environment, HDR settings and post-processing settings - pretty much like photos of a real product will usually differ across various studios taking them. Nevertheless, on the technical level, the goal should be that similar lighting environments always produce similar results, and there the above mentioned projects are trying to help.
Question: Instead of trying to standardize every aspect of a 3D file how about allowing XR on the web cross platform? That way we could just use the same 3D viewer (Google's model-viewer for example) and the model will always look the same
Jon (Shopify) agreed this would be ideal in some ways but there will always be multiple viewers in practice: Proper XR support requires all the necessary components to be a part of the browser. Safari has a huge user base and as of yet doesn’t support all the requirements to make AR via WebXR possible. The reality is that there should still be multiple choices for renderers (three.js, babylon, model-viewer, etc) for healthy competition and so that users can choose between depending on their needs. That’s why the 3D Commerce Viewer Certification Program will be key as we need to see some level of visual parity between all these viewers.
Question: Does the viewer certification process also include event tracking pipelines that will play into how analytic data can be gathered consistently?
Jon (Shopify) explained that the current Certification Program is currently focused on enabling better visual (pixel) consistency across viewers for 3D models of commercial products.
Question: How far do you think WebGL can go without browsers crashing, i.e. could we run full on smooth, very large experiences in the browsers? Think of a powerful 3D game, that level of power.
Jon (Shopify) thinks that getting 90%+ of the way there is certainly possible. There are gaps right now, but there is promise in technologies like WebAssembly and the upcoming WebGPU specification that narrow the delta. I think a more fundamental problem is the size of the content: Modern video games have gigabytes of content to reach the fidelity you see, but the expectation on the web is that experiences begin within hundreds of milliseconds. Even with better APIs content loading speed will be an issue.
Thomas (Microsoft) added that while there are some limitations such as no-multithreading and, as Jon mentioned, the download of the assets, it is already possible to create games today as long as the constraints are taken into consideration at design time. There are already some games built on WebGL: Minecraft Classic and the upcoming Age of Ascent.
Question: Can we expect that WebGPU could be an alternative way to showcase 3D environments with objects and shoppers interactions in real time?
Question: How can we scale the 3D modeling process?
While there are different approaches such as 3D scanning or AI-supported modeling from photos, we are focusing on streamlining “classical” 3D modeling processes by providing Asset Creation Guidelines that can help modelers to produce better 3D models for e-commerce in a more streamlined way with consistent results.
Question: Is there a place for photogrammetry (like via RealityCapture) to get plugged in here..as the start for geometry creation?
Eric (Amazon) affirmed that photogrammetry definitely has its value as a starting place for model creation, but it also brings its own challenges. Depending on the surface characteristics of the physical object, photogrammetry can yield a wide variety of geometry qualities ranging from very accurate to extremely noisy. Cleaning up this geometry can be challenging for artists, as it doesn’t include the types of well-behaved, easily-editable surfaces that artists have in a hand-crafted model. UV maps are often less-well-structured than artist-created UVs. Regardless, geometry complexity needs to be reduced to meet the needs of most experiences, particularly realtime ones, by orders of magnitude. From a materials perspective, most photogrammetry packages do not create the complex materials the Asset Creation Guidelines require, but we hope that as capture technologies improve, that gap will decrease over time.
Question: How can we get the Apparel apps to save to glTF when they keep waiting on the customers to drive demand? What can we do to get the demand there so that apps will follow?
Shrenik (Wayfair & 3D Commerce Chair) said this is a common problem with any emerging technology. But early adopters are establishing 3D as a meaningful merchandising medium for retail with a clear ROI and we believe that customers will begin to expect this for any retail experience online. This momentum will spread across all verticals including apparel. Standardization efforts will further enable widespread adoption by lowering the barrier to content creation, increasing content availability across multiple platforms, and establishing clearer ROIs on 3D for Commerce.
Brent (Autodesk) added that there are a few different paths towards widespread adoption for specifications such as glTF. One is to make it vastly superior to existing alternatives such that the integration application can actually deliver a competitive advantage to their end users, which I think is interesting especially in the context of PBR Next with glTF. Another is to actually seed adoption through member activities. By this, I mean having the working group themselves invest in integrations to existing applications when there is appreciable benefit for the group. Adoption is always top of mind of the working groups. If there are outreach opportunities that we are missing out on, please let us know.
Question: Are you working with Material Exchange for materials?
We’re not directly working with Material Exchange for materials at this time, but they are certainly doing some interesting work with respect to digitization of physical material products at a massive scale. As they are a company, not a standards organization, Khronos would warmly welcome their participation as Khronos Members so we can work together to more effectively solve industry problems.
Question: What about screen standards or screen calibrations? Some issues are about content but other issues are about hardware
Shrenik (Wayfair & 3D Commerce Chair) noted that the 3D Commerce Certification subgroup is taking into consideration color tone mapping as a part of the viewer certification process. Our current focus is on achieving consistency in software rendering on web and native platforms. Screen standards and screen calibrations and other hardware complexities are the focus of some other standards groups at Khronos that focus on the interface between software and silicon. We are looking to cooperate with other Working Groups and standards organizations focusing on their portion of the stack, to achieve end-to-end consistency in rendering virtual products for 3D Commerce use cases.
Question: Is there any mechanism to deal with monitor color calibration on various devices having various settings/exposures/hues etc. at the end customer?
Shrenik (Wayfair) noted that standards at each software layer and in the interface between the software and hardware layer play a major role in achieving standardization across platforms and devices. We’re dependent on these open standards (Eg. WebGL, OpenGL) for achieving true visual consistency across devices. Monitor settings and user settings (such as extra warm night modes) are currently beyond the current scope of 3D Commerce but we hope to help motivate and coordinate an industry solution to this over time.
Question: How do you address the haptics and acoustics of products that have been reasons for returning them? For example, an operating blender is too loud or a sofa cushion is too soft.
These are real-world issues that lead to returns and are worthy of their own deep exploration. However, our Working Group is currently focused on improving aspects of 3D asset creation, management and visual rendering for e-commerce and isn’t looking into those issues right now. But we do see standardization opportunities in non-visual aspects of merchandising products for e-Commerce in the future!
Question: Where are the women on your panel? Women drive 70-80% of all consumer purchasing decisions. Many of them are designers
The work being done in the 3D Commerce group has created a huge demand for Khronos 3D Commerce members to speak at many events. To help balance workloads, while working inside members’ own time constraints, we rotate through members. To hear from the women 3D Commerce members, please check out some of our other archived talks and presentations listed on the 3D Commerce page such as AWE 2020.
Question: What features of the standards development are in contention that must be resolved and what is the process for picking winners to resolve them?
Shrenik (Wayfair & 3D Commerce Chair) said that Khronos follows a well-proven democratic process to build industry consensus within its Working Group while welcoming broader community feedback. Everything is designed under the Khronos IP Framework that protects both Khronos members and companies that use Khronos standards.
Question: How can we sign up for the Khronos Membership or the Advisory Forum?
Any More Questions?
You can learn more about the 3D Commerce Working Group efforts at https://www.khronos.org/3dcommerce. If you would like to become directly involved in the 3D Commerce Working Group, any company is welcome to join Khronos to have a voice in the evolution of any of its standardization activities.