The Immersive Web Emulation Runtime (IWER) is a TypeScript-based tool designed to enable the running of WebXR applications in modern browsers without requiring native WebXR support. By emulating the WebXR Device API, IWER provides developers with the ability to test and run WebXR projects across a wide range of devices, ensuring compatibility and enhancing the development process.
https://meta-quest.github.io/immersive-web-emulation-runtime/
Slides: https://docs.google.com/presentation/d/1jwq7Bi6oDQISt-ERG_PZkm9yqitM4VwVTgmO3sgPSPE/edit?usp=sharing
Dene Grigar, Frode Hegland, Felix, Fabien Bénétou, Ayaskant Panigrahi, Jim Strahorn, Peter Dimitrios, Peter Wasilko, Rob Swigart, Mark Anderson, Brandel Zachernuk
AI: Summary
This meeting featured Felix, a software engineer from Meta’s Reality Labs division, presenting the Immersive Web Emulation Runtime, a development tool for WebXR applications. The presentation was moderated by Fabien Bénétou. Felix explained how this runtime addresses the critical pain point of WebXR development – the disruptive cycle of coding on desktop, then putting on a headset to test, only to find black screens or crashes with no debugging feedback. The runtime provides a modular system that separates the emulation engine, control interface, and browser extension, allowing developers to test WebXR applications on desktop browsers with full controller and hand tracking emulation. Key features include spatial controls, input simulation, natural interaction emulation, system integration features, and advanced mixed reality capabilities. The tool also supports automated testing, cross-platform compatibility, and can capture real room environments for AR testing. The discussion touched on the broader implications for XR development workflows, the challenges of designing spatial interactions, and the importance of having proper debugging tools for this emerging medium.
AI: Speaker Summary
Dene Grigar participated as co-PI of the Future of Text project in XR, based in Vancouver, Washington, and director of the Electronic Literature Lab at Washington State University. She expressed enthusiasm for VR fitness applications like Supernatural and Beat Saber. She emphasized the importance of having multiple people in the group who don’t speak officially for their companies, noting that when Brandel speaks, he doesn’t speak for Apple.
Frode Hegland served as meeting facilitator and co-PI of the project, demonstrating his obsessive interest in text and spatial computing. He brought philosophical context by referencing books about perception and reality, connecting them to XR’s potential to reinvent representation systems. He asked practical questions about implementation, particularly interested in how end users and designers (not just developers) could access and test XR experiences. He drew parallels between Felix’s work and their academic workflow visualization project, seeing potential for creating “interaction languages” in XR. He was particularly intrigued by the concept of graceful degradation from XR to legacy platforms and the challenges of determining where interactions should be spatially located.
Felix presented as a Meta Reality Labs software engineer speaking in personal capacity, with extensive background in WebXR development and HCI studies from Duke University. He explained his motivation for XR development stemming from the realization that virtual worlds aren’t constrained by physical limitations. He detailed the technical evolution from the original WebXR polyfill through Mozilla’s emulator to Meta’s current immersive web emulation runtime. He emphasized the modular architecture allowing framework integration, demonstrated various use cases including his own first-person shooter game, and discussed the challenges of hand tracking complexity. He had to leave five minutes early but expressed interest in returning and potentially bringing colleagues like someone working on Blender UI in spatial computing contexts.
Fabien Bénétou served as moderator and provided crucial context for non-developers about WebXR development pain points. He vividly described the physical and intellectual frustration of the development cycle – putting on headsets, encountering black screens, removing headsets (messing up hair and earphones), making code changes, and repeating the process. He emphasized how this workflow can drive developers away from XR development entirely. He demonstrated automated testing examples and discussed the philosophical challenge of representing XR experiences to people without headsets, acknowledging that videos and emulations can never fully convey the spatial experience but serve as important stepping stones.
Ayaskant Panigrahi contributed technical insights about WebXR development, having previously met Felix at a WebXR hackathon in Seattle. He works with various companies on shared experiences and sees WebXR’s cross-platform compatibility as particularly valuable. He asked technical questions about micro-gestures and their integration with the emulation runtime, and provided context about MediaPipe for head and hand tracking, noting the mapping challenges between MediaPipe’s 21-point hand output and WebXR’s 25-point standard. He appreciated not needing the extension when using React Three XR framework.
Jim Strahorn briefly introduced himself as an architect-trained design thinker from Menlo Park, emphasizing his belief that important information should be visually self-evident, though he didn’t elaborate extensively on this philosophy during the presentation.
Peter Dimitrios identified as coming from Ward Cunningham’s Federated Wiki project and having extensive IBM background with AI work. He described himself as a “lurker” who had been quietly attending meetings while exploring VR applications for text. He suggested using generative AI to create test variations from captured headset events, potentially developing into macros for live usage.
Peter Wasilko actively engaged with technical questions about the emulation system, particularly interested in 3D Connection SpaceMouse support and the possibility of cursor-based controller binding. He inquired about saving named position presets and obtaining textual event stream representations with timestamps and state deltas, suggesting applications in pattern recognition through parser development. He appreciated the modular architecture and requested access to the slide deck.
Rob Swigart briefly introduced himself simply as a writer interested in text, though Dene provided additional context about his pioneering work developing the 1986 game Portal and extensive contributions to digital literature. He expressed interest in applying these XR effects to storytelling in the future.
Mark Anderson arrived late due to traffic and asked clarifying questions about the distinction between gestures and micro-gestures, seeking to understand the technical differences rather than expressing concerns.
Brandel Zachernuk contributed insights about continuity between experiences across different platforms, drawing parallels to responsive web design evolution. He discussed the goal of tailored experiences rather than identical ones across platforms, emphasizing understanding what happens on each platform. He inquired about webcam-based head and hand tracking integration for greater continuity and suggested pushing development on both specification and implementation sides to enable broader hand tracking adoption.
AI: Topics Discussed
What was discussed regarding WebXR?
The discussion extensively covered WebXR development challenges, particularly the disruptive development workflow where programmers must constantly switch between desktop coding and headset testing. Felix explained the evolution from WebVR era tools through Mozilla’s WebXR emulator to Meta’s current immersive web emulation runtime. Key technical aspects included the modular architecture separating runtime, control interface, and browser extension components. Framework integration was highlighted, particularly React Three XR’s automatic injection during localhost development. The conversation covered spatial controls, input simulation for gamepad features, system integration challenges like exit XR handling and visibility states, and cross-platform compatibility features. Advanced mixed reality capabilities were demonstrated, including plane detection, mesh detection, and hit testing using captured real-world environments.
What was discussed regarding gestures?
Hand tracking and gesture recognition formed a significant part of the discussion. Felix acknowledged the complexity of human hands, noting that while the runtime supports full joint control, the control interface focuses primarily on pinch gestures as the most commonly used interaction. Fabien emphasized the “mind-boggling” complexity of hand movement combinations and the challenges of describing hand motions verbally. The conversation covered micro-gestures, which Felix explained are currently Meta-device specific and handled at the system level rather than being part of WebXR specifications. Ayaskant mentioned MediaPipe’s 21-point hand tracking output versus WebXR’s 25-point standard, highlighting mapping challenges. Brandel suggested exploring webcam-based hand tracking for broader accessibility.
Were other topics discussed?
Several broader topics emerged including automated testing workflows using tools like Puppeteer for CI/CD pipelines, the philosophical challenges of representing XR experiences to people without headsets, spatial interaction design principles and where controls should be located in 3D space, the evolution from traditional game design to XR interfaces, accessibility considerations in XR development, the tension between development convenience and authentic user experience testing, and cross-platform compatibility strategies for graceful degradation from XR to traditional displays.
Were there any interesting anecdotes?
Dene shared her enthusiasm for VR fitness, particularly Supernatural kickboxing and Beat Saber with Daft Punk music. Fabien humorously described the physical frustrations of XR development, including messed up hair and tangled earphones. Felix mentioned that some people tell him he needs to “get a life” because he lives and breathes WebXR development. Frode drew parallels between learning proper tools (like different screwdriver types) in trades versus knowledge professionals typically settling for basic tools like Google Docs and Microsoft Word. The group discussed the irony of Glitch shutting down their hosting service, affecting demo availability.
Were any concepts defined?
Several technical concepts were explained: Fabien provided detailed context about WebXR development workflow pain points and the physical/intellectual frustration cycle. Felix defined the immersive web emulation runtime as a modular system separating runtime, control interface, and browser extension. He explained visibility states in WebXR, particularly the “visible blurred” state when system UI appears while maintaining the XR session. Peter Wasilko described the 3D Connection SpaceMouse as a 6-degree-of-freedom input device. Micro-gestures were defined as small gestures not requiring wrist movement, currently limited to finger-only interactions. The concept of “action recording and playback” was explained as capturing complete WebXR sessions with timestamps for later automated replay.
AI: Concepts Introduced
Immersive Web Emulation Runtime – Felix defined this as a modular system that separates the emulation engine, control interface, and browser extension into distinct components, allowing developers to test WebXR applications on desktop browsers without constantly putting on headsets.
Visibility States in WebXR – Felix explained the “visible blurred” state that occurs when users press the meta button, bringing up system UI while maintaining the XR session but removing access to input sources.
Action Recording and Playback – Felix described this as a feature that captures complete WebXR sessions with timestamps, allowing developers to record interactions in headset and replay them automatically for testing.
Micro-gestures – Defined by Felix and Fabien as small hand gestures that don’t require wrist movement, currently limited to finger-only interactions and handled at the Meta system level.
WebXR Development Workflow Pain Points – Fabien extensively described the disruptive cycle of coding on desktop, testing in headset, encountering failures, and the physical/intellectual frustration this creates.
AI: People Mentioned
Brendan Jones from Google (mentioned by Felix as original WebXR polyfill builder), Ward Cunningham (mentioned by Peter Dimitrios in context of Federated Wiki project), Brandel Zachernuk formerly at Apple (mentioned by Frode), Eric Vick from Meta (mentioned by Fabien and Ayaskant regarding micro-gestures demo), Scott Kim (mentioned by Peter Wasilko regarding visible system state design), someone working on Blender UI in spatial computing (mentioned by Felix as potential future guest)
AI: Product or Company Names Mentioned
Meta/Meta Reality Labs (Felix’s employer), Apple (Brandel’s former employer), Mozilla Reality (original WebXR emulator), Google (Brendan Jones’ employer), Duke University (Felix’s alma mater), Washington State University (Dene’s institution), Oculus Rift CV1 (mentioned by Felix), Supernatural (VR fitness app praised by Dene), Beat Saber (VR game mentioned by Dene), EA/Electronic Arts (published Rob’s Portal game), Amiga (platform for Rob’s game), Project Flowerbed (Meta’s WebXR showcase project), WebXR polyfill (original runtime), React Three XR (framework with integrated emulation), Three.js (JavaScript library), Babylon.js (3D engine), A-Frame (WebXR framework), Puppeteer (browser automation tool), Glitch (former hosting platform that shut down), MediaPipe (Google’s tracking solution), 3D Connection SpaceMouse (input device), Blender (3D software), TestFlight (Apple’s beta testing platform), Tinderbox (software community Peter Wasilko mentioned), ACM Hypertext (academic community), Chrome (browser for extension), WebVR (predecessor to WebXR), CI/CD (continuous integration/deployment), Daft Punk (music artist), Portal (1986 game by Rob Swigart), Stackblitz (mentioned as Glitch alternative), GitHub (code repository platform), Sony VR (mentioned by Frode), Battlefield (game series mentioned by Frode)
AI: Other
This meeting demonstrated the collaborative nature of the Future of Text community, with participants building on each other’s expertise across technical, academic, and creative domains. The discussion revealed the maturity of WebXR development tooling and the ongoing challenges in spatial interface design. Felix’s willingness to return and potentially bring colleagues suggests growing industry engagement with the community’s research goals. The integration of philosophical perspectives (reality perception) with practical development tools showed the interdisciplinary approach characteristic of this research group. The meeting also highlighted the global nature of the community, with participants from North America and Europe, and the casual yet substantive format that encourages deep technical discussions.
Chat Log URLs
https://futuretextlab.info/2025-schedule/ https://testflight.apple.com/join/gqjWYvhC https://video.benetou.fr/w/f92jFp74Gy87VDy6zW8G7i https://fabien.benetou.fr/ReadingNotes/TheCaseAgainstReality https://worrydream.com/refs/Kim_1988_-_Viewpoint,_Toward_a_Computer_for_Visual_Thinkers.pdf https://pptr.dev https://en.wikipedia.org/wiki/CI/CD https://futuretextlab.info/2025/07/05/ayaskant-panigrahi-july-21st/ https://github.com/meta-quest/immersive-web-emulation-runtime/issues/23 https://3dconnexion.com/us/spacemouse/ https://mediapipe.dev https://companion.benetou.fr/index.html?username=q2_step_volumetric_frames&emulatexr=true
Chat Log Summary
The chat log reveals active engagement throughout the presentation with participants sharing relevant links and technical resources. Early in the meeting, Frode shared the project schedule and a TestFlight link for macOS testing, with some confusion about mobile compatibility. Fabien proactively shared a video example of Felix’s work being used for scripted testing and links to related reading materials. Peter Wasilko contributed academic references about visual thinking systems and asked technical questions about SpaceMouse support and slide deck availability. Participants shared contextual links during the presentation, including Project Flowerbed details, puppeteer documentation, and CI/CD explanations. Technical discussions continued in chat with links to WebXR input documentation and hand gesture libraries. The chat showed the collaborative nature of the group, with participants helping each other understand concepts and sharing resources. Multiple participants expressed regret about Glitch’s shutdown affecting demo availability, with Fabien noting he had over 300 projects hosted there. The chat concluded with appreciation for the presentation and plans for follow-up discussions.