Demo of the system by Fabien Bénétou, prototypist for the Alfred P. Sloan supported ‘The Future of Text in XR’ Project, to Lubos Omelina, postdoctoral researcher, VUB.

AI Summary
Based on the conversation, Luboes tried on a VR headset to test a prototype interface for managing documents and code. Here’s what he experienced:
1. Initial Exploration:
– He saw a black rectangle with text and a yellow shelf
– He could interact with orange and purple cylindrical objects
– He could manipulate text snippets using pinch gestures (right hand to move, left hand to execute)
– He learned to toggle instructions by tapping his virtual wrist
– He could interact with a yellow shelf by pinching its middle to adjust its position
2. Interface Elements:
– There was a black console board on the left for error messages
– White wireframe panels around him could potentially hold PDF documents
– Code snippets or text could be manipulated in the space
– The system had the ability to take screenshots
3. His Main Feedback:
– While the original intent was to test bibliography management, Lubos became more excited about the potential for coding applications
– He specifically saw value in using it for Python/Jupyter notebook-style programming where:
– Code cells could be arranged non-linearly in space
– Users could have more flexible control over execution sequence
– Different components (training, validation, database loading) could be spatially organized
– He was less convinced about using it for paper writing, noting he typically uses Overleaf
– He encountered some minor technical issues (things freezing momentarily, some black elements appearing)
The experience seemed to spark ideas for Lubos about potential applications in programming, even though that wasn’t the original intended use case of the prototype.
This work is part funded by the Alfred P. Sloan Foundation.