Headset-Computer Transmission

First it need to be clear that working in any environment only becomes ideal when we can take the information with us to other environments. This is the only way we can take advantage of different environments with different focus and hence different affordances. This is true for working now, when there is no simple entity which can provide all the interactions needed and it’s even more important for the future, to truly allow for a competitive ecosystem of interactions to thrive. We cannot afford a monopoly on extending thought. This goes for knowledge objects (anything from a PDF, a JSON stream, a personal LLM to a full environment with all objects and velocities).

To achieve this, we are starting with basic WebSockets between a traditional computer and Headset. We will then also pursue other means through which to allow the user’s data to be accessed in the XR environment. The WebSocket layer only receives and transmits Visual-Meta compatible JSON data so should therefore be possible to replace with other transport systems.

Principles

  • Needs to support a pre-defined Library (ACM DL in our case) as well as user Library on their computer.
  • Needs to be possible to specify code/program in headset and data separately, so any of the code bases we will test can be used on any Library.
  • Should be simple enough for a regular academic/student to set up

Headset

WebXR Software

Receives data through the WebSocket and transmits back any changes to individual documents (such as annotations or spatial placements) and the environment.

Transmission

WebSocket

What is transferred is JSON with Visual-Meta included, so that the parsing on the computer and Headset can be the same.

Computer

‘Computer’ refers to the user’s desktop/laptop computer or online cloud storage, with the following elements:

Library Environment

This includes information about the Library itself, such as what documents are hidden, pinned/favourites and layout.

Also stores WebXR Library environment is received.

This will be transferred as full Visual-Meta wrapped in JSON to allow for the same parsing in the headset as on the user’s computer. It will be done using a dummy PDF document called ‘Library’ or as pure JSON, to be determined.

All Documents in Library

A list of all the documents in the Library including their individual metadata: Citation Metadata,, Highlights. Entity Metadata and XR Metadata.

This will be transferred as full Visual-Meta wrapped in JSON to allow for the same parsing in the headset as on the user’s desktop/laptop.

Individual Document

The individual documents can be plain PDF or augmented PDF (with added Metadata in the form of Visual-Meta), and in the future other document formats.

Augmented with Metadata can mean:

  • Core: Citation Metadata (including author, title and date)
  • Extended: Headings, Glossary, References, Endnotes (such as exported from Author)
  • Temporary: AI Analysis/Entity Extraction, XR spatial information and enhanced images (metadata attached to images)

Code

Accessible code from all developers in the community for testing.

The code will be uploaded to the headset on every use, along with the data.

Needs to be easy for programmers to add and simple for user to choose.

Notion of interaction can be to have all the environments available for the user on their right arm (left arm for Fabien Copy). For example.

Shared Code Services

This is where programmers will be able to upload tools for re-use in different environments.

Future

This method will need to be able to provide other data in addition to document data, such as meeting records etc. in the future.