Components of Environment. The environment will be composed of:
- Visual background color image (user setting to toggle).
- 3D background models (user setting to toggle).
- Workspace where the user does the actual work with a foreground and background. The foreground is for active work and the background is for supplemental information or information not currently worked on.
Initial Environment. Visually the environment is imagined as, for the initial use before customizing the environment, a rectangular desk overlaid the user’s physical desk (brief alignment process with guide to make it correct on first use), with a basic flat grey background, basic floor and various 3D models for a sense of space, all of which needs to be easy to toggle off when desired.
Main Environment for Exploration. The main environment we will focus on creating will be a knowledge ‘map’ for exploration, where the user can access annotations from their readings and construct a new spatial Volume of annotations and new text for inclusion in The Future of Text 6. Other Workspaces etc. will be designed to contribute to such an environment.
Knowledge Object. A Knowledge Object is the basic unit of work in a Map type environment, whether in exploratory and explanatory spaces. Then system should be able to support such objects as simple text strings, text with metadata as well as images, 3D models, PDF documents and video, all possible to place in a Workspace in a manner which can be exported as a Volume to another user who will access this multimedia, multidimensional set of objects.
Volume (space/folder/document). A Volume can be thought of as saved framing of a set of knowledge objects, a single object or no objects only a ‘viewSpec’ of what the Volume can later contain.
Personal Library. The user is presented with their library which appears as The Future of Text Vol 1-5 in the form of bound 3D books (for initial appearance) as well as a document with a plus sign, indicating that they can create a new ‘volume’/document, as well as the word ‘Import’ for the user to import external documents, including text they may have written externally. The Library here should provide a very different experience from the Workspaces and needs a specific interaction to enter (and leave). The Library is simply defined as the user’s stored information which is accessible from the headset, wherever it’s stored.
The ‘WorkSpace’. The Workspace refers to spaces/views/workstations, such as Mapping (of various types, including to think and to communicate), Writing from an Annotated Bibliography, Writing Space, Outline, Chart, Timeline, Geographic and so on. These WorkSpaces will need to be separated and possible to combine, at least insofar as to allow a user to bring further information in. Such WorkSpaces should also be able to support saved states so that the user can have several different Maps/Writing Spaces etc. for example.
Interactions
This is an overview of the interactions the user will need interfaces to be able to perform.
- Desk-Free/Desk Toggling between them.
- Settings Changing environment settings such as colors and background.
- Library Access the Library where the user can store their research information as well as their own manuscripts.
- WorkSpace toggle to go through different workspaces.
- WorkSpace interactions:
- Interacting with a knowledge object to toggle foreground/background (to bring forward to work on or back to remain in view.)
- Interacting with a knowledge object to toggle vertical/desktop (to bring onto desktop in Desk style or vertical.)
- Interacting with a knowledge object directly (to move, scale, hide/show etc.)
- Interacting within a knowledge object (such as writing text in a node/document)
- Interacting with knowledge objects by type (group selection)
- Interacting a knowledge object externally (such as annotating/tagging)
Interfaces
This is an overview of the currently imagined interfaces the user can avail themselves of to perform the interactions listed above.
- The Wrist Menu is what Fabien has already implemented. It is a ‘ball’ on the user’s wrist in VR which can be tapped to execute specific commands or be expanded to show a range of options. These options could further potentially be torn off to float in space or to be attached to be a HUD.
- Non-Primary Hand Menu is based on what we had last year in Reading in XR, where the user looks at their non-primary hand and options are presented on fingers and potentially in the palm of their hands. (For a right handed person the non–primary hand menu will be on the left hand, with the right hand’s fingers used for pointing/interaction)
- Primary Hand Menu is the user’s active hand which usually will be used to point to the non-primary hand. This menu could be made to be harder to spawn (less likely for accidental spawn) through maybe the non-primary hand will need to point to the open primary hand’s palm for the menu to appear.
- Floating Menu which can either be spawned by itself or town off from a Hand or Wrist Menu.
- HUD which can either be spawned by itself or town off from a Hand or Wrist Menu.
- Embodied such as tapping head to indicate ‘head back to Library’ or chest for store etc.
- Gestures are in a way native to XR and provides a great sense of immersion and interaction but there is a real danger of the system mis-interpreting gestures, including executing them when not intended, as we learnt last year. This is one of the reasons why we use the hand menus since it is unlikely a user will touch, or point to, a finger on their other hand accidentally. We can also look into into using both arms together, such as both doing a lifting motion to load a Library.
- Table can provide additional affordances, as described below, while also constraining the fluidity of the environment.
The approach here is basically that the ‘Desk’ style has more controls/ways of interaction than the ‘Desk-Free’ style and that all the interfaces are used for both styles, with simply the Desk Style having the Desk interfaces.
Desk & Desk-Free Interactions & Interfaces
Toggle Desk and No-Desk Styles
• Wrist Menu.
Library Access (including Saving/Export)
• Primary Hand Menu (the user’s active hand) to show (initially) lists of stored documents/data.
• Or, assign one finger, maybe little finger, maybe palm, to always bring user to Library.
• Or, two hands with palms up gesture, moving from below up, ‘lift’ the Library into view.
Environment (Background, Look & Feel, Left/Right handedness etc.)
• Wrist Menu.
Toggle WorkSpaces
• Gesture such as similar to the Desk gesture to toggle Workspaces, but without table.
WorkSpace Interactions (using Map as an example):
• Direct manipulation to select and move individual items. Ideally also drag select to select more than one object
• Hand menu:
1 (thumb)
2 (index) Hide/Show
3 (long)
4 (ring)
5 (little) Layout
– (palm)
Toggling between Desk and No-Desk would probably require the user to move away from their physical desk and for the information which was displayed to follow for the same distance, then lock onto that distance.
Desk Interactions & Interfaces
The same interactions afforded in the Desk-Free style will remain available, and the following will also be made available:
Toggle Workspaces
• Swipe Gesture. Should also be possible to see saved layouts within each Workspace, which could be accessed through a similar swipe up maybe.
WorkSpace Interactions
- Visual ‘tabs’. Tabs at the top (away from the user) of the desk, as buttons which can be toggled on and off.
- Text on Table. Put text onto the table for the user to be able to lean forward to read and annotate using a finger or ‘pen’, against a real surface.
- Slider. If controls can make use of a slider, allows the user to slide agains the table for potentially finer grained controls.
- Drawing Surface. When using a document on the table the user can more easily draw in 2D.
- ‘Pull out’. Extract documents or other information from the table.
- ‘Pull out’. From sides of table for extra virtual space for specific information.
- ‘Pull Out’. From back of table to access saved Workspace views.
- Cuttings. Cuttings area for snippets to have ‘to hand’ and not far away.
- Slider. A part of the desk, such as maybe the right part (for right handed users, left for left) as a slider to scroll up and down in a ‘focused’ document.
- Slider. Slider as a slider to move background elements forward and backwards (maybe on the opposite side of the desk from scroll?).
- Rulers etc. Affordances for rulers and similar on a flat surface when information is on the desk.
1 comment