5 February 2024

Frode Hegland: Good morning.

Dene Grigar: Good morning. How are you?

Frode Hegland: Yeah. I’m good. It was a long flight, but I’ve been resting, not trying to sleep too much because I time zone and jet lag, but feeling a bit mare, so. Yeah. I’m okay.

Dene Grigar: It’s hard to go east. It’s easier to go west. So you didn’t have as bad a jet lag coming over here. But it’s always hard to go to England for me.

Frode Hegland: I heard many friends say that, but when I go to Singapore or Asia from here, or when I go from here to America, it doesn’t matter. Mornings are always easier somehow. So it makes no sense. But then again, I’m good at sleeping. So I had a good night last night, so cannot complain.

Dene Grigar: Can I share the Can I share the screen for a second and I’ll show you what I have down for.

Frode Hegland: Yeah, absolutely.

Dene Grigar: Hang on a second. I got to set this up.

Frode Hegland: Okay, so you’re now on your bigger computer.

Dene Grigar: I am.

Frode Hegland: I can see it for two reasons. You’re not looking down and your face is very evenly lit. And so the good side screen.

Rob Swigart: Okay.

Dene Grigar: All right. I’m going to quit and reopen. I’ll be right back.

Frode Hegland: Okay.

Dene Grigar: Part of the reason why the the Part of the reason why the lighting is different is I’m coming. Lights coming straight at me from the window instead of from the side. Because when I’m over here on my laptop. You’re only seeing this side of the of the the lights just coming in here. But now I’m facing the light and that’s different.

Frode Hegland: Yeah, that makes sense.

Dene Grigar: All right, let me show you the screen.

Rob Swigart: Share.

Dene Grigar: All right, here we go. Success is mine. All right. I want to go here to. Yeah. Can you see that?

Frode Hegland: You’re back again.

Dene Grigar: Okay. So onboarding update programing. We’ve got as you saw, Adam can’t come on Wednesday most time. So he may be there. He may not. It depends on scouts.

Frode Hegland: Hang on. I thought it was Monday. He couldn’t come on Monday.

Dene Grigar: Yeah. That’s right. Okay. And then also Alan wrote me, I wrote him and said, you know, let’s chat. He wrote back and said, okay, and I don’t know when we’re going to chat. But anyway, I’ve got him down for Wednesday and the next steps. I just kept on what we had because we need to continue working on naming conventions. Gesture inventory, design principles, definitions and wish list.

Mark Anderson: Yeah, yeah, yeah.

Dene Grigar: Hang on a second. Have a good day. I’ll see you later. I’m not going to campus. Okay?

Rob Swigart: Okay.

Frode Hegland: Okay. Yeah, we do those. I talked to Adam quite a long time in the taxi on the way from the airport yesterday. And a couple of relevant things. One thing is we feel we need another day. To work on design. Different kind of design things. So that’s kind of interesting. See if there’s an appetite for different people for that. Also, Denny, it’s good that you’re here on Mondays, but you don’t need to be here every Monday. Considering today is a very general day. So, you know, feel free not bound by Monday. Wednesday, of course, is our killer day, right?

Frode Hegland: I’m saying that in context of adding a day. So that day would be.

Mark Anderson: Yeah, more design.

Frode Hegland: Led because on the Wednesday we do the we have this time here to show and go through what they’ve actually done. But one thing that I realized on the flights is there are other design aspects we haven’t gone into yet that we need to discuss. Hello? Rob. Are you here? Rob?

Rob Swigart: Rob.

Dene Grigar: Morning. But also I got noticed that your headsets coming in today, Greg’s out, so I’ll get it tomorrow and ship it to you. Your your inserts.

Frode Hegland: Yeah. Thank you for that. Now that’s. Let’s stop sharing for a second. I’m just going to upload a document to base camp, first of all. Yeah, thanks for that. That’s but it may not be the insert and the extra thing that needs to be you know, the little framing thing. So please wait until we have until both of them are there. So I have.

Rob Swigart: Finally got hooked up.

Frode Hegland: All right, you got yours.

Rob Swigart: No. It’s coming tomorrow.

Frode Hegland: Very good. I need to upload a document for you guys. Danny, where should I put it? It’s kind of a report about last week. Does that go in a special folder?

Dene Grigar: Yeah. Let me go back into our home.

Rob Swigart: I can go.

Frode Hegland: Into my journal, actually.

Dene Grigar: Go to docs and files.

Rob Swigart: Yeah, it’ll.

Frode Hegland: Go into my journal. Yeah. Is it? That’s pretty much what it is. Okay. That’s fine.

Dene Grigar: Yeah. That’s fine.

Rob Swigart: Good.

Dene Grigar: What are you doing that? Rob, I want to let you know that you can download the game from the website and try it on the Vision Pro when you get your Vision Pro.

Rob Swigart: Okay. People are. People are asking.

Dene Grigar: It’s been it’s been online for a long time. I wasn’t sure if you. Head. I mean, I wasn’t sure if you were looking at it or not.

Rob Swigart: I have to find it.

Dene Grigar: I’ll get it to you now. Okay, I’ll. I’ll drop it in our slack channel.

Rob Swigart: That would be great. Then I’ll put it on my website, too. Now I’ve just lost the thing I was going to put on. Here we go. I want to put this in the chat and show you something, which I think is quite remarkable.

Dene Grigar: And here’s the. In our chat here. I’m going to drop this in. This is the. That’s to the website. And the website has a download the game member that member the download the game. Yeah okay. Share screen real fast with Rob.

Mark Anderson: Yeah.

Dene Grigar: So this is the website. Download the game is here.

Rob Swigart: Oh, okay.

Dene Grigar: So an itch we have, we put it on itch. But itch would not let us download the entire game because it’s bigger than the one gig limit.

Rob Swigart: Oh, so.

Dene Grigar: We had to put it on a server and it’s sitting on the the Elo server. And you can download from there.

Rob Swigart: Okay. But you can. Maybe she can do it with her PlayStation I don’t know.

Dene Grigar: I don’t know, I don’t think it’s going to work on PlayStation. It’s not set for that. But it’s it’s definitely set for headsets.

Rob Swigart: Okay, well, she’s got the VR headset, so.

Dene Grigar: Okay, well, we looked at it on Last week when Frodo was here, and Frodo liked it a lot. He thought it was good, so maybe you will too.

Frode Hegland: Oh, you haven’t seen it yet, Rob?

Mark Anderson: No.

Dene Grigar: He has. He hasn’t seen it.

Rob Swigart: I’ve seen it, I’ve seen it, but I haven’t, you know, been able to experience it.

Frode Hegland: Yeah. No, it was very impressive. Very inspirational actually. Okay. Hey, Mark.

Dene Grigar: Hey, Mark, the way we’re handling text is what I kept harping on earlier, because we really did kind of solve a problem with. Handling texts at a in a VR environment, I thought. And And Frodo agreed.

Rob Swigart: I just wanted to share this image because I had lunch with artists I work with. So who did a little book of a poem of mine? Which I can show, but it’s complicated. But this is a piece from a I think it’s hers. But what struck me, this is a quote from a E.T.A. Hoffmann story, The Princess Bambina. Brandler. And look at the image. Weird.

Michael Bonfert: Hey, everyone!

Dene Grigar: I love that which. What is the title of the book?

Rob Swigart: The story is Princess Berm Brambilla. I think. It’s 1820. But she’s wearing a headset. I just find that remarkable. I guess that really is a headset though, so. But the quotas is, you know, how do you know that your dream isn’t the real reality and that you’re daily life is just kind of aberration of your lost senses.

Frode Hegland: But that’s your friends illustration, right?

Rob Swigart: Yeah, I think so. It’s either hers or somebody else’s. A friend of hers. But I think it’s hers.

Frode Hegland: Okay. Yeah. No, I just thought when you say 1800s or something, it got a bit. Yeah.

Rob Swigart: No, I thought initially I thought that. That’s weird, but. But she doesn’t know about VR, so I’m not sure what. What she had in mind there. I’m going to ask her. But what she does with text is, is stuff like that. Sinuously. Sliding down. It’s the past of text.

Mark Anderson: Yeah.

Frode Hegland: Part of the future. So.

Dene Grigar: Morning, Michael. Good to see you again. Hello, Alan. Yeah, it’s been a while. Yeah, it’s been a while. Good to see you again.

Michael Bonfert: Was I more often have now states that collide with this one.

Frode Hegland: Sorry. One second everyone.

Michael Bonfert: And we have some new faces.

Dene Grigar: Yeah. We’re growing the community trying to. That was the.

Michael Bonfert: Andrew Aitken.

Rob Swigart: High. Maybe some of us are.

Ken Perlin: Maybe some of us are eyes.

Michael Bonfert: Well, you never know. Maybe I am still trying to figure that out.

Rob Swigart: That’s right.

Ken Perlin: Realizing you’re an AI, that’d be a great plot for a movie.

Michael Bonfert: I bet there is a Black Mirror episode on that.

Ken Perlin: Blade runner, The matrix. There you go. Well, he wasn’t in. I think Deckard in Blade Runner was an AI.

Dene Grigar: He was. He was an android?

Mark Anderson: Yeah.

Frode Hegland: Was an.

Dene Grigar: Android?

Mark Anderson: Yeah.

Frode Hegland: Yeah. They didn’t have a iOS back then.

Frode Hegland: So. Yeah. Hello. Most people. And Yeah, it’s good. So I just came back from spending a week with denae. And Andrew and the Pacific Northwest. I’ve learned how to say it now. It’s called the Pacific Northwest. More locally, Vancouver, Washington, which was an absolute treat to sit in the lab and just go through these things with quite brilliant. The week before that, I spent in Silicon Valley celebrating the 40th anniversary of the Macintosh, which was kind of incredible. The next day and a half Rob and I together spent a lot of time with both Bruce Horn, who wrote The Finder, and Bill Atkinson, who wrote Macpaint and HyperCard. And that was a very weird experience. Bill Atkinson is now more about shared consciousness and direct communication and all these things. He feels that this kind of stuff is a bit primitive. So yeah, that that was that.

Rob Swigart: I would say proto that he’s ordered his headset though. He’s turning it up this week. Yeah.

Frode Hegland: Oh, that’ll be interesting. Keep me updated. Yeah. So I really connected with their shared background. So that was very, very good. Yeah I do sorry. My chair. I do really want to hear how he is on that.

Mark Anderson: I.

Frode Hegland: Uploaded a document to Basekamp, but it really briefly. Go. There is another topic that I think we need to discuss. Not now, immediately, right off the bat, but I just want to. Bring it up to the team and see where you feel where it fits. It may be a Wednesday topic, may be a different day. It might be part of today. So here’s the thing I imagined. So me writing some sort of a trip report, a little bit of these are the goals that I see it, that kind of stuff. So then I imagine Andrew doing some work and having referred to some of that. How does that connect if we do it on the web? There’s one way. But how do you do a link that kind of looks into the future, right. So I did a tiny invention which is illustrative of what I think we need to discuss. We need to hang on. I’m going to step back half of an inch here. When spending the time working last week properly in a headset, it seems that one of the key attributes is, as I think we agree, the incredible vastness of space that we have. So far, we’ve been working a lot on the mechanics of being in the headset. You know, how do you hold the PDF, how do you flip it, how do you show text and so on. Right. But when you have this massive vastness. One of the key things we need to look at is how we can get texts.

Frode Hegland: In a document and between documents to connect. Does that kind of make sense that we need to also look at that. Right. So few shaking heads. So the invention which again is not something we should necessarily do, but it’s illustrative is imagine that I write blah blah blah. And then in hard brackets I write single document interactions. That’s what I write in my document. And I export that to PDF because that’s what we have to do for a while. And then Andrew reading through this, he writes this document that is about experiments he’s done with single document interactions. So what he does when he exports his document is add that single document interactions to the metadata in the export dialog. Because we have visual data. That means that all of us, as long as we have both of these documents in our in the same folder, we can open the original document. And this single document attraction text is now something you could click on, and it’ll show all the other documents that has that as its tag. These are probably things that I’ve done before. It’s basically a search that happens. If you do that, it’s a convention. And again, I’m not saying we should jump into doing something like this, but I do think we need to put some time aside to figure out not just a dumb one way link, but better ways that we can have things in our space connect over.

Dene Grigar: Mark.

Frode Hegland: Mark. I thought if I thought you would have your hand up. Which is why.

Mark Anderson: Well, I, I had a thought, but I didn’t want to sort of jump right in because I did ponder on this. I mean, but it what I’m hearing is, is basically a tagging system. So we’re actually setting. Essentially we’re adding metadata. Not proposing another form of linkage, you’re adding a mark that allows you to run a query against that metadata. If I understood the gist of what you’re doing.

Frode Hegland: Sorry, I have to be excused. I have an issue here I have to deal with. One minute.

Mark Anderson: Okay.

Dene Grigar: Mark, you want to while we’re waiting for him to get back? Thank you for the article. You are the David Kolb’s article from hypertext. On sprawling places.

Mark Anderson: Yeah, yeah.

Dene Grigar: That’s a wonderful piece.

Mark Anderson: Yeah. And I’m, I’m sort of annoyed that I hadn’t spotted in when I was looking at the, you know. So another thing that’s missing from the proceedings. Well, it ain’t anymore, but I don’t know whether I’ll be able to get the library to add it back. It’s kind of weird. He obviously did a two piece PDF that went into the conference, but it says quite clearly there’s an essay. So thankfully it’s there.

Dene Grigar: Well, I looked at my archive because David has given me all of his archives for the next. Right. Yeah. So I have his papers and all all of his work has been digitized. And I went through and I did have that copy, so I had it all ready from him. But it’s good to have a second copy coming from someone else to have this kind of provenance. So thank you for that. David was a one. He’s a wonderful hypertext developer, artist, writer who also is a good theorist at the time. Yeah. So he’s also coming to visit in a couple of weeks. Mark. He’s coming to see me.

Mark Anderson: Oh, right. Yeah. Well, I saw him. He was at the tinderbox meetup the weekend. Kathy Marshall gave us a very good rundown on her. Some of her research as well, which so small circles. I but I see Fabian has his hand up.

Dene Grigar: Fabian. Good morning. You’re you’re muted.

Michael Bonfert: Hello? Hello. Bonjour.

Brandel Zachernuk: Whatever.

Fabien Bentou: Bon soir. Yeah, it’s a little.

Frode Hegland: I’m so sorry my mother couldn’t find her keys. I really apologize, I had to do a little bit of Why don’t you come here kind of stuff?

Brandel Zachernuk: I’m talking.

Dene Grigar: Fabienne’s starting to.

Frode Hegland: Talk. Fabian, please start again. I am here to listen. Sorry.

Fabien Bentou: It was to distract the everybody while you were away, but I think it might be interesting also for you. So, of course I have more gadgets. This time. Some of you might be aware of it. It’s unreal. Which is relatively old, actually, or it’s real now and it’s glasses like that you don’t see through. Why do I mention them? I bought, I got them actually because I need to review a bit the landscape and also because there are consoles like the Steam Deck that you can play with which often on their own, but. And they’re not. It’s 3D, but like a console, like a traditional 3D. You’re not in six degrees of freedom. But if you plug the glasses on there, instead of going to the train or whatever and playing the game on your nine inches, I think, screen, or maybe a tiny bit more, but it’s small. You can play on the glasses and then you have a huge screen in front of you. And I think for let’s say this use case, it is interesting, in order to contrast proper six degrees of freedom where one grabs the document, move the document moves with the document with this kind of display, because it’s like, so much cheaper. I don’t to be honest, I don’t really like it. But I think in term of showcasing the advantage because fraud. You started with this being able to evolve in space and having a set of documents, knowledge organized in space, then I think it’s also interesting to contrast with both what we had so far, namely posters on the wall or post-it notes on the desk, etc.

Fabien Bentou: but also this kind of things, which is kind of in between. It’s not like a normal screen. You can still move with it, but it’s still like flat on a sphere. Let’s say even if you can curve it in the end. So I think it might be interesting. Let’s say one thing of such a project to contrast proper six DOF with knowledge, work and documents knowledge work outside of the screens and then this kind of interface, because in a lot of situations, I was looking again at somebody reviewing the Vision Pro. I think it was Alan who linked it on, on Twitter earlier. And they were marveled at the user experience or the UI of window management spatial window management. I did not try the headset, but I as far as I know, for example, like on the quest three, two one, the windows are actually aligned. I I’d be curious, but the windows well, maybe you can already reply, but when you display the UI with the different windows, there are not in 3D like they’re all on the same plane, even if curved. So it’s in-depth, proper. Okay. Then again, it’s interesting to be able to contrast this versus something like that which is not, and then find use cases where that that properly leverages six degree of freedom, including for the UI of the window manager.

Frode Hegland: Yeah, we definitely don’t want to get tunnel to only look at whatever current items are on the market. Now this one, there is something that I find really, really annoying and that you should all be aware of. By the way, I’ll show it next to the. So here’s the difference between the. This is the Quest Pro, and this is obviously the other one. This one seems quite massive. But mostly the back. But one issue that I personally have had wearing glasses I got contact lenses to correct for you know, normal distance vision. So when I put these on, they don’t work. It all is blurry. So the contact lenses really need to be optimized for reading distance, because the optical distance is about a meter and a half to two meters away. That was a bit shit. So the Zeiss lenses that I’ve ordered may be completely wrong. And there was no description of this on Apple’s website, which prescription you should use if you’re an adult like us. So I’m going to be experimenting with that and giving you feedback. He.

Ken Perlin: I just want to mention I’ve been. I bought the x reel quite a while ago, and I started using it, and at first I was really impressed with it, but then I just found myself not using it anymore. I found that I just, I, I it just it was more comfortable for me to just look at my computer screen. I could see certain circumstances where I’d want to use it. Like if I were lying in bed and wanting to read. That almost never comes up for me. Or if I wanted to be on an airplane and have nobody see what I was looking at. But that also doesn’t come up for me. But in general, if I have the choice, I’d rather look at my MacBook screen if I’m just looking at a flat text. So I have it, I tried it, I keep looking at it thinking I should use this, but I end up not using it. For whatever reason.

Frode Hegland: The small.

Michael Bonfert: That’s interesting that you point this out because there are there’s a keynote speaker that, at least in Germany, is quite popular. Philip Rosenthal, who also worked on the XR terminology framework, and he these days goes up on stage and asks the VR communities in, in front of whom he’s presenting and asks who thinks this will be the new big thing in the breakthrough and that we’ll all use it in a couple of years, blah, blah, blah. And everybody’s like, whoa! And then he goes like, okay, who is actually using it? Seriously, how much time are you spending on it? Who in the audience is using it more than a day or a week? Who usually use it for this and that task and it drops so drastically, who still raises their hand? It’s it’s shocking that even within the community, we haven’t adopted that many use cases where we integrated it in our everyday life and well, guilty here.

Frode Hegland: Yeah, a good point. Ellen.

Allan Laidlaw: Yeah. Springing off that is something that. I’d like to talk about. And actually do a session on or you know, facilitate some collaboration. Don’t have to do it right now, but the I’ve put some thought into what I think is a good exercise. It’s always been good for me and maybe others have done it here. It’s a pre-mortem which basically says. Okay, let’s look ahead at a certain time, whether it’s November after ACM or sometime later. And what we’re trying to do fails. Right. And really kind of like imagine what that. The varieties of ways that it can fail. This is not meant to be an intimidating thing. It’s actually. A really good way to figure out. When we’re caught in our own cycles of enthusiasm or urgency, if we can step back and go, you know, here’s an experience that went totally differently from how I wanted it to go.

Rob Swigart: What?

Allan Laidlaw: Cause that what could have been different? So I would love to do a session. I think it’s a broader topic kind of thing, because it isn’t specifically tied to the people that are making whatever. But I find that it can really help us figure out ahead of time what gaps in our perspectives we may have. In our, in our processes. And this would be the time to do that. So just putting that out there and then the other thing I wanted to bring up is the broader future of text outside of the Sloan project, or perhaps hydrated by the Sloan project, but the larger community. There are various topics that are outside of VR, and one of them being, say, Fabians offline and, and, and federated kind of living. And I that’s really important to me too. And so I’d love to. At some point talk about what are the tendrils of the community beyond XR that? That we think others would be interested to be involved in, because I think that there’s a that’s worth talking about.

Frode Hegland: Absolutely. Maybe today, but also, if not, definitely. Sorry my door was moving. It could also be a very, very important and good Wednesday session, which can then later on help us lead into user story mapping. So I’m very, very excited to see that question. It’s it’s a typical island reverse question that makes us think so. Yeah, that’s really, really cool. And then as far as the tendril is concerned, we probably talked about that today. That’s hugely for the whole community. But over again.

Ken Perlin: Yeah. I’m just responding. To what? To what Michael said, which I thought was a really good question, which is what? What do we actually use? I use this every day. I spent a significant amount of time in my quest three. But I never except for my morning exercises, I never use it in VR. I always use it in video passthrough extended reality. And in fact, I went online and got a little 3D printed thing. I don’t know if you can see this, but if you notice I replaced the gasket with something that looks quite different. And the significance is it’s open on the sides, just like the Quest Pro. And that way because I’m using video passthrough. Anyway I’m, I’m in my own room. I see everything, you know, I see my hand here, and I bring it around and it’s still here. So if you’re going to be using it for XR, it’s much, much better to just change it slightly and make it open. But, I mean, I just swear by this thing. Yeah. I there are open source printable files. I bought the first one because I wanted to test it out. I could probably. I must be spawning to the chat here. I could probably find the link to, but I can’t swear by any of the open source printable files because I haven’t tried 3D printing it yet, but I can give you the link to the guy who sell to you for 30 bucks. And then if you want to also try printing it out, go for it. But this is what I know right now. All right.

Frode Hegland: Yeah. That’s Yeah, I do agree with the I also own the pro the side bits. I don’t have them on. I also like a little bit of the environment. And I haven’t spent enough time in the, in the vision to really feel. But today I took off the light shield because, you know, this, this thing goes off. And then I tried it with glasses, kind of holding it carefully away from my face. And I could see the quality of display was absolutely phenomenal, really, really, really good. So I can imagine working with it like you’re talking about, Ken. But on the vision, you get it in a different way. Peter. Peter, Peter.

Peter Wasilko: Okay, I really liked your comments before about wanting to have a link to material that you’d be citing in the future that didn’t exist yet, and that also brings back to mind just a recurring problem I’ve always had in my scholarship, and that’s instances where I’m aware of a source. But it was so far in the past that I no longer have a full, proper citation to that source. So if we’re in a strict mode, I have to pretend that that work doesn’t exist and I have to avoid making any references or building on it directly because I can’t properly cite where it came from, which is incredibly annoying, and it’d be much nicer if we could start to build in a notion of having futures and citations that could be resolved at whatever point in time, hopefully in the near future when we’re able to pin down that citation, but also at the same time, have something documented in the record as a stand in that other people could hook on, and not necessarily be me who fills in the citation. So there are a few cases where, you know, I know there was a book I read about ten years ago that talked about such and such, fill out as much detail as I can possibly remember, and put that in there, as opposed to pretending that that work didn’t exist, because I see a lot of things sort of drop out of being part of the formal literature, because there’s information that never got documented in an adequate form.

Peter Wasilko: Also, kind of like Deeney’s comments about your experience when you are having the live interview and not being able to get the author to pin down exactly what it was because he didn’t want to have possible drug use, get entered into the formal record somewhere. So that bit of information winds up getting walled off. So we have this world of formal. Strictly. Defined literature with rigorous citations where we pretend that we know everything with absolute certitude. And then there’s the real world of practicing scholarship, like Cathy Marshall’s talk at the Tinderbox Meetup about how information, the ephemera can get lost, and you can have all the secondary metadata sort of hovering around, none of which is rigorous enough to make it into the formal publication literature. But that’s absolutely critical for the ongoing search and the process of building and narrowing things down over time and erecting our knowledge scaffolding. So I think that could be an area where our group could contribute and just sort of put it out there that, okay, we understand there’s no use strict mode where we’re going to follow the traditional things, but that there are also areas where that level of formality and citation. Isn’t necessarily the most desirable thing.

Frode Hegland: That’s really, really nice to hear. Peter, you’re obviously partly bringing it back to what I was talking about in the beginning. It’s so annoying when zoom says it looks like you’re talking. I know I’m talking. You don’t need to do anything for me. A lower hand just to make you happy. Okay. So writing. One of the great benefits of writing, of course, is that you’re literally writing it down. And that almost means carving in stone, right? So to be able to write with different levels of certainty and different levels of address and different qualities of addressability is important. One of my friends I think you know about this person is in the military, does intelligence did a pretty huge thing that I’m not going to put on the record here. I asked him a little bit about this and he said they have to code whether intelligence is reliable or not, because this can end up having severe repercussions and they have their own way of doing it. Not saying we should have the same, but you know, this is so old and this is from a human. This is from a system. All of that has to be encoded in a way.

Frode Hegland: They put it up the chain. I’m not sure if you remember, not that anyone is this old, but World War Two, the beautiful maps that they had in the control centers there in England to, you know, get rid of those other guys. The planes were color coded so you could see on the map if the if the plane was indicated with whatever color you knew, the information was so and so fresh, because every period of time, 15 minutes, whatever it is, they would move and change the color. So you instantly had that idea. This really feeds into the whole notion of what writing is, what it should be. When you’re writing a fully academic document, of course it has to be polished and have correct everything. Of course it does. That’s a very specific thing. But it really does address, Peter the notion of what do you do with this massive space, both in terms of layout, interaction, space, time and so on. And I’m sure, Mark, I’m so sorry I had to leave earlier. My mother couldn’t find her keys and it’s a big thing. Anyway, she’s coming here. I can concentrate on you guys. Mark, please.

Mark Anderson: I’m sorry. There we go. Yeah, I was just thinking, listening to Peter’s point, that it sounded to me a bit like having So once you’re in a digitally native document, it’s creating the space for a, effectively a self-healing citation. So what you’re doing is you’re creating, you’re what you’re doing is you’re putting the necessary citation into the document that then can be healed, in a sense, completed at a later date. Because so in other words this means that the the document, when read at a later date, can point actually points to the right thing. So it starts by pointing essentially at possibly nothing. I mean, you know, you wish it pointed something, you don’t know what it is or you’re not able to point to it in a meaningful sense. But if the thing is pointed out, is is subsequently found or fixed or improved by somebody else. That’s interesting. It’s really interesting. This this makes me think of stuff I’m doing with Gabo and the team at Myntra at the moment and tinkering around with what you do with documents in a federated space. And there’s a lot of stuff about going back to link basis. Funny old thing. So, you know, a problem at the moment is wi hardwire all the references into the document. We create our citations. Well, actually, if that link base was outside the document, then you wouldn’t have the problem that you’re referring to now. So maybe something we knew before the web arrived actually still has value. Fabian.

Fabien Bentou: Yeah, it’s funny because it’s something. I was chatting with Leon earlier today and I started to. So my background is software engineering, but I’m doing more and more electronics because I want to make like augment my headset. I want to whatever I think, I think it’s we’re having more and more gadgets around us while understanding less and less about them in the same desire of managing my agency not to be like a slave of such gadgets, I’m trying to learn how they work. All this to say that I’m learning a bit, practically speaking, how to do it, how to make circuits, how to put electronics on them. And it’s it sounds so far away from, let’s say, publishing research work within software. But in the end, what I think I notice now is that you have packages also, you have like a set of different components from a different hardware provider, let’s say and some of them rot. Let’s say they are not supported anymore. You have a new version of, let’s say, a resistor or an LED or whatever. It is nearly the same, but not really. And then you must somehow preserve either provenance or some way to get like the difference from version 1 to 1.1 .3.2 or whatnot. And it feels like it feels it feels in a way so far away from writing text. But in the end, it’s still manipulating symbols. Symbols with dependencies, symbols with links between each other. So I think a lot of those things that apply just to text apply to even like state of the art of hardware design, because in the end, it’s still tools manipulating such symbols with the same kind of challenges.

Frode Hegland: Yeah, we do need to look at the wider issue of texts. One thing that I discussed. I’m looking at Alan’s comment there. I discussed with Adam yesterday. We probably need to separate discussions of the more mechanical, the interface and the, you know, what are we connecting and how and why? Discussions at least some of the how and why needs to be in the user mapping. Alan, are you cool with doing the the diagraming on Wednesday in that session? Because I think that is really, really core, but I think it could be very specific to what we’re doing for the project. Is that cool?

Allan Laidlaw: Yeah, I can unless, you know, runs dry today. Obviously, there’s a lot to catch up on, but I do think it’s a a topic that could benefit the larger group. It’s it’s and it’s also a little bit of a show and tell on, on my part, even though it’s a blank canvas.

Frode Hegland: Okay. Yeah. Let’s go a little further today and then see how it goes. The just for context though, on on the connective topic and probably in your next I see your hand is when I was sitting on the flight, having my laptop writing away and author having a good time talking about the last week, all of this stuff. And then I started talking about the desired outcomes, and then I started talking about single document, this stuff we’ve gone through many times, and it just seemed like something was really missing. And I thought back to Denny and Andrew and I sitting, you know, sharing the headset, passing it back and forth. We did some basic tech test, test, test. First we were standing in a field with a huge wall of text. Then we did all these different things, and we ended up that for the three of us, at least, we prefer gray background, black text, huge wall, no floor. The no floor was really fascinating. And these are the mechanical things we need to test because when we looked at other VR software that has an infinite floor, it feels like it’s a huge cold expanse.

Frode Hegland: It’s awful. But when you have no background, you just have your text and not because we didn’t want to at this point deal with issues of what’s behind it. This is pure thinking space of text. It doesn’t actually feel empty. It feels okay. So we need to do a ton more tests like this. And for those of you who want to try out exactly what we’re up to on this on the Future Text Lab website, you can click through and we are continuously updating our our experiences there for comments. We decided to keep them up live. Oh Brandon is joining us. There’s been a power outage in California but he’s coming through well not very well at the moment, but he’s coming through. Yeah. So I’m just going to ask a question on this topic. What is the appetite for you guys for this group to have a session or sessions on discussing how things should connect better? Randall, you look so much better when you’re in the headset. Is there appetite? Is there interest in that aspect of it?

Brandel Zachernuk: Yes.

Frode Hegland: Dennis giving a big. Yes.

Dene Grigar: Mark Allen said yes to.

Frode Hegland: Okay, great. Hi, Randall. We’re talking about lots of things, and Fabian is next. Mr. Fabian, where are you?

Fabien Bentou: Yeah. Did you, did you you you question was how to connect better.

Dene Grigar: What’s next?

Frode Hegland: The question was if our topic of discussion. Okay, so this thing we call the hyperlink, it is not a link. It is a one way pointer because a link implies a connection. Right. But you can link to something that doesn’t exist. That’s why I find the term a little bit annoying. So I’m talking along the lines of. In the end, the space that we are now working, how can we get things to better connect? If you want something to connect intelligently in this sense, you need addressability and you need a little bit of context. So the basic use case would be to use Andrew and me as an example. I write a document saying we need to do this, that and the other. Some of the titles I write in hard brackets. And again, this is a small invention just to be illustrative. I write the word single document interaction. Someone in our community then has written on that topic, and they want others to be aware of it. So they export their document with this as Mark completely correct, it says tag. As long as they’re on the same directory, the system will know that. If, let’s say Randal reads the original document, this thing, it can click on and it’ll open up because we’re thinking XR space on the side of his document. Any other document that has that tag by date, the newest one will be open on top right. This is a phenomenally simple idea, and I’m not. Again, just to make sure nobody thinks I’m saying we should implement it, but we should start thinking like this because.

Brandel Zachernuk: Can you.

Allan Laidlaw: Sketch it out for us at some point? And just like or maybe you already have.

Frode Hegland: Well, I have in the document I shared with you guys, but just imagine. Okay, I’ll make a I’ll make a really clear example. I have written a trip report, and I decided in there to, to write this one word single document test. I just write that in hard brackets because underline means link. So this just means I expect someone else to write about this. And then Denny writes about the same experience we had last week. And she. When she saves her documents, she tags it. Single document interaction. It goes into the visual met at the back. So at the visual met at the back, it says this document, amongst other things, refers to the topic slash tag of single document interactions.

Allan Laidlaw: I don’t think anybody has a problem with the with that. I mean, it sounds great. Like as a as a Connective tissue. It sounds great, you know, like it may not. I have a larger, you know, background question of where does this fit in into the whole bigger picture of things. But as a as a technique, it seems fine.

Frode Hegland: But well, as a technique, I’m sure it’s got lots of issues. If we were to test something like this, and I have a great feeling that Mark Anderson has a lot of experience in this, in this field. Sorry. Pinging things. Hang on. I got to go focus mode now because you guys don’t deserve that sound. I apologize, right? But the whole idea is and I’m so glad that Peter pulled it in another direction. The thing is, writing a thing and reading a thing is fine. We still need to improve it. But to have this infinite canvas, almost a really important issue becomes how do we look at what’s between things? And the citation on the link ain’t enough. So the big question is soon we’re going to have a lot of documents, even in our own community. Some will be on slack, some will be on Basekamp, some text will be in email, there will be a PDF, all this stuff. Right? You can’t. We, as the future of text community, come up with a better way of saying I have a thought. And the way I choose this thought to radiate into the community is this I’m giving it these different aspects. So this is probably a bigger and to be clear and honest and Alan organization because you do it really great for these with these screens and you’re. A relatively neutral to this now, but I would really love to have a discussion of getting past hyperlinks. I mean, for crying out loud, it’s the most basic technology. It’s been incredibly useless, but it’s useful. But it has its issues, as we all know. Yeah. Sorry. Waffling on there, Alan. You’ve got a double hand. That means you have to speak.

Allan Laidlaw: Excellent. Great. It’s it’s a totally different topic, but it is worth putting down as a revisit to, to look at what hyperlinks can be. I just, I think that’s very interesting. And it’s actually come up a couple of times. The different topic is this I would love to set up just one on one meetings with a few of you. Some of you I haven’t talked with very much, but just to get a sense of open ended conversation about feelings about XR, VR, how you use it. Ken, I’d love to talk to you. How about how you use it? How you tried to use it? What? Didn’t feel right? I don’t know how how much you’ve tried to use it for reading or not, but I’d love to talk with you about that. Michael, I’d love to talk with you about what you’re working on, whether it has anything to do with XR, VR at the moment. I know you were talking about embodied. You’re doing embodied research, so I’d love to just talk with you about that and your feelings about. Where XR is and can fit into it, but not. Just just some questions about how you how you interact with knowledge and acquire knowledge and sort it and stuff like that on a day to day basis. So I’d love to talk with. Any of you that are willing to talk and probably just record it and just do a simple one on one sessions. So, but or I don’t have to record it also like that’s totally fine. Okay.

Frode Hegland: It would be good if it’s in the community. So we can refer to it, of course. Any accidental secrets that need to be noted and edited out, and that’s if it’s recorded in zoom. That’s really annoying and difficult to do. Yeah.

Allan Laidlaw: Recorded, but but not necessarily made public.

Brandel Zachernuk: Yeah, right. Yeah.

Frode Hegland: Yeah, there’s a lot of headnotes and comments in the notes on that. And I think Randall has something on this too.

Brandel Zachernuk: Well, you were talking about. So I’m responding to the. The notion of identification of something that I enjoyed and was amused by when I first built a wiki for my own purposes, was of genealogy. Working for my family was the was the bracket notation for the implicit creation of destinations. I think it’s I’m not sure. I haven’t used the notion enough to know whether they have any other innovations beyond that, but but I think that’s actually a pretty powerful place where if, you know, you were talking about creating a reflection, meaning also creating a reflection and having some mode by which they sort of implicitly call us in the same location. It’s not an intrinsic of intrinsic attribute of hyperlinks per se, but it is of Wikipedia where if you, if you the bracket, let’s go and bracket category links, things like that. Then I, they simply do collect in the same place and it becomes an issue of nomenclature to to make sure that they end up in the same way, in the same place, having the ability to kind of understand the the tag distance from those things, that computable sort of tag distance based on a likelihood that what you said the tag was is close to what Dini said is, is is a not a far leap from pretty pragmatically available capabilities today. So I think that those are useful. I do kind of feel like we lost Google as an ally for organizing and for our information of late, like, last 3 to 5 years, they’re sort of assuming ability to or willingness to to actually collect things and help you organize and understand where things fit together seems to have been impaired by some other motivations and incentives.

Brandel Zachernuk: But yeah, I think reviving that notion, that and notion is, you know, the notion of company, but notion, the idea of of being able to Construct connective spaces through the presence of tags and of soft tags. You know, we’ve kind of also talked about the way that the ability to kind of right click and, and search for things creates soft hyperlinks in a way, know that you, you can you can search through things and so that if you have some kind of notable piece of text, then simply it existing is is halfway toward being a hyperlink. Yeah. Having the ability to kind of. Alter the presentation of a page in a. In a way that. Is less than an author, but more than somebody simply viewing it as maybe an interesting start for being able to organize that information to have. And I haven’t I haven’t spent enough time in what’s called Not annotate with annotate. I know there’s another one that that was a significant and they inherited all of the YouTube annotations stuff. Dan Whaley got it. Anyway, yeah, I’ve gone on long enough, but I think the square bracket notation and the ability to construct common spaces simply by agreeing on what the tag is, is interesting start and somewhere that it would be worthwhile to kind of pull on for being able to construct these common locations.

Frode Hegland: Yeah. Thanks, Randall.

Dene Grigar: Hi. I want to follow up with some things that were said earlier. When Rob Swire was visiting my lab. Two, 2 or 3 times ago. He was here visiting numerous times in the last year and a half. One of the discussions we had was the fact that there was still a lot of work to do. On the notion of hyperlinks. Do you remember that, Rob? We talked about how there were still, despite the fact we’ve categorized them, there’s been articles written about what they do. I mean, I think of Jeff Parker’s wonderful control vocabulary of all the different kinds of ways that hyperlinks work, all of those things. But there’s still something missing. And I think this is what you’re talking about, Frodo. And one aspect of this that’s always kind of bothered me working in a movie, when you’re sitting in your chair and you’ve got this virtual environment on your screen and you enter into that, you teleport in to the landing space in your movie, there’s something happens that that from the time you’re sitting in your chair and you enter into this text based environment. Right? And that’s called intermediality. Like, how do we close the gap? And artists have been trying to address this problem of preparing for that jump, that leap into this new virtual space. We see that that in films like matrix, you know, they’ve tried to envision what that might be like. But the connection between two links me as one.

Dene Grigar: The space is another. Bringing those together, and with that space in between feels like looks like is experienced, has not been fully realized. And I think that’s some there’s some real opportunities there to think about that. Another thing I want to mention is about the empty space that Frodo was discussing. We started off with this beautiful scene, this gorgeous, you know, pasture land with trees in the background, very calming. The text sits in front of it. You’re working. But as a somebody that actually just wants to work on text, the background image was this this distracting? It was taking my eyes off what I wanted to do. So. I. We kept subtracting from the space until I finally said, I don’t want any floors, I want no floor, I want no boundaries. I want complete space. And then when Frodo came to my house, he was kind of walking around and he saw my desk with my I was I’m translating the Iliad. Right. And I had this desk that’s quite small, and everything’s piled up on top of it, and things are holding pages open. And he said, oh, you’ve got one desk for one task. And my response is, yes. And it’s not big enough, right? I don’t want something to fall off my desk because I’ve got to put it over here. I want everything in front of me. I want it 360. I want to look here and put something here.

Dene Grigar: So what Andrew had built was me being able to take something from the text and put it here and put it there and arrange my text however I pleased, and rearrange it once I was through looking at it from this perspective, that was a really useful and important way for me to be able to work as a scholar. Gray text, you know, easy to read. And I thank you, Peter, for that great suggestion for the Google font. That was helpful. But that kind of stuff. I don’t want any frills. I don’t I don’t need any frills, I just need. The be able to work in a space. And that’s what I can’t do right now. I’m sitting here with all these desks and computers around me, but they’re still not enough room. I mean, I’m still out of space to put my coffee cup somewhere that’s not going to fall on the floor and virtual space. This is what we can get, right? This is what we net in this space. So I encourage us to think about those two things. The notion of space not being defined except by how I want to define it, and secondly, by the fact that we have to fill that that space between when I sit in my chair, put on my headset and enter in that space, what happens in that moment to prepare me for that VR experience? And I’ll stop there.

Frode Hegland: Yeah, very good points. Let’s keep a pin in that. That’s a huge discussion as part of our system. Yeah. Can please go ahead.

Allan Laidlaw: Oh, muted.

Frode Hegland: I was going to say I really, I really can, that’s all. I really.

Allan Laidlaw: Like.

Ken Perlin: What? I really like what Brandel said, and I wanted to build on it. I think it’s actually a really important point that since Google started showing up I very often I mean, I believe the first, I think the first use in popular culture of the verb phrase Google, it was from Buffy the Vampire Slayer. So and since then, we’ve never looked back. That this idea that you don’t need an explicit link because we all know that there is this thing out there and I’m talking to you. And in the course of our conversation, I mentioned something. And it’s a phrase you can remember or it’s anything you can highlight and you just Google it. And all of those references are right there. So there’s a sort of implicit bracket to, to use your phrase, around everything. And I, I hope that we’re going to be able to keep that in mind and build on that as we start having the future of text everywhere that everything is implicitly bracketed because of. Cultural context I things like Google search, etc. and that we don’t forget that we have that superpower already to build on.

Frode Hegland: Yeah. That’s really, really. Well, yeah, days of discussion and we’ll keep discussing. And Mark, what have you got?

Mark Anderson: I well, first of all, prompted by what Kenya said, I find that really interesting and think to the work I’m doing with reuse of research data at the moment and the sort of how links get in the way. So the important thing is that something could be reachable, which is so this idea that there is this, this practice space to talk about but the, the sort of type of word in a box search that we’ve had for 20 odd years has sort of conditioned us into thinking that actually, no, there is a sort of there is a piece of text that I click that takes me there. There’s a sort of wrong understanding of the notion of the linkage that isn’t even real, because if you’re looking if you’re looking for sort of cheap local Rahman, lots of hits, if you’re looking for something that’s really interesting, you’re trying to research. Not much. It turns out, because the system is optimized for the things it knows it’s going to be asked about. But I was just trying to think I think I’ve now had a complete brain fart because I was going to say something about what Didi was referring to, and it’s completely been pushed out of mind by the equally interesting thought that came from what Kevin was saying. So I’ll shut up there and return through.

Frode Hegland: But yeah, I just wanted to say the first time I was lucky enough to see Vint Cerf was actually for breakfast. It was all the time we had, and we’re sitting there talking about all these connective issues, and he pointed out something that I thought was really worth following up on. You know, it’s not just a link from A to B that’s important in the world. We also have relationships and proximity. You know, you may have two knowledge things that are here, but another one is a bit further away, you know, like in a graph. How do you communicate that? How do you communicate that something is next to you? So the relationships and, and proximities and the space of knowledge isn’t just it’s over there, right. It’s over there thing. It’s just overdone. And I do think you’re right, Ken, that just Google it has destroyed a lot of our ideas for what a knowledge shape is, and to bring it beautifully with what Denny was talking. What is beautiful about the potential of a headset thing is the shape of knowledge, right? It’s not confined to a rectangle anymore. So how can we get into the space? This is something we discussed a lot last week. What is the quote home screen? What does it look like initially when we’re in this environment. And we have some idea that that you’ve seen. Yeah Adam Allen I see your double hand. One second. You’re next, I see your digitals. So for us and with what we’re working on, the idea of how the sinews and skeletons and the body of knowledge, because these are all different things, it’s so core to how we’re going to express knowledge in this space. So. Yes. Allen.

Allan Laidlaw: Yeah, sorry. I got to raise my hand because zoom will tell me to lower my hand if I don’t raise my other hand. A couple of things on that to going back to what Ken said. I don’t I don’t know that he was even saying that Google ruined the. Our knowledge landscape per se. But maybe I’m wrong. Okay. It reminds me, though I agree, I wasn’t.

Ken Perlin: I wasn’t saying it was ruining anything. I was actually praising it.

Allan Laidlaw: Yeah, yeah. So an interesting segue there is, I don’t know if you all are fans or followers of the Arc browser, but they had a product demo, or rather, they, they announced their their whole big new vision. And it’s a browser that browses for you. Right? And I’m going to go on record to say like, this is really good marketing. But when I looked at the demo, it’s like, okay, you’re asking for something and it’s going and finding links and then setting those up inside a folder for you. So now you can jump to those pages and get your answers. And that’s helpful. But that’s not like. A groundbreaking idea like Google. In a way, it’s it’s and I was thinking about that. Like, why does that feel so different? So I want to put this forward as a topic of discussion as well as I think a lot of times when we. Design solutions, software solutions. There’s this third rail called content. And it’s like, maybe it’s because we’re we all grew up with the whole freedom of speech or something, but it’s like you can’t touch other people’s content. So if it’s, you know, if it’s it’s a long article about like how to cook a steak and there’s a whole personal story about somebody’s grandmother, then that’s a, that’s got to go into the that’s the that’s their content, that’s their right to say that stuff. So all of that’s got to go in and usually with the ads. And so we have this almost.

Allan Laidlaw: We keep content away from the technology. We look at solutions that are just. I’m going to pull that page for you rather than I’m going to find what it is you’re actually looking for. And I believe this was just a condition of all Silicon Valley tech until ChatGPT came out, which seemed to go 180 degrees in the opposite direction, like. A generative AI, having no care at all for the sovereignty of text or the sovereignty of content rather, or someone’s content. And it’s really refreshing. I’m not going to say whether it’s a net good or net bad. It’s I don’t want to get into ethics, but The I bring it up because.

Leon van Kammen: It’s.

Allan Laidlaw: A lot of us grew up at a time where the rules seemed to be pretty static. You link to an article that someone else made, you know, and and that’s how you read about that topic. And everything was sort of like in its nice place, but but it’s clear that. Other generations before, you know, after us. And just the nature of the landscape is changing.

Allan Laidlaw: What? What the boundaries are. I’ll stop there, but because I want to open it up. Yeah.

Frode Hegland: I didn’t say Google was just bad in this context. I would I was saying it is a one step thing when there are so many shapes to consider. So sorry for the misunderstanding. Randall.

Brandel Zachernuk: So I wouldn’t. I would go a little further and say, not that Google is bad, but that Google has like the presence of Google, as this implicit sort of connector has meant that people have become less aware of the idea of structure and the idea of implicit structure and relationships between knowledge. You know, and that was sort of a case in point in the initial kind of power struggle, I guess, between between tagging and folders. At the very beginning of Gmail, they were like, we’re not going to do folders, you just do tags. And then people sort of were confused and frustrated. And so they said, coming soon folders. They’re just like tags, actually. And so there’s, there has been that kind of tension between the structuring information I have. One of the things that I would love to do with space and spatial computing is to provide Slightly more indication of the of the structure of information in the relationship between things. So, you know, we’ve talked about these bracketing or sort of implicit near links. And there are, there are structures like even behind links that you don’t get to see, like, I think that Ted Nelson’s notion of visible links. You know always, always present and things like that are is overkill. Insofar as it sort of requires, obliges too much real estate for the presentation of stuff that may not be adequately relevant to be able to justify the this kind of space of it, but some sense of what’s where and how much is in that place, rather than merely there is a hyperlink or there is a recognizable piece of text.

Brandel Zachernuk: I think that would be really interesting. And to to get to a point even further down the track, where the the implicit structure of something behind a claim or behind a piece of text is legibly expected to follow a certain form would be a really, really interesting thing for, for the purposes of information literacy. And I don’t know the answer to that. I don’t know what that looks like because you still don’t have a massive amount of real estate, but just the idea that you have some signal, some indication of what is behind a link before you click on it, before you just the idea that it can kind of conjure these implicit locations in a way that ideally isn’t distracting, I think that would be a a really interesting goal to be able to kind of search for so that people can tell when things are. Obviously empty and other things may, may have a little bit more behind them.

Frode Hegland: Yeah. Fabian, please. Thank you, thank you, thank you.

Fabien Bentou: It’s I I’ll post a link in the chat, but there is an old paper ten years ago, more or less on a personal information management. And why is navigation the preferred retrieval method? And the idea roughly behind this is search is good if you it’s hit and miss, and if you hit, it’s instantaneous. So it’s great. But if it’s a mess you don’t have any extra information. Yeah, yeah. And that’s the that’s the one. That’s the good retrieval. But if you don’t then you can’t backtrack, basically because you don’t have new information, whereas in navigation you’re on the path to the information. Let’s say you’re in the wrong directory or you’re in the wrong physical room. You can recall the path itself. And it goes back to what Brandel mentioned on the structure of information. If you don’t bother because, you know, you probably will find it to keep in mind the structure of information. And if it’s a hit, it’s a good bet. If it’s a mess, that’s it. You don’t have anything. So I think I really like to keep in mind that paper, especially when I think about VR or XOR actually because as you mentioned at the very beginning of this conversation, one of the things most of us are hoping to is piggyback on how brilliant our mind is with navigation.

Fabien Bentou: Like even people who say, oh, I get lost easily. Yeah, you do, but most of the time you don’t. So you’re still pretty good at it. You might still get lost in a super complex new museum, but overall, our mind is just brilliant at managing spatial information. So, yeah, I think it’s a good paper. And I think overall the effort, that’s just my intuition. It’s not but the effort we put into building and mapping the information, the structure of information of a complex topic on a structure and eventually a spatial one, again, piggybacking the kind of memory palace view is really efficient, and but it’s still an effort that has to be made. And I bet that people who don’t. That’s also why, just like Brandon mentioned, I mean, I’ve been managing my own wiki for more than a decade now. I don’t remember how long, but a while ago. And it’s it’s a pain. But in my opinion, it’s a worthwhile pain. You need some way to have your information space that is tailored to the way you view things. And if you completely delegate that to an actor that shapes it their way, I don’t know, I’d be scared. I really don’t think it’s a good bet.

Frode Hegland: Okay. Okay. So as you can see from the comments here We. This is partly what we want to do provide a navigable space of knowledge, also called spatial hypertext. But the thing is when you have a plane to work on. Sorry if this this is obviously preaching to the choir. So forgive me for my my language here, but if you have a rectangle to work on and you put a bit of knowledge here and another bit of knowledge here, the system knows where that is. When you have a huge space of knowledge, it can exactly it can do the same thing. But how do you deal with other people’s knowledge in such an environment? Right? That’s why it’s so important that we provide the context for the system, which is what we’re talking about today. So we can refer to different things. And it does seem that for a lot of the stuff in that we’re actually working on, a community matters as a boundary, but for not many things, you’re going to have the entire internet of Knowledge to play with, which is a good thing. So that’s why if you have tagging or connections or whatever within a community, it makes sense. It’s a useful bubble that of course you can break through. Leon, please continue.

Leon van Kammen: And that’s a good point. One of the things what was not really clear to me. Can you hear me, by the way? Okay. Is the distinction between searching or seeing links between of things you already sort of know or already have read in previous sessions versus you know, links to all kinds of material you haven’t read yet. And why I think that is an important distinction is that when I go to a library. Or to Freud’s house, who has many books. And there’s one room with many, many books. There is a very normal thing that the person will not show you all the books and starts to tell about all the books, because it’s too much. And in the same way you go to a library to maybe with a goal. So what I’m trying to say is that there is, I think, a big difference between trying to you know trying to make available some kind of huge search. And perhaps just some semantical cache of things the person already read and is maybe able to find again something he read about this or that. I think if we’re just going back to the simple reading in VR application or demo, which needs to be convincing at some point, I think yeah, I would be curious of what the average person that sounds very, very oversimplified would like out of a reading experience. Does the person want to have the feeling to have some kind of access to a super highway of connected documents, or would that person be more interested in seeing his own little library there and having, like, some kind of super access to back and forth between all kinds of the documents he already knows. So that that is just some, some questions which came into my mind.

Frode Hegland: Some good questions.

Brandel Zachernuk: I just read.

Frode Hegland: Ellen, I.

Brandel Zachernuk: Just.

Frode Hegland: So sorry I was jumping.

Brandel Zachernuk: The queue just to say I just remembered and realize that everybody’s notes are accessible via icloud.com. Not that there’s API access for people outside of Apple, but one I might be able to get it. And to not for yours. For mine. And to you can hack anything that’s being delivered via a web page so that it can be delivered to another web page if you have enough control over everything. So that could be a really interesting place to play, because I have a lot of stuff in notes. Anyway, sorry I jumped the queue.

Frode Hegland: You are forgiven. Alan, I think you were next, right?

Allan Laidlaw: Yeah, I perhaps I may have been an errant hand up, but Listen, great stuff. And I’d love to, actually. Throw that on a map so it doesn’t just, you know, go the way of of ideas and zoom. Because there’s a. There’s a lot more that goes to that, right? That could could wind up being the fodder for experiments such as. I know there was a like a Wikipedia VR kind of project, but you know, there are there’s if the premise is if, you know, and the great part about throwing this on a map so we can kind of like see the, the stack of hypotheses that these are based on. But if the idea is it’s more valuable to me to see the, the subject, the foreground in the contrast of world knowledge. Right. So world knowledge is in the background. And now I can interact with what I’m looking at and how it, like, you know, how whatever this it’s a PDF or whatever it is, is connecting to aspects in Wikipedia as perhaps the background, you know, some sort of parallax or if it’s. A different premise, different set of hypothesis, that it’s more valuable to me to see my own stuff in different ways. Right? Then we can do experiments with like a book stacks, like I have stacks of books that go together, even though they’re not categorically similar in my mind. They they are stitched together like a, you know, crochet. But So I think it’d be great to, to try and get a backlog of some of those. On a different note. The thing that I like and look for, even though it doesn’t seem like it has anything to do with knowledge representation, are those moments of genuine surprise.

Allan Laidlaw: I saw one in Twitter today where someone said they had a shot of skeleton in in XR and in their room, and they’re like, I had no idea that the there were two pelvic bones. Like they thought there was just a pelvis. But now apparently there’s actually there’s two separate bones or something along those lines, and it’s the kind of information that you would just wouldn’t occur to you if you’re looking at a image of a skeleton. Right. I’ve always thought it was just one bone. I don’t know anything about that. And and and like that. For me that goes into attraction log. That is something that is unique, that seems unique to XR, where it’s like I just learned something because I got to see it in a slightly different way. The information was in front of me the whole time, but I didn’t understand it until I got to walk around it a little bit. I would love to collect more instances like that, even if they don’t directly deal with. I think the much more difficult topic of how to represent knowledge, like the representing a skeleton, is a much easier thing to do. It would seem in VR, XR then you know what’s my knowledge base, right? But those. But those little morsels are what I think. That’s what I would consider a win, right? To get someone to go like, oh, I never saw it that way.

Frode Hegland: Absolutely. I’m just trying to find an image here from our session last week of our testing. Yeah. This is wonderful. Mark, please.

Speaker12: But just dropping another recollection.

Mark Anderson: Sort of based on commentator about, you know, I totally agree with Bernard’s point about, you know, too many links. It’s just way too much sort of it becomes visual noise after a while. And indeed, actually one of the things polishing up the the hyperbolic view in tinderbox, which effectively is a is a network of all the linked but only the linked notes in a, in a, in a tinderbox document. So in as it were, a database. Is that actually you scale it back to only about 7 or 8 hops because beyond that it’s just too much to see and understand. But of course, in a more malleable space like a an XR space. What that says to me really is that’s that’s one of the controls you want to have, you know, one of your controls, maybe on your wrist or something, is just to be able to dial that in and out, and that doesn’t stop you somehow creating a link that, in a sense, routes beyond your visible horizon. If you need something to show in what’s what’s interesting in that to me is it starts to make the engaged author really start to think about the interrelation of links between things, because you have to start conceptualizing.

Mark Anderson: You sort of have to start thinking about what you can see and what you can’t, and why you want to see it and why you don’t want to see it. And I think that’s potentially a really, really rich thing. And I also just note that Fabian covered much more eloquently what I meant about search, because a lot of the search at the moment, you either get an instant win or you often get nothing. You get a lot of noise, but you really get nothing. And a lot of a lot of search. I certainly find in academic context what I really wanted was something that sort of just drifted me towards where I was going. I often feel so often like that child outside where the grab thing, I can see the toy inside. I just can’t get hold of it, you know, Mr. Google or Bing or whatever won’t will not give me the thing that I’ve. I know it’s there, but it will not tell me where it is anyway.

Frode Hegland: Danny asked me here in the chat if I could post some pictures of which is what I’m doing. It’s just that this this the pictures or video go into my phone, so I’m just transferring them now. But I just wanted to tell you that in addition to that, if you click on the link that I put up, you can see it either live either on your laptop or in VR if you want to try it. So you’re going to have access to both. I’ll get the video up in a second.

Dene Grigar: It’s not loading on my computer, so it says webXR not available for me, so maybe someone else can see it.

Frode Hegland: No, no. If it’s on your computer, that is not webXR. Of course you can then see it just as a flat environment. No.

Dene Grigar: No, I’m just that’s what I’m saying. I mean, I, I can’t see anything, but thanks.

Allan Laidlaw: Yeah, I had that problem. I can I’m familiar with that using Chrome.

Frode Hegland: It should know. And actually.

Allan Laidlaw: If if we’re talking about.

Andrew Thompson : The the live test site, it doesn’t load the text in until you have the headset on, so it won’t show anything on the main page. And that’s just because of how I coded it. I thought it would be cool to see them all, like pop in when it loads in. Yeah, there was no point in me doing that.

Dene Grigar: Okay, as I was saying, it’s not accessible on my computer. I’m using Safari.

Frode Hegland: No, that’s not Safari. No no no no no.

Dene Grigar: Sorry.

Frode Hegland: Yeah. Now we understand why. So Andrew is I’m uploading a little video of it, and I’m also going to put a little screenshot, but it’s literally text and space. But please guys, use our website to what in the world is going on here. Please use our website for for testing. It’ll be continuously updated. So when there’s anything new you should feel free to put it in your head, said Fabian.

Fabien Bentou: Yeah. It’s to to touch a bit back also it’s a bit, it’s boring implementation detail. But one of the thing you mentioned a bit earlier was about at least my understanding of it was a shared workspace where if the modification you do on the headset reflects what you have on your desktop, if we collaborate on it, if you modify it. I see the modification I shared on the slack couple of days ago, maybe a week ago. There is a tool called or clone. Some of you might be familiar with it. It’s basically to copy and paste files across a network. I’m making the short version of it. And I discovered it because I have a specific cloud provider that was not supported. It now works. But what I completely discovered was very new to me. I thought you would just copy a file from, let’s say, your cloud. It might be what’s the name? Whatever. File manager? I don’t know if it’s. Well, let’s say Google Docs, for example, or Google Drive. And then something completely unrelated so it can keep them synchronized. So I think that’s the kind of feature we would want either the file system self or sidecar files, like with a database of whatever.

Fabien Bentou: But what was really surprising for me is that there is a server version of it. So you start your clone somewhere on a server, then we copy the paste or sync the files, which here, honestly, there are like if we imagine couple of or dozens of research pdf small. It’s not like terabytes of data. So it should sync like in seconds at most. I mean, with all the networking done so being able to have that basically back end to ensure that the files are synced across the different cloud providers without developing like a heavy interface for logging in on the different systems, exchanging tokens, etc.. I think that that could help quite a bit to to bootstrap a shared workspace across, across different cloud drive for file providers. I’ll put the link in the chat. Of course. Oakland. Org open source, customizable. You can merge also different file systems. I mean, it it can do really quite a lot. And the daemon mode makes it, it can be managed on the back end, on a server, wherever we need.

Frode Hegland: Thank you. I’m sure there might be comments on that, but just going back to Denny’s important message here. What you posted wrote is not easily experienced. We really hope that you will test these things in headsets, and we’ll make sure that everyone has headsets through whatever means necessary. And I’m still trying to order yours. It has a problem with the fact that you and I are in different countries, but it is an ongoing battle being fought. You kind of need that extra one. So yes, those are still a picture in a video, but yeah. Then Alan, we are also talking about things like that. But Peter, you’re still looking into how to be head setted, right?

Peter Wasilko: Yes, I’ve got to figure out which one will be best given my bifocals and prescription. I want to make sure that if I had to get inserts, it wind up being the right inserts. So I’m trying to find some articles, see if I can find some actual bifocal users who’ve tried the different headsets and what their actions and experiences were.

Frode Hegland: Yeah, that’s a good point. So, Brandel I got contact lenses and I got them at normal distance, and I put on the vision, and it’s awful. Because apparently when I took out the light, she’ll just to try them with glasses if I try them with my reading glasses. It’s beautiful and crisp. So they kind of prescription that should be used is not normal. 2020 walking down the street it is reading prescription. So I’m going to have to get different contact. This is something you may want to talk to the vision people about. When people order. I’m probably going to get the wrong inserts now, and the inserts aren’t the most expensive things in the world. And I’m happy. This is called testing, but in the case of Peter representing another customer, it can be really, really annoying.

Brandel Zachernuk: Right? Yeah. Which I don’t have experienced personally with bifocals yet. So I would hope that the instructions that the Apple is able to give to, you know, sort of accredited or associated optometrists is to seek the appropriate prescription for the appropriate virtual image distance, the the idea of the device.

Frode Hegland: There are no questions and no issues. When I had my vision tested at Stanford, they tested my prescription. As in what I need for normal. And when I uploaded my prescription, that’s all. There was no comment on what it means. But it does seem now that this focal distance, which is, what, 1.4m away or something. Api something. Oh yeah, or something like that. This is something that needs to become public knowledge because putting on the the, the Vision Pro is absolute heaven. It is none of this faffy quest nonsense, as you know. Right? But when you have that issue of wearing glasses, which a lot of us do it becomes a what the heck as well.

Rob Swigart: Yeah, yeah.

Brandel Zachernuk: I wanted to jump in to talk to to talk about the the process of making something be sort of maximally or appropriately accessible in a non virtual reality environment as well. It’s something that I is interesting. I’ve been talking to like my, my actual job is internal evangelism for spatial web, which is a lot of fun and talking to a lot of folks who are doing things like data visualization and and analysis, as well as people playing with things like graphical techniques and splatting and things. Just for evaluation purposes. And yeah, one of the things that is really important for them to be able to get to is to have a continuum of access where, you know, you can see it on a phone, it’s better on a mac. It’s best in a headset. And I I feel like my own personal ad hoc conventions of using things like the boilerplate are a good start, but but it’s going to be interesting to see how people move between those, because, you know, the one of the most useful things that happened out of the idea of the responsive web design practice was that it wasn’t a separate application. You know, sometimes there’s there’s an overreliance on exactly two form factors in desktop or, you know, computer screen sized and laptop screen size. And. Right now, there’s sort of the formulation and presentation of what a VR website is per se is just on, on quest and on. Vision Pro. It’s literally a computer screen, you know, it’s presented as though it’s a, I can’t remember the user on quest, but it is iPad on Vision Pro today, or it’s Mac, which is iPad.

Brandel Zachernuk: Because people don’t want to ship information and user agent, but the idea of a, of a responsive and and appropriately scaling web experience that has the ability to be legible on all of those platforms and then to to maximally leverage the most ideal one is going to be an important one, especially as well, because the I haven’t had one myself. But I understand that Android phones, a number of Android phones now support web AR or webXR, Amr, AR and you know, there may be some decisions where like, that’s not that doesn’t provide the modalities that are necessary for the kinds of stuff that we’re doing. But but it’s an interesting thing to to ask about like is webXR per se all you need or do you need to be able to have hands and head what do you need and what are the sort of the, the requirements that you, that you assign to a given experience versus sort of something that’s intrinsic to what web XR is per se, you know, like, is it all web devices or it’s only ones that have have these things. So it’ll be it. I’d be very interested in how people feel different experiences have the ability to flex in that way. What, what what the conventions are and. To, to whatever extent it’s actually possible to codify them. That would be amazing. I don’t have a lot of hope for that, but, yeah. We should we should be aiming to build things that work everywhere and work best where they work.

Frode Hegland: Yeah, that sounds like a lofty and reasonable and important goal for sure. Peter. Peter.

Peter Wasilko: Yeah, I was disappointed that Apple didn’t have beefier section on its website addressing glasses users with the Vision Pro. And also, it seems like the marketing department at Apple seems to have a very strong ageist bent. You know, you only see these beautiful young people with perfect vision, and it’d be really nice if they did some ads with middle aged people who have glasses and show that they can make use of the technology to. And there should be a section, you know, for your optometrist with a little thing. You print this out, take it to your optometrist and it will tell you tell him what settings he needs to put in his. Prescription simulation device in order to replicate the correct focal distance to have the headset on so that you can get a prescription. That would be working perfectly for that, especially since you’re getting dedicated inserts for the headset, it would only make sense to have a dedicated headset prescription as opposed to the current trade off with your bifocal user between your distance and your near prescription. Unless, of course, you need to simulate the distance and near differences within the headset because of how the headsets work. And I’m not sure how that tech works.

Frode Hegland: But Peter, we do know one thing. And Randall, if you can get this information, someone at Apple please do. It’s so unnecessary. But Peter, what I did find out from my testing is that trifocal doesn’t make any difference. What you need is to have the prescription that you use for reading for.

Peter Wasilko: Okay, that’s good to hear.

Frode Hegland: Not close reading, but you know, about a meter and a half and the fact that that’s still. Yeah. No. Exactly. Danny.

Peter Wasilko: It’d be nice if they. It’d be nice if Apple gave us the exact distance to. So it should go to the optometrist and make sure that you have an optimal prescription for it, as opposed to next best guess, because I don’t know whether the best distance for holding a physical book would be the best reading distance to be, you know, completely optimized. If that makes any sense.

Frode Hegland: Absolutely, absolutely. Fabian.

Fabien Bentou: Yeah, I pushed a couple of links in the chat following brendel’s comment. It feels initially and it’s actually the, the fact that Dean couldn’t open the page and that Andrew didn’t assume that somebody without a headset would go. There is a great example of it. So one of the beauty of webXR is it works everywhere. But it’s not magical, like it technically works everywhere, but the design, which is not a trivial matter, has to take that into account. So there are a couple of links show that and the idea is not to find the lowest common denominator, because, of course, if your immersive page is not immersive anymore in order to work everywhere, what’s the point? It has to work everywhere and best use the medium that you that you have to use. Which again, is hard as we were just figuring which what XR can do. But I really think it’s worthwhile. And I had a personal experience like this where I’m traveling, I have my headset, I can use it in the plane, and I tinker with it, and then I go out and I take a cab. And it was not convenient. I don’t necessarily trust the space, everything. So I want my experience to translate to my mobile phone. I didn’t think about it. That use case specifically before it. So every time we do put the effort into responsive design and accessibility because basically it goes hand in hand. Honestly, it is challenging, but in my opinion, it’s very much a worthwhile challenge.

Frode Hegland: It’s super important because Apple owns the ecosystem for the the vision, meaning that if you use an Apple app, just take it off and use it in something else. So we got it on that transition as well for being absolutely. So we can please. Unmuted.

Ken Perlin: Was muted. I’m always muted. I went, it’s a good thing that doesn’t happen in real life or no one would ever hear me. I we just went to the web page on my quest three and the testing page, the one with the round circle. It didn’t do anything, didn’t go in. And the other one, I got a blank, lovely light gray color room with nothing in it. So that’s that’s the experience on this device, presumably. Maybe it’s

Frode Hegland: Andrew, is that.

Andrew Thompson : A three? You said I haven’t tested it on quest three. I assume it should be similar. The the gray space. That was testing the gesture interactions. Did you ever see hands?

Ken Perlin: I will look right now. I’m right in. I’m in there, so I’ll go look. Yeah.

Andrew Thompson : If you do see hands, pinch your left fingers together and you should get a PDF to sort of tinker with gesture work.

Ken Perlin: Okay. So I’m going to go enter VR. So I’m in.

Brandel Zachernuk: This is the the Thompson 20.

Andrew Thompson : That’s the one we were testing. Live in the in person. So it’s kind of just text in space at the moment. It assuming the server hasn’t gone down.

Ken Perlin: It doesn’t believe I’ve put down my controller. It still shows me. Oh, there you go. Now it finally figured it out. Okay, I can see black spheres.

Andrew Thompson : Yeah, it should just be tracking the joints.

Ken Perlin: And I see that’s a very nice tracking. That’s great. But that’s of course meta. So. And then when I point my finger I get the lasers. So that’s good. And I’m looking at a nice text window and yeah, I can I’m manipulating it in this weird, awkward way. Yes. So that works. Yep. It’s working great. Now that I know what I’m looking at, it’s great.

Frode Hegland: It’s interesting, Ken, that you say awkward way because the discussion we started towards the end of last week is this is not a computer game, so we don’t aim to give the user full, complete interaction. We aim to give them useful interactions.

Ken Perlin: Well, the reason I said awkward. The reason I said awkward is because it’s treating the text page as a six degree of freedom object for me to grab when I when I pinch, but in reality, I don’t want it to be a six degree of freedom object. I want it to be constrained to be vertical or snap or things like that. So it’s awkward because it’s it’s implemented the which correctly and wonderfully the lowest level interface, but there’s still design work yet to be done above that. So this is not a criticism. It’s it’s all working.

Frode Hegland: Oh no. No, absolutely. I didn’t take it as a criticism. It’s just that the context of it, because some of these interactions come from some earlier game work that Andrew has done. So now we’re looking at exactly what you said, Ken. So we’re on the same wavelength, how to constrain it in a useful way so it doesn’t become too much. Yes. Fabian. That’s the right URL. Leon. But also actually, just to explain a brief thing, I don’t think you were here last time we went through this again. So it’s worth mentioning to all of you the current interaction, because we want any random academic to go into this is you click on the link and you get the sphere. And the point of the sphere is it’ll say two things. Currently it doesn’t actually support that. It’ll say tap on me to enter VR. This is when you’re viewing it in a headset, of course. And then also I am going to go to your wrist tap on me again for controls. So you have the headset, you’re seeing the sphere, you touch it and it wobbles a bit as the room becomes your entire space.

Frode Hegland: And then that’s fair, gets small, and it goes to where your watch would be, either on the left hand by default or both hands, or controllable by the user. That’s to disgust. So now the user knows that whatever they do with their hands, because that’s going to be learning and different gestures. If they tap on this sphere they get this control panel. So we’re going for a very, very simple here are lots of options for what to do on a global control. We had the flat idea, but then Andrew suggested we make it a prism. So it has three sides. So you have your normal and let’s say you do. Yeah. There’s a slider you press and then the thing moves down. So you now have a huge slider. So it’s all very comfortable. And on the third side you have maybe user settings. We’re trying to do this. So it’s super simple entry points. We can do a lot of testing ourselves for what is useful based on what the users will actually need. And then we can get into more sophisticated stuff later. Leon, thank you for your patience.

Leon van Kammen: Yes. So one thing I I just wanted to try the VR headset. But it turns turns out it’s I need to charge it. I just came back from a conference Fosdem with a lot of open source updates and activity. The guy from Mozilla who is maintaining pdf.js was there as well. And so I wanted to watch this talk, but it was so crowded that there was no way I could enter the room. I have a link here. I think they will put up the video of his talk soon, so there might be something interesting in there. And one thing I noticed, like, on the there was a lot of talk about the the apple Vision Pro. I checked some videos of some reviews and, and and kind of combined it with all the feedback I got from other people. And it’s I started to realize how hard it is for a company like Apple or Meta to sort of steer people away of, you know, comparing it with the, the real world. Like a lot of this feedback is a bit like I don’t know, you get like the feedback from people who are not really interested in VR or who are not have never really tried to work in VR. So you get like this very shallow feedback. And I was thinking maybe we this should be a warning perhaps for us as well, that like the moment when we say like reading in XR we might want to be careful how we phrase our, our things. I was even thinking, like, maybe we should even, perhaps broadcast something like a different way of reading because I’m, I’m a bit afraid that we’re getting a lot of perhaps like the wrong responses from people who prefer prefer books anyways and who just want a quick peek, and then they say like it’s not as good as books or blah blah blah. So yeah, I just started to realize this how how hard it is to do something new and sort of not get the the easy naysayer feedback thrown at you.

Frode Hegland: Yeah, it’s definitely a different kind of text. And with the concern of going into talking about knowledge, which so many forums are, I think the discussion today of spatial hypertext and looking at connections and stuff puts it in a, in a different way. It is putting on our headsets. Probably not going to be the ideal thing for people who prefer just to read paper books or for very long time. Anyone else have a comment on this before we move to Fabian?

Brandel Zachernuk: Yeah, I would say that one of the things that I’ve always been struck by in, I mean, I’ve been pursuing word processing and the idea of writing and PR since at least 2014. And one of the things that is striking is that even the most quantitative person will just wax mystical about these unknowable, but inevitably superior characteristics of reading and meatspace, the things that they do with books, the smells, all of these things. It’s just like, yes, maybe. But if we can enumerate them, we can characterize them, and we can talk about whether they’re possible to kind of convey or pull through into a digitally mediated medium or not. But if we have the ability to give voice to them, then we can actually talk about what those attributes are. And some of them are things like the fact that a book bent or a book has the ability to show you where how far through it is you are, or that a book has, on average, a more memorable set of landmarks that you have the ability to call on, and random access or semi-random access by being able to turn through it. But once you actually enumerate those things, once you have the ability to recognize those things, then you can make incidental and, and immediate tests of ways of being able to backfill those capabilities and then, you know, include the the sort of monumental improvements that things like control F can do for you in the context of more intrinsically, digitally mediated material. So yes. One of the things that I would say is important is to actually enumerate the things that are good about books and then say, okay, these are our answers to them. Some of them are imperfect, but here are what they are. Rather than saying, you know, reading is equivalent, you know, reading is exactly like this and that because, you know, as anybody who has enjoyed a Kindle knows, it’s still not the same as a book, but it’s lighter and that’s often better. So, yeah it’s tricky. I will read what Danny said as well, because she’s got a lot of experience with this too.

Frode Hegland: Fabian, is your comment on this or something else? Because I have a quick one. The dean told us to read a book which discusses different types of reading. Thank you. Danny. Mine’s downstairs. I think this is really, really important point because it’s like saying to some not that one. Not no, no, no, I’m thinking about this one.

Dene Grigar: I was I wanted to point out this one. If you look at it, it’s been thrown across the room so many times it’s dented all over. But it is. He is exactly the kind of people we’re going to come up against. So and this book came out in the early 90s, I want to say, let’s see, 94 or something like that. So this, this argument has been a long, long. Yeah. 94. What a memory. Yeah. Long argument. So we’re.

Allan Laidlaw: Ready. That’s like a Gutenberg parenthesis that just came out because a dipped into that book, and it seems like a similar kind of cynicism. Interesting.

Dene Grigar: Well, what’s interesting I don’t want to derail this, but I think it is interesting to note some of these arguments. And his major argument is the tactility of a book, the smell of the pages, you know, all of these things. And but what we argued back at the time was, yes, but we have sound now. We have visual images. I mean, this is all walls of text, right? There are other there are other modalities that come into play and hearing. The argument is and John Barber, my husband, is an expert in sound, that hearing is the only. Yeah, that’s the book. Never mind. Yeah. That’s it. Yeah. Perfect. Thanks.

Frode Hegland: Now you’re being hypnotized. In here. It’s a clever book. I started reading it on the flight back, but this is a really fascinating point right now, because the whole pros and cons is a bit useless. It’s a bit like saying, are you into sports? Yes, I’m into sport. And then you just, you know, just talk about football. It’s like, hang on, I’m a figure skater. Right. The type of reading in XR and the type of reading on paper, it’s so different. The affordances are to us. So obviously difference. I mean, reading a runestone runestone in Scandinavia is a completely different thing. So I don’t think we should pick a fight with paper people, of course, but maybe we need to redefine how we talk about it, just like what was mentioned, maybe going into the headset, maybe we try not to even call it reading. Maybe we just talk about knowledge shaping or something. You know, we change our vocabulary because if something, you know, one of the things we have to do for this loan is to hold up the PDF in XR and read it, of course we do. Job done. What do we do next? One thing that I’ve asked Andrew to do a bit surreptitiously on the side is to take the Fabian Wall, but only stretched out. So, for instance, I gave him a jpeg of 12 pages of a document, and the only affordance we’re going to have is you can go back and forth like this, right? That’s it. Just to get that testing out of the way, because we are not talking about reading, we are talking about interacting with our knowledge that happens to be in primarily textual form. But of course, we shouldn’t shy away from other things. Sorry, Fabian, for that huge detour, but it was so important, wasn’t it? Anyway, over to you.

Fabien Bentou: So I’ll take a little detour to I ended up loading my headset because I was curious, like, what were people talking about? And I think in term of being inclusive and responsive, I don’t think we do a great example here. And I wanted to then be able to mirror the headset to show other people here that don’t have a headset, what they’re missing. Because I think in term of having the conversation, even though it’s first. The first. Yeah. It’s not like having a first hand experience. Still, I think even for something you’re so passionate about, which is to document this whole journey, we need to to see when we say, oh, we see the greatest or the black text, and then when you pull the text, etc.. I can describe it for how is it doesn’t do it justice. So I think I know it was impromptu. I don’t want to be critical about it, but I think in terms of preserving and having traces of all those attempts and discussions, it would be good next time to have somebody who does tinker, who check to do the, the mirroring or some or short like even five seconds video. Otherwise I think it’s a little bit tricky. I wanted to do it. Of course, I have a last minute bug because I change a version of whatever, so I can’t do it. I can’t fix it. But. Yeah. Please. Next time, let’s try to have it. I don’t want to sound recorded, too formal, but some kind of visuals of what we’re doing. Otherwise it looks like half of us have headsets. The others don’t, and they don’t have any clue what what’s happening.

Andrew Thompson : Actually, Fabian we have video for all of this. With the exception of the test site that we were doing in person, because that one wasn’t planned to be shown off yet. I’ll link it to you, but it’s all in base camp. It was shown off last Wednesday.

Fabien Bentou: Sure did we? Did anybody? Maybe I dozed off, but did somebody show them? Well, we.

Andrew Thompson : Know this is kind of a resurgence of Wednesday’s conversation. Which not everybody’s at the Wednesday meetings. So we should have relinked it. You’re absolutely correct. But I’ll grab that right now.

Frode Hegland: Yeah, that is important. But also we really try to Oh, sorry you were interrupted. What were you saying? Sorry. The connection, I think today has been quite jerky, and we’ve had all kinds of issues.

Dene Grigar: No, I was talking about what I was trying to get the point about. Burkett’s book that you were holding up. How we think. So I just dropped it into our slack channel and dropped the link in here. So I will stop. Thank you.

Frode Hegland: Okay. Yeah, sorry about that. That was obviously not my intention. No. Okay. I’m having problems logging into slack for that. Hang on. Vince just asked a question about the symposium.

Speaker14: While

Brandel Zachernuk: On the subject of the sort of the conventions of a responsive design, I will also say, like, my point is that we do not have them yet. It’s there aren’t any clear table stakes. In terms of what what is the base expectation for how to be able to kind of transition and experience from the sort of the windowed to the phone to the to the Mac to the, to the computer. And so, you know, that’s one of the things that we get to do together here is to figure out what we expect that to be like. And and I agree, a video would be neat. What might be even neater is to record some actual action in it so that you get to see what a virtual reality participant sees not in a video, but in the actual experience. That’d be fun to work on. Fabian, if you have any experience with those things, then maybe we can. We can jam on a way to make those things sort of not idiot proof, but but a lot easier to be able to kind of convey into a system so that people get that the ability to see that in the same place, in the same way.

Frode Hegland: Yeah, I mean, it is really annoying that the the reason Peter doesn’t have a headset and I’m glad that I was the guinea pig for that. We all do need headsets as soon as possible, because to say the obvious to watch a video is a very, very poor substitute. And also we are targeting only webXR in this community. We are having the quest and the vision both as our targeted devices. So we should all have have access to that. A video just really doesn’t do it justice. And I know Randall, you agree with me, but, you know, we’re trying to find a middle way. Peter.

Peter Wasilko: Yeah. Since we don’t need all six degrees of freedom when we’re positioning documents in VR and being not nice to just be moving them around within a constrained space, that also leaves several degrees of freedom that our controllers will be able to sense available for other uses. For instance, we might be positioning it in an XY plane, but also allow rotation, pitch, or elevation of our hands to be controlling other factors like text magnification or semantic zoom, to bring in deeper levels of additional content, or enable us to click in extra glosses and secondary material. So there’s a lot that can be done there. Also, I think we should take a look back at the old canon cat and the Leap key interface. We might be able to do something analogous to that when we’re dealing with the vast information space of, you know, dozens of virtual documents floating around. So you might reach onto your Mac keyboard and use leap keys to jump around and highlight different documents in order to decide where we want to focus our attention next.

Frode Hegland: Sorry Andrew, I was just replying to your comment there. Yeah. Thank you. Peter. Leon.

Leon van Kammen: Yeah. I just wanted to take a small step back and simply thank Andrew for keeping up with us or or sort of going along with us taking in all this input and doing all this work you’ve done so far, I, I’m super happy that you’re doing this. So thank you.

Frode Hegland: Yeah. Excellent. Can.

Ken Perlin: I just wanted to say that I, I definitely appreciate the energy of wanting to create a continuum of experiences that’s relevant. And I think this has been addressed also recently, the conversation between this device and this device and one of these devices, but also having spent lots and lots of time in the recent months inside of my quest three in video passthrough, I’m starting to believe that there are limitations on how much we can move forward and still continue to think they’re the same thing, partly because once you go into the the video pass through experience, then the old fashioned book starts becoming even more relevant than in a way that it is not. When you’re looking at screens, you can look at your book and you kind of want to say, I would like to do real time video analysis and machine learning on this diagram in this book, and be reading the book while I’m interacting with the 21st century content and the entire paradigm of I’m in my physical room and there are books and there are objects, and there’s a globe and there’s all this stuff. All of the interface modalities that we have yet to develop are not even appropriate for anything that you would do on a screen, including on your phone. So I think there should be at least part of our energy that kind of embraces that 100%. This is reality, and reality just has this, you know, Harry Potter like quality about it, but it’s reality.

Allan Laidlaw: I’d love to jump in and add something to that.

Frode Hegland: I can’t believe you added Harry Potter. There can.

Ken Perlin: In my mind, it’s basically that’s what we’re doing. We’re all going to Hogwarts here without hopefully without Voldemort.

Allan Laidlaw: That’s excellent. I love that direction that you’re talking about, Ken in particular. And one of the. Questions I have about the project and and file it under concern. Maybe is. I know it’s very. Explicit that the project is about reading PDFs in VR. If we find that that is not.

Allan Laidlaw: Comfortable or it only works in a certain environment or for a certain amount of time. Is. Is that spiritually similar to say read a book and pass through and get metadata? You know emanating off of it or other ranges of experiments, or is this really, truly like we have to pound through any hesitation that a PDF must be read in VR?

Frode Hegland: No, no just to vocally say what he’s saying in text. We our job is to augment a academic reading and PDF is the default academic stuff. So this is why I wrote the document that I put in Basekamp right before, so no one’s had a chance to see it. We need to really separate our efforts. We need to have a pleasant interaction with holding a PDF. No question. We need that. But we shouldn’t spend forever refining interacting with a piece of dead material. Absolutely not. So this is why I really appreciate today’s discussion, where we have been redefining a little bit what reading means. Right? It certainly shouldn’t just be reading current stuff. I think we can fulfill our obligations to the basics. To Sloane very soon. What Andrew is doing and Adam’s doing is fantastic. You know, we haven’t actually promised them that much. So for us to really look at, what does knowledge look like when primarily encoded with text in a massive, massive space? That’s what we’re looking at. So, Alan, absolutely no need to worry. And I’m very grateful that you brought it up. And, Randall, thank you for showing the the showing the world there. Now, I just want to highlight that particular test that we did that is random text. And there’s too many interactions you can do. It was purely to get a quote unquote desktop, because what stressed me out very much the weeks before that is, what the heck does this initial thing look like? Like Dean was asking earlier, it’s a kind of an in-between state. So now we need to look at how do you show a bunch of documents and what should we prioritize should be shown in there, which we could have a view on, just glossaries. We could have a view on people. And these are becoming really, really interesting aspects. But until we had something to stand on, which turned out to be no floor and text and space, I was freaking out. Now I’m no longer freaking out. Randall with your big smile, I want to hear what you have to say.

Brandel Zachernuk: Well, I’m glad that you’re no longer freaking out.

Brandel Zachernuk: I wasn’t aware, but I guess I should have taken the hint.

Speaker14: I would echo what Ken was saying about.

Brandel Zachernuk: The sort of the inevitability that that some of the things that we should be aiming to to make sort of necessary for VR, are unreachable in other places. It was something that I was interested in thinking about when I was talking to some of the folks in the data science and data visualization teams, they were like how can we make this work everywhere for everybody? Like you don’t we but but but maybe there’s some stuff that you can do for thinking that you then have the ability to package up for showing in a different way, so that if the thinking must be done in VR, then the showing can be you can construct a distillation that is appropriate and still better in VR to to to show, but it’s going to be good enough to be able to see the thing to, to, to, to to be shown and to be stepped through and have those things demonstrated the I think in terms of the table stakes and the, and the portability, I think it’s at least more to in the, in the immediate term to be able to say, here’s what the experience. Looks enough like that, you can understand what you will get from being in virtual reality.

Brandel Zachernuk: You know that that here’s where the things are. Because yeah, the other thing from being a multi-decade veteran slash victim of navigating 3D space and digital content creation tools, is that 3D navigation is just so hard for some people to do in a 2D world. And it’s just so anticlimactically trivial within a 3D stereoscopic world to be able to, to see that these cubes are not the same. You know, that this, this box here and this box here, they only look the same because one’s closer than the other. And some people literally never get it, and it can be sad for them, but then they can get a really great job doing earthmoving instead of, instead of computer aided design like I did. So, yeah. There are there are ways that the correspondence sort of can be, can be mapped. There are some places where they can’t be and yeah, it’s going to be an interesting process of discovery, what we can backfill and what we have to say, like, no, you just you just need to put one on, man.

Frode Hegland: Absolutely, Brandon. By the way I cannot open the App Store because I have a European accounts.

Speaker14: Oh.

Frode Hegland: I can’t open anything. Nothing. Right. Which is okay, because it’s primarily to look at Andrew’s work, so that’s totally fine for now. But I if you can please open the reader up. I don’t need a full review from an Apple human. I just need to know if it just basically works.

Speaker14: Oh, yeah. No. It works.

Brandel Zachernuk: Sorry I missed your call. I was just like yesterday. So. No, it works.

Speaker14: Okay.

Frode Hegland: And thank you, Mark. Just one quick second. Andrew, where are you? At top. Right. Okay, so the experiment that I wrote about we now do need to find a better way of of mapping it. We’ll do as you say. We’ll find a way in base camp to say, you know, we agree you should do this. And then you guys comment. So we. Absolutely. We need something like that. All this is I sent you a JPEG that is 12 pages wide. So the idea is just that in a test you load this up, the whole thing, and the only interaction the user can do is back and forth. But of course you can’t go too far, so it goes only to the first page or last page. That’s it. Just as a really basic test to see if that is a nice way of reading. Because don’t forget, this project is partly research. So if it turns out it’s rubbish, we write that up.

Andrew Thompson : And I have a suggestion. If we want to deal with the 360 space what if we have it in front? But if you swipe it, instead of it just moving left to right, it sort of pivots around you. It’s the same sort of thing in front of you, left to right. But you could technically, like, spin it behind you if you want. Let’s. It might just be interesting to play with.

Frode Hegland: Let’s test both. Okay. Two thumbs up.

Andrew Thompson : And then for a place to keep track of this I was suggesting using the to dos on base camp. Generally to dos are like, this is a required thing, but if we want to make it like here’s a suggestion for Andrew. It will ping me if you use my name. And then I can just, like, checkmarks them once we’ve tested it. And it’ll be pretty simple to keep track of, everyone can see it as well and contribute.

Frode Hegland: We need a list of suggestions and we need a list of to dos. To dos. This is what’s agreed on the community. Do it baby. And the other one. You want to.

Andrew Thompson : Keep them separate? Okay.

Frode Hegland: Yeah, I think so. Thanks. Okay. Mark. Sorry.

Mark Anderson: Yeah. Very quickly, just just an exhortation. Say you in terms of mentioning what next. And I had mentioned sort of like people I’m thinking, please, please don’t let let us just count the countables. I’ve just seen it too many times. It seems sensible to do because it’s the thing the easiest to get hold of. But after years of sort of trying to make useful stuff out of other people’s metadata. So one consistent error I think that happens is, you know, countables you can you can take how you like, but just things you can easily get hold of are the things that you really already understand. Will. Will, I think, offer you less than the new medium. That’s my gut feeling. It does in the 2D space, so I don’t see why it would in the 3D space. So it’s just a just a caution to be more, more, more for us to be more adventurous in what we try and categorize.

Frode Hegland: This is very, very important. Mark and I plan to organize a date to come down to see you and Dave and Les partly for the PhD nonsense, but also to spend a couple of hours with you going through exactly this. What is spatial hypertext in XR and record that session, because these guys have an enormous knowledge of history and will tell us all kinds of things because yes, it is pointless just to do the mechanical if we don’t look outside of what are we really trying to do? And what we’re really trying to do, obviously, is help people think. And in this context think about how things connect. So yeah, Mark, thank you for being shouting that rooftop. Leon, please, we have a few minus minutes.

Leon van Kammen: Okay. Very quickly. Keep in mind I had some tests with XR layers, which basically improve the readability and resolution of things. I noticed that it’s not really it’s perhaps not not always the easiest to have, like Z ordering of objects, in other words, having objects in front of each other. There are some it works a bit in a different way. So thinking about that, I was thinking a bit that maybe if this becomes an issue like, hey, hey, what the hell? Why does it do we have all these artifacts, then it might be an idea to, you know take the inspiration of reader, basically a very simple experience. It starts very simple. And we could do that in the same way that we basically fade out the rest of all the objects when somebody really wants to read. And that’s when the XR layer can be activated and there’s no complex ordering of objects. And, and to avoid all kind of like rendering issues. So this could be a sort of like quick, low hanging fruit to sort of hide the fact that it’s pretty complex to have a lot of XR layers and traditional 3D objects in the same scene going on. So that’s just a small note.

Frode Hegland: No no no no, that’s not a small note. That’s really, really important. And just really two things on that. Number one because it’s you, it just made me think of visual media and URLs. We should be able to store it in a lot of it in the URL. But on this issue you all know the engineering diagrams that are exploded views, right? I think it’s a really provocative way for us to look at it. And I think this relates somewhat to what you were saying, Leanne, where you can read a flat document, even in XR, it through an interaction, you choose to explode it to see a multi-dimensional view. If you have an engineering drawing in a specific page, of course you can just have a 3D model appear and that’s nice. But if we can work enough with defining what different aspects of the document are, we should be able to take almost any document and explode it. And then maybe even further exploded into wider community, but not necessarily have to do that because sometimes a flat document is fine.

Michael Bonfert: If we go that.

Allan Laidlaw: Oh, sorry. Go ahead.

Ken Perlin: So just to respond directly to what Leon said the way we are doing it in our lab with the webXR system we’re building, is we if you have a bunch of text and images and various things, we’re presenting them for the most part, when you’re walking around in search mode as textures on 2D canvas and as you get close to something, if you do have a high a high quality PNG, etc., we then fade to a layer so that, you know, you walk up to it. Oh, I’m in reading mode now and I really want high quality. So that’s how that’s an engineering level solution that we’re doing now, because it seems to be the best we can get with what we have.

Frode Hegland: Yeah, I think that’s a very elegant solution. Ellen, your hand went up and down. Where are you?

Speaker14: Got

Allan Laidlaw: Yeah. The exploded diagram. I approach is if. First off, I think that we’re still really missing what a an alignment kind of meeting of, like, here are all the various ways that we could go about, you know, the next two years, right? There’s there’s so many options and. Getting being able to see the value and the bets that are involved in each approach, I think, would be incredibly helpful. Before we dive into one thing. When I hear the exploded diagram. You know, that is super interesting and fun sounding to me. That would line up with rather than let’s let anyone try out their text and explode it. To me, it feels more like the Apple dinosaur demo, where it’s kind of like, hey, we’ve got a text in mind and we’re going to turn it into a museum for you, right? We’re going to show you all the things that are possible that you wouldn’t ever think of with books. Right. And and I know that’s a different that is a different kind of experiment. And it’s certainly not a practical one, but I think there’s, there’s value in discussing that option as well or eliminating it.

Frode Hegland: This is the intention for the Wednesday session that you are heading to do exactly this, to check our alignment, to check what is useful. So very much look forward to doing that on Wednesday. Absolutely. Just a little issue on the exploded diagrams. It’s not necessarily a pre-designed thing, although that could be cool. I’m just saying, if the system knows enough about the contents, the user should be able to define how it explodes. Boom. Ken, your hand is still up. Please go ahead. So. Okay, Randall, your hand is up.

Speaker14: I’ve. No, no, I.

Brandel Zachernuk: My my phone is too close for me to need to Which is why I have a funny fisheye view of my face.

Brandel Zachernuk: What extra layers is an interesting trade off in terms, as you’ve discovered, Leon. That they don’t have z depth occlusion because they’re not using the same buffers. To that end, it’s one of those things that will probably only be with us in a nearer future. Where? And, you know, it it could be many years, but it would. It’s not a forever thing where performance is challenging enough that people are willing to make these major concessions to things like what object is in front of what? And I’m not sure how I stand on those things. I mean, one of the challenges with WebEx, or per se, is that it’s very difficult to wipe away all of the the privacy concerns that come with it. But you know, I think that it’s, it’s worth our using it in order to figure out what the spatial web beyond it looks like. But yeah. So there will be a number of sort of different solutions and cul-de-sacs of technology that we need to explore in order to understand what the best shape of all of these things is. Layers is just to be able because, because literally what layers is, is saying like, actually don’t don’t make WebEx render this at all, make make the lower level compositor that’s running in, in the Quest’s operating system do it so that all of the downsampling and all of the challenges that come with rendering those WebGL pixels can be entirely circumvented.

Brandel Zachernuk: That’s too much inside baseball. But what that means is that you’re starting to create sort of a hybrid renderer where there’s a number of things responsible for, for each piece of it, and it gets very messy very quickly. If you don’t have a single unifying way of being able to kind of reconcile all of the different signals about what should be on top of what. So, yeah, it’s it’s a it’s a useful technology, as Ken has pointed out, in terms of the way that his lab is using it, it does require that you need to have a particular point of view about when and how you use it. It’s not a catch all for just just making stuff better. So that’s something to bear in mind. And if you care to know the details, that’s some of those. And if not, my apologies.

Frode Hegland: These technical discussions are so important because they are literally about the scaffolding of our new world. Our world is not fully open. I think we do need to balance them with with discussions of what we should do. A little bit more. Alan, on your point of what we’re doing, I can tell you something now so we don’t waste time on Wednesday. These are from the documents. And also what I think we’ve agreed. We need a basic interaction and interaction with the PDF. And currently you can do too much like you can throw it across the room. That’s just not really useful. So we need to you need to be able to take from your corpus, read it, turn the pages, maybe look at many pages, maybe look at as a rectangle. That’s basically it. That just has to be done to fulfill that requirement. And then there is the question of viewing a library and a library. As Marcus pointed out, I have a very specific idea of what I mean by that. But of course it’s a very loose term. So in the library it could be just your own documents. You can see their relationships either just a timeline by when documents are released or define terms inside them and all kinds of things. So that’s going to be interesting experiments. Now one thing we need to remember is there are three parts to this Sloan document. One of them is actually a visual metaphor. It’s an entire third. Of course, we’re not going to be too tight to it, but it means that the notion of making a modern, hugely interactive document is really within our purview.

Frode Hegland: So I think we can use the excuse of visual metaphor to put a hell of a lot of stuff in there that, when viewed in a rich environment, not, you know, it really explodes. This notion of an exploded diagram is just a cute way of looking at that. But it also means that our discussion of how do these entities relate? Because in visual media we already have document names which academics don’t usually use, and I’m using it in my own software so that I can if I click on a citation, it immediately opens the original document if I have it. We can circumvent this. We can we can really hide stuff in there to reinvent the world. And that is what is so important with your thinking and your way of questioning. Allen. Now that we know that we have extreme freedom within these tiny bit of constraints, I really hope on Wednesdays we can have a very good discussion and do user testing around, not necessarily what is an academic want, because that’s probably a promotion. I’m not joking, but what is an academic need, right. An academic needs to understand how things connect better. And that’s our world, right? So almost anything we want we can shoehorn into that, whether it’s spatial hypertext or whatever. But it’s really clear your question of we need to have some more community understanding of where to focus. Yes. So let’s figure that out together.

Michael Bonfert: Okay. I hope.

Allan Laidlaw: That I’m. I hope I’m hearing the same thing that you said. Yeah.

Frode Hegland: I’m just saying that you are actually the key to this because you introduced the whole user mapping story mapping. And, you know, some of the early stuff was really, really simple. And you had to get the damn stuff to the headset. So some of it we didn’t need to think about, we just needed to do it. Now we’re at the stage of what are the things we’re going to research, and that’s why we have to spend these sessions. And we have to spend them correctly and they have to be reported correctly, because at the end of the year, we need to tell Sloan, listen, we built this thing and these are the things we didn’t build. These are the things we should have built and these are the things we built. That was absolute rubbish. But it’s important that the world is fully aware of them right now. We don’t know any of these things.

Allan Laidlaw: Well Ken, I’d love to talk with you and and other members, obviously, but about the.

Allan Laidlaw: The best bets you have. You know, like, I think that there’s we’ve got some ideas and I’m really interested in in. Kicking the tires on the ideas or or finding confounders and getting our you know, you’ve had a lot of experience in this, and I want to, as quickly as possible, jump ahead in the right way to think about it, because I know I’m not thinking about it the right way, I so I. If that makes sense. I’m thinking about making typical products that are on 2D screens.

Rob Swigart: So yeah, I think.

Ken Perlin: For me, I think part of the challenge has been expanding our ambition space and thinking about things to do that we wouldn’t normally do because we don’t have multi-person webXR, you know, you have to kind of like I keep thinking back to the thing that I learned that in 2006, when Steve Jobs asked his advisors, he started telling about the iPhone and most of them said, well, what would anyone do with it? And, you know, it seems like a reasonable response in 2006, but so, so I, you know, I think we’re in that space now where it’s all going to seem obvious in a few years. It’s like, oh, well, of course you just everybody knows this is how people interact now that we can do this, but we don’t have that experience yet. So this for us, it’s the, the, the limits of our imagination seems to be part of the problem. But I this is a nice group to start trying to set challenges and say, what if we tried to do this, you know, and then see if we can do this in my view, yeah.

Speaker14: Excellent. That’s good, I appreciate that.

Ken Perlin: But yeah, I’m happy to talk more. One on one. We just schedule it. Sure. Absolutely. Anytime. Happy to help.

Speaker14: Excellent. Thank you.

Allan Laidlaw: Yeah. Oh, we’re in New York, so. I mean, I could buy a coffee.

Ken Perlin: Yes. And I graciously accept.

Frode Hegland: Have you seen the price of coffee these days, Alan?

Ken Perlin: This is why I’m accepting so quickly.

Speaker14: Right, exactly. Yeah.

Frode Hegland: Hey, would you like some gold or coffee? Just pick one. That’s wonderful.

Ken Perlin: My headset fund. The money I save when he buys the coffee. That’s right.

Speaker14: I’m going.

Frode Hegland: We should wrap up. And I’m very glad you guys made that connection there. But I’m going to ask you guys one question, and that is. Please, please, please, please, please, please fantasize and write it down. Okay? This is a perfect time for that. We have a good clue. But you, Fabinho, right now, can’t commit because of other commitments. If if you can write a paragraph or a sentence or page or draw a picture or do whatever the heck you want of how you want this to be. Yes, exactly. On this one for you.

Allan Laidlaw: Yeah, yeah. The idea is that you’re making schemes, but you don’t want to write everything out. So you have all your notes on your arm and you just like push it up and then drop it anyway.

Speaker14: Yes.

Frode Hegland: Perfect. And that really drives you. It really drives with the things we’re doing.

Ken Perlin: You hold that up again. I want to see that again. Yeah.

Speaker14: Okay.

Allan Laidlaw: I used to be a cartoonist in my former life, so I I like to this the person’s face down at the bottom and kind of an Escher perspective. And he’s making plans in the sky, but he’s he’s grabbing with his other hand the notes that are just hanging out on his arm and sort of like tossing them up there.

Speaker14: Cool.

Frode Hegland: Anyway, yeah. No, that is really cool. But on a procedural thing. Yeah. I need you to write this stuff down. And also, for those of you who are Mac users and those of you who don’t mind, please write it in author, because it just makes it so much easier for me at the end of the day to take what we together decide goes in the book, because then you get it exactly how you want. Plus you get to know the software. But even if you write it in base camp in a word document, even if you write it on a blog, whatever, we got to get this richness down and and just look at how different we think in this community. We have almost mentally nothing in common. It’s fantastic. You know, we’re all white males right now, but still crazy white, and we’re going to go wider. So on that note, those of you who can make it same time, same place on Wednesday, I’ll see you there. And we may try to find another day to do more design talk. Any any final comments from anyone else? Eye on our YouTube channel. I am uploading the videos of us testing in the lab. It was filmed black and white, high quality. La la la. Most of it’s really boring, but I think it’s a really nice little reference to just sitting there changing colors and stuff. It’s kind of fun. Not required viewing. All right, have a good evening. Have a good Tuesday and see you on Wednesday. Bye.

Leave a comment

Your email address will not be published. Required fields are marked *