17 April 2024

17 April 2024

Frode Hegland: Hello. Long time no see.

Dene Grigar: Yeah. Hello. Okay.

Frode Hegland: Oh, I’m adding one item to the announcements. That’s just a little update on a camera thing that’s relevant to our community. 368 K.

Dene Grigar: What does that mean?

Frode Hegland: That means that yesterday this company called Insta360, which makes cameras that films 360 videos, they released an eight K camera. So the quality, because you’re filming two sides, it’s not eight K or one thing to see, but the tests show that it’s much more usable than what we’ve been doing before. So it’s going to be interesting to experiment with, especially now that YouTube is making a proper viewer for vision.

Dene Grigar: That’s good. It’s really good. Yeah.

Frode Hegland: So let’s see who’s joining us today.

Dene Grigar: Andrew. Of course. There’s our Andrew. Hello. How are you?

Frode Hegland: Hello, Andrew. So Andrew Leon will not be able to make it. He texted that he can’t come, but he says please tell Andrew that I like the new modifications, including the improved motion smoothing. So that’s good.

Andrew Thompson: That is good to know. Nice. Yeah. Yeah. I’m a I’m pretty happy with this week’s update. But it’s not very flashy. It’s just kind of like back end stuff and changes to what we already had. But I think it’s good progress.

Frode Hegland: Yeah, that’s exactly what we need right now. Well, we see who else is coming in and.

Dene Grigar: Out of town. Mark’s on vacation, right?

Frode Hegland: Yeah, on proper vacation. No laptop allowed. Vacation.

Dene Grigar: You didn’t do much vacation. You weren’t with your family much at all. You were working the whole time, right? Photo?

Frode Hegland: No. It was mostly Edgar being ill. That kind of meant we couldn’t do the things we were planning to do, but Yeah, we didn’t get to do much vacationing. We went to a place called Teamlab Tokyo today, which is kind of a immersive artistic experience. They’re getting quite popular. You walk into a room and there’s things dangling down and there’s mirrors. And so that was kind of cool. And we saw a large life size Gundam, you know, the mech warrior thing that was very excited by that. I think that kind of inspirational because when we put on the headset, I like to imagine it’s not just our eyes. It’s like our whole bodies become bionic. So I actually bought a little plastic model myself. I’ll put it together in London.

Dene Grigar: That’s great. I’m reading three body. Problem right now. You know, I had watched the TV series that the book was based on. Now I’m reading the books and it’s interesting because it’s written, you know, a while back. Right. It’s about ten years old, 12 years old, and it takes place. A lot of it takes place in a VR game. And it’s just fantastic. It’s really so well done. And when you and I were playing around with the spatial personas the other day, it it almost reminds me of the way in which this person envisioned VR space. It was really quite amazing. Is a Chinese fighter.

Frode Hegland: You’re watching the series as well, or only reading the book?

Dene Grigar: I only watch the series now. I’m reading the book I want to see. I want to read the book.

Frode Hegland: Yeah. No. That’s cool. So yeah, I’ll put the agenda in here again. I guess we can. Conquest. Mr. Fabian is coming. Best pasting one more time. Make sure Fabienne gets the agenda. It is kind of cool what we’re doing right here today. Three continents. And, you know, East coast, West coast, America, land. Europe and. East Asia. That’s just crazy that it works.

Dene Grigar: We were saying that back in the Moo days, we were like all of us playing Scrabble at three in the morning, you know, with people in Japan and Europe. It was just. It’s just so freaky. Freaky back in the early 90s especially. Right? Yeah.

Frode Hegland: Even just a basic computer game just has, you know, when you know, you’re playing with people all over the place, it’s really quite a thing.

Dene Grigar: Was funny, when I got my first job, I got to my first job and I was in an English department, and nobody liked computers except me. And I was so lonely there. But I found that if I stayed involved with my virtual friends, I had so many colleagues, so I didn’t I the loneliness I think I would have felt without the without the net back then, I think. I don’t think I could have survived. It was those folks that I saw at night or early morning or afternoon playing in these spaces that kept me going. This serves to, you know, a part of it. So.

Frode Hegland: Social is important. Absolutely. Right. I’m not sure who else will be joining us, so I think we can have a look at the The gender. You all have the link. So just briefly on visionOS developments. We had an update this week. It’s worth mentioning a couple of things. One thing is in author, it’s too easy to accidentally select text. So the context menu, same as we have on iOS. The first item we had to put there now is unselect deselect. Excuse me, which is very odd. Not something we thought we’d need. Other than that there’s a few practical issues, but it’s going really well. And Yeah, nothing. Nothing else really to report today. But if you have a vision, if you have auto reader, please do test and feedback, obviously. So Danny and I did a special FaceTime thing this week, and Yeah. Danny, do you want to comment on it?

Dene Grigar: Well, yeah, it’s really quite interesting because it’s, it’s almost it’s a little freaky, right. Because you’ve got these heads that are not clearly presented. They’re they’re very virtual. Right. So they’re you’re not completely here. And it is your persona that you took. You know yourself, and you’re holding up the headset. You had to take a picture of yourself and move your head around and all that stuff to kind of get a good. Perspective, but it looks pretty good, right? It looks pretty good. And then you see your hands moving. Right. And you can walk around and look at an object together. And so Frodo was videotaping us. So I couldn’t see him in the videotape because he’s the one taping. But if somebody was with us, you would see two people communicating with each other in the space, pointing at the document, talking about the document. And I you know, it’s funny because I was in the middle of finishing up the second section of the case study, and one of the things I asked for was to be able to work together with students and, and kind of larger tutorial groups instead of just one on one. And we could do that with this. And I’m trying to figure out a way, a way now to get my hands on enough headsets, Andrew, for the fall semester. So that we can. When I teach the spatial computing class, we can actually have these kind of five people events. We’re only going to have 18 people in the class, so it could work out pretty well. But anyway. So yeah, that was quite interesting.

Frode Hegland: Yeah, absolutely. And, Denny, you said something I found really insightful. It doesn’t go into the uncanny valley. It kind of ignores it because you have a sense of presence, but it doesn’t feel freaky in that sense. So that was really worthwhile.

Dene Grigar: Well, can I say something even freakier. So it’s interesting because I grew up in a home that was a nurse that had been a funeral home. But by two cemeteries and in the middle of Texas and outside of Texas. Right. So I had all this land behind us. But next to me was this cemetery, two cemeteries, and I lived in the old funeral home that had been servicing these, these cemeteries. And my mother had imagined herself as kind of a medium. And the things she would describe to me that she would see were these kind of disembodied heads and hands moving around the rooms and stuff, and she would communicate with these things. And I know it sounds weird, but she actually had these experiences in that house. And so when I saw us froda, I immediately thought of my mother and the way she explained the the spirits that lived in our home, she had the house exorcized by a priest. So and that didn’t seem to help much. I had my own experiences, but not that one.

Frode Hegland: Okay.

Dene Grigar: Just. Just saying. Yeah, but when I saw you, I was like, oh, my Lord God, this is. My mother would just have a heart attack if she saw this. Her dreams have come true.

Frode Hegland: That is quite spooky.

Fabien Benetou: So just briefly on this when I was, I would say a kid or a teenager at some point, like the light was shining in my bedroom just a certain way. And there is like this face in places kind of phenomenon. When you see two dots and a line, you most people can’t help but see a face. And I was not religious or a believer, but this was spooky. And basically, I noticed over the years that the more I watch even though it’s fiction or like documentary about hunting ghosts and whatnot, the more stuff like this I saw and at some point I like. I don’t know why, but I completely stopped looking at horror movies or documentaries about people looking in the haunted house. And since that moment, it’s funny because when I look back, it’s like there is a very direct correlation between how much of that content I was watching and how many things I was seeing. And yeah, it’s if you’re not prompted, I guess, for it, if you’re not somehow expect to recognize such patterns, then you just don’t. It goes all over. You don’t see them.

Frode Hegland: Yeah. That’s that’s a good point. Talking of patterns. So that little thing there is live. This is the Insta360 X4. Which came out yesterday, so I managed to find it in a Japanese store. The previous 361 of the future of text we had was actually only recorded in 360. And it’s really interesting. A 360 camera needs to have A sense of presence. I think if you put it in a funny location where a human. Sorry, I just got to plug it in while I’m talking. If you put in a funny place like the wrong height or at an odd place, it feels very strange to see it. I did an experiment with an interview once where I put the camera between me and the subject. It was actually Alan Kay, and it’s a cool effect, but when you watch it, you feel that your own head is stuck there between these two people. It’s very uncomfortable. So it’s an interesting experience. And now that YouTube are making a native app for the Vision Pro, I look forward to seeing what this higher 8-K quality will will be in there. So it’s in a way the opposite of what we’re doing in the group, because it’s purely about the environment and it’s purely passive. But it may be interesting to some extent anyway. Something something to experiment with. And the other thing from today here in Japan, I don’t know if any of you know Gundam. It’s a Japanese anime thing. You know, huge exoskeletons. We want to see one of them. They have two here. Near life size. Absolutely freaking massive. Edgar wanted a toy. I bought one, too, because when we go into XR, I want to not to feel that it’s just here. I want us to feel completely empowered. So the stuff that Fabian has been posting in slack on, you know, like the humane interface, you onto your hand and all these things to use. The whole thing is kind of nicely provocative. Any other announcements today from anyone?

Dene Grigar: I’m going to be in Montreal. I leave next Tuesday at 6:00 in the morning. I arrive that evening at 530. There’s an event. It’s a whole of a conference in honor of. Who was the director of Antigua, which is a very famous lab in Quebec. Quebec area of. Canada, and we’ve been partners for a long time. A lot of product projects, my lab and his. And he’s retiring after ten years of being the Canadian research chair. And so there’s a big kind of like celebration to say goodbye to him. And it’s a symposium in his honor. So I’m going to give a talk. I’m invited to talk there on Wednesday morning about the next. And I’ll be talking about the VR integration that we’re going to be doing, Andrew, which I’m excited about. But anyway. So Yeah. So I’ll be gone from Tuesday morning all the way till Saturday, so I won’t be here next Wednesday.

Frode Hegland: Okay. Okay. Good to know. Sounds like Rob. Do you.

Dene Grigar: Know Bertran? Rob, have you ever met Bertran Gervais? Oh. He’s amazing. Yeah.

Speaker5: Where is he?

Dene Grigar: He’s in Montreal.

Speaker5: Yeah. Montreal University? No, he’s.

Dene Grigar: At he’s at the University of Quebec in Montreal. Okay. You can. So I’m Fabian. I have to get my French back. I’ll be speaking French all next week. It’s been a while. I’ll come back, maybe able to talk to you.

Fabien Benetou: And I think actually, in in there, you have to speak better French than even in France, because they were pretty strict about not any English word like weekend sneaking in and all this kind of things. So really good luck.

Dene Grigar: Yeah, yeah, I like the these folks known a long time. They’re very, very nice, but I it’s going to be. I can speak stonkingly today, but listening to them and responding when they speak quickly is going to be hard. But hopefully I’ll get it by. Usually by the fourth day. I’m okay.

Frode Hegland: Yeah, that’s that’s cool. That’il. That’s. Yeah. Any other announcements from anybody else? Okay then. Denny, any further discussion on case studies?

Dene Grigar: Yes. I actually want to put my recap. I’m going to drop it into the. Chat here. The zoom chat. So I wrote that section on what I want. And then I had a recap that I’d like to share with you all. Pull this out so I can see the chat better. There we go. Oh, that.

Frode Hegland: Reminds me, I’m just posting a link to something adjacent to the 360, which is just very high quality, normal video that I’ve been filming for. Anybody who wants to see was just in the chat. Forgot to click enter. Yeah. Please put the link in.

Dene Grigar: I need to put it in the slack channel because it’s too long for the zoom channel. So if you go into your slack, I’m going to drop it into our future of text lab feat. Lab space. And I’ll share my screen with you.

Frode Hegland: Okay. Yeah. Good idea.

Dene Grigar: Okay, so what you’re looking at here? That whole section I boiled down to this list. One computing device in any location I want to work. 360 degree environment. Ability to see in front and and around. Ability to put objects wherever I want. Multitasking. Functionality. A spotlight option to highlight one document or item at will in order to focus on it, but still leave the other items ready for the future. So there’s maybe 3 or 4 things in front of me, but this one is the one that’s spotlighted in some way, and the other ones are dimmed so that they’re not in the way. Ability to save a scene. Oh, a seamless reading and writing experience that allows for marking up text annotation. This is stuff we’ve already been talking about. Ability to save a scene so that all my documents and other items are located in the same place they’ve been left once I log out so I stop, I’m working, I stop, I come back, it’s still there. So it’s a kind of a general save motion. So I don’t have like, put everything back again. A space sharing with other people. Of course, that’s already happening with the persona. Control over esthetic design of environment. I’d like to be able, I mean, Frodo, when we were playing in the personas the other day when I was playing back the video, the sound in the environment was, I couldn’t even hear you right? And I couldn’t concentrate on the document because the environment was so.

Dene Grigar: Pervasive, right? It was so overwhelming. But some people might like that. So having some ability to to decorate the space as you will within reason. Right. We don’t want to make it completely decorative, but. Option for leaving some items open to visitors but locking down others. So let’s imagine that I have some private documents I’m working on, or a draft of something I don’t want anyone to see until it’s ready. But I invite you into my space so some things can be locked and some things aren’t. Ability to choose. Ways to interact I can use my finger, use a stylus. I can use my keyboard, whatever, whatever I want. And then finally a note mode that makes it possible for others to see comments and respond to them. This is part of the the whole discussion that I have in that second section, which are draw up the draft of in the slack channel a couple of days ago, and I’m still working on the next section, which is about what is currently available and what’s left to be developed. So this is where I am with my case study currently.

Frode Hegland: At first comment, of course, is guys, Denny is our prime academic use case person, but you’re all also use case persons. I think this list here, Denny, is absolute perfection, and we need to I think I should put it on our website as bullet points and hopefully invite people to discuss prioritizing and what they want to do with that. You know, that is relevant or not relevant. If it’s not relevant, we list somewhere else. Yes, Fabio.

Fabien Benetou: So to the content of the list it’s to me, it’s exciting and frustrating in the sense that I would argue most of those needs I’ve addressed at least once in one of my prototype. Not necessarily. I mean, far from perfectly, but I’ve tried some of those. So in a way that’s exciting for me. It’s like I did not completely I wasn’t too crazy I guess in a bad way. What’s frustrating, though, is first, most of them are, as you all know, poorly done in the sense they are not efficient or reliable. Like I can show that they work. It’s feasible basically. But maybe, for example, you can only manage five documents or and not in high resolution. So that’s one aspect. And the other aspect is they’re all very much like individual experiments. And I cannot just like, bring them one by one together, like one will break the next, like 99%. Sure. And I imagine I can already say that for most of them, having them individually feels good. It helps you. You can feel you’re on the right path, but it’s indeed, when you start to combine them that it becomes powerful. And some of them, for example I’ve done but like super limited, like for example, multitasking, like the most recent thing I’ve done was being able to control the web page from the browser. So it’s kind of multitasking, but you only have the browser, you don’t really have apps, and those apps are from the. It itself. And I would argue, though, that for some of those to do it reliably, like across all platform is really, really like you need a ton of resources. So for those, I would say a couple of your own device or some of the platform you own just as a way to kick start the process. But like all platform is. Yeah, it’s it’s really usually it’s just too much resources needed.

Frode Hegland: Right? Okay. I have comments on that too. First one is the fact that you built these things is phenomenal because of the importance of experiencing these things. You know, in my journey with XR at so many things have changed that I haven’t noted it down. Well, you know, but of course, having an integrated environment is key. Now that I’m having a little bit of success with author, you know, being featured in the App Store and all that wonderful stuff. I’m spending more time on the little details like what is in the context menu, but that’s in a fully mature environment where I can actually do that. You know, we’re building this entirely from scratch. And as we’ll talk about a little bit later today, a reminder of what we talked about on Monday, a little bit on how we can move stuff around because one app won’t be enough for everything, obviously. So the question is to reduce user friction. How much should we do in our world, and how much should we leave for someone else to to do? You know, to try to do everything. Yeah.

Dene Grigar: And I. Yeah, I mean, I agree with you. And I think the if you read the the draft of my section where this comes from, I’m so infected, I want to say the word infected by the Mu experience because we were able to do so much independently. Right. So I was a wizard in my own mu. And I could give powers to each of you to do certain amount of things. You could build your own space or dig a cave, as we would call it. You would have ability to make your own objects. You could build your own fireplace and have your animated fire going. I mean, all these things were done with Ascii art back in the day, right? And so. And you could leave things there, lock things, lock up your room, leave your objects out. If your room was unlocked so people can come and play with them. Visit people in their own spaces and interact through text. Right. So that that was my world from about 92 to 2006 when I moved here, I gave up my Mu. Because I couldn’t get this. I couldn’t get the university to open up the ports for the server. And so that’s a long time to be playing in that space. And so when I get into this space.

Dene Grigar: I kind of want. I mean, that was 30 years ago. We are doing these things 30 years ago in these textual spaces on the net. It seems to me that there’s the possibility we can do it today, 30 years later, in this more advanced technology. It’s just a matter of figuring out the way to make that work. Bless you, bless you, bless you. Plus. Yeah. The Gophers. Oh, God. Peter, I lived in this. That’s how I did my dissertation research was through gophers Archie, Jughead and Veronica. They were called back then, but. Yeah. So I think that that it’s not it’s not rocket science to envision that this here worked in 92, 93. And we can make it work and, you know, 2025. When this. When this is due. So that’s my premise. So the things I’m writing about are things I could do, you know, 30 years ago. I defended my dissertation in a move in 1995, and it was the first and only one that’s ever been done that way. And I had 50 people from around the world watching me defend my dissertation. And I still have the Ascii art for the auditorium that was built for the event by John Beck from Norway, by the way, Rhoda and Cynthia Haynes from the United States.

Frode Hegland: Or John Holmavik. Yeah, it’s in one of our books. Thank you for the reminder.

Dene Grigar: He’s lovely.

Frode Hegland: So here’s a crucial question, guys. So Danny’s written this brilliant list that is very, very useful. I want to have it on our website as well as in slack. I think these are the kinds of things I want, I want that needs to positively fight. Should I write it on a Denny’s wish page, or should we put it in a what’s the structure of this? Because hopefully the rest of the community will add to them over time. So it should be some kind of coherence to it.

Dene Grigar: Well, I’d like to say I’m the author of this first round, so put it as don’t put a separate page, just put it in our set and say this is from this is from Deeny’s case study. It’s an excerpt from Disney’s case study. Now respond to this. What else do you want?

Speaker5: Right?

Dene Grigar: I’d like to think of it, but let me just finish. I’d like to think of it as a way to get people thinking like, here’s here’s what she’s saying now, what else is there? It’s a way to trigger a conversation. But I should get credit for writing that because it’s part of my case study.

Frode Hegland: Yeah, yeah. Yeah, absolutely. So what I’m looking at is if you all go to our website and hang on a second. Okay, so the first link on our website is research questions and use cases. Right. So it’ll be in there somewhere. And here we have four excuse me three things. The research questions which are separate they’re done use cases which is deeney’s use cases plus a basic thing of yeah it goes to interactions to support the use cases. Maybe that’s where it should go actually.

Dene Grigar: Yeah. So let me ask you this. I’m looking at our website and I’m there’s so much on here now and I can’t figure out where everything is. So I’m looking. Let me show you what I’m looking at and maybe you can it can help me out. The logic of this. Let me go back to where I started. Okay. This is the future of text, right? Let me go. No, no. Yeah. Wait wait wait wait wait wait. This is. The future of this is your future of text, right?

Dene Grigar: Here’s all of this. I go to XR, right? That’s us.

Frode Hegland: Not really. That’s kind of an announcement. Future text labels us. That’s our that’s our meetings.

Dene Grigar: What was that on this website?

Frode Hegland: No, no, no, it’s not on this website. Where I put Andrew’s code and everything is on the future text lab. But if you click on lab, there on the left, in the menu on the left. And then you click on future texlab.info. That’s where I put all the live stuff for us here.

Dene Grigar: I have that as a separate page, but there’s no way to ease. I can’t, all right? I couldn’t figure out how to get from the main page to this page. Okay, but.

Frode Hegland: This is a completely different website.

Dene Grigar: Yeah, I got that. Okay. Yeah. So use cases will be where it goes. You’ll have the whole case study to put on here soon. I should have it finished this weekend.

Frode Hegland: Yeah, but if you. Okay, okay. I’m sorry.

Speaker5: Go on.

Frode Hegland: No, no. Don’t worry, I’ll share. I’ll share. You can see it, right?

Speaker5: Yeah.

Frode Hegland: So the logic of this future text lab is on the top. One research questions and use cases. The research questions are. These three Danish questions which are just perfect. Simple. That’s what we refer back to. So I’ll go back a page under use cases. We have these so far, and I will bullet point them here and link to the full article as well. And then below that I’m trying to break it down into actual interactions. So while in the library. We should be able to open the document. Change listing. These are mostly obvious. But this is where we should put these are the things for Andrew to make magically happen, right? Annotation and note mapping came out of a note making came out of last week a lot. So I think you see, if you look at the top here, it links to all the different things like document interactions and labor interactions are both there. So maybe we should have something like environment interaction or something too. But anyway Fabian what’s up?

Fabien Benetou: So I’m not going to reply on how to do this but on what I had in mind to use it for. So. When I said I’ve done most of those experiments one way or another. Maybe I’m overconfident. Maybe not. And one way for me, of course, to check is to put one line, one experiment or a couple of experiments. And I think it would be interesting to have then, for example, Dini to try them to see, okay, but this is not at all what I had in mind. Or I’ve tried this, which is better. So basically the list of specs or needs or requirements. And next to it my experiments, some of what Andrew has done, some of also Brendel has done some of things that are totally unrelated, even to exhort, but could be useful. So basically, I imagine this kind of as a grid and ideally at the end, like, oh, this one solution takes three of those needs or all of those needs or, oh, I’m making a new prototype that address five of those needs or something like this. So basically one way to look at it. Yes, but ideally to be able to comment both with solutions and then that Dini and others could reply in term of shortcomings to such solutions.

Frode Hegland: I’m just going to write that down. Okay, I’ll work on that. I’ll work on making a structure for that for us and post that on slack so we can we can build something. More on that today, Danny. Or should we talk a bit on the symposium and planning and book?

Dene Grigar: Symposium. Be great.

Speaker5: Okay, so I.

Dene Grigar: Met with the folks from the Murdoch on Friday. It was you know, it was interesting because they’ve already given us permission to use the space. And I thought the talk was going to be about the use of the space, but it wasn’t. I met with the CEO of Murdoch. And Murdoch six on $2 billion worth of money they give out to nonprofits. So they give out money to, you know food banks and the symphony and all these different the university, my university. So they’re, they’re they give out millions and millions of dollars and they don’t run out of money. Right? They just have lots and lots of money. And so I met with the CEO, and I couldn’t understand why I was meeting with her. Come to find out, she had worked at Microsoft as an executive. And was in tech. Really heavy in tech, loves programing super much into tech in a computers. She just wanted to figure out find out about the project. So I met with her for an hour and talked to her about what we were doing, the goals, going through the advisory committee, you know, all the things that we’re trying to achieve. We talked about the Apple, Apple Vision Pro she wants to get one. We talked about, you know, the ins and outs of it. And, you know, I told her to go ahead and get and get it and to upgrade it to the most recent version. And and it was a wonderful conversation. And I was finally I said, do I need to sign a contract for the space? And she said, oh, well, you know, just contact contact Mike. He’ll take care of it. So anyway, I’ve got the address and we’re ready to put that on the invitation. So I’ll start working on getting an invitation produced so we can send out to people that we really want to be here. So that’s the update on that meeting. I thought that was really quite interesting.

Frode Hegland: Yeah, absolutely. Absolutely. Now, one thing we need to do is to decide when. Because soon we’re going to send out the invitations who to invite. So I’m just opening a list here of the old stuff once I can. I decided to research a person and look this person up. So Michael Benedict, we should obviously invite. To Peter talks about a book club. It was he seems to be active in the community still, even though he wrote a book a long time ago. So I guess he’s a he’s a good one. So if you have any.

Speaker5: Further.

Dene Grigar: Cyberspace, that was the that was the book club reading for for Monday, Michael Benedict.

Frode Hegland: Yeah, exactly.

Dene Grigar: So Peter Hein was a person I recommend he he’s written some articles. I also think Douglas Rushkoff.

Speaker5: But you might know him.

Frode Hegland: Yeah, I know Rosskopf. I’ve known him for quite a while. Yeah, he’s an interesting fellow. He kind of goes in and out of this. I can invite him.

Dene Grigar: He’s been here before. I’ve had him here.

Frode Hegland: Oh, excellent. Very good.

Dene Grigar: Yes, he knows he knows this university.

Frode Hegland: But, guys just even though we’re just in the middle of our session before next week, all of you, please send me one suggestion for someone to invite and one interaction you want to have in XR. Okay. Excellent. Good. Let’s pretend that you. You’ll remember. Anyway. So So that’s all quite good. One second.

Frode Hegland: Yeah. And in terms of you know, we look, I wanted to do a Twitter announcement. Just say, hey, what do you guys think we should we invite? So I played around with some graphics for you know, generative AI, like a space person on an ocean because cyberspace. And Fabian pointed out it’s a little bit AI generated style. And Deeney said it’s not very academic. I agree with you both. But we’ll do something like that soon. We’ll figure. I’m working on some other things when I have time. I’ll be back in the UK on Friday night, so then everything will be a bit normal and woke up a few things to see what you think. But I do think that our theme this year should be going beyond current understanding. Meaning that a lot of the stuff we’re working on now is great in the lab, and it’s completely relevant to put in. But when we go to solicit for the book and symposium, we should really push people, you know, if they’re interested and say, okay, you think you know this, write your assumptions, but what do you think could lie a bit further than that? Be a bit more provocative. So that’s that’s the thinking around that it shouldn’t just be extended cognition because we did that last year. So this builds on it. And then just briefly, before we go into Andrew’s into the real reason for today, on Monday, we talked a little bit about what I jokingly instead of PDF called SDF or spatial Document formats, and we’ll keep that conversation going. But as it was made very clear on Monday by all of us, we’re not trying to invent a new document formats. We’re trying to invent an approach of using standards that exist, such as what is it? Randall was talking about was a UDF or something. Universal scene description USD yeah, USD.

Speaker5: Yes.

Fabien Benetou: Usd.

Frode Hegland: Yeah. How to plug that in, how to use that with the JSON that Andrew and Fabian are working on. And Fabian and I are looking into the proposal of making your work a little bit more real in this, but we have to wait for some other feedback so we are not ignoring. We’re very excited by all of that. But complications, right. So we’ll talk more about that later. But as you all know, I want at the end of the day, if a document has metadata, it should be at the back. It should be at visual meta. It’s just too dangerous to lose things and it needs to be super, super open. You know, that’s one of the benefits of visual meta. Anybody can just read it with their own eyes and that’s it. Unless anyone has any point on that, I think we can move straight into underworld.

Dene Grigar: We love Andrew World.

Frode Hegland: So the link is as usual on future text.info. So if you have that in your headset already, it’s should be a reload page and click away.

Andrew Thompson: I’m trying to get some better lighting over here because it’s a wonderful day outside, which means all my lighting is backlit and I’m just a silhouette. Oh. Yeah. So I’m. That’s could add some really important stuff that have been sort of falling behind. So the the two main things, there’s been lots of like little stuff, but the two main ones is that the pointer system now works totally differently. It was just too frustrating having it always just sort of shake. And I needed a new system that doesn’t fall victim to the the nuances of the hand tracking. So I originally looked into getting the sort of pointer system to match the headset system, so tap into that. But Dini and I had a really good conversation about it, and we realized that you want the software to be consistent. So if you have it in the vision, working with sort of the, the eye and the hand tracking and then in the quest working totally differently, people won’t be able to Sort of go between. Headsets. Generally, software, even if it’s multi-platform, acts the same on every platform, at least as far as they can. So I moved away from that a bit and tried designing our own system. That’s based on the best of both. We know that the eye tracking and the vision is amazing, but it’s not that great for, like, really fine selection. And since we have text, that would be a big problem. So test it out. It’s it now doesn’t care about the rotation of your hand so you can have your hand in sort of any orientation. When you first start the point gesture, it chooses the direction you’re facing as kind of the intended direction. And then moving your hand around moves the cursor on almost a slightly curved plane in the direction you were first looking. And it’s a lot smoother, and it responds almost instantly, as opposed to sort of a plastic smoothing we had before.

Speaker5: All right.

Frode Hegland: Okay, so then the.

Andrew Thompson: Yeah. What was it?

Frode Hegland: Now I was just going to say so I just waved my hand about. Nothing happens as it should be. I make the gun thing. And I get yeah, yeah, that is night and day. And also you made the little line come in immediately and then disappear.

Andrew Thompson: I think that may have been coincidence, but I can do that. It just comes in if you haven’t pointed at anything for a while.

Frode Hegland: It’s Yeah. No. That’s it. That’s that’s what’s cool about it. Yeah. This

Frode Hegland: Yeah, this. The actual selection is night and day. This is what I’ve been quietly waiting for. Because I know you’ve had a lot of priorities. I’m glad you did this. This is amazing. By the way, being in a hotel setting in front of a mirror because there’s no other place to sit means that the Vision Pro thinks that there’s another human in front of me. So even are. I’m ghosting and I see myself. It’s not not the best experience. Oh.

Andrew Thompson: The the second thing you guys won’t notice right away. But I finally got around to optimizing the trick attack system. So you can now stack many, many, many documents without just lagging the whole thing to kingdom come. Which is very nice.

Frode Hegland: Yeah. I mean, no, it’s at the point where I can give you proper feedback. And Hang on, I’m going to record a bit in here. The way it looks Once I can. With the layering. So I just I remembered to do the long press on here for the library. Which I’d kind of forgotten about. So I’m now opening. A new. So now I have three layers this way. That’s interesting. Can you tell me the logic of that please?

Andrew Thompson: Yeah. So there’s kind of two things to it. I haven’t given up on the snap distances. Those are coming at some point once the more important stuff drops off. But I was playing with randomizing the offset just slightly relative to the plane to kind of give it more visual depth. And I liked the effect, so I kept it. It was, it was very easy to implement. So if people are like, that’s awful, take it away. I can but it also as a, as a nice little bonus helps avoid some z clipping issues with the text. So even though a user would not want to do this, they can stack two different texts on top of each other. And then of course, you’re you’re trying to point at both at once. And because they’re in the exact same z index, the headset really doesn’t know which one you want. So it kind of switches between the both really fast. Of course that’s that’s kind of user error, because why would you stack them on top of each other. But this, this also kind of avoids that issue.

Frode Hegland: So the.

Speaker5: Way it is, can.

Dene Grigar: I ask you can I ask a question I can’t I’m on the I’m actually on the screen with the sphere and it’s telling me that webXR is not available.

Andrew Thompson: It sounds like you’ve. You had it working before, right?

Speaker5: Yeah, yeah.

Andrew Thompson: Did you turn off webXR as an experimental mode? Not at all.

Speaker5: I have none of.

Frode Hegland: Yeah, they always update. May have done that on your behalf. Okay. You may need to go and do that again.

Speaker5: Okay.

Frode Hegland: For some reason, it stopped accepting input. So teeny one of the things you talked about last week was to be able to take text out and put it somewhere from a document. That’s an interesting thing. So I’m now in reader for Mac. You can select text or spacebar and it loads that text up in a separate window. I’m trying to do that also in vision. However in the environment here the language you used in your list was really nice. You said a focus mode. So Andre, what do you think about that? To have a a way where you look at, you know, I’m saying all these things. Now, if I could indicate one item and do an action. So it would then come at a predefined comfortable reading distance, maybe with a. A background, so if there’s something behind it, it’s not transparent. So do what I want and then I do an action for it to go back again like a float and or focus mode and go back. What do you think?

Andrew Thompson: Yeah, I like that quite a bit. To use our terminology. Say, that would be a a snap distance. And I like the idea of adding a background to it. Say. I’ve been treating our current distance as snap distance one. But say, what if we turn that into like that’s the middle snap distance where you can comfortably see it, but you’re not trying to focus on it? Snap distance one is the one that’s up close to you for just like focusing and then say like a snap distance three is further off their faded. So they’re not very distracting. But you can tell there’s something there in case you need to grab it.

Frode Hegland: Yeah, Danny, that makes sense to me. How does that sound to you?

Speaker5: Sounds great.

Andrew Thompson: That might be good to work on this week. I could focus on that.

Speaker5: Thank God, because.

Frode Hegland: Now we’re at the point where. I feel we have a play space, let’s call it that, for fun rather than workspace. And that. We can now try to make it useful.

Speaker5: Yeah.

Frode Hegland: But but by the way, while we’re talking about distances. So I have sent. Hang on before I misspeak here. Let me just check. Yeah. So Fabian and Danny and Rob perhaps sent a video to you and normal messages. And it was shot on the headset, not on the headset. On the phone. It was shot for the headsets. It’s really, really interesting how the stereoscopic visual effect works. I think that’s becoming more and more relevant now that we have spacing in XR. So when I send you the messages, please look at them when you’re on the headset open and just have a look. It would be nice if you can see it as well. Andrew, when you’re with me and the headset is there.

Dene Grigar: I’m enabling my webXR to just give me a second. I’m catching back up.

Frode Hegland: To talk about that while Danny is doing the settings. During this trip, I’ve taken a surprisingly large amount of spatial video of everything, and looking back on it, the fact that there is that real world depth effect, it may not make it as artistic and all of that stuff, but it makes it so much more there. It’s really, really is bizarre. How? You know, they have that in the launch video for the the vision. They have an adult getting down on his knees to film the family with the headset, which looks absolutely ridiculous. But you know, you can do it with a phone too. The effect is absolutely insane. So anyway, it would be good if We have more chat on that.

Dene Grigar: So what’s it like walking around Japan with the headset on with your family? Does that you get a lot of attention for that.

Speaker5: I don’t walk around.

Frode Hegland: I don’t walk around with it. I very rarely film using the headset. I usually film with the phone, but I have been at the odd cafe and and also on the flights. No real. No real attention. I think just the whole headset thing is become quite normalized. People may not know that this is the Apple one. Some people do.

Dene Grigar: Are there other people doing it then? So you’re not the only one on the flight with the headset on?

Frode Hegland: I have not seen anyone else.

Dene Grigar: Yeah. So I mean, I guess my question is, are people. Does it elicit attention when you do that? Or are people asking you questions, or are they curious, asking why you’re doing that? Do they ask what it is? Yeah.

Speaker5: Well, it’s.

Frode Hegland: It’s very odd. I mean, China, Beijing specifically is extremely technologically advanced. It makes Europe look like, you know, like living in some kind of a backwater. And the speed of the internet and the hotel here is absolutely insane. Our video from Monday uploaded in 2 to 3 minutes rather than two hours. But that kind of stuff when I was in Starbucks. Now to get my little coffee to stay awake from you guys. This one. I had just bought this Insta360 camera that I mentioned, and I was putting the pieces together on the table, waiting for my drink, and then the young girl serving behind the counter was very interested by this. It also had to explain, oh, it’s the fall. Oh, really? So, you know, that was something that I was surprised that a random person would notice. So a lot of these technologies are. Much more normalized than. It’s relatively recently. It’s it’s quite interesting. And one. One more thing. When I talk to old friends whom I know intimately and who are very quick at criticizing me as they should be, they say, oh, the headset, you know, people shouldn’t be isolated being in their headset all day and all of that. So of course I say the thing we say, which is, of course not, you should use different devices and you should be out of it. But when you need a thinking space, you should be in here. It really brings it home to me how important our work is. I see our work above everything as being evangelizing. The fact that you can work in this and we need to make it really, really effective so it doesn’t just become a playpen.

Dene Grigar: I want to say that I heard the same criticism about reading books as a child. I even had a boyfriend grab a book out of my hand. I was reading Emma by Jane Austen in the car when we were on a trip together, and he just grabbed that on my hand and threw it out the window, and he said, you’re not paying attention to me. And he was driving, right? But I’ve had my parents. I mean, I’ve had lots of people say, oh, you’re too engrossed in your book. You’re a bookworm. You know, it’s you’re too isolated. In fact, that was a huge criticism about reading privately right back in the 1800s, 1900s. And so we’re just circling back to that same idea. But I do think it’s important that we are together so I can read alone, but I can also have a book club. So now we have the the headset for myself, but I can also invite you in. That’s important.

Frode Hegland: Oh. Yeah. No, absolutely. Fully agree. And To kind of go back to what Andrew has shown us today. I feel apart from when it didn’t work, which I think was probably a vision interface thing, we have an environment that is worth criticizing now, not on the detail, but on the big thing, which is amazing. It’s it’s so cool having it in a window over there.

Andrew Thompson: If I could get a quick clarifying question when you said it stopped working did it just lose finger tracking and everything else was still there, or did the whole thing just freeze and shut down?

Frode Hegland: Not the whole thing, it just didn’t do inter interactions so I like it.

Andrew Thompson: Still had the hands but the pinch wasn’t working.

Frode Hegland: That’s what. That’s what I’m trying to say.

Andrew Thompson: Oh interesting. Okay. I have had that occasionally on my end and I’ll just restart my headset and it gets better. But if you’re having it on the vision it might be something else, so I’ll keep that in mind next time I encounter it.

Frode Hegland: We probably want to really focus on the. Okay, so I’m trying to close a window that’s down here, has a closed dialog. So I do the pointing gesture. So. Okay, let me reload the page and go back in. Hello, world. Where are you? You guys are still here, right?

Dene Grigar: Yes. We’re watching you.

Frode Hegland: That’s exciting. I am watching fog with a little bit of myself because the thinks it’s a person, but no data. Let me reload again.

Speaker5: Oh, here we go.

Frode Hegland: Okay, so on the left here we have Ohio. Oh, yeah. So the thing about the pointing sticking to the first seen hand, I’m not sure if that’s a good idea. Oh, and also another thing Andrew, that I think will be useful now is the little bar we have. If that’s. I don’t think it should be a bar anymore, because it’s too often in a table or the wrong thing when I’m at a desk. And right now I can’t reach. Yeah, okay. I can barely reach it. But I think maybe we should do is when you touch this, maybe it extends from the hand that has it. So it’s you literally can move it anywhere you want. Does that make sense?

Andrew Thompson: Yeah, I see what you’re saying. It’s just kind of like, have it stuck to the hand. Yeah. You can do that. It’ll be a little bit shaky because once again, we’re dealing with hand tracking, but that’s okay.

Frode Hegland: Maybe you can do a different kind of smoothing.

Andrew Thompson: Also if, if you’re ever having a. Yeah, it then it wouldn’t match the hand. So it did look weird. But we could we can experiment with it. Totally. If you’re ever dealing with it in the desk, I know this isn’t, like, intuitive, but if you look further up and open it, it’ll be, like, closer to you. If you’re looking down, it tries to position it the same distance every time. So if you look down, it’ll go further into the ground. And that’s of course not great.

Frode Hegland: But how about this temporary fix? Yeah. No, that’s what I’ve had to do. But that makes me think. How about the spawn distance? Be based on the height of your head, not where you’re looking. So it’ll be at the height of your head, but a little lower down, and it’ll be a certain amount in front of you, like an R, half an arm’s length. Something to experiment on.

Andrew Thompson: Yeah, totally. I think it was the way it was because we wanted, like, motion gestures. So it’s like the same position every time. So based on, like, where you’re looking, you can always find it. But I see what you’re saying.

Frode Hegland: Yeah, just have it. Almost like a HUD. Hard kind of distance, you know, rather than too far down.

Andrew Thompson: Yeah. And so far, we really don’t have much info in there. Yeah. It’s, it’s useful to have because, like, you know, we’ve got some things like debug mode and the slider distance, but it’s not it doesn’t have as much as we originally thought we would. So I don’t know if I’ll get to it this week, but I’ll try to like have it attached to the hand separately.

Frode Hegland: Okay. Yeah, either that or just have it spawned at a minimum down distance. How about that?

Andrew Thompson: Okay, I’ll do that first then because that’s easier to implement. Okay.

Frode Hegland: So, guys, what do you think of having the spawn be in the library rather than in the documents? Because right now you have to long press the big spear to go to the library. So maybe we start in the library.

Andrew Thompson: We do start in the library.

Speaker5: You do? When I press.

Frode Hegland: When I long press the dots, then it changes.

Speaker5: What is that? Yes.

Andrew Thompson: So the library is where you see the different documents. So you can select one.

Frode Hegland: Okay. Hang on. Maybe because we’re dealing with reference sections, it all looks like a library in a way, you know. So let me just be in it and okay.

Andrew Thompson: Yeah that’s fair. The library is the darker space.

Frode Hegland: Don’t say that in front of Peter. The library.

Speaker5: Is.

Frode Hegland: The light space.

Speaker5: Okay. Yes.

Andrew Thompson: It has a darker gray background.

Frode Hegland: Okay, enter VR allow.

Andrew Thompson: Yeah, because we don’t have a default document anymore. So you get to pick one from the library.

Frode Hegland: So in in my experience now when I just enter with a reloaded, what I see on the left is. Actually, you know what? I’m going to try to share a screen with you guys. That seemed to be very useful last time. Because then we can all discuss what we’re saying together. Okay. That wasn’t very useful. I shared screen, but then I couldn’t see you guys to to put it on here. Here we go. Sharing the screen. Now you can see this and this and this and this. That was very 1980s, wasn’t it? Right now you can’t see, right?

Andrew Thompson: Okay. Yes, we can see we’re, like reacting with thumbs up, but you can’t see us.

Frode Hegland: I cannot right. So let me enter the VR world and hang on first of all. Okay, I’m going to go in and out again because. That that doesn’t seem to be basic, so refresh reload okay. Enter VR allow. Allow. Here we are. So I’m going to light gray. And this is what I see.

Speaker5: What is this?

Andrew Thompson: This is the darker gray. This is the library.

Frode Hegland: Okay, but when I do this now. Okay. That is the electric, right? Okay, so so there’s nothing here unless I put something which makes complete sense. Yeah. By the way, your animation is really lovely. So now I’m going to open a narrative and hypertext tap on it.

Speaker5: Okay.

Frode Hegland: Kind of lost tracking. Now, there it is. There’s a bar narrative and hypertext. Right? So here I have this. And it’s a bit far. I’m going to. Make it there. Okay. Guys, what do you think of when you first open a document? It starts. And where the focus space would be. That makes sense, right? Rather than far away.

Andrew Thompson: For the focus space. You’d only want one at a time, right? Yeah. So would you start the new one in the focus space? But then if you choose another one, it swaps them out and like, pushes the old document into the reading distance or the, you know, whatever the current snap distance is.

Speaker5: Yeah, that’s what I’m.

Dene Grigar: Can I make a suggestion. So I’m thinking about how I work on a, on the desktop. Right. I open up three documents and I put them on my desktop and they’re all the same. Nothing’s highlighted. They’re all the same. I click on one to work with it. The other two go back and dim right. So I don’t want to click on something and it becomes a spotlight, I want it, I want to pull something up on the desktop or this or this in the scene, whereas we’re going to call it have them all laid out the way I want it, so I can see they’re all there then I. Evoke the one I want to work with, right, and make that the focus.

Frode Hegland: Hang on. Let me do that while you’re okay. So we’re in here. We have stuff in the room, so I know. But I think there’s a hand tracking lost. Okay. It’s back. So now I’m just going to try this one. I like it, so click. So here. It’s opened up here. So that makes sense. I think this should open up. Denny, I’m not sure if I’m contradicting you. I’m a little confused. So if you think I didn’t listen to you, I did. But I’m just very confused, so.

Speaker5: She just.

Andrew Thompson: Stepped away. I think you guys are talking about the same thing though.

Frode Hegland: Okay, so I think what would make sense is whatever I’ve chosen to read now is what I’ve chosen to read. Meaning it really should be, as this one is here, beautiful, but with the background. So I can actually, with a rectangular background, read it properly. And then if I go here. Excuse me. And open another one.

Speaker5: Come on.

Frode Hegland: That should now be in this space, and that the previous one should be stepped back one.

Andrew Thompson: Yeah, I like that. That’s a that’s a good idea. Well, I’ll try to add that with the focus distance.

Frode Hegland: Another thing is in this library. I don’t think tapping here should open it because it’s too hard to kind of hold here and then read on the right. The abstract, I think, would be nice if you have to go over here to like a tap on the an underlined heading or something title. What do you guys think?

Andrew Thompson: It’s kind of hard to move the cursor over there, though, without accidentally highlighting another document that might just make it more frustrating. But what would tapping it do instead? Just nothing or.

Frode Hegland: Okay, let’s put it this way. Oh I must oh, this is a great one. Okay, well, first of all. This is a bit too wide. The text. We live in a time of da da da da. That’s a bit too wide, but okay, so when I open this, I think we’re at a stage now where instead of just saying references. We should open up at least the abstract.

Andrew Thompson: To. So the abstract is what’s what’s being previewed. Yeah. I was planning on working on actual, like, document reading soon. But then also we’ve got the the focus distance. Not to say I can’t do both, but I can’t necessarily do both in the same week.

Frode Hegland: Danny, what’s your preference? Focus, distance or document reading? The reason that came up, I think you stepped away for a quick second, is when tapping on an article, I think it would be really nice if what opens in the reading space is the abstract, not the references. Because the abstract and the library is a bit unwieldy. I don’t know. What do you think should happen when you tap in the library?

Dene Grigar: So imagine that I bring up my library, which is the list of references. Right. That’s that’s the end. Call it the inventory, not references. The inventory of my library comes up. I decided I want to read the article by Mark Bernstein.

Frode Hegland: Hang on a second. Just. I need to follow you. Are we talking about in the library area now?

Dene Grigar: I’m imagining I’m sitting down at my desk. Let’s start from the very beginning. I’m sitting down and I’m going to work with my with my documents. I pull up my inventory and alpha order by author. Of my of my reference of my library contents. My inventory. A look at them from A to Z, and I decide I want to read Bernstein. I tap on it and pull it out. I also want to read the other Bernstein. Pull that out. Right, and I’m pulling out the document. Not the abstract, but the whole document. I’m pulling it out. I don’t want to just look at the abstract when I’m working in my library, and if I’m trying to figure out what something is, then I’ll just highlight the the the abstract to read it. Do I want to really want to read that? Then I’ll read that in the document. Put the document back.

Frode Hegland: So in the paradigm of how it is now, the way you would do it to make sure we’re all talking about the same thing is you go through your list of documents in the library and you tap on one, it opens and the library closes. So you we, we will have to decide what is presented. I agree with you. You open the whole document, but we need to decide on an initial view for the sake of discussion. Right now, let’s say it is the abstract and then you do something to see everything just for simplicity. And then you go to the library, open a second one. That means the first document you have opened, unless you moved it to the side, gets moved back. You know, away from you. So if you open all three without doing anything, they’re just getting staggered further and further away. Then you can choose where to put them and you can leave them as abstracts. Or you can go to references, or you can read the full text for each of them. And when you read the full text.

Dene Grigar: What I would prefer is not to have the library go away. I’d want to move it out of the way. So let’s imagine I the library shows up. Here’s a list of all the documents in my library. I tap on one and bring it off to the side. It’s the whole document. I take the library contents the list. Move it over here. Move the document in front of me. I start to read the abstract and I go, you know what? I think I probably ought to read that other article that Bernstein wrote. I pull a turn over here. Tap that document, pull it next to the one I’m reading, and they’re side by side.

Speaker5: Well, that’s a difference. That’s a then and then.

Dene Grigar: And then I’m looking at them and going I’m going to focus on this one first. So I tap on focus. This one becomes the one that’s, you know, brighter. This one dims and I can toggle back and forth between them. Or I can leave them both side by side, totally visible.

Frode Hegland: So that’s a slightly different paradigm because we have a library paradigm. Now, of course I would argue for keeping the current paradigm because the space I think can get very cluttered very quickly. So I do agree that it’s really important to be able to open a document super quick and to put it where you want. However, one of the interesting things I think we need to work on is how to hide information as well as show it. And I could imagine that as we work with the library, we can have really amazing views in there. So I could imagine when you’re in the library, pretty much your entire field of view is used. Let’s say it’s a citation tree or something. So when you open a document, I do believe that it’s really useful for the library then to be hidden now in at some point we may very well find a way to keep it usefully visible, either as a small thing or as in the background somehow. But either way, whatever we decide on that, once you are in the reading view, to be able to choose how to highlight, how to put things next to each other and how to read them is absolutely going to have to be one of our near priorities. So some let’s for for now not focus on what we disagree on, but what we really agree on is the having multiple documents and to read them so they knew what you would like is to have have you had a chance? I know you’ve tried author, I think, but have you also tried reader in Vision?

Speaker5: Not yet.

Frode Hegland: Okay, so in reader one of the functions is you can see one page, two pages, or all of them. Right. It has a maximum width, but you know. Yes. And then it’s actually really easy to scroll back and forth. So let’s say Andrew has all the facilities available to have reading in in XR you should be able to open your document to. Well, actually, here’s a very important Denny. Would you like it to be based on PDF page by page, or would you like it to be based on the HTML and have it scrolling? That’s that’s an important thing to decide on first.

Dene Grigar: I thought we decided we’re going to do PDFs right now.

Speaker5: Well we’re.

Frode Hegland: Doing. We are adhering to the academics having a PDF library, absolutely no question. But we’re also making the assumption that we have perfect metadata because perfect metadata can be made available. So I am perfectly happy to go with PDFs. But but the fact that Andrew already has the abstract, which is really nice, and it’s shown as text and not PDF. That’s why I’m wondering for you, in an ideal setting, not necessarily what we can do now. How would you prefer to read it?

Dene Grigar: What? And when I say me, I’m imagining my colleagues, you know, like I’m representing a whole lot of people. I would be weird to be somebody with. I’m. I’m unique in that I would probably have web pages. My colleagues across the university would not have web pages. So if I want to reach a larger audience right now with this project, I would say that I would want it to be PDFs at the moment.

Frode Hegland: Okay. That okay. That’s reasonable.

Dene Grigar: I mean, I would prefer more word, you know, non PDFs because I freaking hate them so much. But I’m also representing a whole class of whole class of workers. Yeah. Knowledge workers.

Speaker5: Know.

Frode Hegland: Absolutely. So we are going to use HTML in some places or plain text here. Randall. Please Randall, if you have a chance, go to Future Text Lab, find the latest demo and have a look. Andrew’s done some really great work that we’re going through right now. And what we’re discussing at this particular point in time is when you open a document from your library, from your collection, what you should initially see right now, it takes you to the reference section, because that’s what we’ve had to work on. However, when you’re in the library, you can also see the abstract. So Denny, and while you’re in the library and you go through your documents, as now the abstract is on the side, that is useful, right? For what you’re representing.

Dene Grigar: Well, I mean, I want to be able to pick the abstract, but I want the article out. I want to go to the library, pull out the entire document.

Frode Hegland: Yeah, yeah. All I mean is, right now when you’re in the library and you kind of point up and down in the abstract shows that in itself is useful for these people, right?

Speaker5: Yeah, yeah.

Frode Hegland: So under what I would like you to do then for the abstract, make it about half width because it’s actually really long now. So you have to point your hand there and read. So if you could type it would be really good okay.

Andrew Thompson: That’s a good point.

Frode Hegland: So so when we then tap to open into the reading view that. We then need to okay. So having it as a given that forgetting technological constraints, the human constraint is we have to show PDF. So Dean is showing a PDF. Would you like to show a single page, two page spread, many page spread. And would you like there to be additional information around the document, such as one of our designs where you have the PDF in the center, 1 or 2 pages, but you have the references as plain text on the side, so they’re easily accessible. What would you like the initial view to be?

Dene Grigar: So I imagine opening up the PDF. So I tap the list, pull up my document and it sits there, and then I decide what I want to do with it. So I want to open up. 2 or 3 pages. Read the abstract. Close it back. Open it up. Just go straight to the references. Right. So I want to have control over what I look at, just like I do in the real life.

Andrew Thompson: So if I could jump in really quick, I’ve got, like, a clarifying thing I want to see between you guys. Since we are using PDFs, but we’re essentially reading the HTML version of it we can have the page breaks wherever we want. They’re not at a set point. So, Dini, when you say that your colleagues will want to read PDFs what I hear is they want to see constrained pages. Is that is that the same thing? Because I don’t think they care what format it is. They they just want to see, like, they it needs to look like a piece of paper to them. And then they just scroll forever.

Dene Grigar: Yeah, I imagine they do. Okay.

Andrew Thompson: Yeah. I mean, the cool thing that.

Dene Grigar: Can I say this? Oh, yeah. I like to think at some point we can get rid of constrained spaces. I don’t want constrained spaces myself. I do want page numbers. I want things to be numbered, or I want things to be findable in some way. Which is what? Which is what page numbers do. But I’m imagining my colleagues are going to have PDFs they’re going to want to access. So they come up and they look like paper because we need those damn breadcrumbs.

Speaker5: Yeah, yeah. So if.

Dene Grigar: Breadcrumbing.

Andrew Thompson: If we are talking just putting in a constrained like visual piece of paper, that’s totally fine. That’s on track what we’re doing. If we’re talking about going back and reading specifically a dot PDF and having it render the same, that was something that we left a long time ago. So if we’re trying to go back to that, we can but like it would be stepping back quite a ways. I just, I just want to clarify between those two.

Dene Grigar: Peter’s going to want to talk, but let me just answer that question. I’ll be quiet. I’m imagining. Let’s think, Andrew, let’s say Sue Peabody, whom you know, history. We’ve talked her into going into the virtual reality world, and she’s got a headset and she’s going to read now. She’s got her library of her documents. They look like they’re PDFs, but they don’t have to look like PDFs. When we render them in the world. They just need to have, like, a page constraint for her because she’s not going to know what to do with text floating in the world. It’ll freak her out.

Andrew Thompson: So we we want dot PDFs again.

Dene Grigar: Well, what are you currently using?

Andrew Thompson: We’re using dot HTML exports of PDFs.

Dene Grigar: Okay. That’s fine. As long as it looks like a page, that’s fine. She won’t know any difference.

Andrew Thompson: Right. So that’s what I’m saying. We’re talking like we just care about visually the end result. Or we need to be able to import a PDF.

Dene Grigar: As long as she doesn’t have to do any conversion herself. She thinks she’s looking at the PDF that’s on her desktop.

Andrew Thompson: I think we might be talking past each other. Yeah. We can. We can chat. We chat some other time about it. There’s definitely a programing nuance to this that I need to kind of narrow down.

Frode Hegland: So also before Peter, just to on this very important point this is also the difference between the reality we have to deliver now and the ideal academia we want to develop. So at some point we want to do both. There’s no question about that one. We’ve done the basics. We want to reinvent this, you know, scholarly communication. There’s no question. So What what is in the Sloan pitch that was approved was the users own PDFs and XR. That’s really, really crucial. So we have to render them at some point, however. And then you gotta make sure I’m not talking rubbish here. In the library. Of course, there is no PDF, it’s just a list of documents and texts. But it is it. It will conceptually soon be their own. So it’s still their documents. And in the library we will have fantastic interactions developing, you know, seeing citations and all of that stuff. So that fulfills the criteria when you go into the reading mode. Ultimately, we want to have what Denny talked about last week. I’m very, very excited about that too. You have what looks like a PDF, but can instantly be just plain text, HTML style, depending on what you as the user want. You can tear bits off, put them wherever you want, they will know where they came from, and you can bring back where they came from on demand and so on. You know, like a freely Ted Nelson, gone mad kind of environment with post-its everywhere, so to speak.

Speaker5: For what we’re talking about now.

Frode Hegland: The most important thing we need to do is. I believe the theater of one of these people, that history person putting on the headsets, saying documents that at least look like theirs in the sense, you know, they haven’t actually synced yet, but they understand it’s academic documents. And when they open a single document, they find it a bit useful. So I suggest we do a hybrid solution where we show the abstract to begin with, because that’s in DNA’s focus area. I think we should. The next thing we should do for next week should be the focus area, because if we have a useful focus area for reading the abstract and an instant interaction to see the rest of the documents, that in itself is huge. Right. Because let’s imagine our discussions might go down the route that XR is really for the library. It’s not for reading yet. So we spend a lot of time in the library doing all these citation things. They still want to be able to see the documents. So, Denny should we tell Andrew that for next week, you in the library, click on a document, it opens, you get the abstract title, it’s got a rectangle behind it. So if there’s anything behind, it doesn’t matter. You can read it. And we put an underline under the title or an icon. Right now it doesn’t matter. That will in the near future mean load the PDF of this. Is that a good next step?

Dene Grigar: Well, I guess that sounds fine. I guess what I was saying is I wanted the whole document opened up so I could, and then I could access the abstract and the references. So. But if you want, if you think it’s better to start with the abstract, that’s fine.

Frode Hegland: Okay, well, how about this then? Open up the PDF, show the first page. And let the user just tap through it the worst interaction possible. But for the sake of argument so they can have it, they can move it about. And if that works well then we can look at having multiple pages and tearing out. So okay I see you’re pulling pulling back in a very good way. Let’s let’s absolutely do that.

Dene Grigar: And I do want to respond to you Andrew. So I’m imagining that going back to Sue, who I like a lot. Right. And I respect she’s. She seemed to hurt her documents won’t be PDA, won’t be HTML, but I’m imagining there’s going to be some way that we can provide her a migration tool. Like render your PDFs for VR.

Andrew Thompson: Right? That would be the hope. And and I believe that’s supposed to happen through the the reader software. Yeah. At some point or at least that was the idea.

Speaker5: Okay.

Frode Hegland: The way that I see that particular and important issue is that our initial group user group, is actually ACM hypertext members. So yes, we want academic other colleagues for sure. But the number one user group is ACM. They can provide us with HTML of the documents of any modern document. So we do have it and it is the same document. But yeah.

Andrew Thompson: And we’re matching their style right now. So when I search for specific tags I’m using their tagging.

Speaker5: Oops.

Dene Grigar: That makes sense. That makes sense.

Speaker5: Because.

Frode Hegland: We obviously we have to be on the balance here of reality versus ideal. And if we can really demonstrate to such colleagues that. Not having them in the normal PDF, but it’s the same data is actually useful to them. You know that’s a win. Oh, also really importantly, Dina, you mentioned page numbers. One of Doug Engelbart key things was high resolution addressability. So what he had in his earlier documents, what he called purple numbers because they were literally purple. They would be at every paragraph. I told him that you could make a link, make each one of those a link to an invisible anchor at the beginning of the paragraph, which means that when you come across a purple number, if you click to copy, look to copy that link, that becomes a link to itself. So in our world, we absolutely want to have addressability at least to paragraph level, if not finer. So page numbers are a really good transition from paper, so to speak. But high resolution addressing is absolutely crucial. Yes.

Speaker5: Now, Peter had his hand.

Dene Grigar: Up a very long time. Thank you. Peter. Yes.

Frode Hegland: You can take your hand down. You must be losing blood.

Peter Wasilko: Yes. Much better. Starting to get a little numb there. So what I’m thinking. As we’re going through this discussion, is that a lot of what I’m working with most recently doesn’t consist of PDF papers. I’ve been working a lot with books that aren’t going to get visual media in them. Then I started thinking that what I’d really like to have is naked visual media. Just imagine a file where the visual meta is what matters, and I’m creating the visual meta. Of of one or more separate references that are books, not necessarily papers. And what I want to be capturing in the visual meta is the knowledge graph of how those sources are connected to each other. For instance, I might want to have an entry for the concept of utilidors, which were the underground corridors at Disney World. And then I want to add an assertion that in the book, Walt’s apprentice Dick Nunez asserts that they were his idea. Now that’s not going to be a part of the book proper. But it should be able to live as a block of standalone visual matter. And then we can have software that could grab multiple visual media files. That’s why I suggest in the chat a while back, it would be really nice if there was something along the lines of the font da mover to be able to transfer assertions in visual meta from one visual meta document to another visual meta document. So visual meta I think should be elevated more. And instead of being so obsessed with the PDF artifact that will hopefully eventually come with visual meta attached, we should try to put visual meta a little bit more front and center in what we’re doing.

Frode Hegland: Well, thank you for that, Peter. I owe you a beer and a glass of wine and a whiskey. One thing I’ve looked at before was to have a. Sorry to use that horrible word, but a PDF represent a book, and that was just for use in reader where there’s no the book doesn’t exist in our collection. It may be a paper book, but we get the metadata from the online Google books or whatever, put it into PDF to pretend so we can do at least the put it in a graph kind of thing. But all of that is really important for discussion, but not for this for now, because we our primary data is the ACM papers. So that’s more of a Monday discussion. But I’m definitely passionately for doing something like that.

Peter Wasilko: Yeah, I wanted to get out there before it slipped my mind, and I might not come back to it for another month or two. So I wanted to get on the record now.

Speaker5: Yeah. That’s good.

Frode Hegland: Talking of records, Brandel, just really briefly I’ve texted you and Danny and Rob Fabian a few videos from here. They are stereo videos that look really good on the headset. So I hope you have a chance to see them. The one of the lizard was in a a pet store. You haven’t seen it, Rob.

Speaker5: But I have seen it. But it doesn’t look.

Peter Wasilko: Like 3D.

Speaker5: In the.

Peter Wasilko: Vision Pro.

Speaker5: Really? I’m missing something. You should be able to.

Frode Hegland: Just open it as a message and just play it there.

Speaker5: There’s a message clicked on the video. Played the video. It gets large. And it’s it’s not in stereo. Oh, is there another.

Brandel Zachernuk: Button on the video that you can tap on? Sometimes there’s a, there’s a button in the top right hand corner that looks like the panorama thing. I’ll, I’ll jump in now, I, I’ve been away from my device so I haven’t had a chance to look at it, but I can look at it now.

Speaker5: Yeah. That’s that’s.

Frode Hegland: Really weird. Hang on.

Speaker5: Pray for us. Just Okay. I’m looking at the message.

Peter Wasilko: There’s a.

Speaker5: Video. There’s a play button. It’s. A play and it plays in the in that little window. Oh, okay. That’s. I can download it. No, no, that’s the problem. If I can’t stop it. Right. So, Rob.

Frode Hegland: Okay if you play it in the in the. Message stream. It’ll place non stereo. So you have to tap kind of not on the triangle but on the side. And there it should definitely be. Stereo.

Speaker8: On the side.

Frode Hegland: I didn’t like the play triangle, right? If you touch that, it just plays in.

Speaker5: I can’t touch it.

Speaker8: I can just click it.

Frode Hegland: Yeah. But okay, so you’re looking at the video of the lizard, right?

Speaker8: No, I’m looking at the train station. I don’t see where the lizard is. Okay, I saw the lizard. I don’t know what happened to it, but.

Frode Hegland: Okay, but you’re in the message.

Speaker5: I got it, okay.

Speaker8: Now I see the lizard. All right.

Frode Hegland: So you see, there’s a circle with a triangle in it on its face, right? Yeah. Don’t look at that. And pinch. Look at the lizard image. But not on that. That should open it up especially.

Speaker8: Okay.

Speaker5: Isn’t that weird? That’s.

Speaker8: That’s not intuitive.

Speaker5: No.

Frode Hegland: Randall, tell the guys to fix it.

Speaker5: We have a thing. Sorry. That was a.

Speaker8: Bit. Yeah. That’s bad.

Brandel Zachernuk: If only there were other ways.

Frode Hegland: Sorry, a bit of an aside there.

Speaker8: That even has a pause button. Okay. Well. Yeah, that’s that’s much better. All right, I get it. I would not have guessed that.

Frode Hegland: No, I wouldn’t have guessed that either. It was accidental. I’ve managed to do it before and now suddenly it didn’t work. So click everywhere. Right? Yeah. I apologize for the detour, but Randall was here, so of course, you know, we a little bit of vision stuff. Okay. So. In order to deal with the question at hand, or what to see when you open it up. Actually, let me share a screen with you guys. This might help.

Frode Hegland: Hang on. We can go back to the slides we’ve been discussing before. For our designs. Just open them. So the problem is I use keynote for too much. Yeah, because we had things like this. You can all see, right? We had things like this where you open up a document that looks like a PDF, whether it is PDF or HTML for the sake of discussion is a bit irrelevant. But where specific aspects of it, like a table of contents and notes, a pair connected to it. Because one of the things we want to be able to do is write annotations. Right?

Speaker5: So, Danny.

Frode Hegland: Are you thinking about this kind of a look, or are you thinking more? This kind of a look?

Dene Grigar: I don’t want to. I don’t want it to look like a PDF on the side. I want it to be able to spread out and be it, be a document. Or spread out so I can see text.

Frode Hegland: Yeah. Because we had Where is it now? Because we had we had all these different layout options. You know there was one. And then we had one with many. Just because that’s how it works in reader. Ios now vision and iOS. It’s actually kind of interesting, even on iOS to have the all the pages open so you you move through them smoothly. Yeah. So you see the one at the bottom here, right?

Frode Hegland: Excuse me while I move. Is this kind of the thing you mean?

Dene Grigar: Yeah.

Frode Hegland: Okay. Andrew, what do you think?

Dene Grigar: Is that possible? Andrew?

Andrew Thompson: I have no idea. Probably. I don’t think the HTML export has page numbers. I could arbitrarily choose where the pages end, I suppose.

Speaker5: Yeah, this this.

Frode Hegland: Would definitely be for Sorry. Not definitely. It would probably be for. They Sorry. I’m just going to connect to share here. But yeah, this will probably be PDF is what I’m trying to say. Sometimes it takes a bit of time to mirror. Okay. You can see this. The screen as before, right? So Let me just go somewhere more pleasant.

Dene Grigar: Are you trying to share something with us? Because we’re looking at the slideshow still.

Speaker5: Oh. You are. Yeah.

Dene Grigar: I see your I see that you’re you’re looking like you’re trying to get inside something but weren’t looking at the slideshow.

Frode Hegland: That’s really weird. I was I was sharing my screen here. Let me try that again.

Brandel Zachernuk: So we see yeah we see our whole screen now. It’s got a number of now we see it on it.

Speaker5: Do you see? There we go. Okay. Got it.

Frode Hegland: Okay, good. So I’m just going to have to set sideways. The mirror makes it think like I’m there. So that’s not a good thing. And let’s leave this a tiny little hotel room. Right. So in reader, now the way this works is let’s open this one for instance. So if I just tap I have this. And now it’s set to be one page. You know, so we can do something like this. So that’s all quite beautiful. But it is very. One page. Now, if I go all Well, it’s a two page.

Speaker5: Like this.

Frode Hegland: We go through it this way. And then finally I’m going to make the window really big. This is the biggest I’m allowed to make it. But now we have this. This is closer to what we’re talking about right Denny.

Frode Hegland: And then we need to work on selecting text. I find that difficult to get done in text area.

Speaker8: Now.

Frode Hegland: Of. Of course it’s so. This is what we have with author. So author is obviously text, which would be the equivalent of HTML in for a current discussion. And I can make the window massive. But There’s a there’s an odd little bug I need to. Right. So say. There is text and then we have. The outline on the right. This is what I was thinking about for our

Speaker5: Our view were.

Frode Hegland: We can have things sticking out of the document to quickly jump around it.

Andrew Thompson: So one thing that might be worth thinking about, because this is this is really nice. But you currently have this. This currently works. It might be a little silly to make this same kind of thing again, just in webXR. Obviously it would still be a bit different, but since this is like a use case for what you’re envisioning, I wonder if it would be worth experimenting with something else for the the webXR version? I don’t know, I’m fine either way. I know for now, just opening up a bunch of pages is is within scope either way. But. Just thought it was worth bringing up.

Dene Grigar: You know that’s a great point, Andrew, and I’m wondering if If. Because this is considered proprietary software reader, right?

Speaker5: No, which is fine.

Frode Hegland: It can be open sourced, but it’s based on visionOS, which is the issue. So the software itself is not proprietary.

Speaker5: Sorry.

Dene Grigar: Author. Author costs money.

Speaker5: Currently.

Dene Grigar: Author costs money. And so we are looking at open nothing costing anything for the Sloan project. That doesn’t mean this is not fantastic, you know. So we have this this works. This can work in this space. It’s fantastic. It’s exactly what I’m imagining. So yay. But I’m wondering if maybe we can get rid of the breadcrumbs. Andrew. Is that what you’re suggesting? Go straight into text. Floating in space.

Speaker5: Oh, hang.

Frode Hegland: On a second. On this really important topic I think we should really focus on. Xr doing the best that XR is. And of course we’re learning what that is. And also to it is in the Sloan Foundation spec to integrate with reader and author as example external applications. And now Fabian is getting ready to integrate with his stuff, which is very good. And then Adam will be integrating with his stuff, which is very good. So one of the ways I see it is my proprietary software. You’re reading something like this and it’s too much. So on top here there’s a button, you click that and then it goes into Andrew’s work to see it connected in a library. So it’s a different thing, same data but it does a different purpose. This is optimized for incredible reading because the vision gives us that. But the XR is optimized for seeing the connections. The thing that is exciting about what you talked about last Wednesday, this pulling things out thing that I think is really, really important may be the key thing of reading like this in webXR the fact that you can do that, I haven’t been able to figure out selection’s good enough. And. Invasion yet so annoying. Anyway, I don’t think it’s black and white. I think different environments need to talk to each other, and we need to make clear what it is. That’s it. Really? I’m just going to try a different selection on this.

Speaker5: There we go.

Frode Hegland: Let’s see. That’s that’s obviously really important that you can select the text in the scan box. You can’t do that. So, Danny, obviously what you’re saying is completely, 100% the point. I’m wondering, though, maybe for a bit, we should focus on The library and annex are because that’s unique and that’s not going to be here in any way and see how far we can take that so we can abuse of you. Select the document and you can see who cites it in. All of that stuff, I don’t know. Okay. No. I’m just. I’m just. Arguing with you and myself here. I just don’t know. I’m just trying to. To figure out what this space is. I don’t have any strong opinions.

Dene Grigar: Let’s turn to Andrew. Andrew. In the direction you’re going, what would be the next logical step for you to produce?

Andrew Thompson: I mean, the things that.

Dene Grigar: We’re talking about.

Andrew Thompson: Yeah, the close up view distance is the next thing I’m focusing on. But the, the viewing, the PDF is definitely right after that. And I can make that in whatever form you guys decide. I was simply bringing up the idea that perhaps we have a little bit more creative freedom in the direction that we want to think about taking it. Since we now see the current suggestion working inside of the vision with with free software. I’m fine with developing something similar. I would just throwing it out there that we have a lot, a lot of sort of experimental freedom in this project as opposed to you’ve got you got reader, it needs to be polished. It needs to do what it what it’s designed to do without just, like, adding a feature. We don’t like it. Take it away. We have that freedom here. You know, time limiting. I had no real point of just opening up a different type of discussion, since we might be overlooking stuff.

Frode Hegland: It is a very important discussion though. I mean.

Frode Hegland: Dini. I actually don’t know if I charge for reader right now, but the reader is some reader for vision, iOS and Mac. I’m completely willing to open source the whole thing. Author is a bit more complicated in that sense. I know it’s still on a proprietary platform. You know, it’s not built on open code. But the code itself is one thing. But I do think that

Speaker5: What.

Frode Hegland: Annoys me, what argues against my point, is Danny standing in a room, pulling out a piece of text and putting it somewhere else in XR space. That’s going to be really difficult, I believe, for reader to do in visionOS. That’s going to be relatively possible in webXR. So. So that’s such a strong use case for what you’re saying, that we should actually have the proper PDFs or PDF looking things in webXR.

Dene Grigar: I’m just also thinking I’m trying to think of the more global experience. So imagine I’m not just going to be reading right. The point of my the main point I made in the case study that you read is that reading is not separate from writing for me, for for academics, right? You’re reading something and you’re scribbling on the book. You’re underlining things, you’re making notes on the computer. I mean, reading is not just reading. I have never been able to pick up an academic article and just read it without marking the daylights out of it. And the same goes for even novels. I’ve underlined most of the novels in my library. So my big question then is are reader and author? Go hand in hand. Right. They’re meant to be kind of counterparts to each other. Is that true? So if we adopt reader, which I think is a superior, a very superior product, and it works well with the Vision Pro and it’s open, it’s free. Then I want to do stuff beyond just reading. I want to actually do the writing. Can we do the writing on top of reader? That’s not author so that people don’t have to buy author. And can we put reader on GitHub or at least the web XR stuff on GitHub, because we promised that in the I went back and read the Sloan Foundation stuff yesterday because I was having to fill out paperwork for the university and we, we, you know, we have the GitHub site and we made that open. Right? So that’s now open. So what can be open in the GitHub site. So is it just the webXR. Is the reader open in the GitHub. I don’t know and probably not author because that’s that’s you’re making money off of it. I think you’re you’re doing really well with author right now, right? Daily.

Speaker5: Yes.

Frode Hegland: I’m I’m making I used to make something like maybe 20 to $40 a day, and now it’s between 500 and 700 a day.

Speaker5: Yeah, there’s a.

Dene Grigar: Lot of money to give up.

Frode Hegland: It’s a lot of money. And that’s money. That’s. It’s all going to go back into coding for itself. However, it’s not going to last forever. I’m getting this because of the being featured in the App Store. So it’s nice for the moment. But in terms of workflow, for my own environment of reader author just like on the Mac, you can copy from a reader documents and paste an author, and it’ll paste as a citation. I’m working on having that in Vision Pro so you can be reading a section in a PDF. There’s an option to either copy or copy quotes. And then you can go across to author and paste that it should do the same thing. That is of course clipboard visual meta. So that should work in and out of XHR. That should work everywhere. And I think that’s an important component. You should be able to be in the webXR environment, copy something and go to any software and paste it as a quote.

Dene Grigar: Here’s a question would it be possible to do what a lot of app developers do, and that’s make a stripped down version that’s free, that seduces people to buy? The paid version. And so the strip. Let me finish. The stripped down version would be the one we use in this. That’s like free. Then they like it so much and they want to use it on their desktop. They buy the they buy the version that is the $20 version. I think it’s 20, right?

Speaker5: Yes we know.

Frode Hegland: Yes, it’s $20 and yes, it’s possible to do that. And yes, I have done that. There is a reader, excuse me, an author basic which has everything except export as pdf. You know, that’s that’s what people quite literally pay for. All of those things are very happy to entertain discussions over. I’m not a business person. You know, I’m getting a little bit of income not to pay for coding. This is not going to be an empire. But I do think it is also very useful to have external things we integrate with including Fabian what Adam will be building and author and reader externally, because that’s how we make an ecosystem, obviously.

Dene Grigar: I also want to go on record saying that I have no trouble with you making money off your software. And I think that the more money you make, the more you can throw into your programing and do the things that you’re doing. So I, I’m not anti-capitalist in that perspective. So, you know, that’s great. I’m really glad for you. And and you deserve it. All right. For the project. You know, the dumbed down version that’s free would be ideal because then we can meet the standards of our promise to the Sloan Foundation, cut some time out of our production by using these documents. But I do think we’re going to want to make sure that they talk together in this environment so that what I’m doing in my reading can then we can bring in the author at the same time. That’s possible.

Speaker5: Yeah. No, that’s.

Frode Hegland: Really, really important. I mean, a third of Sloan is metadata, and that is exactly what you’re talking about. Now, there is the kind of metadata that’s at the back of the document. There’s the kind of metadata that’s transferred with JSON. There’s also the kind of metadata that is in a clipboard when you copy and paste. So that is absolutely crucial. There’s no question about that. And one of the things that I think will be really, really important and Brandel, this is something we talked about on Monday, all of us, when and mentioned briefly earlier today is I’m joking, instead of having a PDF, we should have SDF. And that’s not a new file format that is packaging current standards, USD and so on. So that one of the basic things you can do is that the back of a PDF in an appendix have 3D spatial information about the document. This is, you know, and that’s why it’s good to have external software because like reader and author is made by two different teams. And that’s because if it was made by one team or one person, they would inadvertently cheat. This is a way to guarantee that the metadata works. So that’s the same approach that I’m looking at here that Andrew. Is doing input output and someone else is doing input output. And in the dialog we figure out something that works. So whether how what we do with reader, I think is relatively inconsequential. It’s not a big piece of software. We can probably do exactly what you’re saying, Danny. I think what is more exciting is the notion of being in reader or equivalent, pressing a button. That stuff goes into webXR underworld. All the metadata is there and we can see the connective stuff.

Speaker5: Okay. Okay.

Frode Hegland: Because right now.

Dene Grigar: That sounds good. And I need to I need to tell you that Andrew and I got to go to the other lab meeting now at ten. So we’re going to take off. Andrew, do you have clear instructions about what you’re doing next?

Andrew Thompson: Yeah, I’ve got pretty clear on what I’m doing next. I have less clear what I’m doing after that, but that’ll be a discussion for next week. Okay. Well and I’ve got a general direction so we’re we’re fine.

Frode Hegland: Okay. Well then briefly in closing then to make sure we are on the same wavelength on that next week, you’re going to be when something is opened, it’ll be in a focus area. If something was in the focus area, it’s moved back a bit. That’s the number one thing to do for next week.

Andrew Thompson: Yeah, that’s what I’m working on. The the focused area bit. It might just be either it’ll be the citation block that that already exists or it’ll be just like the abstract or something, something simple, just to sort of prove it works.

Speaker5: Yes.

Frode Hegland: And I think for now, because we have to figure this out. What is opened is the abstract.

Speaker5: Just because because most.

Speaker9: Of Grammarly free.

Frode Hegland: Just because, just because we have it. And then the references below it or to the right, whatever suits you easily. You know what makes sense? Yeah. Yeah. And then we will deal with the how to make it PDF ish very, very soon. Peter. Right.

Peter Wasilko: Yeah. I just want to note that in the sidebar, I had suggested before that we invite Curtis Hickman to participate in the symposium. He is a magician by training who worked as a VR developer at The Void. And his idea of hyperreality is mixing physical experiences with VR. But a lot of his work will also work purely in XR and VR by leveraging psychological insights and how the human brain works. So he’d be a really great person if we could get. And also, I left the link to David L Small’s dissertation from his work at MIT on rethinking the book, and that’d be a really good selection for the book club.

Speaker5: Yeah.

Frode Hegland: That’s perfect. It’s 2:00 in the morning. It’d be a great.

Peter Wasilko: Person to invite.

Speaker5: To. Yeah.

Frode Hegland: My brain is gone. It’s 2:00 in the morning. Can you please email me those two? So I have it as a proper backup? They’re absolutely brilliant. But on the list.

Peter Wasilko: Okay.

Frode Hegland: Sure thing. Thank you. Brandel, while you are here, that last bit of discussion was interesting because you know, we all see where we’re going. Hang on a second. Oh, what’s happening to my connection?

Chat log:

00:02:32 From Frode Hegland : https://public.3.basecamp.com/p/GW8VuBtmQkoA6uiPeDpddJbH
00:03:43 From Peter Wasilko : Brunching off cam.
00:03:55 From Frode Hegland : Hi Peter
00:04:14 From Frode Hegland : https://public.3.basecamp.com/p/GW8VuBtmQkoA6uiPeDpddJbH
00:04:32 From Frode Hegland : https://public.3.basecamp.com/p/GW8VuBtmQkoA6uiPeDpddJbH
00:11:52 From Peter Wasilko : Are we all LLMs?
00:16:40 From Frode Hegland : Standard very high quality (technically) video: https://fleetingmoment.org
00:24:16 From Peter Wasilko : @dig
00:25:15 From Peter Wasilko : I like the Gopher Slates
00:36:31 From Peter Wasilko : I want ambient display affordances, to calmly reflect things like unread inbox email count, or slack channel activity level.
00:36:52 From Peter Wasilko : Curtis Hickman would be amazing.
00:37:49 From Peter Wasilko : As a magician / VR developer he has all sorts of psychology inspired insights.
00:38:21 From Dene Grigar : Michael Benedikt
Peter Heim
Douglas Rushkoff
Ben Camarano
Eric Preisz
Nathan Stahlman
Dave Barcos
Caitlin Fisher
Mez Breeze
Jordan Giboney
Lucas Haley
Toby Roberts
00:40:10 From Peter Wasilko : We could use something like the old Font/DA Mover for transferring VM between documents!
00:44:05 From Peter Wasilko : If the user has a keyboard handy, we could use a rapid text based selection paradigm: https://github.com/ggandor/leap.nvim?tab=readme-ov-file#getting-started
00:53:16 From Fabien Benetou : brb
00:56:30 From Peter Wasilko : Yup
01:01:21 From Peter Wasilko : Does Apple have a Library environment or are they all outdoor scenes?
01:02:07 From Fabien Benetou : we must go deeper
01:19:27 From Fabien Benetou : (I’ll be going in 10min)
01:19:32 From Dene Grigar : ok
01:27:44 From Peter Wasilko : https://communitywiki.org/wiki/PurpleNumbers
01:32:07 From Dene Grigar : brb
01:32:39 From Peter Wasilko : https://eekim.com/software/purple/purple.html
01:33:03 From Fabien Benetou : gotta run, take care all
01:33:18 From Fabien Benetou : a human waved back, success 😀
01:33:18 From Fabien Benetou : bye
01:42:04 From Peter Wasilko : Maybe a TreeMap arrangement of the content hierarchy.
01:43:28 From Peter Wasilko : Maybe we can invite David Small, he did 3-D visualizations of large documents during his studies at the MIT Media Lab.
01:44:48 From Peter Wasilko : AH here it is: https://acg.media.mit.edu/projects/thesis/DSThesis.pdf
01:45:42 From Peter Wasilko : It would be a great book club topic too!
01:56:29 From Dene Grigar : bye folks

Leave a comment

Your email address will not be published. Required fields are marked *