3 June 2024

3 June 2024

Speaker1: To what they’re talking.

Dene Grigar: Good morning. How are you?

Frode Hegland: Hello. Hang on.

Dene Grigar: But I ran there just to finish. I worked really hard on some other documents for the event too, so. And printing out a bunch of things to.

Frode Hegland: Sorry, my computer was acting up. I just have to check a few things. Good. Yeah. Good morning guys.

Dene Grigar: Have you recovered from all your fun?

Frode Hegland: I did have a tough hangover this morning, but that I did do. That does not happen very often, but it was worth it.

Dene Grigar: Then everybody had a good time. They got home safely. Which time did everybody leave? Was it Saturday nights? Sunday morning. Early.

Frode Hegland: Yeah. They all left this morning. So Adam’s flight is taking off quite soon. Fabian’s train left an hour ago. Leon’s flight left about 2.5 hours ago. He was a.

Dene Grigar: Bit. They stayed all the way through Sunday. Wow. Okay. You had him a long time.

Frode Hegland: Yeah, it was very good. It was just Yeah, it was a real treat. And the one thing we talked quite a bit about was the next one. So we’ll plan ahead, make sure that the timing fits and that you can come over to any. And of course, Andrew, you’re extremely welcome to as well.

Dene Grigar: Well. Thank you. Let me let the cat in because he’s screaming. Hang on.

Andrew Thompson: Okay, I appreciate it. Yeah. Not sure I’ll be able to fly out to London any time soon, but the invitation is still nice.

Frode Hegland: Yeah, it’s not super simple, but there will be September. Europe, of course. And then there’s Mark.

Dene Grigar: Mark. Good morning dear.

Frode Hegland: Mark came up for Friday so that was nice. We had some time here. Went into town for Adam Singapore birthday dinner, which was lovely. And it came back here and Mark was supposed to leave now. Mark was supposed to leave now and he stayed very late, so that was super nice. That was a treat.

Mark Anderson: Well, it was fine. It was only a short drive down. So.

Frode Hegland: Short ish. So I don’t think there will be that many on the call today. They’re traveling, most of them.

Mark Anderson: I’ve got a I got a thought after our conversation the other day and we were talking about extraction of terms and things and, and a recognition that actually, it’s an awful lot that you’ve done that we haven’t collectively made much use of. I looked at the discussion summaries thing. The thing that I, I got stuck on is what I’m what I was thinking to look at was the list of extracted terms, and I was trying to find them not mixed in with everything else, but literally just this. This is what we think we’ve extracted. Because then I could go and sort of do some of the testing we talked about, or rather, you know, manual review, kind of human review that we talked about. So I don’t know if that’s all smushed into the overall narrative. And I don’t see that’s not a sort of that’s not a sort of oblique complaint. I just figuring out, am I missing something? Am I missing something simple? Or am I going to have to do a bit more work to find the actual list of terms?

Frode Hegland: The on that particular note, I will share my screen. And then there will be a further report from the weekend. If we had lots of suggestions and ideas. That will be good to discuss. You can see my screen, right?

Speaker1: Yeah.

Frode Hegland: So this is the master document, so to speak, and author. It’s the same stuff. But if we jump to the end. Here is the prompt. So the prompt has a A long intro. For the system. But then they are indeed. But chucked. So all every meeting we have, I get one result. All I do is assign some headings, bold some text, and if I see an error that’s egregious, I delete it. So here’s the specific bit for you, Mark. I’m just going to go to that again. What named entities were mentioned and what context were they mentioned and by whom.

Mark Anderson: And then where do I where’s the link? Where does that link to to find that the things were extracted?

Frode Hegland: I’m not sure if it’s. If that’s the way.

Mark Anderson: So now I’m sorry again. Sorry. That sounded wrong. I wasn’t that wasn’t an oblique complaint. I’m just thinking. No, no, no, what I need to do to help is to then find that information in the body of the whole, extract it out of all the other stuff that it doesn’t apply to. So I can just look at the list and say and look at those extracted terms, but we can find a rubric for this as we go along.

Frode Hegland: Well, well, it’s easy to answer your questions if you look at the headings. Yeah. We have names mentioned. So they are on a page. So it is part of the report. So I mean, if you wanted to take a meeting and check it against the text transcript and or the video, you can do that. I’m not sure if you need to extract them out because this is what it looks like.

Mark Anderson: Okay, so the the bits in bold are the extracted term’s. Yeah. So are we only extracting names? We’re not extracting any other terms. Technologies we’re using or anything else like that. Again, I ask without knowledge of, of of what it should be.

Frode Hegland: So what we have is List of participants. Mm.hmm. General summary. Summary per speaker. Then we have topics which gets closer.

Mark Anderson: Sorry, that’s me interrupting, but you’re describing something that’s different, that this is different to what I’m asking about. I’m asking specifically about extracted terms and only extracted terms. Mark.

Frode Hegland: And when I show webXR here, the reason I’m showing you is if we talk about the term webXR, it will be listed under webXR. It won’t just be listed as we discussed what banks are. So the way it is now, I’m extremely happy to change this, customize it to add or delete or whatever, but that is all we’re doing now.

Mark Anderson: This stuff understood a suggestion and I and I don’t know how this forgive me, because I don’t know how easy or difficult it is to do, but so I quite see why you’ve done that. And that makes sense from a narrative sense. I don’t know if the process is able to basically keep a running list of all those entities it’s extracted and just literally give me that as a list, because that’s really what I’m interested in looking at that I can then go back and see what it thought. But what’s quite interesting is to see what it found and to see it. The trouble is, when you read it, part of the problem is when you also when you read it in context you’ve then got to work out which is a bit you’re looking for and not be fooled by the context you’re given. Because one of the things we talked about at the weekend was that it’s really impressive that all the stuff is being produced, but one of the things that we need to do and, and it’s it’s a chore, but we need to do it is to actually, in a sense, test that do the human thing. So. Okay. Yes, that that’s definitely a term or that’s a you know, it’s a misapplication or misunderstanding. And again, I don’t say this. I don’t say this with any expectation as to what it will be or should be, but I’m just conscious that we’re not doing that. And that’s something that as part of our experimentation, we we probably ought to do.

Frode Hegland: Yes, I know Mark. I completely and fully understand that. But for you to go to a meeting and look at a page in a PDF, how is that different than having it in a separate data dump, so to speak?

Mark Anderson: Because I’ve got to go through. I’ve got to I’ve got to read the thing. I’ve got to copy and paste all the terms out into the list so I can look at the list. So at the moment all the terms are smushed into explanatory text.

Frode Hegland: Which list are you talking about?

Mark Anderson: The extracted term’s so the. The I is going through and it’s extracting things like names and essentially entities things things it thinks are addressable namable objects which are turning up in a variety of, of contexts, which I understand. And that’s great. So I’m not this isn’t saying that the No, but.

Frode Hegland: Verifying you’re talking about going through one meeting and seeing if the list of names given by the AI is correct. Right?

Mark Anderson: Yeah. This is where I get mixed up between names, which I think you mean by names of people and extracted terms, which are essentially, I guess, nouns, descriptors that it’s finding. I’m more interested in the descriptors overall than I am in the list of names.

Frode Hegland: It’s not doing that. It is only doing the all. Obviously all these things operate on is a product. So the problem that I have is an introduction and specific requests. One of them is names. It doesn’t use the term entities descriptors, anything like that. So far it’s just been names. If we want to experiment with something else, I think that is a good idea. Such as were there any company names or product names? Absolutely no problems adding that. But that has not been done so far.

Mark Anderson: Okay, no that’s fine. It’s a much bigger job than I thought, but but yeah, we can have a go at that. Oh, sorry. I mean, and I say that again because I didn’t say it was a pejorative, but it’s just that there’s a lot more data munging work to do to get to the stuff we want to look at. Because I’ve got to extract it out of the text, but that’s fine. Now I know now, now I know that’s what I’m doing.

Frode Hegland: I don’t really understand what you mean by extracted out of the text. For. For what? How would you like it ultimately presented to you?

Mark Anderson: What I okay, what I’m trying to get at is I’m trying to look at and seeing the key points that the AI is or isn’t, because that’s part of the test, extracting cleanly and correctly from the source material. So it’s a non-judgmental test, but it’s trying to say right, it’s found these things. So to do this, I’m less interested in what in what it thought about them and how it described it. It’s just the list of things. But I’ll have to go and look at the text and think about what that loose term things means. And that’s fine. That’s it. But thanks. Thanks for that summary because that’s helped a lot. Actually. It point me in the right direction.

Frode Hegland: Okay, I understand what you said, but how would you like it presented so you can look at these things?

Mark Anderson: Well, I think I need to go and to help you. I don’t want to dump. I just you’re doing a lot of work already, and I don’t want to dump more on your shoulders. So the thing is, I now guided by what you said, I’ll go away and look at this and then work out what, if anything, might be useful to add. But rather than just give you another thing to put into it without me having thought about it, I’d rather go and spend some time. Just look at one of the reports that’s, you know, not possible to do. I just I just now understand what it is I’m looking for. So that’s great. So you don’t need to do anything for now.

Frode Hegland: No, no, that makes a lot of sense. And the prompt is in the report itself on the last page. Okay. Got it. Right. So yeah. Back to back. Back to the Weekend Update, as they say. Nice coffee was had lots of fun. In terms of work, one of the things that we kind of ended up looking at was gestures. And that’s a dimension that I have mostly ignored. So on Wednesday, I think would be useful if we for now, maybe split the time. So we have ten minutes for announcements and any other things. Then we have Andrew updates and then after that we talk about gestures. Because this is something that I think it looks like Adam is happy to head up. To build a separate system, initially for just testing gestures, because instead of only having this and on the vision this to be able to do. One of the examples we talked about was Adam was talking about this. Where if you have one hand touching another, it’s much less likely there will be a mistake in gesture. So we can then use the different fingers. We can extend them. We can do all kinds of things. So that’s something that we should probably look at more seriously.

Mark Anderson: I love that Adam was referencing his mime training as part of the way of being explicit. Was it as he sort of he was explaining how you you how you simulate the sort of thing of touch when there’s nothing to touch? And when he broke it down, it’s, it’s sort of gestural. And so that’s why I think it’s really interesting, the observations that he sort of came up with, which is fed into what Fritz just described.

Frode Hegland: Yeah. How does that sound? Danny and Andrew as a great take to look at. Right. So we called you a few times, Danny, for to see if we could have the meeting, but obviously you were at a conference as well. So how did that conference go?

Dene Grigar: Actually, no. I think you called me on Sunday morning at 8:00, and John and I had our phones turned off, and we were enjoying Sunday morning together.

Speaker1: Yeah, no we didn’t.

Dene Grigar: I didn’t see you on Saturday. It was Sunday morning.

Speaker1: No, no, Saturday.

Frode Hegland: Wasn’t Saturday. We didn’t call. Saturday wasn’t practical because we were out. We were out. First to meet this Well, what’s his name? I’ll think of his name. But the interface designer went to the Tate Modern. And then we jetted off for. For dinner. I had a Chinese dinner, which was nice. And then it just became a became quite late. But yeah, we called quite a few times on Sunday.

Speaker1: But yeah, it was Sunday.

Dene Grigar: Morning, 8:00 my time and 8:00 my time. John and I were still sleeping, and then we got up and had brunch and I had my phone turned off, and I noticed that you had contacted me while I was still sleeping, and I wasn’t going to get up Sunday morning early, so I had it all turned off.

Speaker1: No, no, no.

Frode Hegland: Of course not. I mean, I did call you a few other times, and I did text you asking if you had a time to, to call. But, you know, that’s absolutely fine yesterday.

Dene Grigar: I see it here at 8:49 a.m.. That’s the first text I had from you. And then face time. Let’s see. Two face time yesterday. Well, now it’s hard to get them to turn it off yesterday, FaceTime twice and this at the same time. And my cat’s now biting me. So give me a minute. Go on. Come on, let’s go outside. Out, out. Goodbye. Yeah. Yeah. So know if you text and FaceTime this about the same time. I didn’t get out of bed till about 8:00 anyway. I was lazy.

Frode Hegland: Yeah. No, that’s okay. But if you. I’d wanted to. Of course, you could also have called in this direction. But the whole point of having the different headsets on at the same time is not to be in the same location, so I’m sure we can organize something like that later.

Dene Grigar: Yeah, we talked about Saturday and I was available Saturday. I was working all day Saturday, but I didn’t work on Sunday.

Speaker1: I tried to take off. You could.

Frode Hegland: But please don’t be kind of.

Speaker1: Okay.

Frode Hegland: You could have called to. Right. You seem quite annoyed. We didn’t call on Santa. We call the wrong time on Sunday.

Dene Grigar: I’m not annoyed about anything. You seemed annoyed that I wasn’t answering the phone at Sunday morning at 849. In my time, I was busy, I had breakfast.

Speaker1: Yeah, okay.

Dene Grigar: Because I’m glad you had a good time. This is wonderful and you had a lot of good success. So congratulations on that. And Yeah. What else do we need to talk about today?

Mark Anderson: I’ve got a couple of practical. Hello, Peter. I’ve got a couple of practical things that might be, well, actually useful to Peter as well. Just as observations made from when I met up. One was that I have I have now seen inside. I actually put on an apple Vision Pro. So when I was up with Freud, that was really interesting. I saw a Fabian showed me something he’s done. But it was really interesting to see I could sort of, you know, sense the difference slightly between the two, but it’s just so, you know, that I’ve now sort of, sort of seen the kit. It was very brief, very brief thing, but interesting nonetheless. So another little milestone and one other thing to throw it back in. I was talking with Adam about the sort of visualization stuff he’s done, which at the moment is essentially taking the timelines he did about two years back in for the Future Text group and doing that in, in VR, in VR, in XR. The really interesting observation that he made, which doesn’t surprise me, is having done it, actually, he didn’t think he was wondering what it was useful for, and that the thing we realized is that rather than using it as you would in 2D, where you’d open it into a browser window and you would sort of interact with it. Given that the the input aspect of things is still a not as smooth as it will become over time, let’s put it like that with in the XR that. But what we realized it would be really useful for is like an adjunct view.

Mark Anderson: So if you were doing, you know, one of our other sort of use cases, being able to have, in a sense, a panel off at the side, I don’t know, I don’t know the practicalities of the interaction, but but our insight, I think, from the previous has had so far that something like that timeline could be really useful because it could be giving you indications. In other words, it’s not the sort of view you probably necessarily necessarily want to try and read, as in sort of pull up really close and view it. And the other thing we talked a fair amount about was use of negative space. So rather than highlighting things or drawing lines to them, what you might actually wanting to be able to do is to deselect or, or essentially temporarily hide part of the display because what you’re doing is you’re removing having decided the stuff in which you have no immediate interest you basically remove it from your site. You don’t have to process it anymore, you know, at the back of your brain, it’s there if you need it. But instead of giving yourself more, more cognitive input by putting colors or spaces or lines on it, you just actually gently elide the stuff that you don’t need. And I thought that was quite interesting. And we’ll obviously look into that further. But it does make a it does seem to me to make some sense after the fact. Thanks.

Speaker1: Yeah.

Frode Hegland: Absolutely. Do you have anything more, Mark, of your impression of first time in the Vision Pro?

Mark Anderson: No, other than that, I mean, I was it just seemed the display seemed very nice and crisp. I don’t have enough time to sort of really relate it, to have a professional view that as opposed to quest three, which is where my a point of reference is. And the other thing I’ve done is I’ve taken my Oculus two up with it. So I’ve given that to Freud so that can be passed out to somebody else. It’s in good nick and it can have a second home so we can get somebody else using some of the kit as well. No. I think my takeaway from the Apple Vision Pro is I was actually, I was just. I. The good thing is actually that I wasn’t really surprised in any unpleasant way. I mean, essentially it just worked, which is a good thing. And that makes it that may sound rather trite, but actually I think it’s a good thing. It was rather nice that I tried this and and it wasn’t in any way surprising. We, you know, we had to do 1 or 2 tweaks just to, just to get things running. But no, it’s good. Thanks.

Speaker1: Well, Peter, I find.

Dene Grigar: This, you know, from my own just playing so much with the quest and then also the pro. The difference to me is that the quest environment seems like a faux environment. It’s it’s it’s it’s it’s not it’s not meant to be real. Which is fine. Right? That’s what the virtual reality is all about. It’s not real. And whereas the, the Apple Vision Pro is like a real environment, I’m really using my computer, I’m really using the space. And I think by that perspective it’s more like augmented reality as opposed to virtual reality. So instead of VR, it’s AR or mixed reality, which which makes us a little bit of VR and AR. And if you go into our production with that mentality rather than thinking of it as VR, I think our results are going to be better. We’re not building a faux environment. We’re like a holodeck at all. We’re building a desktop that is augmented in space. So I think that that that philosophical shift is an important one, Mark. And I think you can see that now that you’ve seen both environments.

Mark Anderson: Yeah. And it’s interesting too, then in the light of my observation about the sort of thing that Adam was doing that essentially you can produce these views, reports, whatever. And have them as separate things. So they have all the malleability of being constructed in sort of VR or whatever. But what they don’t have to be is they don’t have to be whole screen. And indeed, I can sort of see thinking of some of the ideas that Leon brought up of being, you know, but you might you might literally sort of fly through one to expand into another, which would still put you in an AR space. But looking at a slightly different context, for instance, for some of the content. It’s quite intriguing. It’s I mean, I’m, I’m finding it quite interesting to sort of think through this and, and what it, what it offers. And of course, it one challenge, of course, is that we’re trying to do it with information that was never authored for this. Which is now Canada, our past selves. But I am seeing that to make the most of this, we’re going to need to Not for everything, but for things that we really want to stand up and get a lot out of, in, in.

Mark Anderson: We’re going to need to develop some some new tools that are designed for writing in a way that you actually work with both the metadata and the narrative. So at the moment, our tools, you broadly write, you write the tech, you know, you write onto a white rectangle on screen sort of thing, or an endless scrolling screen, and you’re essentially writing the the narrative, the textual narrative of, of what you wish to describe. The richness of some of the things we want to do means that. It’s really useful if we can capture metadata around that. So case in point. So you’ve got a document. So you’ve got some bits and pieces that are associated with a particular part of the work. Or you maybe you have braided together 2 or 3 parallel narratives. And it’s. And when you read it as a human, you put that all back together. Okay. Right. I’m on, you know, I’m on narrative. I’m on track one of the narrative, or there’s somebody searching, being able to actually hint that in in a way that is tractable to the software processes to do the clever sort of re representation or remediation in, in the AR space is something I think will become really useful.

Dene Grigar: Mark, I was going to say the two important things that that I was leading to, and I think you caught on to one of them, is that first and foremost what the question is, why do we need an augmented space in this virtual environment? What what’s the what is the gain. Right. And that’s one of the things we’re supposed to be exploring here. Like what is it that we gain with this. The second one is the the thing that we realize is that anything that we’re using here is not really meant for here. So the second thing we’re exploring is like, what is what we need here? Microsoft Word is not the answer. A PDF is not the answer Andrew and I have been grappling with with our lab. Right. So when you make a book, right, it’s a physical or you have a physical object and you want to save it in a space where people can get to it because you have one copy of it, it goes on a shelf and a physical brick and mortar. So brick and mortar, brick and mortar for the next, which is our museum and library.

Dene Grigar: The objects we’re dealing with are ephemeral, right? They’re digital objects. They’re virtual. They have virtual experiences. They’re experimental, participatory, interactive, but they have no physical physicality to them except for the electrons and the electricity needed to drive them. What is the space look like for that? Well, it’s virtual, so brick and mortar for the physical stuff, for the non-material, for the, you know, the, the nonphysical stuff, a virtual space. And we develop that out of the lab. That’s our project. Now we’re looking at what kind of tools do we need for this virtual space to make it really work, because it’s not working with paper. Right. That means, yeah, that’s exactly right. And so we’re trying really hard to kind of put a square thing in a round hole and beat it in really hard. And you know, the the answer is after six months of doing this is that the tools aren’t there and we’re trying to come up with them. Right. That’s the point of what we’re doing. And so we validated I guess my final point is we validated the need for this project.

Speaker1: Right.

Dene Grigar: We surmised this is what was the problem. Six months later, we can validate it. Now we’re developing, you know, answers to the question, what is it that we need? And I think that’s an important thing to put into the report for the six month. Sloan report. Then writing. That makes.

Speaker1: Sense. Question. Yeah.

Mark Anderson: Quick question for you. As I sort through the word remediation, I’m sort of feeling somewhat uneasy to my mind because I think back to sort of, you know, Garrison’s book on this, and I know that they were broadly talking about a work being put into what seemed to me that if you take something that isn’t a physical artifact, just being represented in facsimile within sort of AR or VR, you are in effect, you know, with the sort of tools you’re positing that we don’t have. You’re sort of doing a dynamic remediation, which doesn’t have to mean anything. But it’s I think it’s important to address because it makes you think about what you’ve got to do because a challenge now is that I might want to have I don’t know, this book doesn’t matter a book. I might want this in a digital sense. I don’t want to destroy the version that is this the narrative. That’s this. In other words, I might want to have the literal as close to the literalness of this, apart from its physical aspect in a digital sense. But I might also want to have it in completely remediated. Remediated in a form where I can do all sorts of clever and analytical things. So I think it’s very interesting, actually, fun enough to the digital humanities, probably more so than the scientists. But yeah.

Speaker1: Can I say something?

Dene Grigar: This morning I was reading the New York Times online, of course, and they’re doing really amazing work with storytelling in the new newspaper format for the electronic. And I don’t know if you saw the report on the Hunan. The Wuhan, the Wuhan lab and the Covid 19 pandemic. But the guest essay was about the origins of the pandemic and that virus and but it didn’t. It wasn’t like an article. It was not an article. It was this incredible multimedia experience where you had graph graphs and figures that were interactive inside the text. And as you just kind of like what they were doing in 2012 with the avalanche story, but they’ve gotten even better at it. So they’re doing the the New York Times is doing the best work of any journal that I’m reading, and that includes even The Guardian, which I adore, of how to tell a story in this new environment that we’re working in. And it works for the phone, iPad, desktop, the whole shebang. But it’s just an amazing thing. So, you know, we’re still at the infancy of trying to articulate what it means to read and write online.

Speaker1: Brother.

Dene Grigar: You seem bothered by something. Are you okay? You’re so quiet. I’m not used to it.

Speaker1: I’m scratching.

Frode Hegland: Something here. I’m scratching my head. It’s not like I’m scratching. But I’m also on the New York Times website. I’m just trying to find this. I would love to see it.

Dene Grigar: I put it on Twitter.

Speaker1: Oh, you did okay, I did.

Dene Grigar: I should be on Twitter. I posted it this morning because I was so impressed with it.

Speaker1: Yeah, well, that should be fine.

Dene Grigar: Let me pull it up. Hang on.

Frode Hegland: Sorry. I guess sitting like this makes me think I have a big mental thing, but I just do have a little thing on my head. Thank you for checking. I appreciate that. Yeah. Let’s see.

Speaker1: The only signage here. I can’t.

Frode Hegland: Find you.

Speaker1: I posted this. Oh, no.

Frode Hegland: No, there you are, I found you, I just you see it? I forgot the at. Yeah yeah yeah yeah yeah.

Speaker1: Good.

Mark Anderson: It’s interesting that you describe that because and another angle of this is interesting in the way that our people are these stories moving from the infographic, which was a big thing of the last sort of 15 odd years. So it’s moving from the static to the dynamic. But I think it’s more subtle than that, the sort of or the possibility. So it’s not just taking what might have been a basically a static piece of very well produced graphic design and making parts of it more fluid. It’s it’s really going the other way. It’s taking it’s, it’s expanding a narrative, something that other otherwise would have been a written narrative and enhancing it in that sort of multimedia way with all sorts of dynamic things. And that I think that’s tremendously exciting, and certainly the sort of factual storytelling and sense making.

Dene Grigar: Well, I think also it’s not just it, it, it’s. It’s it’s articulating in a way that is 21st century. I’m going to try to put it into our slack channel, at least if I can post it there for everybody. I pulled it up here. Slack. Go here, share with destinations and get there. Yeah, I’m going to put it in the cafe. Do you guys have access to the cafe?

Speaker1: No, I think.

Frode Hegland: Put it in Maine I think is okay.

Dene Grigar: Well, I’m not sure if it’ll go me. See if I can get it there. Future texts. I can get it to administration, but not future text.

Speaker1: Okay.

Dene Grigar: I said to Fabian, I may have to send it to each one of you separately.

Speaker1: Oh yeah.

Frode Hegland: I can do that. Let me just post it in the chat text.

Speaker1: Well.

Frode Hegland: Yeah. That’s it should be clickable. That’s just from the. I try another way, just in case there’s something issue with that.

Speaker1: I think.

Dene Grigar: Andrew, you were you were you were in the class I taught for digital storytelling, in which we did that whole. A module on? Yeah.

Andrew Thompson: The avalanche. Sorry. Yeah. We did. I know the one you’re talking about.

Dene Grigar: Three stories. Avalanche. There was one about the the deaths of the people hiking in Russia. And also there was another. There were three of them that we looked at.

Andrew Thompson: Yeah. The other one was I think about, like, the burning forest fires. Yeah. In the.

Speaker1: Rainforest. Yeah. The.

Andrew Thompson: I think that’s.

Dene Grigar: What that one. Yeah. And then you guys had to make one yourselves. And one of the really cool stories was the the tunnels underneath Portland, Oregon, which were used for slave trade and white slave trade. I don’t think it was your group that did. I think it was maybe.

Andrew Thompson: No, that was that was a different group. But I remember it was cool because we got to actually tour the tunnels for their research.

Dene Grigar: Yeah. We went on a field trip and toured the tunnels. Peter, do you have access to it? Are you seeing it?

Speaker1: Hey. Good.

Dene Grigar: Well, anyway, I’m just I find it fascinating that we’re we’re looking for ways to tell these stories in a more useful way. But I think the third thing, Mark, that I is important to notice is that the only way I think the most important feature of the headset of the A of the Apple Vision Pro is the fact that we can come together in a single space to share documents that right now, that’s the draw. And until we get past that as the only draw, I don’t think there’s, you know, much that is being done with it.

Mark Anderson: Yeah. It’s interesting you say. Sorry, I’m afraid you. Sorry.

Frode Hegland: No, no, no, I just like to going back. So please reflect on that mark.

Mark Anderson: Yeah. I was just thinking that something that you were saying that is that that there’s a sort of a drumbeat about collaboration. And I was just doing a peer review, and I realized this person was going on about collaboration. What they actually meant was co-presence. They’re not the same thing at all. And I think and I think that’s really the thing with the, you mentioning the Vision Pro, it’s this ability to be effectively present in the same space. And we probably don’t help ourselves a lot of the time we talk we talk about collaboration when we don’t need to because that actually that needs a lot more just the but the ability to, to, to share a common view of something, to see the same artifact. I think it’s remarkably useful, and not just because it means we can do it from opposite sides of the planet, but just because you can, you can. It’s the ability to take something which has to be confected in, you know, produced in a, in a virtual sense, but it can be placed together before people because you can’t just project it into a room or not without an enormous laser auditorium or something. So I find that very interesting. Thank you.

Frode Hegland: Yeah, I agree on that. So. Yeah, there’s so many interesting issues and going in all kinds of circles over the weekend, a lot of it entirely irrelevant to to this, which is good. I mean, irrelevant to Sloan, but to the bigger discussion. Some of it we recorded but in 360. So that’s 50GB per half an hour. Yeah. So that takes a while to upload. Nothing’s worked yet. I’m going to try to do it in town. So it’s one of the things that, that I was thinking about in terms of our practicalities of our project and management is. I hadn’t really thought so much about the gestures. And, Denny, I’m going to be spending some good time tomorrow on the high resolution paper. Just as an important thing to mention to you, where I will also be mentioning high resolution gestures. The importance of that. There were some things that I learned on that that’s interesting that we can look at together, but in terms of our project. There seems to be, and this is so up for discussion. But it seems to me that we should look at three things. We should look at gestures, of course, inside our main Android work. Keep doing that as we do every Wednesday. But we should also look at this thing we previously call the map.

Frode Hegland: The space is generally just the space now, right? So we set a time, a time just for that. And then one thing we haven’t really done, and this is my fault because of how I’ve been kind of pushy. Let’s look at reading a single document. When you read a single document in the context of what we’re talking about. And of course we’ve done it over the months. Of course we have, but we haven’t really done it in a we’re now building manner. So Andrews built all these really nice, powerful views. I think it’s now time for us to really reflect on them, look at them, comment on them so that I could imagine. When we get to at least Poland, we will have an interaction that is really showing the use of space. Showing a really interactive document. By the way, for clarity, when I say one document, I mean one document, and you can have many on the side, but it’s the focus on the document rather than the relationship just just for that. And then separately gesture and gesture. So I can imagine I know that Andrew has a link for current code. It could imagine. For instance, we have a picture of Texlab Infohash view.

Speaker1: Or whatever.

Frode Hegland: Just a suggestion for a name that goes right to current Andrew Code. Because once we get to September, we want these people who sit down. We want to have a name for it. So I’m sure we’ll think of a name at some point. But at least we have future texlab now, right? So they can just remember the URL. Future texlab.info/view or similar. And we also have a link that is future Texlab info slash grip. I think the term grip is nice because we are talking about getting to grips with our information, right? It’s a very figurative notion. So in that view, we take some of the nice code that Andrew has done with the basic document or whatever is appropriate. But then we talked a little bit over the last few days about. So a few weeks ago. Edgar really likes Gundam. You know, he put on the suit. So there were some playful things. Like, what about a glove? What about if we use the analogy of wearing a glove and changing gloves to be the same as changing interactions? So now that I’m reading, I have one glop. But Oh, I don’t really like that. I’m going to download a different kind of glove developed by someone else, but different gestures. So we can really, really test a gesture thing completely free of worrying about integrated code. Because when we get to demo for this, we’ll have a few headsets. And again, this is all just very loose thinking. You know, I’m a very changeable person in terms of what I prioritize. So I’m wondering what you feel that we have kind of a separate. View with some interaction and one that is mostly interaction demo. Specifically Houdini. How does how do you feel about that approach? Splitting it.

Dene Grigar: Well, I think a couple of things come to mind is that we experimented with the two different gloves with the data entry portal project. We’re going to have a glove for when the people were landing into the travelers, landing onto the earth, and then another glove for the actual movement through the game.

Speaker1: Interesting. Okay. I’ll stop.

Dene Grigar: Okay. Well, anyway, we decided. Okay.

Speaker1: I just.

Frode Hegland: Said interesting. Please let me be supportive. I did a thumbs up and said, that’s really interesting. Like we’re in the same room. Why do you say you have to stop? I’m sorry. Okay, I’ll mute myself. Please go on. That was really interesting. I’ve never used a glove.

Dene Grigar: So we decided not to do it because two. I’m sorry. Go ahead.

Speaker1: I was just. You want me to speak? You want.

Frode Hegland: Me? There’s a bit of a there’s a bit of a delay today. I think my network’s a bit slow. I’m sorry. I’m going to mute for a bit, and let’s see if we can manage to do this without intercontinental delays.

Dene Grigar: Okay, I think that’s the problem. So don’t get offended. I’m I, I can’t understand when you’re talking. It’s coming at the time I’m speaking. So we had two different gloves, one for each kind of level one and two. And then level three, we found that it was too difficult for the player, the user, to know when to pick up the gloves. Also, it made the game very, very complex when it should have been easy. It needs to be intuitive, right? So we tossed that idea out. Andrew can speak to this more because he was developing this with the students, but it was it didn’t make sense. The other thing that comes to mind is that Andrew’s already on target to with certain gestures already. Right. He’s been building these gestures into the system. And so he’s got a several of them that are in the works. And if we want to make changes to them or, or augment them with other ones, the question then arises, how do we overlay those on what’s already there? Are we building two separate projects now, one with the folks in Europe that are building gestures from experiences they’ve had with miming and experience with that kind of thing, and then we have the ones we’re building in the lab that Andrew’s been working on. That means two different sets. Is that is that going to is that going to dilute our efforts? Or are we going to just adapt what’s already been, what’s being done by the folks in Europe and let them take over this? Reorient Andrew’s activities so that he can move forward on some aspects of the project. So we’re not developing two different projects, and Andrew has his hand up. I think he wants to say something about it.

Andrew Thompson: My personal suggestion is, on one hand, like, have as many experiments as you want. As long as it’s, like, different people per experiment. I of course can help on others. It looks like, as was clarified in the chat, this would probably be Adam working on gestures. I’m totally on board for that. Adam’s very passionate about this sort of thing. I think you would do great for it. I think it only starts to be an issue if I’m, like, split between two projects. Sounds like that’s not the case. Adam’s welcome to use my base code. I don’t think he’ll want to because Adam’s very experimental. He knows how to get his stuff done. And my code is built on layers of just, like, plans changing. So it’s kind of rough to work with. Honestly I had no clear goal for, like, anything through this, it’s just been like, hey, Andy, work on this, work on this. So that’s fine. I can make things work, but the code is gross. So Adam, as an actual developer, is going to not have fun working in my environment. So I would say give him the freedom to kind of make his own space to experiment with gestures. But that would just be my $0.02 on that. Good.

Frode Hegland: Those are $0.02 well spent. That is also how I see it and the way I’m thinking of unifying it. Now that I hear what you’re going. I hate it when my gym coach says fun. But anyway, I’m going to use the F word fun that maybe we the gloves we wear in your environment. Now, Andrew, we make some sort of a basic design so it looks a little nonstandard, maybe even a collar or something. And then in what Adam’s doing, he uses that same skin. But just so that for the user it is a bit coherent. But we make it clear that underworld is what we’re working on. That’s the interaction. But we also have kind of a gesture lab. Where we’re experimenting with the dream of developing a system whereby you can be in Andrew world or any webXR system, and if the developer has the right APIs, they can literally plug in these different kind of gloves or whatever we might call them. So to keep it for September completely separate. Only a stylistic thing because, you know, it’s taken me six months to even fully appreciate gestures. So I’m extremely slow to this part of things. I don’t know does that how does that feel? Yeah. Danny first and then Peter.

Dene Grigar: When you were visiting back in January, I began mapping out the gestures for the quest and for the Apple Vision Pro so we can get a sense of what overlaps were there and what opportunities might exist that no one has thought of. And I also would like to argue that the gestures that are currently being used are not as intuitive as they can be. And perhaps that’s something else that, you know Adam is thinking about is what is a more intuitive system of gestures. And I’ll end by saying that when Steve Gibson and I were working in the motion tracking lab together back in the 2000, early 2000, we were using gestures in that space. And also just kind of the actual movements of our hands and legs that were being able to be tracked by the infrared tracking device and trying to make them trying to map those into some sort of stable, interactive. Design principles and and then, you know, then the, the technology went away. But at the time, we were trying to think about what does it mean to put your hands up, down, out in. Is there something about this? And then we’re holding these trackers in our hands. What what does it mean to go in and out? What do you expect this to do? Vista, do you know? And then when you’re playing music, you expect the music to be almost like a keyboard, right? And we could we can mimic a keyboard in space. So a lot of things like that. But there are opportunities there that never came to fruition because the technology died, was obsolete. But it is a good opportunity for us to build a whole new gestural language, as we call it. As part of our side quest. And this is Grant. This is grant territory, right? There’s a lot of money in this.

Frode Hegland: Exactly. And Sorry, Peter, I’m just skipping in just to. I am, pardon my European way of speaking, but that’s fucking brilliant. I’m so glad you have that particular experience come from coming from that perspective, the dream here is to research and experiment as components and make these components interact when. When there’s a time for that. And the last thing before I hand over to Peter and Mark is what Fabian is also doing is very, very related to this because he ideally wants you to be in a room, there is something there, and you do a gesture and suddenly you can see the code behind it where it is. So you can then edit the code. So depending on your level of familiarity with the code, you can do everything or a little bit. Of course, you as the user should be able to plug in different kind of skins, so to speak, for how you do that kind of interactions. So for us to be able to have three separate tracks in this sense. Andrew Main, Gesture Lab and then Fabian’s thing and right now almost no interaction between them, except maybe if Andrew wants some of the code for gestures. Of course, you can just take that out across. There is a lot of papers there and there is a lot of product there. I think yeah. Peter, sorry about the interruption there.

Speaker6: Okay. My one problem with gloves is that you’re limited to two literally hands.

Frode Hegland: You know that, right? It’s not a data glove.

Speaker6: Oh, okay. I thought you were talking a data glove for a minute there. Okay, good. Correct.

Dene Grigar: Colored hands, colored hands. Okay. Color coding. Hands. Purple.

Speaker6: Does anyone remember the old classic Mac game Scarab of Ra? Okay, I’m dropping it. I’m dropping an emulator link in on the sidebar. It was a pseudo 3D dungeon crawl, but one of the things that had was the idea of wands and rings sort of embodiments for procedures that could be run within the game environment. So in order to keep it interesting, it would randomly assign what the different things would do. So you wouldn’t necessarily know what a given spell would do or what a ring would do, or a wand until you tried it. But the idea was that each one was an embodiment of some piece of game functionality, and if you had, on one particular ring, would do something different. The neat thing about rings is, of course we have ten digits, so you could have multiple affordances associated with the ring concept, and you could slip them on your finger and you could have several operating on the same input at the same time. So you could sort of think of each ring as being some sort of a gestural affordance representation. And if you had them more than one on, they could be operating concurrently on the input.

Speaker6: So that would let’s get a little bit out of the you’re trapped in a modal world where you can only be doing one sort of thing at a time. When I think about wands, that got me thinking about the forth programing language and that it would be really interesting to do some sort of an embodiment of a forth like language where each wand would represent a separate data stack. So now imagine that you tap a couple of objects with your wand, and each objects ID gets loaded onto the metaphorical wand object, and then you tap the wand on a crystal ball or something representing a procedure, and all of that data could be pulled off of the stack on the wand. Result could be left on the wand to move it around. And that gives you sort of like a little metaphorical way of moving data around with multiple clipboards in an environment without being back in the old cut and paste onto a standard clipboard paradigm. For my own work, I wanted to give you a little update. I’ve been looking at parsing the HTML representations of.

Speaker1: Peter.

Frode Hegland: Just really briefly on what you were saying. We hope to be able to do something with the wand, but there isn’t really a trackable wand yet. I love the idea, too. Okay. And also Mr. Anderson came with this, the quest two. So can I send it to you, please?

Speaker6: I have to think about that. Does that count? Issues with it and all. I’ll give it some thought offline. Don’t worry about it now. It’s not.

Frode Hegland: Can at least go. But, Peter, can you please go to the Apple Store? Because the actuality of Hand-tracking now is very different from the kind of potential you really need to. You really need to try how it actually is. But anyway, please, please update us on on the work. Thank you.

Speaker6: I’ve been taking. Well, first I took all of the HTML files representing the conference papers in the set, and I took them and put them all into a single YAML file. So that’s basically one text file that’s associating the name of each document with the document’s context. Then I started working on parsers for it, and I was initially I was starting off with an island grammar. It was just extracting a few things. But then I decided that I might as well bite the bullet and do a full parse. So now I’m in the process of slowly capturing more and more tags and representing them in an abstract syntax tree. The abstract syntax tree has the grammar production name, the content that’s covered by that production, and a location object that contains a start and end offset within the original text document, so that you’re able to see where each production bound to the original source text that would let us be able to create. Stand off mark off representations of the content that’s currently in the HTML format. As far as the grammar productions go, the basic approach is that you have one rule representing the start of each tag, another rule representing the end of the tag. In order to get at the contents of the tag, you do an indefinite look ahead for when the end of the tag appears, so you start reading off characters until you hit a character.

Speaker6: That is would be the beginning of another recognized tag. Then for unrecognized tags, I have the notion of a run. And a run is basically any character that doesn’t constitute the beginning of a parsable recognized tag, or the start of an end tag for an open tag that’s currently out there. And this lets me capture the full hierarchical structure of the data. Anything that I don’t care about just simply gets sequestered away as a run. And that’s basically any tags that I don’t have the system consciously aware of at the moment. It’ll just appear as a block of text with the appropriate offsets and any recognized tags as I defined productions to support them get captured with their content and with all of their offsets. And that’ll give us a nice machine, workable representation of those HTML files, so we can drill down and pull out all of the citation footnotes, all of the author names. Because they have span elements. We can separate the surname from the given name for each author, and it won’t just be a single string, but we’ll actually know where the last name, the first name distinctions were. So that’s what I’ve been working on this week. And eventually, when it’s done, I’ll give Andrew back the data file. That’s the result of parsing all of those documents so that he’ll have access to all that rich data to play with.

Frode Hegland: Thank you Peter, and please have a look at the chat. You will see a message from Andrew. Hmm’hmm. So if you if you want to please re email me your address and I will make magic happen. Yeah. That would be really, really good. And by the way, Andrew, a lot of us had a look at your beautiful current demo in all the different headsets, and it really looks much more amazing on the quests. So there is some kind of a rendering thing, but that’s to be discussed on Wednesday just as a heads up.

Andrew Thompson: Yeah, that’s that’s definitely frustrating how that is. But it’s something we expected with the I mean, Randall said that webXR runs in half resolution on the vision. So that’s just us seeing the result of that. Yeah.

Frode Hegland: Not not a problem. But, Mark, you’ve been very patient.

Mark Anderson: Right. I should lower my hand. I’m going to loop back to the thing of of gestures and ask. So the question that follows reflects my lack of knowledge on this. And it’s I’m trying to sort of get my head around the thing of gestures because part of it, presumably, is the ability of the system to recognize the gesture. So that becomes an element of code that’s tracking something going on and saying, okay, I’ve got a gesture. And then we also talk about gestures in terms of our intentionality. But the two and then it somewhere those two connect. So I might talk about doing this in my hand. And I might have an intention as to what that means. And together and I ask that with the intent of, in a sense, being making more sensible requests of what we do, in other words, not asking for things that are hard to do because they combine the building blocks in the wrong way. Does that make sense? I don’t know. Andrew. Do you have a view on that?

Andrew Thompson: Less so at the moment. Okay. Good. Good concept. Honestly.

Frode Hegland: Sorry for the way it was discussed by Fabian, and and Adam, who’s been working on this for a bit with the gestures is they really think we should split this in two, where us regular folk express what we would like. They will then look at what it’s possible, both technically and in terms of issues such as recognizability by the system, as you mentioned, Mark. And the thing of doing things by mistake, like right now, your hands are doing quite a lot of stuff. And that’s the exactly kind of the clash that, that Andrew’s talked about. So, Brandel, brief update. Some of the guys were here this weekend. Were hoping to have another one in a month. We’ll have to decide on a date. Americans who can fly over can do so at a comfortable timing. And so we’re looking at having simplifying now our design discussions around what Andrew is doing into wide space, where you have lots of nodes and reading a document. We haven’t spent a lot of time on the document, something that I accept responsibility for, but then also have a separate track headed primarily by Adam that is all about gestures. So, you know, research what other people have done, including one of their contacts who actually works at Apple, who did some studies where there was I used to measure, you know, someone would repeat the gesture and then it would be encoded to do many different things so that we can have people experience one version of our work that is very visual. Of course, it’ll have gestures. And then we have a just kind of a gesture lab where that’s the focus over.

Dene Grigar: So I see a paper. I think it’d be important to think about mapping out the gestures as Adam’s producing them, and make a chart as we started before. That has the gestures to the quest gestures for the Apple Vision Pro, and then look at the gestures that he’s developing and look at the the way in which he’s developing his out of the concepts that already exist. But then use that for a grant proposal.

Speaker1: Yeah. As a side quest.

Frode Hegland: Absolutely agree. And within that is in webXR there aren’t that many system gestures, which is great for in our case. And we’ve also had random discussions, including such things as when you’re doing a gesture, you should have a button or whatever to rate that gesture. So we can start testing quite quickly. So if people really like, you know, they do this or they do that, so we can get a better idea of of what it is. But, you know, just starting on this gesture thing is so exciting because we want some basic, you know, you’ve been told 2 or 3 things to go in the environment, do your work. But then over time you can become much more expressive. So instead of it this suddenly the fact that you do it like this means something, but in a gradual manner, which is obviously more difficult. But if we develop the testing environment to do exactly as you say, Denny, and this could be very grantable, I absolutely agree we can become a collection of gestures, and we are going to follow the Android style of every update. We store the old one so we can always go back. So we will always have a record of the earlier ones, and the code will always be available and will make it in such a way that a developer who wants to use this, the gesture maps onto our suggested effects, and they can decide if they want to use it under please, please.

Andrew Thompson: Yeah. So you mentioned being able to make a gesture and save it. I you may not have heard of it, but handy. Js it’s a library for webXR. And it does this it’s designed to interpret a whole bunch of hand gestures. And as a user, you can make a gesture and save it, and then now it becomes a recognized gesture. Pretty much exactly what you’re talking about. You could totally implement it with one of these experiments. I did test it early on with this project. And I didn’t like it for my use because it recognized too many gestures, which means that since the tracking is a little bit imprecise. Anyways it was kind of triggering a whole bunch of false positives, but the baseline is there. And if you just want to quickly test gestures, it’s a great way to go. Just wanted to throw that out there. It might be worth looking at. Yeah. That’s it.

Frode Hegland: That’s really, really good. Andrew. Thank you. Can you please also put it in the chat with a little bit of information around it so it’s easy for Adam when he watches this to, to find it. Yeah.

Speaker1: Thank you. I mean.

Brandel Zachernuk: The point is around semantic distance. The idea that if you have one thing that you do, you want to make sure that you have cleared enough distance around it to another thing. It’s, you know why you don’t have a novel with a character called Mike and another one called Michael and another one called Mikey, and those are the only three characters, unless you’re some kind of asshole. So the same thing applies in the in the context of gesture space where, like you, you want to make sure that, you know, semantically what are all of the sort of legible gestures and make sure that you map that to what’s sort of physically reachable between people and, and other things. So it’s it for for better and worse, it also means that you need to be dynamically making assessments of those based on what else you add in there. Because if you have that and you have that, then that’s cool. I mean, if you have that, that’s cool. But if you also have something that is completely distinct for that, then that’s kind of awful.

Speaker1: But think, yeah.

Frode Hegland: That makes perfect. Perfect sense. Dana, you put some zip file in. Yeah.

Speaker1: That’s the gestures.

Dene Grigar: That we that’s the. When I developed a curlew with Greg Philbrook, we had to lay out gestures for the Kinect game system because we shifted from the older system that no longer worked to connect. So it meant we had no more 360 degree experience. It was all in front of a Kinect built, you know, system. And so the poem that the multimedia poem was about controlling the weather and controlling the system. And so I had to develop a whole system of gestures that worked with Kinect so that I could evoke the multimedia. There are a lot of sound and video files and things like that that would project all in the room.

Speaker1: But that’s.

Dene Grigar: That’s the the design of those gestures. And there were many of them. Moving left and right, palm in and out, crouched. So I designed them all. And then when you look at the actual, it’s also available as a, as a kind of a book experience, an iPad experience. So all that’s documented in there. So but that’s a this is easier to see anyway.

Frode Hegland: Oh that’s really, really cool. So that was for a festival. So to put it in academic terms we can even cite that can’t we.

Dene Grigar: Yeah it’s a it’s scitable. It’s it’s in a catalog. It took place in 2014 in Naples at the palace there. Balilo. Marzuki.

Speaker1: Yeah. And then we publish this.

Dene Grigar: We actually gave a paper about it and did a I actually in Norway, Greg and I went together and gave a performance of it and carried all that gear with us.

Brandel Zachernuk: It reminds me of David Kirsch’s work with the dance troupe. And I haven’t read his papers on it. I’m not sure if you’ve seen any of them, but it seems like that there would be a lexicon of marking that would have come out of it as well. The things that people do, mostly for themselves and to some extent to each other, for each other about marking out the what is actually semantically present in much less exhaustive terms for, for dance. And it’s a human to human computer interaction and communication, but fundamentally so is word processing. It’s just that it has a much longer step in between that intermediating layer. But we’re all doing this.

Speaker1: This, this.

Dene Grigar: This piece was also meant to be left as an installation so that I wasn’t there to perform it so anybody could perform it. So there was a little tutorial at the beginning of it to show people what they could do to move the piece along. The tutorial included these these different gestures, and what was nice about the event was that I could sit back and watch people perform it, and the gestures were so intuitive that all they had to do is look at that one short tutorial and they were able to manipulate it, right, and move through the piece pretty easily.

Frode Hegland: Yeah, absolutely.

Dene Grigar: And this came out of the work that Steve Gibson and I had been doing for years. So I was trained by him. So I can’t take full credit for it.

Frode Hegland: So, Denny, when is good for you to have the next meeting in London?

Dene Grigar: Well, let me look at the schedule. I’ve got the Yellow conference mid-July. I’m also, when I get back from this, I’m leaving on Sunday morning, this coming Sunday with Andrew, James, Lesperance and John. And we’re going to Victoria for that exhibition for a week.

Speaker1: Oh, I.

Frode Hegland: Thought it was this weekend, I misunderstood.

Dene Grigar: No, no, I was home this weekend. I was with John on Sunday mornings, our last Sunday morning. Quiet before we go to Victoria. Okay. We’re leaving at the crack of dawn on Sunday to get to Victoria by 3:00. And not miss the ferry. But anyway, I get back from that and then I’m gonna turn my attention to that report, which is going to be due mid June. Right. And for the Sloan Foundation, and I’ll look at the calendar to see what a good time would be. A and I’ll, you know, I’ll probably still stay at the blooming the Bloomsbury Hotel, which is my stomping ground, and I’ll just take a trip.

Frode Hegland: But yeah, please email me. I’ll put it on a Google thing, and then those who are interested can put down the dates around that. Well, actually, why don’t you just look up right now, actually. So it’s the.

Speaker1: Yellow, right? Yeah, but I.

Dene Grigar: Need to talk to John, too. I need to look at finances.

Speaker1: Because I’m on I.

Frode Hegland: Where is that? First of all, in London.

Dene Grigar: The ELO is online this year, which is nice.

Frode Hegland: Oh, okay. But you’re not staying at the bloom.

Speaker1: No.

Dene Grigar: I’ll stay over in Bloomsbury if I come, because I don’t. I want to stay in my own space.

Speaker1: Sure.

Frode Hegland: But okay. I mean, that’s obviously for you to decide, but we do have a guest room that you would have. And the guys would sleep in the loft. With all due respect, guys. You know, there’s plenty of space, so that’s entirely up to how you feel. And no way are we trying to Shanghai you, which was a term used to several times, but. Okay. No, sorry. I just thought it was here, and. But yeah, no, there’s no rush. But there was a large the consensus in the group this weekend was we should meet again soon with more of you. And also, we really want to keep Mondays. More like it used to be. So the book club idea and having a theme is absolutely something we should do. So I think Yeah, we can start that from Monday week. If that’s okay. And and really, maybe we say half an hour at the end or 15 minutes or whatever for any clerical matters to do with our Sloan work. But other than that, we keep it on Wednesdays that everybody seemed to to like that except me. I won’t talk, Sloan 24 hours a day, so I’ll have to constrain myself for once.

Dene Grigar: And Mark. I’ll draw. I’ll look up the catalog information for the Ola Festival, and there’s also a video and all kinds of stuff. So it’s well documented. Yeah.

Speaker1: Sure, sure.

Dene Grigar: Thanks to Lelo and his team. That was. That was an amazing, amazing, amazing thing. So. Let me ask you a question. Frodo, since you mentioned the book club, is there any particular kind of reading you want to start off with? Because I haven’t selected anything yet. Does anybody have a brother? Any brothers? Randall, is there anything you’d like to study together? Read together?

Brandel Zachernuk: I’m a huge fan. Of what? Prototypes. Prototypes by by a couple of folks at Apple and 97. It’s a pretty good primer on on those things. It’s like ten pages. I think I folks may have seen that I made a virtual reality version of it to be able to walk around as a museum. But it’s pretty foundational thing for me. I also really like Buxton’s sketching user experiences. I haven’t returned back to it for a while, but his 2007 textbook has some really, really good chapters on sketching, per se, which I think are really, really important when you’re inventing a medium. So those are those are two votes for me.

Dene Grigar: Okay. Buxton. Buxton.

Brandel Zachernuk: Yeah. Bill Buxton, he he works at Microsoft. Well, he may be retired now, but he was at Autodesk or he was at Aliaswavefront designing Maya. And has a it’s come from a music background and really interesting motivations and, and views. Okay.

Dene Grigar: All right. Anybody else? Peter, anything that you are interested in? It’s not a one time thing. So we can add, put a list together a wish list.

Speaker6: Nothing’s coming to mind.

Dene Grigar: Mark.

Mark Anderson: I. Oh, I haven’t at this. As I sit here this second, I’m just copying a link to that was brand. Oops. Wrong button. That was, I think brandle’s. That was what? Prototypes. Prototype. The one that he mentioned that his his his sort of video version of that. No, I’m. I should really make a history of East Germany at the moment, which probably isn’t very pertinent. Fascinating book actually, but no, I actually, until I get my head out of the remains of this paper on which the beginning of your comments at the beginning of today very much I get to it’s, you know, writing tools for the new environment. I’m a I’m a bit of a block over writing. It is interesting how much of I mean, I take heart from the fact that in looking at some of the problems we have to resolve, there is so much certainly within the sort of back literature of the hypertext world, not just the hypertext conference, but broadly in, in multimedia and hypertext. That’s just been lying out there in the open, unused for so long. If we step away from pretending to write paper on screen. So that’s interesting.

Speaker1: Okay. I’ll

Dene Grigar: I’ll look around, pull some things together, and next Monday I’ll be. I won’t be able to be here next Monday because I’ll be in Victoria. And let me explain the schedule. So we’re arriving Sunday. At around 3:00 and we can’t set up till Monday morning at eight. And the exhibition opens at noon. So the four of us are going to be scurrying over to the library, to the room that we’re putting our exhibition in. At 8:00 in the morning and setting up. Thank God the people were working with over there. No, no. What the hell they’re doing? And Mark. They’re supplying us the old Macintoshes, so I don’t have to carry the old ones. I’m just carrying the desktops with me. The ten tens, the ones running the 1010, the iMacs. So it’s a lot less than what I shipped to hypertext back in September. It won’t be as pretty as a space without a nice garden, but it’s a beautiful library and a beautiful space with windows. And already the university is publicizing the exhibition, so we’ve got lots of publicity out. And then it’s at the same time as DSI. There’s about 500 people at that event. So I’ve already talked to faculty that’s teaching during the HSC. They bring their students, their participants in the to the exhibition. So it’s going to be good. But that means that Monday will be tied up setting up that exhibition and we’ll be missing next Monday. I suspect we will be able to meet on Wednesday because we’ll be all set up and we don’t open till ten. So that’s the plan right now.

Mark Anderson: I think your discussion of not having to fly around with old systems just prompted a thought. Which a side note, but talking with Klaus Amaris doing HD 24 committee stuff. And I raised the point with them thinking about the summer school and the conference about having at least a few old systems available in the general sense. And I, I just make a point that I think Marius was going to contact you and see if there’s anything. I know it’s not the same running old you know, works on new kit. And but. And that’s a delighted to be able to do that, but just thinking about things that people might want to see, especially with the students who probably won’t have been exposed to a lot of this before. So if there’s anything we can think. And so Marius is going to try and get, you know, things set up so these things are available for people to see.

Dene Grigar: I gave him a macintosh. I literally gave him one of my.

Speaker1: I recall.

Mark Anderson: I recall you did.

Dene Grigar: And he should be able to carry that one into the exhibition, into the into the conference. I also sent one to London and one went to Germany, two went to Germany, Riggisberg and Hoff. And there’s one in Toulouse, France. So I’ve shipped out five of them. So I’ve been sharing the sharing the wealth. And Mark, I’m happy to give you one, two.

Speaker1: Rhoda.

Frode Hegland: Yeah. First of all, in terms of what to read if I already knew, then it would be boring for me. So I’m very, very happy that you need to be entirely in in that mode when we’re on those sessions. So absolutely nothing to contribute, but I’m greatly looking forward to it. I do like themed sessions, so we don’t go all over the place, even though of course we’ll have some flexibility. Also a really brief update on my software. I’ve had an issue, and I tried to have it done by the time people came over on Friday, but they didn’t manage to do it because they were confused. When writing in Author and Vision, the margins are quite wide, which in a sense I wouldn’t think as being an issue. But in practice, if I am an R mode, I really want to have a clue of what’s going on around me. So I want the information area to be only taking up the space. No clutter with a margin or anything. But then it also turned out even if I’m at home in my library in a fully synthetic environment, I still want that. I still don’t like margins. Margins are very much like on a normal computer. It’s something I would never have guessed.

Frode Hegland: So they managed to do that. So for those of you who have author Vision Test Flight, you can now try it. And it should be available very, very soon because we’re not trying to do that many more things. The next thing we’re trying to do is things like in the context menu, have the first item be undo, because it’s very easy while writing to make a mistake, and there is no undo in Vision Pro. I cannot say to Siri on this there is no button for undo. So this is again exciting for a developer to realize. What the heck. Oh. Final. Final. And this are recorded. I got Leon today to go into reader in vision and he found that even if he’s reading a story, he prefers the view that you first talked about Dini, where you have many pages open and it remember it reader and vision doesn’t curve, so it’s just rectangular. So he had something like 3 or 4 pages open at the same time. But you can easily slide. You don’t have to do the page thing. So as a related experiment for a super simple, very little interactive thing, it’s very nice to begin testing how it feels for people.

Dene Grigar: I just put two things in here. Margins exist for print publication needs. There’s a wonderful book. I used to teach a course called Language Checks in Technology, and it’s about the history of communication, writing, oral communication through time, print, electronic and lots of book history stuff. Right. And so the when indexes were introduced, when the spine, you know, spine writing were introduced, these, these conventions of print. And we had 500 years. You know, Steinberg’s book is fantastic, lays out every step of the way. And so we had 500 years to develop the conventions of print. We’ve had, what, 40 years for the convention of electronic. And so it’s tiny. We’re just starting to think about these things. But we don’t need margins. We don’t need pages in a virtual environment, all of that. Goes away because all I need are words on this on the screen. I just need words in front of me. But we’re stuck with pages because that’s the convention of the tools that we currently have. So for.

Frode Hegland: My perspective is a bit adjacent to that, and that is when I’m reading on paper. I massively enjoy a margin because it takes the external environment of the table or whatever. It gives a space for the text to breathe. But then I’m a pretentious type designer, so maybe I have a need for that. And of course, when I’m reading something that is for work, it’s really nice to be able to write in the margins. But of course, in a digital environment, that could be a layered thing, obviously. Yeah. Yeah. We’ll be finishing properly today. Peter. And when it comes to pages, I agree that the pages created by the publishing thing should be entirely throwaway, able and reusable. I really, really think that we at least people like me is having a rectangle, a defined shape to read. A unit of knowledge is really, really powerful. And when I asked Bill Atkinson, I can’t believe I can say, this is so cool, but I texted him asking why he did HyperCard framed rather than scrolling. He said it’s because he wanted people to have bite sized chunks of information. I really like that. Dino, please.

Dene Grigar: Yeah. I think what really influenced the way I’m thinking about this project that we’re doing is that working in the multi-media performance space, in a room that gave me 12 by 12 by 12 space with six wall projection possibilities and a 360 degree experience with the technology I realized I didn’t need. A space like this anymore. I could communicate a lot of media. And so that’s why even my my definition of text is so broad. You know, we worked with video animation, 3D models we worked with sound, music, voice gesture, smoke machines, robotic lights. You know, we had and then we had projections and all of that becomes text in a all becomes kind of collapsed in a, in an environment that’s reading everything as digital, right? Digital objects. We’re using a macintosh for all the multimedia production using module eight and other other kind of software, and then using the PC to run the GAM system. And those two talk to each other. And then on top of that, Froda, my lab spoke to the lab in Victoria, BC, where Steve was so that my gestures in my lab were happening in the just in his lab controlling his media and vice versa. And we can see each other through the eyesight back in those days, the eyesight technology, and then later the camera so that we could perform at a distance with each other, you know, in at the same time, the lag was so imperceptible that it didn’t seem like there was any lag lag at all.

Dene Grigar: And so that experience changed the way I think about what I want to do in this space. And I can’t go back. It’s kind of like when I learned to code for HTML, I quit being able to lay out print pieces for brochures anymore. It made no sense. The limitations were so exacting the colors were limited a four color separation system as opposed to 256 colors in those early days. Now so many. And once your brain is changed, you cannot change it. And I think that’s a good thing, right? So I keep pushing for this idea that we we want to go beyond what we currently are used to, because that’s part of what the grant allows us to do. But I also am aware that we don’t have the right tools to do it. So how do we go from point A to point B in two years when when print has had 500, right? I mean, that’s the spiel. And that’s exciting. I mean, to me, that’s the exciting problem to solve. That’s not a that’s not a problem. That’s a solvable problem. It’s fun to think about. And I do lay in bed and think about how to get there, how to break that apart. Yes. Steinberg 500 years of printing. Thank you Mark. Fantastic book. And there’s another one called the book that I also use. That’s all about the book and it’s very detailed. Less theory and history.

Speaker1: Anyway.

Frode Hegland: One of the things I find exciting about new media is absolutely how they can expand our experience. But I don’t think many media kill older media and I don’t think they should. So we should absolutely have a richly interactive and Portugal. Is it the cap for the book?

Speaker1: Okay, I want to get the book.

Frode Hegland: Yeah, yeah, he’s a friend of mine. Keith Houston. He’s absolutely brilliant. Is a very good guy. But, I mean.

Dene Grigar: Even the cover of it’s kind of cool shows the the layout, the structure of the text, and, you know, the page, you know, where the page came from, and. You know really terrific book and more for graduate students than it is for undergrads, but help put together lectures and stuff. Thank you. Mark.

Speaker1: Yeah, Randall’s got.

Dene Grigar: One in his hand, too. Randall, what are you holding?

Speaker1: I do it’s.

Brandel Zachernuk: It’s a little obtuse. And unfortunately it’s it was self-published, I believe it’s called shift happens. And it’s I’ve got a couple of copies it doesn’t have. It’s. The cover is just printed in gloss over the matte, so. But it’s it’s really beautiful. History of typewriters and word processors and that’s the first typewriter prototype was a single key attached to a morse code key. And then Remington got to build the first production typewriter off of the back of off the back of their sewing machine business. And so the first commercial typewriter looks mostly like like a, like a sewing machine rather than anything else. And it took, you know, ten years to put the turn the page around to be the right way so that people could actually look at it while they were typing. And there are all kinds of arguments that there goes. No. Is that I mean, that that shows you the kind of colors of the and industrial design of the. So yeah, it’s and and and and a form factors, you know, like these the profusion of random designs and ideas about how you might sort of strike ink to paper is just incredibly important for understanding the silliness that necessarily needs to start an industry and a process. And it’s it’s really exciting to realize that we must make those mistakes and that. It. And collectively we’ll we’ll settle down eventually. But Yeah. Unfortunately, it’s I think it’s a completely sold out, and I doubt that he will print another version unless he I mean, he could try to throw it in or something, but yeah, I don’t think he has any aspirations for it. So I have two copies. But but I don’t know if anybody fancies my sending it around to anybody.

Speaker1: Yes.

Dene Grigar: I apologize to my. As you know, my ragtop on my convertible was torn and I had to order the replacement $4,000 and it arrived today and they want to do it next week while I’m in Victoria. So I have to call him back. But yeah, I’m finally getting that fixed. Sorry about that.

Speaker1: I get your.

Mark Anderson: Point when you talk about the book. And I should take a look at Steinberg. Keith. His book I came to the Future text a few years back. Loved his stuff. Also his shady characters, the one he wrote about the origin of the Octothorpe and things like that is also very readable too. But it occurs to me, one of the things I’m thinking at the moment is paper I’m doing is you’re absolutely right. We’ve had 500 years to get to all sorts of usual nostrums in print, and I suppose the question I find myself asking is which of those are actually useful in an immediate sense, which, you know, if you had to run out of the. But which one would you take out of the burning building? You know, would would you, would you stay for footnotes? Would you do you need pagination? Which of these things are obvious and comfortable to us because we have a cultural knowledge of them? Whereas which are actually structurally still have an importance in a new medium. And I don’t see that. I don’t see that as a bit of both. I don’t see as a sort of a kind of a binary or a pejorative judgment. But I as I think about it, it’s really, really hard to try and imagine not having these things and then sort of coming back and having to reinvent them because they all seem so obvious. So I, you know, I don’t know I don’t know the answer to that, but I think it’s a very interesting one for us to start looking at, because I think it’s very pertinent to the problem that’s in front of us.

Frode Hegland: Yeah. I mean, it’s kind of funny just to be a little bit provocative, something I’m trying to do less because I’m very good at doing it very badly. But in a way, it’s kind of funny to be both, in a way, fetishizing beautiful prints and saying, we should get rid of the printed page in XR. Of course, we should have an entirely new experience and XR two. But in many cases, of course. Again, provocative, but I think you all agree. Of course, there’s many things from print we can take into XR. So I think it’s really important that we don’t end some kind of a way to polarize these two positions, but do both. I mean, the act of reading a traditional book, beautifully done in XR with So then you say, yeah, actually, why don’t you speak? It’s not in my voice.

Dene Grigar: Yeah. I think the the answer is your question mark, is anything that is required for collection and distribution across physical media margins is so important, right? Because you have to be able to put things behind things. So the bind ability of text is important. And then Frodo to speak to your comment about wanting to write in the margins or to make annotations, don’t need margins of that because you’ve got as ample space. I mean, that’s been my argument all along, is like, I want to write all over this stuff. I want I want to be able to have all the space I want. Margins are so limiting to me. There’s not enough space in them. I think also the idea that footnotes at the end of a, you know, just I’m right now finishing a book on the history or historical book on Winthrop Bell, Canadian person who figured out the Nazis were going to kill Jews back in 1919 and was warning the allies about this, and nobody was listening to him. And so it’s an interesting, interesting thing, but all the all the notes are in the back of the book because readers don’t want to see those notes. You know, they want to go back to the end of the book, that that’s not necessary in this environment. So there’s a lot of things, I think, that are related to the physicality of pages and books that we need to kind of start moving away from, and this is what Apple is doing, right, with the Apple Vision Pro. I mean, this is what it’s doing, right? Right. But how can we push that forward, the textual experience as opposed to the usability of the system, which is what they’re doing. And I think, Frodo, just to speak to your software, I think you’re on to something with the way you’re thinking through these things. But I would want to just kind of make the page disappear in that space and have just the words there. Right. Okay.

Frode Hegland: Thank you. I don’t feel the same way because words floating in space get messy in my experience and my perspective. I should I should be able to tear things off like you talked about. I completely agree, and I’m experimenting with how to do that, no question. But, you know, 5500 years of refining the graphic iconography of writing, I don’t think it’s that useful. There’s a lot of babies that can be thrown out with the bathwater, though of course we shall experiment with it. But yeah, anyway, this is a nuanced discussion, so the only thing I reacted to was when you said, I think anything that is required for collection and distribution, of course, physical media can be ignored to different degrees. Yes. And I look forward to further dialog on on what that will be. Mark and Randall.

Mark Anderson: Yeah. I, I just to reinforce the thing, I don’t think these sort of binary choices. I think part of the reason to be bold about thinking about what 1st May leave aside, I think it’s about so much sort of throwing the baby out with the bathwater. It’s just saying, let’s start thinking. I do find it incredibly difficult because I’m trying to do something very flexible. I keep finding myself using paper constraint notions that are terribly useful and easily accessible to me, but then pull me off the wrong direction so I and I don’t, I sort of text on paper and the whole design and the whole enjoyment of that, that, that they’re different. They’re different things. But I don’t think, for instance, I want necessarily not always in some cases, I may want to see, in a sense, the facsimile facsimile of the the book experience in a, in a virtual space, because I can’t get to the original or I can’t see it where I am. But in other cases, I don’t want that. It doesn’t mean that it’s got to be sort of read in monospaced font or something. It doesn’t have to be awful experience, but I might actually wanting to be doing something different. And I’m really and I take I just at one point finished and I never I take from the experience of first using a Kindle, and I so railed against it because I didn’t know where I was in the book, and there was a sort of percentage and thing, but now I just, you know, I take my I pump the font size up, I take my reading glasses off, and I just read but those are novels. You normally don’t have illustrations, they don’t have any of that. And I’m just trying to interact with the text. So in that case, I’ve lost nothing from the book apart from maybe a nice cover, but, you know, so I think we can be quite open ended and I’ll stop there.

Speaker1: Yeah. No.

Frode Hegland: Okay. Just really briefly. Brian, I’m so sorry. This is something that I’ve really asked the community about quite a lot. We call it an exploded book, a pop up book. All of these things, even if we sketch out ideas of how a book can be on paper, this is a really great time for that, because one of the things we’ll look into now is how to read a document. And it doesn’t have to be based only on ACM giving us a flat thing. It can also be based on our idealized form. So when I’m discussing this back and forth, please understand I’m coming from perspective of great support, and I would love to see more perspectives on this. Yeah. Sorry about that.

Speaker1: I don’t know. Not at all. I.

Brandel Zachernuk: Wanted to say two sort of slightly opposing things. One of them is about what Skeuomorphism actually is. It’s not, you know, shaded corners or buttons. It’s it’s to do with borrowing the sort of functional language of a medium or a substrate that something no longer exists upon. And so in that context you know, Skeuomorphism definitely was shaded buttons at a time when we, when we didn’t need the shading because they, they weren’t recessed or debossed out of a, out of a, out of a physical monitor screen and having a top left corner of, of windows beigne writer and stuff. That’s all. All a call back to that. But. There are other ways that things can be skeuomorphic as well, in that if you look at an iOS screen now, it maybe has more in common with a printed page than it had with the spatial dimensional stuff. And so actually not shading stuff on visionOS is arguably skeuomorphic because there is meaning in shading, there is meaning in depth and shadows and all of those kinds of things. And so, so a rejection of that is actually skeuomorphism skeuomorphism and, and, you know, it’s that’s one of those things that designers are going to learn the hard way because they are knee jerk in many regards. And they look at the fashions and they try to figure out what things mean based on the vibes, rather than necessarily thinking about what kind of information exists for people.

Brandel Zachernuk: So that that means that thinking about what actual dimensions are real and what, what, what jobs they do for us is important rather than merely like what appears to feel modern, because otherwise we’re going to be stuck with that same electric blue color for the next 40 years as well as the last. But the other thing that I wanted to talk about was margins and and to try to, to come to their defense a little bit, because if you look at Apple.com, it has just an inordinate quantities of white space or pale with tasteful pale gray space or sometimes black space around all of its type. And one of the things that that does is gives the impression of simplicity. It gives people room to process one thing at a time on its own. And and space does that, you know, space. So, like, you can, you can remove a single style sheet on most Apple.com pages and it will completely disable styles and it’ll look like Wikipedia. Thankfully, hopefully in most cases that also means that it will still read properly. The HTML is not so munged that it that the actual semantic elements are out of sequence in such a way that means that it, it loses its semantic relevance as a document as well.

Brandel Zachernuk: But it’s remarkable the difference of difference between the two because it just looks like another page. And the, the apparent simplicity at some points is illusory. And sometimes the, the actual whitespace really contributes to it. So there are a lot of things that are for us, there might be because we needed to stitch the damn pages together in the first place. But they, they come to do very important mechanical things for us in terms of our processing. So, you know, back to whether a page exists or not. We don’t necessarily need to have stuff on pages, but what we do need to do is have a really vastly reduced visual contrast behind pipe as opposed to on pipe, in order to make sure that we have the differential capacity to recognize glyphs and the shapes that they have. And so whenever you see type that’s actually rendered on in either in real life, in meatspace, because it’s actually written on acrylic or perspex or, or in a digital context where you have type that’s just floating in space with nothing backing it, you do find that it’s remarkably challenging to process and you need to sort of get your head around it in order to figure out which things are at which perceptual depth.

Brandel Zachernuk: So it’s an interesting discovery of like, what are the constraints that have been hiding work for us versus things that are only there for the benefit of of, of reading, of the mechanical production of it. And one of the things that this type book has been really interesting and telling me about is the way that we probably didn’t have a lot of experience with mono spacing until typewriters, because there wasn’t really a significant need for it. And so a lot of the conceits that are baked into Courier and other fonts maybe were completely alien at the time of that. And and we’ve only clawed back the idea of variable width fonts in the last 40 years or so. Obviously newsprint had them for a long time, but, yeah, like just this idea that That That the some of the sometimes the conceits of of the mechanical efficacy and expedience. Have pervaded our understanding of something so deeply that we don’t realize. How much we’ve had to bend over backwards in order to facilitate them is something that’s that’s interesting, but also then goes like, well, what do we all what else did we get out of mono spacing or these other, these, these other mechanisms that we’re sort of obliged to rediscover the need for once we have the ability to throw them out.

Frode Hegland: Sorry, I was muted.

Speaker1: Yeah, yeah.

Frode Hegland: On the margins, of course. As I mentioned towards the beginning of the call in author, I have removed the need for margins just to make it clear that on that side of it, I completely agree. Because it should be digital. Sorry. Zoom is being annoying. You should also, of course, be able to have massive margins should you want to. If you’re working on a specific piece of text that you want a lot of margin. You should be able to, you know, just resize the margin. So I think what we’re talking about is end user choice and flexibility and a rich working environment.

Frode Hegland: You know, that’s really, really important. Yes.

Dene Grigar: If we think about I mean, I think what’s beautiful about the book is its ability to document. Right. That’s the point of it. So and we all can read the same thing. So Mark can share a link to a book. We all can read it right. And it doesn’t change. It’s it’s finite. And so the information is documented. This is the beauty of the book. So if we consider that what we’re building in this environment that we’re talking about, building is not so much we’re building in space, but we’re we’re going to have a. A kind of invisible boundary. Right. So I can have as much as I want in the space. But when it comes time to share that, all of that becomes documentable to make up a new word. So it’s documentable and it can be reproduced right in a bio printer or whatever. Right? It can be. It can be saved from that perspective. But what this means is that when I’m in the space working in it, I am free. I am freed up from the limits of the eight and a half by 11 page, which I find limiting as hell.

Dene Grigar: So I’m not. I’m not suggesting we get rid of the book as a practice. But the aspect of the physicality of the book in the space, that doesn’t make any sense for that physicality. The functionality should be held onto. And this goes back to the history of scrolls versus books. You know, scrolls were the first books. Homer was not written as a book. It was on scrolls after it was written down. Right? And it was scrolls. You could see images of what scrolls looked like in the cubbies that these libraries would hold on to, and they were categorized in a specific way, but they also took up weird spaces, and the book was much more convenient than the scrolls were. So there’s a lot of conveniences, but we didn’t get rid of the the practice, the functionality of a scroll as it was used to relay information and share it. But we fixed the functionality problem in this new environment called print. And that’s what I’m talking about here.

Frode Hegland: Yeah. Absolutely. I mean, one of the early features of text I had the chief person in charge of cuneiform and Egyptian writing from the British Museum, which was really fascinating to have them kind of fight jokingly about where the first writing was. So yeah, that was fun. That’s that’s a lot of discussion there. But a little bit on Mark’s comment earlier, why does it have to be a book? But this is something that we’ve been discussing a lot in terms of our language, of course, recently, I think that we’re very lucky because we are dealing with academics. So that’s why we can use an even worse term. But it has more meaning. And that is, of course, paper, even though it isn’t made of paper. But, you know, the word book comes from beech tree anyway. So that in itself came from something entirely different. So it is really important that we can define what we mean by a unit of knowledge. And it’s something I really think we need to keep talking about because is a unit of knowledge. The the paper, the proceedings, the book, the paragraph, the sentence, the glossary, the dictionary. Is it in between? Is it what connects them? These are really, really exciting questions. Over.

Dene Grigar: The photo. We’re going to have to leave in just a few minutes, Andrew and I, so maybe Mark and then we’ll take off.

Speaker1: Mark.

Mark Anderson: Okay. I just I was just going to say, I mean, an interesting aspect here. This whole thing about of books and documents is it’s funny how how rapidly we’ve, we’ve sort of applied the word document to, well, is it one file, is it lots of files. Is it the input file. Is it the output file. And funny enough we tend to actually talk about what is the output file. And then an interesting aspect of the flexibility that the dean has been referring to is that it’s like, why can’t I have a document? Or why can’t I have a thing that has an output form that says immutable insofar as a PDF can be considered immutable, but I can still have that same document in a perfectly reflowable form because I, I need to do I’m not trying to interact with the copy of records. So that needs to be there still. So there’s an interesting thing about well, what is the document? What is the book? I mean, obviously with the book, when I make it, it has a physical form and there’s no argument about it. Of course, that’s until it gets republished and looks completely different, at which point even that begins to sort of fall away.

Mark Anderson: So I think there’s, there’s there’s, you know, there’s there’s some flexibility still to be had. I certainly think we, we need to understand the relationship really, between the stored media that are in the digital sense, is is what gives us, eventually gives us the form of the document we interact with and what it means. And I’m, I sort of beginning to see it actually as a collection of things. It’s almost like a wrapper around a number of a number of instances or a number of outputs. Visualizations call them what you will of that content, and you need all of them. Not not for every purpose, you know. And that’s the other problem. It’s not a thing that runs across everything. You know, if you’re going to write a simple kids story, you probably don’t need to have, you know, 15 different versions of it, okay, you can have pop up things, but what I mean is you you haven’t got to have the depth of something compared to, you know, a monograph on Victorian butterfly collections or something where there have been lots they’re going to be lots of sort of anecdotal pieces all steamed in together. Yes.

Frode Hegland: Freud business, of course, the entire goal of visual meta. So I fully agree. You know, it should be in a solid medium, but you should be able to extract it and do whatever the heck you want with it.

Mark Anderson: Except an additional meat. There is no solidity in a digital medium. So yes, that is.

Frode Hegland: Yes that is that’s what PDF is. But you can then make it liquid on lifting it. But it should be stored in a social way. So you don’t have in a solid waste, you don’t have to worry about link rot and all those things.

Speaker1: Yeah.

Mark Anderson: You do have the problem about not wanting to read it though. I mean, one of the things if in implementing it properly, you have the difficulty that you don’t want to lose it. But if you did go and say print it out or something else, you probably in some or, you know, say to read it in the space, you’re not going to read the visual better. So you don’t people will not thank you if it comes with it. I mean, this is a classical digital something.

Frode Hegland: We’ve gone through many times, and you don’t have to print all the pages of a document. You know, I agree on that and all I was trying.

Mark Anderson: Well, except you can’t. You don’t have the choice. You’re right. But but I mean, of course, the reality is most people don’t have the choice to press a print button. It all comes out. So unless unless the technology.

Frode Hegland: These are the last minutes of Danny and Andrew. We’ve discussed this for literally six years. Right. So we agree on this completely that you should be able to have a stable storage and a completely flexible reading. We agree on this 100% and there are issues surrounding with it. And if you know, you can just reread my thesis on this point. Tara. Awesome. We’re on the same page, literally, even though we don’t want pages Danny and Andrew, anything quick before Wednesday or anything long. It’s your time. We’re good for another ten minutes.

Dene Grigar: Andrew.

Andrew Thompson: I’m. I’m aiming to have a demo if I can get the map stuff working. It’s. Got more hurdles than I expected, but that’s normal with programing. The unfortunate thing is, if I don’t get it by this Wednesday, we’re probably not going to have one for two weeks because I’ll be gone all of next week. Hopefully I can attend the Wednesday meeting. Depends on how much it overlaps. When I’m there, I got to prioritize the exhibition. So I might not get a lot of work done on this project for a week, so just expect things to slow down a bit on my end.

Dene Grigar: Yeah. Don’t think you have time for any production, but on Wednesday morning, I suspect if we get up and make the meeting to at least 9:00. And I’ll have we’ll have breakfast because our condo has a kitchen, and we’ll have breakfast there with coffee. And then we’ll walk over to the exhibition. We’ll leave about 930 in the morning to walk over to the campus from the campus housing. It’s about a about a 15 minute walk, lovely walk.

Andrew Thompson: So you get like half the meeting then that doesn’t have a meeting.

Dene Grigar: So I would like to see you first in that meeting. That way when we need to leave, they, you know, folks can talk about other things, but I do I do suspect we’ll be there for at least the first hour.

Frode Hegland: Yeah. No that’s perfect. And Andrew. Absolutely. I have no concerns about what you just said. I think we’re really now in a one step design phase to look at maybe a little bit more of what this what isn’t the market that’s the previous language should be and also the document in front of us, because you’ve done so much implementation work, we really the entire community just needs to talk about that. There isn’t anything that’s rush rush for you to do. So thank you for saying that. But it makes absolutely no difference from for the for the super success of the project.

Andrew Thompson: Still current version has the maps to separate space. I know we’re moving away from that. We want it all integrated in the same area. Doesn’t, but I. That’ll be a next step.

Speaker1: It’s all.

Frode Hegland: Testing. Doesn’t matter at all if we can just play around with stuff. Fantastic. So very good. So thank you guys. And I’ve been talking. One of the things I’ve been talking to the people this weekend is about Washington, about coming there for the future of Texas, as many of us as possible. We hope to also spend a few days in the Bay area, so we’re hoping to make it a grand adventure, but we should all see each other in about a month if possible here. So thank you all and see you at Wednesday at least part of the time. Bye for now.

Speaker1: Hi.

Chat log:

16:02:40 From read.ai meeting notes : Frode added read.ai meeting notes to the meeting.

Read provides AI generated meeting summaries to make meetings more effective and efficient. View our Privacy Policy at https://www.read.ai/pp

Type “read stop” to disable, or “opt out” to delete meeting data.

16:34:51 From Frode Hegland : https://www.nytimes.com/interactive/2024/06/03/opinion/covid-lab-leak.html?smid=nytcore-ios-share&referringSource=articleShare&u2g=c&pvid=393C046E-A842-4212-A267-597CE3B1398A

16:38:27 From Mark Anderson : I’m in now to NYT, I had to do make and account bit.

16:43:20 From Peter Wasilko : Wands or Rings are possibilities too!

16:43:22 From Andrew Thompson : Can I clarify, do you want me developing two different projects in conjunction, or do you see someone else doing this one?

16:43:23 From Frode Hegland : I’m sorry

16:44:03 From Frode Hegland : Adam mostly gestures I think.

16:44:09 From Andrew Thompson : Reacted to “Adam mostly gestures…” with 👍

16:44:25 From Andrew Thompson : Adam seems excited about gestures, I think he would do a great job experimenting with this

16:44:31 From Frode Hegland : Reacted to “Adam seems excited a…” with 👍

16:45:54 From Mark Anderson : So, I think we are (a few minutes back) positing shortcut gestures, or particular shortcuts addressing shortcuts. Both as means to access what in a web browser would be list of clickable links. My thought is where the link between the two occurs.  IOW do I manage the shortcuts in or outside the VR.

16:52:04 From Peter Wasilko : https://archive.org/details/ScarabOfRaMacintosh

16:53:36 From Dene Grigar : Forth

16:54:04 From Dene Grigar : Amnesia was built out of a language built on Forth

16:54:35 From Dene Grigar : wand like a stylus

16:55:09 From Andrew Thompson : Peter, the Quest no longer requires a Facebook account like in the past. Enough people complained about that requirement so they removed it.

16:58:00 From Frode Hegland : Reacted to “Peter, the Quest no …” with 🔥

17:01:48 From Andrew Thompson : Mark, you bring up some good points with gestures, but it’s kind of hard to speak on at the moment since it’s just the idea of ‘gesture’ rather than a specific example. I can code a bunch of gesture recognition just fine, but how they actually feel to interact with is unknown until we try it.

17:01:54 From Peter Wasilko : Here are the core driver productions for my parser work: Start =  m:Content+ .* { return{production: ‘start’, content: m , location:location()}}

Content = m:(RecognizedContent / Run) {return m}

Run = m:$(!EndTag !RecognizedContent .)+ { return { production: ‘run’,content: m, location: location() } }

17:04:07 From Peter Wasilko : Actually, the .* at the end of the Start rule is superfluous and can be pruned out.

17:04:38 From Mark Anderson : Replying to “Mark, you bring up s…”

Quite understand. I’m just exploring the best ways to communicate from the lay perspective to the (code) implementation PoV. IOW, more productive discussion! 🙂

17:05:04 From Andrew Thompson : https://handy.stewartsmith.io/

17:05:27 From Peter Wasilko : I pasted the old version of the code.

17:05:39 From Andrew Thompson : Replying to “https://handy.stewar…”

Handy.js, a library for webxr hand gesture recognition. Can save custom gestures and recognize them later.

17:05:54 From Dene Grigar : Here are the gestures we developed for the multimedia performance piece “Curlew”

17:06:17 From Dene Grigar : It is the work that was commissioned by the OLE.01 Festival in Naples in 2014

17:11:46 From Mark Anderson : OLE.01 in Electronic Literature Knowledge Base

: https://elmcip.net/knowledgebase?page=38%2C0%2C227&order=title&sort=desc

17:14:12 From Dene Grigar : Buxton

17:15:02 From Mark Anderson : Brandel’s “What do prototypes prototype?” https://www.youtube.com/watch?v=tMr33zFj3YA

17:15:10 From Brandel Zachernuk : What do Prototypes Prototype? 

Stephanie Houde and Charles Hill 

https://hci.stanford.edu/courses/cs247/2012/readings/WhatDoPrototypesPrototype.pdf

17:17:43 From Brandel Zachernuk : And this is Buxton: https://www.amazon.com/Sketching-User-Experiences-Interactive-Technologies/dp/0123740371 (I can’t find a good non-amazon link unfortunately)

17:18:14 From Dene Grigar : Replying to “What do Prototypes P…”

thanks

17:19:12 From Brandel Zachernuk : Replying to “What do Prototypes P…”

Ooh I never looked just in that path, but there are probably some other neat readings in here: https://hci.stanford.edu/courses/cs247/2012/readings/

17:19:32 From Peter Wasilko : I still have my 512K E and custom luggage for it!

17:20:53 From Dene Grigar : Replying to “I still have my 512K…”

hang on to it.

17:21:12 From Dene Grigar : Margins exist for print publication needs

17:21:49 From Dene Grigar : I’d like to erase pages

17:21:56 From Peter Wasilko : Replying to “I still have my 512K…”

It still boots!  But the external drive accessory won’t eject properly, so I consider it to be the world’s smallest hard disk drive.

17:21:58 From Frode Hegland : Width visionOS

17:23:08 From Mark Anderson : @Dene Grigar  which book (by Steinberg)?

17:23:08 From Peter Wasilko : I am going to have to drop off at the bottom of the hour for routine transport duty.

17:23:15 From Dene Grigar : 🙂

17:24:32 From Frode Hegland : “I was thinking of people sharing different domains of knowledge, and wanted to keep the interface bite-sized and easier to comprehend.

Sorry to be slow to respond. I am sailing in Bora Bora and mostly offline.“ Bill Atkinson text

17:24:45 From Mark Anderson : Reacted to ““I was thinking of p…” with 😀

17:25:57 From Peter Wasilko : https://en.wikipedia.org/wiki/SK8_(programming_language)

17:27:07 From Mark Anderson : Steinberg, “Five hundred Years of Printing”: https://www.amazon.co.uk/Five-Hundred-Years-Printing-Steinberg/dp/0486814459

17:28:08 From Mark Anderson : Keith Houston “The Book” love this one: https://www.amazon.co.uk/Book-Cover-Cover-Exploration-Powerful/dp/0393244792

17:29:12 From Mark Anderson : Definitely get the hardback book of “The Book”

17:29:57 From Peter Wasilko : https://shifthappens.site

17:30:08 From Mark Anderson : Shift Happens: https://shifthappens.site

17:30:17 From Mark Anderson : Replying to “https://shifthappens…”

Opps!

17:32:21 From Peter Wasilko : https://shadycharacters.co.uk/books/shady-characters-the-book/       See you on Wednesday!

17:33:50 From Dene Grigar : I think anything that is required for collection and distribution across physical media can be ignored

17:39:32 From Mark Anderson : Provocation: why does it have to be a ‘book’ <g>?

17:39:49 From Frode Hegland : It has to have a boundary, call it anything you like 🙂

17:41:06 From Mark Anderson : Reacted to “It has to have a bou…” with 👍

17:42:39 From Mark Anderson : <<< This. Yes this is why I want re-flowable text over source design for AR reading.

17:44:06 From Dene Grigar : My concern is that those I am teaching  aren’t reading books today as much as they did in the past. They are reading a lot of text but not books

17:44:32 From Mark Anderson : Think of the recent PC-era issue over different ‘types’ of quote characters.

17:45:31 From Frode Hegland : Boundaries also maybe massive margins, when called for. I have removed necessity for margins in Author of course

17:51:13 From Dene Grigar : I like the word “wrapper”

17:51:28 From Brandel Zachernuk : I am really excited by Jon Bois’s documentary videos and the information spaces they entail: https://www.youtube.com/watch?v=NqqaW1LrMTY

17:51:47 From Dene Grigar : Replying to “I am really excited …”

thanks

Leave a comment

Your email address will not be published. Required fields are marked *