Tr : 7 Jan 2022

Video: https://youtu.be/TbS-GqULI7M

Chat log: https://futuretextlab.info/2022/01/11/7th-of-jan-2022/

Peter Wasilko: [00:00:03] Good morning.

Frode Hegland: [00:00:05] Hello, Peter.

[00:00:07] Well, everything’s snowed in over here.

Frode Hegland: [00:00:09] Oh, OK. Yeah, it’s just cold. Not super cold, but yeah, New York has an incredible temperature swings, I remember. Well, not as bad as when the campus would ice over and you get the black ice on the way to the law building, that was just brutal up at Syracuse. Did you ever hit that patch when you were going over by Gearoid Geology? Yes, I think so. Yes. Everything would look fine. And suddenly your fired. Just be out from under you. I seem to remember that and also remember going to the computer cluster in my shirt in the evening, parked the car outside and worked all night, came out in the morning and the car was snowed under. I will never forget that was a serious temperature change, it really [00:01:00] wasn’t cold in that. Yep, I hear you. Hi, Mark, we’re reminiscing about the weather in Syracuse.

Mark Anderson: [00:01:08] Yes, I spent my my father when I was sort of between about seven and 11 was working in New York, so I remember New England winters. So first, first time I’d encountered real snow in sort of more than just about this much.

Frode Hegland: [00:01:25] Yeah, we don’t get that much here, do we?

Mark Anderson: [00:01:30] I mean, not not in the South. It’s unusual, I mean, the last the last really big time was about nineteen sixty something, which I sort of vaguely remember. We made we made igloos, you know, there was enough snow for that, but normally it’s gone in a few days down south. A Mediterranean climate?

Frode Hegland: [00:01:51] Yeah. And so I’m trying to get my paper in today, and of course, today something happened to my server, so emails are not happening at all. Which [00:02:00] is very curious. They are working on it and. You know, document issues and you know, how it is, the more important the document them, are

Mark Anderson: [00:02:11] You sending, are you sending what you’re sending the thesis in?

Frode Hegland: [00:02:15] I sent it in earlier to less than a day, I think today for them to have the final view.

Alan Laidlaw: [00:02:21] Oh, I see, right?

Peter Wasilko: [00:02:21] Yeah, I guess there’s anything glaring and then I have to send it in Sunday. That’s the latest.

Alan Laidlaw: [00:02:28] So maybe there is a sort of, I think, called drops. There basically is a thing that it’s a bit like you, in fact. And I think on PDF tracker, I’m not sure whether you put in there as well. But there are things if you if you find that if you can find the secret office somewhere on the Southampton system,

Peter Wasilko: [00:02:45] It’ll oh, it’s just PDF tracker. That’s what they want.

Alan Laidlaw: [00:02:48] All right. Okay, so that has an upload facility in it. So.

Peter Wasilko: [00:02:53] Yeah, exactly. That’s that’s what I’m trying to do. Oh, OK. But other issues and such as this nonsense about having all [00:03:00] these pages before the table of contents. Right. If I do that, an acrobat, that means that the page numbers will be off.

Alan Laidlaw: [00:03:10] Uh, you know. Well, I’m sure it’s a long term play with that, but I’m sure you used to be able to set custom. It’s just, you know,

Peter Wasilko: [00:03:19] I’m not going to have 200 custom pages.

Alan Laidlaw: [00:03:23] No, no, no, no. I think I can’t remember, but I’m in a different form of life. I do remember knowing about these things, and I’m sure. I mean, you probably can’t do it in most other PDF editors. But you know, it’s another bit of sort of secret sauce, I think.

Peter Wasilko: [00:03:40] Yeah, and it’s still a pain in the arse.

Alan Laidlaw: [00:03:43] I agree.

Peter Wasilko: [00:03:45] I didn’t use latex for it. No, I did it. No, I find it abhorrent due to the fact that it needs to exist. I think it’s almost as bad as my PDF needs to exist. I mean, [00:04:00] only about two years to wrap my head around it. Yeah, yeah, yeah. No comment.

Alan Laidlaw: [00:04:07] No, I mean, the thing that finally drove me to it after I think I got to do my nine month report and realize that I hadn’t actually mastered late EC enough to be able to do it. And at the eleventh hour, I had to put it all into word. And that was that just reminded me that, you know, friends don’t make friends use words. So from then on that that was later. But, you know, other tools are available as you’re proving. One interesting thing that sort of pertinent in a future text sort of way is is the fact that we haven’t really. Another thing that we haven’t sort of formalized is the fact that, OK, you have a paginated document. I mean, you know, in an idealized world, you wouldn’t worry about it at all because somehow there’d be a sort of formatting layer that would just say, take this stuff, cut out the following way. But whilst we’re in the world of sort of PDF and paginated documents, I’m sort of surprised there isn’t a bit of formalism to say right here to hear front [00:05:00] matter here, to hear whatever hit here, you know, visual visual metaphor at the end, do not print by default for sake of argument and tune and tunes thereon. I mean, because in a sense to me, you know, that need has already arrived some years back and we still don’t have it in in a way that is understood, you know, because it needs to be there so that PDF tools or PDF readers and things can understand that PDF print mechanisms can understand that and we get to a nicer place.

Peter Wasilko: [00:05:38] Yeah.

Alan Laidlaw: [00:05:40] The curse of the print age,

Peter Wasilko: [00:05:43] I don’t think. I’m not sure if that’s a curse, but

Alan Laidlaw: [00:05:48] No, it’s an it’s an inconvenience when

Peter Wasilko: [00:05:50] You’re I think actually the curse is the fetish for cosmetics that academia has. All professional groups have some kind of a hazing [00:06:00] ritual in language. And I think that there’s just so much there like it has to have these pages, but it has to be before the table of contents. Most of this is not for fuck’s sake, you know, can we not write this?

Alan Laidlaw: [00:06:13] I know what. I agree. It’s a pain. I think it’s slightly unfair to just write it off. I mean, something that one that one personally at the point finds inconvenient doesn’t mean it’s not a convenience to anybody else. And I don’t think it’s not there as a gatekeeping exercise. These things all add up. No, it’s not. No, no. That I think I think that’s overreach. It is definitely the case that some I mean, the fact that, you know, this place wants this font and another wants a different font of that kind of thing. Well, that’s that’s a choice that you could argue more about. But, you know, if front papers have a meaning, say, for instance, when you when your thesis is eventually up on the server, the first page of the server PDF will eventually be some kind of a copyright or access statement, which the system [00:07:00] requires it be there. It has no bearing on your document. And in the end, when I submitted mine, I actually, in the end just made a separate PDF insert in front uploader because I was beyond caring about page numbering at that point.

Peter Wasilko: [00:07:14] But exactly that breaks page numbering. But yeah, but it also marked.

Alan Laidlaw: [00:07:21] The point is that’s a fault of the page numbering. Not, not the process.

Peter Wasilko: [00:07:25] Ok. It doesn’t have any semantic meaning, though, does it to the system? What’s on the.

Alan Laidlaw: [00:07:31] Well, it should do if the document is properly self describing, it should indicate that the first page is a is a whatever is a copyright or re-use assertion. So no, no, absolutely it should be if it needs to be in the document because the, you know, the organization that published it says it needs to be there, then it should be represented in the schema so that something’s reading the document, for instance, those OK, I don’t need to worry about that or I need to go and look up in some system that tells me whether [00:08:00] I’m allowed to process this document. So, no, no, I think it is all there. What’s broken is that again, in the sort of lazy way that we transitioned to sort of faux paper, we didn’t really think through all these bits. And yes, you can use something like latex and you can really, you know, you can get incredibly insane things. But to me, that’s the tail wagging the dog. As I went through that process, my my feeling all the while is, this is wrong. This isn’t how it should work. We should have, you know, earlier, we should have grasped by the horns, say, OK, even if it’s not there for the long time. This is a structural part of dealing with print in a loose sense works. And therefore we need to have a mechanism, a schema that actually allows for this so that systems can just take that up. And then it can be almost like a presentation there. And we get away from stupid stuff like hard coding pages into, you know, where they don’t need to be. So [00:09:00] when I was invited on the print

Peter Wasilko: [00:09:03] When I was working with Logitech, I started just doing some stuff, but then I started thinking about how I wanted my computer science for lawyers textbook to look. And that led me to start digging through all of the annotation related pages that I could find in the different latex packages that were available. And for a while, I was trying a couple of different document templates. And eventually I realized that I had to come up with my own document class myself. Then I decided that I want to do something really crazy for my page design. I wanted to have. And two footnote series one for the actual raw citation data and another one for bibliographic comments related to the citations. Then I wanted to hyperlink them together so that I could automatically jump from one to the other and [00:10:00] from inside the main text, I could jump down to just a bibliographic citation or to the metric citation. Then I started reading about some of the margin notes system, and I decided, Well, that wasn’t quite good enough for me because I wanted to have several different broad classes of margin notes so that you could just glance at the more than you could know without bothering to do any reading what the note was like. So for that, I wanted to have one color related to culture and language, another color related to history. Another color is broadly defined related to anything really math and science that was highly technical and one four. So I had like culture and language, we’re going to be green, color coded notes, then history and management topics. We’re going to be color coded with a red tint, and blue was going to be for math and science. And then I [00:11:00] decided that, well, that wasn’t quite good enough. So I had the broad color codes. Then I subdivided it for pure language notes, for language usage and also cultural reference notes in a different color so that you’d be able to see what the pop culture and context was for a term being used.

Alan Laidlaw: [00:11:19] And then you went to the printer and you find it was all coming out in black and white.

Peter Wasilko: [00:11:23] Well, then then when I decided to do was that I wanted to be able to arbitrarily nets these so that I could be in a historical note. And then inside the context of a historical note, I wanted to provide a reference. And then the reference itself describing the bibliographic element might have a math note embedded in that that could itself have a historical note embedded. So suddenly I realized I was having these categorized color coded margin programs. Plus, I needed to be able to arbitrarily nece them. So in order to do that in latex, there’s a distinction between the [00:12:00] mark indicating that you’re pointing off to some marginalia versus the actual content. So you had to sort of stack these things. So I had to build like a little hidden stack. And that was the point when I realized I needed to write a preprocessor for latex or I’d be going insane because I wanted to, just as an author, be dealing with, OK, I’m in a math note. Now I embed a historical note inside of it. Now I embed bibliographic citation in the historical note and have it come out, right? So what really is happening is in each one of those contexts, you’re getting the mark put in first, followed by the content, which has to be in like a reverse order unwound at the top level at the end of the day.

Peter Wasilko: [00:12:42] I think you are highlighting the issues very well, but it’s going into. Ok, so just to slightly cut you off, we are one of the things we need to do as a community is to decide on the kind of user we want to support, [00:13:00] right? And the the link that I just put in here in the chat is I decided to OK stepping back. I was told by my advisers to, Oh, Alan’s got a haircut that I was not allowed to put in the full transcripts of the interviews because of privacy and nonsense, which I think is crazy because everybody has explicitly signed up for it. But fine, I do need to conform to get this thing. So I instead made a new PDF put the transcripts in there and it’s 200 pages long of just people talking about visual metaphor. Next, person. Yeah, it’s interesting. So I’m going to go through and edit some of it like stuff that isn’t really relevant. I’ll get rid of. But so I think this could maybe with a little bit of an introduction b issue zero point zero of our journal. Just to really get us going. And I had a long discussion with Chris Gutteridge today about addressing and so on and the whole thing about you’re supposed [00:14:00] to be able to if you click on a citation and a document, if you have the local copy, it should open.

Peter Wasilko: [00:14:05] And we went all over the place and at the end I said, Chris, I’ve got to go. And then he gave me the solution in 10 seconds, he says, got nothing to do with any URLs or anything like that. So. I don’t have the transcript from our Monday conversation yet. But what is going on with everyone else, Alan, you were looking into a few interesting things this week to right? Uh, yes. I need to I don’t have anything to report at the moment, except that the the product is. Um, what Adam showed at the end, you know, the last call, which is fantastic. That’s that’s the kind of limitations that [00:15:00] Tullio video would have. The moment, so I’m still looking into the to the hard limits, but essentially, I mean, Zoom is going to be easier for for this crowd, for sure, right? It’s just going to take more time to develop to get it. I mean, that was going to be the case anyway to get the interface right and then having to get new audience member speakers, people familiar with the new platform. Uh, gives me hesitation. Well, I think that’s that’s fair enough, and the reason when Peter was going in into the issues he had with the latex, just so I don’t ignore that is the expert interview is one of the things that came out of it is there is a learning curve to the stuff we’re talking about.

Peter Wasilko: [00:15:56] But guess what? That’s a good thing. Because you need [00:16:00] a new literacy when you have new tools as kind of is obvious when you say it, but we have to accept it. One thing that I’ve noticed with the Oculus is that the onboarding experience is absolutely terrible. It’s so bad. I mean, it’s exciting for us as developers that it’s bad. But in terms of mass adoption, I mean, first of all, even the printed pieces of paper that says how you put on the battery pack, it’s actually missing steps. When you first put it on the cool thing they have that actually does give a bit of an introduction as a download, you have to find by yourself. And also, the quit button that you hold down the Oculus thing often just doesn’t do anything right. And if we look at our community here, we have to decide, you know, are we talking late tech users that we support or just keyboard clicks? Or, you know, I’m not saying we should have the discussion now, but we all have different ceilings for our technical capability and interests. So we [00:17:00] should just keep it in mind. That’s all.

Alan Laidlaw: [00:17:04] You know, this also another interesting area, which is sort of the divide between knowledge of the subject and competence in it. And I I mean, in the sense that I definitely sort of understand the broad strokes of a lot more technology than I would ever want anyone to allow me to be put in charge of to actually achieve anything. So I’m, you know, conscious of what I don’t know in that sense, but it becomes an interesting sort of liminal space between the, you know, the the completely unskilled and the very skilled.

Peter Wasilko: [00:17:36] I wouldn’t say that just to be really nitpicky, because it’s an interesting aspect of the particular work I’m doing now. I don’t think it’s a linear thing, not that you do mark. But just to kind of jump on it, so to speak. We have to decide on what kind of literary essays to support in a way because there are many different ones. Mm-hmm. Sure. Sure.

Alan Laidlaw: [00:18:00] And [00:18:00] one other thing, just just because there was just playing out as the Brandel and Allen arrived is that I think it’s sort of pertinent what we’re doing is pro decided by by stating the annoyance you’re being asked to put some extra stuff in the front of a document and then having all the page number going out. And I just reflect on the fact that it’s amazing. We still don’t really have a robust and sort of robust system that works across the piece in terms of describing how page numbers as we culturally think of them. Mesh to the parts of the document. And maybe that’s another thing that visual matter at some point can sort of address, you know, where’s in sense, where’s the body of this document? What is stuff that it’s necessary to be in the document, but, for instance, may never need to be printed or may never need to be presented in a normal reading thing? You [00:19:00] know why or nothing like reading? Why on Earth what? I’m reading a document. Do I have see headers and footers on a digital screen? They helped me as an performance in the book to dive into the right place in our Chapter eight. But if I’m working on a digital device, I’ve got any number of better ways to do that.

Peter Wasilko: [00:19:16] Talking, talking of working on a digital device. So I don’t know if you’re all aware, but Adam has joined the Oculus Quest. So I haven’t met Adam in VR yet because his kid stole the headset, which is fair enough. I was really shocked a few days ago. Keith Martin, whom some of you met, may join us again today. Him and I, we met in the normal horizons meeting room. And there was two things shocking about it. One, there isn’t that much you can do, and it couldn’t be that hard to make it more. However, the sense of presence was amazing. It was absolutely incredible. You know, I have my laptop, my keyboard, my screen, I can use the [00:20:00] screen as well in VR. It’s not retina, but it’s good. You’re sitting opposite me. I move my head and the sound comes from him. You know, a lot of things like that. So guys, we really need to be ready for this and do something absolutely amazing. So we have only a few outliers there who are not Oculus ready. So we’ll see large mine up and dive back in and see what it’s about. I hope my older Oculus will not make me look like in black and white. That would be cool and pixilated, right? Like a Minecraft? Yeah, yeah, it’s yeah.

Mark Anderson: [00:20:41] At some point, horizons will drop support for our Quest one. I only have one. But yeah, there’s also a VR chat. There’s which is another platform. I believe Jana’s VR is still working. Mozilla hubs as actual web based ones. I haven’t seen how they actually work, [00:21:00] the level of fidelity that you can experience in them. But but I know that you get performance capability because I just wrote something that allowed me to to capture a performance in Quest with hands and head and then exploit that to sketch out. So that’s that’s really exciting because it means that we can we can get the whole body of what what’s trackable so far.

Peter Wasilko: [00:21:23] And so. Sorry, go ahead, continue.

Mark Anderson: [00:21:28] No, it’s it’s just really great, because it means that it allows you to sort of introspect and understand that the complexity and nuance of what it is that a person is and does as a in the process of interacting with the computer in a way that that you simply can’t get with it with a traditional interface, which is something I’m going to go off on a big thread about very soon.

Peter Wasilko: [00:21:54] One thing that became really obvious over the last few days, though, is what Apple is going to do and Brandel none of this has [00:22:00] been communicated from you. But I can tell you a few things that seem very obvious, and I think most of it is to do with the onboarding. That, you know, it shouldn’t go into a screen with a couple of apps in front of you, you know, it really has to become important. You put it on in a useful space. And I think once Apple does that and of course, they will have a pretty good SDK, as they always do with new stuff. Once people, you know, put on the thing there somewhere, there’s a way to get to where you need to go, because right now it’s a complete mess. Then you know, we have our little space. You grab that, you go into it. Wow, fine. It’s going to blow up. So we’ve got to look forward to that SDK. Yeah. Sorry, Peter, you have a hand up. Yeah. I was wondering, has anyone tried to do like a memory palace application for Oculus? I would love to. Sorry, I’m trying to find an Edgar thing. Can you explain why you would actually need that? That sounds interesting. Because it is the room [00:23:00] itself, a memory palace, so to speak. Yeah, but the nice thing, of course, VR environment, you could take an extra copy of something and have it stored in more than one place so that if I had more than one independent dimension alone, which I was organising things, I could have, you know, everything related to my undergraduate days there.

Peter Wasilko: [00:23:20] But I could also pull out a copy of it and move to different space that’s related to stuff on some technical topic and leave a copy of it there. Ok, OK, OK. I can address that for a single location. I could find it in any of multiple organizational locations. Ok. Ok, so before I say you have a handle, but just really briefly, because it’s is all very, very fresh for me. The Basic Horizon’s room is so close to that one of the things they have is a big meeting wall. And one things I have not done yet is allow you to designate a real world wall in your room. To do that, you can tell it what’s a desk, but not so you should be able to walk up and touch the wall. But [00:24:00] to be able to do what you’re saying now, Peter, I think, is that onboarding experience that you can move to different rooms that represent different things to you. It’s incredibly compelling. Alan, sorry. No problem to rewind back to a few minutes ago. I’d like to make a triangle between three topics that were brought up because I think that there’s some interesting space there. One is what Mark was talking about the PDFs, and if I understood correctly, you know, why couldn’t they just automatically be more like e-books in certain contexts, right? Where where you get the yeah percentage you’ve read? You get this.

Peter Wasilko: [00:24:43] This an artifact removed. But at the other is talking about VR. And then the other is this idea of literacy, the what we [00:25:00] expected the audience. And I think there’s an interesting space in the three because starting with VR and PDF one, you’ve got a format that’s now ubiquitous and been around forever and hardly changes, right? And then with VR, you’ve got another one, you’ve got this field that people have been excited and let down in waves for decades about its potential. Right now, it’s time might be now right? It might finally be now, but it didn’t get there through any kind of linear path. Right? And so that leads to the third point about the sticky, tricky problem of literacy. Um, it isn’t straightforward what should be expected of an audience member or say [00:26:00] their constant degree of attention or how how they interact with with whatever, because in one sense. The formats are changing all the time, but I haven’t completely cohesively put the three points together, but they they were bouncing off in my head a little bit, so I wanted to introduce it because I think that it’s really interesting that the theme that we bounce back and forth between PDF and VR. And they’re almost like mirror images. You know? If you look at the broad scope of their technology, you know, technological advance, they couldn’t be more opposite. And yet we’re talking about them.

Peter Wasilko: [00:26:53] So. And how they have their own literacies. I [00:27:00] just wanted to, I guess, put a note on those three points because I think there’s something thematic there. There really is, and it’s funny, I just posted that video of Edgar in, well, right before he started talking, and he kind of demonstrates everything there. And please have a look with sound when you can. But the idea the reason I’m showing it is partly because it’s so dark, but also because only the second time in VR, he’s able to grasp objects and manipulate them. The literacy is just there for that part of it, and a level that was really surprising was that at one point he bangs into a door the physical door, because the object is behind it. And he didn’t freak out like what’s going on? Real world, not real world. None of that. He just accepted it. Then he mumbles, Oh, it’s in the door, right? So some of these things are almost electricity free, we just have it born with it, so I’m so glad you mention it, Alan, because the higher levels [00:28:00] of when you actually build something in this kind of space, it isn’t the same as a physical sculpture. It’s not the same as something on the screen. We really need to to dive in, like to figure it out. And then put it back in the PDF. Adam, what do you think so far? How much have you been in the space?

Brandel Zachernuk: [00:28:25] A few few hours, a few Oculus charges in. It it’s it’s further along than I’ve than I’ve thought. We are. I left my toes in it every five or twenty twenty years ago, I tried it first time, maybe or. And. Five years ago, I tried it again. But now having a standalone headset, I was so surprised that the standalone headset, basically a [00:29:00] mobile phone on your head, is so, so good and so immersive, and I usually get a bit seasick or a carsick in VR, but this time I wasn’t, not that much. So everything about the frame rates and the fidelity of it and the resolution is almost there, I think on the Oculus two. So a few more pixels than I will be very happy. But what really surprised me most is that. It’s the embodiment part, bringing your hands in there and and your automatic viewing is so. So incredible when you used to screen to actually grab things by the hand center and move them physically, it was way better than I thought. And I really we are very close to something that is very good. And I was also surprised that you could do web pages [00:30:00] or WebEx or experiences that you just step. You click on a link and suddenly you’re in a in a in a virtual space with lots of interaction and so many fabulous things. So I think that right now the imagination is. It’s a part where it’s lacking that we haven’t really figured out what to do with with the technology because many of the parts or the technology is there, but the interaction is not there yet. I see glimpses of it in almost every web page. I see something that I find very interesting, but it’s not put together in a. In a fluid way and in a more human centric way, yet, I think.

Peter Wasilko: [00:30:47] I agree with Peter. I think the exciting thing will be when we start building abstract information spaces as opposed to just walking around the physical room, but go where you can start [00:31:00] selecting the dimensions that you’re looking at and moving inside of things to unfold other dimensions. And those are those ideas that I found in the cyberspace first steps book that were so compelling, and I keep wanting to see something along those lines actually realizing it sounds like the technology is almost the point where we can do it. And then I think about the interfaces in Johnny Mnemonic and Minority Report, where you’re grabbing things and spinning them around. And some of that seemed a little bit more frivolous. But I’m not sure what we could do when we had real data running behind it and again had control over selecting dimensions and switching the context that way. And that sort of relates back to what I was trying to accomplish in latex of being able to model that transition between nested related contexts to be able to dive in and then weave back out. Certainly, I wouldn’t want to visit the raw code in latex that I was writing on anyone, but I think we should be at the level where you can have [00:32:00] simply math note and then insert historical note and an insert at arbitrary levels of nesting.

Peter Wasilko: [00:32:07] And that would be high enough that somebody who’s writing could be able to deal with it easily, especially if you had like a little palette of different note types. And if you’re in a note, your choices could be, you know, return to the previous context or digress into a new sub context and just had a set of categorized sub context that you could jump into to weave back. It also brings me back to an idea that I’ve been arguing for in tinderbox and the hypertext meaning for a while, which would be to have a mirror. No, a mirror mode where you could flip the context so that when you’re looking at nested things, you could sort of look back out from the inside and see what things you’d reach them from. And that’s sort of like reversing the context stack of how you got to where you are to be able to walk back out and then reverse direction to go down another thread without having to go all the way back from the start and [00:33:00] re navigate from the top down again. Mark, the system has chosen you.

Alan Laidlaw: [00:33:10] Um, yeah, I mean, just quickly, something that brought up listening to to speak just now is is the interesting reflecting on the thing of new forms of literacy because, you know, that’s not the way we used to writing. And when you were talking about the work you were doing early in the late stuff you were, you were talking about, I heard you basically so teasing apart a hypertext almost. So it’s an interesting thing there. Well, one one one thought I had and what may put my hand up the sense, whether we’re getting towards having I don’t know what if there’s a term for it yet, but what I would, I guess you call blended reality. So I might be sitting at my desk with whether it’s with an eye tracker or whether it’s with us or things along to my face. But what I’m actually looking at is a mixture of the virtual and the real. So I might [00:34:00] be looking at say, I might be looking at my monitor, but I might overhear, you know, in the middle of space, have a virtual artifact, which might be something else. I suppose I find myself thinking that partly because it seems a bit more comfortable than just being stuck completely in it, completely in a complete artifact space.

Peter Wasilko: [00:34:23] To jump in, that is the term of art for that is augmented

Mark Anderson: [00:34:26] Reality, and everyone has agreed that that is the target to get something called pass through augmented augmented reality or air. And so both matter and Apple has has has gone very publicly on the record, saying that it doesn’t believe in VR, it believes in AR because it’s so much less exclusive in the sense of being able to include people running. So I will.

Alan Laidlaw: [00:34:50] Yeah, no, no, no. It’s good to hear that because I wasn’t sure the reason I said Lenny was I wasn’t sure. It may be just because some of the early AR amounted [00:35:00] to basically just putting labels on things.

Peter Wasilko: [00:35:02] And I think this is why it’s so useful to to try it properly, because this is what the latest update and the Oculus actually does. Yeah, it doesn’t do it all the time because the video quality is black and white isn’t bad. But the really shocking thing was you set what they call a guardian mode, you set the space so where it’s safe to walk. But if you now move close to that, you want to break that instead of showing you kind of a wall. It shows the actual video. Yeah, so it’s already blended in that sense, and it really has a positive impact for exactly the same reason you’re saying, right? Yeah.

Alan Laidlaw: [00:35:38] Okay. And one very last thing before I get and that is is thank you to whoever suggested this make it so I can’t remember who it was the right

Peter Wasilko: [00:35:46] To blame, Alan.

Alan Laidlaw: [00:35:49] Well, great. Thanks. I’m really enjoying it. I can’t shut up now. Good. Good.

Peter Wasilko: [00:35:55] Can you hold the cover up again for a second?

Alan Laidlaw: [00:35:57] Sorry. But [00:36:00] if you can see gang bridges, it’s it’s basically pictures. It’s going the other way, it’s going from the in a sense, from the imagined picture back to, well, what does it mean in your terms, anyway, Alan?

Alan Laidlaw: [00:36:17] Yeah, I’ll I’ll go quick, but it’s another sort of bounce around of themes. First off, something that I think really applies to us. I don’t know if anyone saw John Carmack, who helped create Oculus was sort of A. Metaverse for the most wonderful of reasons. And it certainly applies to us. It’s the term architecture astronauts. It’s a it’s a honeypot for architectural astronauts who talk in abstract terms. This is how this should work and blah blah blah. And no sense of the protocol [00:37:00] layers an actual physical constraints. I feel that acutely, but there’s some twists to it, right? I would identify as an architecture astronaut myself, right? And I would say that Adam is more grounded and Peter’s more grounded, right? But. Where the pluses and minuses of that, what he suggests is. This is why when he works on a product, he goes for a particular problem to solve. Because that way he can measure its value inherent in that is an inability to solve all of the problems at once with some sort of conceptual Darwinian evolution, you know? So, so an example, Mark’s example of, hey, why can’t PDF be [00:38:00] improved digitally would be a good example of a step forward in in the space. Another one would be. Uh. Well, I think if you go into horizons, if you go on an Oculus, what you see is a procession of these little steps of improvement where there’s still lots of breaking spaces, but they don’t have the ability to fix it all at once.But they are making improvements like the Guardian Boundary Bits at a time, you know? So, so and that also frustrates literacy, by the way, which I’ll get into in a moment. The the OK, so that’s that’s one box, second box. About the transcriptions and these and in the nature of recording these videos, there are specific [00:39:00] opportunities and product enhancements there, like, for instance, we tend to repeat ourselves a lot over the course of these meetings, even though it’s in different contexts, we’ll say roughly the same thing. Like I’ve mentioned Memory Palace and I love it. I love where Peter’s going with that idea, and I have a lot to say about it, but I also know that I’ve said those things before. So an incredible feature would be, Hey, how many times has he said this thing before? Not because repetition is bad inherently, but because you would suddenly have a window to. Well, Freud was mentioning something about his son here, but it reminded Alan, and he said the same thing that he said two weeks ago. But in his mind, it was triggered by something different.So it’s a different avenue, even though it’s using the same thing that it was triggered. It made the person think of the same thing. That would be a kind of I feel like non architectural astronaut [00:40:00] enhancement where you could say here is an actual feature that we could try out and see if it’s useful. Um. The final thing to Peters objects and hyper objects, which which I love. But where that links to literacy is that we’re also seeing new generations of users who don’t think in the same way of objects and documents the way that we do. We’ve talked in the past about how. Kids just use search, and they don’t know where their files are anymore because they don’t need it. And I think in the same way we might find that finding ideas is is more of a more time stamp than we realize and our generation realizes. And so there already are all these heuristics built around, even though they’re not physical word searches, they’re it’s easier to find things because there’s so much either video [00:41:00] attached or person I’m talking with attached, leaving all that just for the video so I can go back and address it later. Done now. While I’ve written so many notes, but I don’t

Adam Wern: [00:41:14] I mean, it’s your turn, Frodo, I talked

Frode Hegland: [00:41:16] Before. No, no, no, no, please.

Adam Wern: [00:41:20] So one thing that is obviously clear to me now when I walked around in 3D and also in the browser, in the tree, on a flat screen, but 3-D on a flat screen is that text has never really had that opportunity to be in 3D. Of course, it’s in 3-D in real life, but it’s always on a substrate, whether it’s stone or paper or cloth or leather or clay or something. For the first time, we can hang actual actual symbols in mid-air floating around. And it really it brings new challenges because text [00:42:00] so obviously has a front side to it. So what happens when you go off access enough a little bit, you can read it. But when you go on the back side, what you do see on the back side of text or should text always face you? I mean, a programming today with I learned a new term in the in the 3D community. They call it build boarding. When objects turn towards you wherever you are or the text or something is turned towards you and and that brings other things. Suddenly, the room is a bit more alive and less memorable in a way because things move around. If you have to have lots of text that is always turning towards you, new things happen. And. And I think it’s and also we have the idea that. If you have labels for four different parts of the room, they [00:43:00] can turn a turn towards you, but they can also decrease size, increase size or opacity or appear and disappear. The kind of contextual information that if you stand in one place, you get more labels and other labels fade away. But there are so many unexplored things here, but also having what would truly 3D text be symbols that can be viewed from more sides. And how do you construct something resembling sentences, thought or lines of thought or or networks of thought that can be seen in different dimensions and conveying more information while still being compact and and and framed in a way? Lots of lots of ideas, and I love to explore those with you. And maybe in prototypes as well to just work around and show the things [00:44:00] I thought about, and I’m sure many of other has done that as well.

Frode Hegland: [00:44:06] Yeah, that sounds just exactly what we need to do so. So many thoughts, I mean, I literally feel 20 years younger because of this. I feel like the early days of quick time or something. It’s like the opportunity is immense. But in many more dimensions than just 3D. So first of all, I think Apple are doing some very interesting things like the universal control, you know, control from your Mac onto your iPad or whatever. That’s going to be absolutely huge and VR, because one of the biggest problems I have going in and out of the auction space is bringing my Mac with me. I can’t do it. It’s a faff and it’s not always working. So that’s really exciting. And that kind of, I think, sets up to think that it’ll probably be close to Mac versus Windows. In some ways, there won’t be a metaverse. It will be owned by different companies because the infrastructure for the hardware in the basics is so intense. [00:45:00] I mean, don’t forget Oculus. Sorry, Metta just gave up with all their money they gave up last week, making their own OS. Or rather, that’s when the news came out. That’s a really big thing. Right, so the in the same way that you have Microsoft Word on Mac and Windows and you’ve got all those, I think we’re going to see history repeat itself to a large degree. No question about that, and I’m going to paste my notes in before I forget.But I think the other thing is, I think over time, this thing that ah, is better than VR will be shown to be wrong. And I think the devices should support a ah, no question about that. You know, being able to see no problem with that. But there’s two reasons I think it won’t be the primary use one. You still can’t focus on the background, so your eyes will still always be focused on infinity. So it won’t have that kind of real depth difference that can’t be done because you still have one piece of glass that you’re focused on. Oh, OK. Brandel is going [00:46:00] to say something. Put your hand up, please. Make a big point about that. I’d love to hear about that. But also so I’m sitting in a room right now that has an OK kind of dining table and so on. If I am an air with you guys. First of all, I have to decide where you’re going to say it’s because your desks are different. But if we go fully virtual, we can have one conference table and we can put we can do the memory palace thing we can have. Peter’s work is over there. Adam’s got something happening over that. That can be our shared space. You can’t do that on air because the space isn’t big enough. It’s basically the shared air space is all of our spaces divided by each other, not multiplied. So what I mean

Mark Anderson: [00:46:41] Is that a pragmatic reaction or a or a necessity? I still see that more as a pragmatism. So we can’t do this now. So let’s. And in that sense, it’s good.

Frode Hegland: [00:46:51] No, no, no. It’s a logical thing because think about it, if you just imagine, you know, kind of close your eyes and you sit us around a conference table and we’re all in the exact [00:47:00] same visual space, that space is huge and we all know that, you know, this screen is air, that screen is there. But if we’re going to be in your living room, Peter’s living room, Adam’s living room and all of that, basically. It becomes you have to choose for a person sets, you have to decide how each one of you will have the screen. If someone points over there, it’s not going to be the same thing in the other person’s room. So yes, air will be great for a lot of things. But when you go into a meeting like a Zoom meeting, so to speak or face time, I guess it’ll be called, then I think will be fully VR anyway. Brandel did you want to comment on the focusing thing? Because that’s really phenomenally interesting.

Brandel Zachernuk: [00:47:39] Yes. So the problem that you’re describing is something that’s well known in the industry. It’s called the Virgin’s accommodation conflict. The fact that we have we have two ways that we see things and understand their depth. One is based on our stereo origins. That’s how close, how cross eyed we have to go in order to [00:48:00] to be able to observe things. And in reality, that’s always hard, coupled with an automatic changing of our focal distance that is so ingrained that it’s very, very difficult to break in and we actually don’t want to break it. But and so that means that’s why it’s very difficult to work at things sort of quest. It’s the the focal distance is about actually about here and and it’s not very good at representing things any closer than arm’s length. The solution to that is to have something. So Magic Leap, the Magic Leap One has two focal planes. It’s got one of the infinity, which is good to about a metre and a half, and then it’s got one at about 35 centimeters. So that means that it has the ability to do those things. It’s got to statically. I think the Oculus have done prototype series is a very verifiable display in the sense that it has the ability using something called pancake lenses [00:49:00] to to to create a variable focus distance. And people at the University of Pennsylvania have worked on creating a very focal surface on the basis of multiple focal lengths apparatus within a thing. So that means that you can have a single image that is simultaneously projecting multiple focal distances. So the current state of art within Half Dome is that you have the ability to change the focal depth of the plane. The the University of Pennsylvania stuff has the ability to create a very focal surface so that you have the front of the image at one focal depth and the back of the image at a different focal depth, and you’re able to change that on a per pixel basis, almost. So there’s a lot that can be done there and to the end that that augmented reality and virtual reality, that there is a meaningful difference to [00:50:00] your point about the fact that you would need a large enough space. And right now, the place that I’ve devoted to this in this space is not large enough. Virtual reality right now is controlling all of the pixels you couldn’t control, like when you have good enough augmented reality. It would be possible to be able to carve out negative space within it to put additional hesitations in spaces such that you would be able to essentially make room for people to extend surfaces and walls out to the distance that would be required. The only challenge there is is based on the interaction with the real space. What are you going to get into trouble with as a consequence of claiming this space? For being able to solve the issues, you should be able to work against a wall, but then look behind you and see the real world so that you don’t do that. There was a really interesting. A video a few years ago at I think it was at West called Reality Filters. I think I’ll take a look for it [00:51:00] where somebody was looking at the the ability to identify the real world sort of features within a space and and put sort of thematically relevant artifacts in those so that if you have a pillar and you’re in a Dungeons and Dragons game, then it might turn into an impassable pit. I’m less interested in the fantastical because I just unimaginative, boring person, and I’m more interested in word processing and excel. But yeah, it’s very useful to see the way in which people can kind of seamlessly kind of include make sure that people are able to navigate within spaces like that. And then there are also mechanisms, which is admittedly a more purely virtual reality environment of a redirection and things like that, like a redirected traversal and walking and sarcastic redirection that allow [00:52:00] you to direct people around. So once you have untethered VR, you can have people actually walk around in the space. So rather than being in a space, you can go to a soccer pitch. You have very low frequency, but you can do your excel online. And that sort of adds another level of appeal to just walking around in the field. But the problem, then, is people bumping into each other. If you if you, you’re actually relatively insensitive. No offense to whether you walked 70 degrees or 90 degrees. We tried an angle of 70 to 90 degrees. So if you have the computer, redirect your view in that so that it’s consistently steering you away from things, then you have the ability to kind of create a virtual space that is vastly larger by doubling back and forth, for example, along a single strip that allows you to make an apparent zigzag through a larger space. So [00:53:00] as long as you have a system that’s able to reconcile the virtual between between the virtual and the real, this is something that the companies avoid, which I think unfortunately has gone under. But they created large circular hallways that were large enough to. They provided the virtual illusion of going on a single linear one. So there are a lot of different sort of strategies that makes just one. One big issue is that the kind of hardware that people have had to have hitherto has been prohibitively expensive for them to be able to make a. It seems like the bigger the the the Oculus Quest and various other devices that are coming out now with the price compression means that a whole plethora of profusion of new research within this should come to the fore, so long as everybody realizes that it’s research that needs to be done.

Frode Hegland: [00:53:52] That sounds amazing. Let’s see. Alan has a request for you in chat, but still, even though you can have negative positive spaces [00:54:00] as a drill hole in a wall, if we are going to be able to refer to each other and stuff like this, but you need to share a space. You know, and you just look at our rooms, we’re all sitting in very different rooms and sitting in very different ways. So don’t you think that let’s say just the six of us wanted to meet up? It would make more sense to be in a fully VR space.

Brandel Zachernuk: [00:54:23] A possible I mean, I think it really depends on our intent and how much we want to devote to it. Like how much our meeting together has to do with each other and how much it has to do with us. And those are those are really sort of basic fundamental questions that nobody necessarily has a good reason to believe have been answered satisfactorily yet. It’s really fascinating going back to Larry Tesla’s videos of describing the Apple human interface and his time at park and just talking about just the dizzying free for all that the concept of the graphical user interface was within the late 70s and early 80s, and [00:55:00] the fact that they didn’t the scroll bar wasn’t called the scroll bar. This gold bar was called the elevator. You know, Doug called the cursor the bug. All of these things were in such flux and such a tumult in terms of what things could mean things. And we are. Not only are we there where it works because of how many fewer cues we we have to go on as to what the whole thing is. So it’s an exciting time. But one of the things that I think is important is people haven’t realized how little is nailed down and how much room there is to experiment and how much of an obligation there is to do that on the basis of what we know we want to be able to do within the platform. Oh, that’s

Frode Hegland: [00:55:39] What makes it so exciting, Brandel. And just kind of and a little nod to Alan’s know there are so many things. Once you have an avatar that is, you know, like Keith and me, we have roughly looking like each other avatar. I know it’s him, but the fact that the audio comes from where he’s sitting. So if I move my head, it’s still the same place. You know, spatial audio and all that stuff is [00:56:00] so huge. But I think we’ll definitely have simple things like, you know, you tap a little button and that means you’re in privacy mode where your avatar kind of moves around and looks like you’re listening, but you’re actually doing something entirely different. You know, that kind of stuff we will absolutely have and we will need to have that. And if you want to flatten yourself into a 2D picture on the wall, it can be done in these spaces. For someone who joins not from VR, you know that can be done to. And yes, of course, Alan, if you choose to have a setting like little mice on your table, that should be entirely possible too. But I just think that for much of this, a shared kind of dining table will be really important. But of course, it’ll be much more interesting, exciting and dynamic how you break away from that for either a group experience or individual different preferences over. Peter, Peter, yes. What’s the current situation for people who have eyeglasses and prescriptions? Well, we have to have [00:57:00] a headset that can fit over our glasses or will we be able to type into the system’s preferences, what our eyeglass prescription is and then have it virtually accommodate it so we wouldn’t need physical glasses on anymore? And the system could simulate it, Peter.

Alan Laidlaw: [00:57:16] I love that idea, though. That’s awesome. It’s a good basically, it’s what they do for for your ears because you go through a process where an attenuates, how you hear. If I remember correctly and so yeah, I don’t see why you couldn’t do that for your eyes. Of course, that creates a huge problem when you pass your headset over to someone else and they have to go through that process. Actually, it’s not. It’s not. So these glasses have three levels of focus and they go dark outside. I used to have reading glasses. I don’t have that anymore. So when I look at Peter’s name now, it’s in focus. Now it’s blurry because that’s Watch Apple Watch distance, and now it’s definitely birds driving. But the point there is not mentioning it and cutting in is [00:58:00] when I got these glasses that did the kind of sitting down, scanning laser and all of that stuff. And then afterwards I stood in front of a box that looked at me and decided where all the different points are. So it’s not science fiction to imagine a VR headset, do all those readings and do it lives. You hand it to the next person that do it live for that person to. A couple Brandel, I’m sure what’s deeper one on that. Also, what’s the state of the art on omnidirectional treadmill so we could actually be physically walking without going anywhere?

Brandel Zachernuk: [00:58:32] I can answer all of those things. So, yeah, so the state of the art within the sort of the focus technology is that the current generation or I imagine that the current generation of question also has there’s an ability to add extra sort of base plates and spacers and here so that you have room physical room for glasses, other alternatives. People have built prescription drop ins for things like HTC [00:59:00] Vive and other devices. But yeah, at some point it would be possible to to create something that has the ability to dial in prescriptions online. There are actual dial in fluid, lens based spectacles that people distribute to the to developing countries where you where you actually put more or less fluid into a bag. And it’s very interesting because it’s programmable and it’s not. It’s not phenomenal. But but it’s enough to actually allow people to be able to kind of get custom prescriptions without having to grind actual glass or plastic point, which is very exciting. But yes, you’re absolutely right that at some point it would be possible to have something that can look with enough fidelity at one’s own eye in order to figure out how to correct those things. And one of the benefits of having sort of eye tracking and analysis [01:00:00] and assessment of that level is that it also means that you have a much better basis on which to understand what to display and what not to display. Because right now we all have presumably very high resolution, if not high dimension displays in front of us. But the reality of it is that we only have the ability to focus fixate on the very small, angular range about the size of the Moon in front of us. And so if we had the ability to track eyes in real time, we would be able to figure out that we don’t need to have high definition over there. We only need it here. And so while there are, there is some non-trivial expense and complexity in creating a kit that has that support. It also provides manifest sort of benefits for for the efficiency of the operation, as well as sort of situational awareness of what it is that the user is doing and the ability to infer some aspect of intent and attention. So, yeah, very, very exciting. There’s a lot of work to be done for the next 20 [01:01:00] years, many within VR in order to make these things happen. But at some point it will, and it’s going to be really awesome.

Alan Laidlaw: [01:01:09] All right, you’ve convinced me of charging it up. It is so ridiculously important and not to be self promoting here, no to be self-promoting here. We do need something like visual metaphor VR. I call it vember at the end of my thesis and the reason for that. I don’t care what’s actually in it, and I certainly don’t need to call the visual meadow. But something like virtual back in the day was interesting. And there’s all these web things. All that is good. But I think that for the good and bad, what Mehta is doing is, oh, taking over the world. I think Tim Cook paid Zuckerberg a couple of billion to do this because what he’s doing is taking the sting over the fact that Apple will own us much faster and much deeper because they have things like universal [01:02:00] control, airplay and all of that. So that means going in and out of virtual space. Once you buy an Apple device, if you have the other Apple device, there will be no transition whatsoever. That will be bigger than the iPhone. Yeah, I think that is. The real central, perhaps a central theme or mission is like, what does archiving or what does decomposition mean? From a VR setting, you know, I don’t know that it’s even PDF anymore, but the the. Or, you know, like there’s always PDF, right? Ok, no problem, but it’s a totally different question of, you know, what are the artifacts? What is the decomposition and archiving mean in this space, which is I don’t know that anybody has I don’t know of anything about that.

Frode Hegland: [01:02:54] Archiving will mean something else because the dream that I keep having is that I open my [01:03:00] laptop and oh yeah, I put on my. I’m just going to be honest, Apple Brandel glasses, and I’m calling them that. Not just because of you Brandel, but because for me, all the niggles with Oculus goes away because it’s a fantasy device. So put my Apple thing on and I still have my laptop like we talked about a few times before. But now I do a gesture maybe like this and everything on the screen goes into the road. You know, that’s my interaction. Who cares what actually is? We do fantastic things, we argue atoms ideas over there. We connect them, do all those amazing things, but it has to be possible to fold it back in. I think that’s what we’re talking about. Right, because if I now decide that I’m not going to go in Apple world, I’m going to go and Metta or Microsoft or whatever world, most of that stuff, just like with HTML and stuff, has to be livable. And we all know that you’ve all read the story of safari, right? Well, Brandel, what’s the name of the guy who developed it and wrote that book to remember and cross-gender? I [01:04:00] read that in one setting that was raw sugar. Right? And you see the link. Oh, yeah, Brandel place in the mind. Yeah, sure.

Brandel Zachernuk: [01:04:10] If you haven’t seen his worldwide baby go down now, I’ll take a look. So can I did some beautiful, beautiful worldwide developer conference videos as well on some sort of development practice? And his oral histories go into a lot more detail as well with the Computer History Museum. He’s decent, humane and it’s very tempting to to to jump ship and try to figure out what they’re doing there because they have an incredible pedigree of people working on what they claim to be the next generation of computing. And I find it hard to believe that it wouldn’t be some sense of some some sort of variation of immersive computing in a way that we’re talking about it. But yeah, no, I’ll grab all [01:05:00] this stuff.

Peter Wasilko: [01:05:01] They have a website.

Alan Laidlaw: [01:05:05] If another bit of the road is the biography of the guy, the book is called almost perfect and it’s the story of what went wrong with word perfect Brandel. You might like it to that intense place place. People say it is a walking disaster. Ok. Uh. This guy. Peter Pete, Pete Peterson. Can there is. Speaking [01:06:00] of word, Perfect reminds me of the Cannon cap. We need to organize a library of these things, I’m starting on it and I can welcome you guys to the space, if you like of doing it at the moment and craft. But regardless, like which is craft is sub optimal for some things, but it’s at least straightforward. It’s all Mac, though, unfortunately. But but I can. I can switch platforms. But. Really? The link stuff is always just wonderful. You have a link to craft. I do I can invite you to the space. Give me just a moment. Was trying to find Kenyans spoke, I [01:07:00] can’t find it.

Mark Anderson: [01:07:06] Put a link in the side, which is I’ve just done our own arms and legs because we’ve got some pictures of the books. In most cases, except for one book, which seems to be an HP printer cartridge for reasons known only to Amazon. But there’s a list of books that I have to hand. They’re all paper I don’t do. I don’t do digital for four for serious reading any for novels.

Frode Hegland: [01:07:30] Neither do I. But don’t tell the group that you know this is another that’s really. I love discussion. Sorry, go ahead. So Brandel.

Brandel Zachernuk: [01:07:42] I would really love to go deep on what it is about, about digital reading for seriousness that it isn’t good enough because I think everybody saw all of the books that I was making all of the different digital VR books. Those are sort of thematic and conceptual [01:08:00] explorations for me of what it is that books are that digital reading isn’t. I’d really love to hear what everybody else needs from from tangible in order to to and what it is we might have to do to the experience of digital reading in order to be able to bridge those gaps.

Frode Hegland: [01:08:20] I did a study on that and the real world Starbucks in about 10 years ago, meaning I looked at people and I wrote things down. And the surprising thing I found was that people who read on books or paper or magazines or newspapers can move the medium around easily. Um, because the thing is, what I mean is when you read on an iPad, let’s say, which logically is the same thing as a book, except the light thing. You can’t move it, it easily. You need a stupid stamp, which is crazy. So reading on a mac for me to read serious work stuff, I’m read on my Mac better than anywhere else. But you know, you’re reading [01:09:00] more for enjoyment. The, you know, the fact that you can move it is such a huge thing over.

Adam Wern: [01:09:07] And Brandel, one of my first Web experiences was trying your the big book where you could turn Alice in Wonderland, I think your prototype of a book turning thing. And actually, the first thing thing that happened to me was that I got stuck inside the book, which was a surreal experience. For some reason. I was placed inside the book and couldn’t escape, and I couldn’t even go out from the Oculus to say I was stuck into it. I had the power set, the whole thing. I was like in wonderland because I couldn’t get out The Neverending Story.

Alan Laidlaw: [01:09:49] It really felt like real quick about about the I actually. I got to find what I was writing about this, but I go back and forth. I read the [01:10:00] physical books and the book primarily Kindle and and there are actual, I believe, subtle pleasures to reading digitally. For instance, I typically will read 30 percent way through a book. It’s just my ad nature, right? I don’t even have to try, but I found that with Kindle because it is flat, and that flatness is typically a decent disadvantage. Disadvantageous, whatever, because it is flat. I can actually finish books because I don’t realize where where I am in them. And I’ve read much longer books in Kindle than I have in physical form. So there’s something that is there is some value to it that is not in this. It isn’t just a a lesser form of the print. It’s really it [01:11:00] would be worth talking about some more. Yes. I think my is just languishing there. Yeah, Peter, please.

Peter Wasilko: [01:11:16] Ok. One of the things I always want to see would be applications of. Air in a library context. Imagine if instead of just like looking at the level of dust on the physical spines of books, you could actually see books that were being actively used, lit up and glowing relative to the other books on the stacks. And then another thought was What if I could select four or five different books and then ask the system, tell me, OK, who else has been looking at these so that I could reach out and initially anonymously contact them and say, Ah, I see that you have been comparing books on object oriented programming, interactive fiction, and [01:12:00] you’re also into South Vietnamese modernist architecture. How would you like to get together for a cup of coffee and be able to make the linkages that way? So many invaluable contacts that just come from sheer serendipity of meeting people at conferences, and we could really facilitate that if there was a way to temporarily reach out through the library. I mean, imagine somebody is also browsing the same section of stacks. I might be lucky enough. I’m in the library at the same time to see photos looking at the same books that I’m looking at. But if we’re 18 hours apart, our chances of meeting in reality reality would be very low. But in an augmented reality where a afterglow of his presence in the stacks was available for me to see. We’ll be able to make contact in new ways that aren’t possible. Good point. Mark.

Mark Anderson: [01:12:59] Oh, yeah, so [01:13:00] I was going to, in my literal sense, to Brenda’s question, and certainly for me, the primary reason I don’t use e-books. I mean, in a sense, I’d like to have both, but I can’t afford both. And so I’ve have to choose one, which is a physical basically because anything that has reproductions in it is basically useless on a digital screen. I’ve also, I found. Whereas actually, if it was well, I mean, you can say who buys code books these days, but actually, you know, being able to say copy paste something like some useful actual code will be useful. So you know what I miss about home to me. I remember some of the gently teasing me in the lab when I say PhD work and said, What are you doing? I said, Well, I’m making all my grass in art so I can make nice SVG diagrams. So what’s the point? So I just copy and paste a picture of Excel. I said yes, but you can’t begin that on a digital screen, which is the format in which we’re presenting this. So your [01:14:00] your thing is going to look crap at certain sizes. So it looks to me like I was mad. And you know, the tailback was this is not my problem, but from readers perspective, it absolutely is the problem. So I guess it’s a problem and large pictures. If you’ve got a heavily illustrated book, large pictures obviously are going to bulk things out. But I think so. I think that’s one thing that at the moment is probably just a limitation of the technology rather than the medium as such.

Frode Hegland: [01:14:34] Just quickly mark on that point before Brandel comes in. And hi, Keith. Don’t forget now with Mac and iOS, you can actually select text and a picture as text so that it’s interesting how that’s slightly bypassed Brandel. What have you got?

Brandel Zachernuk: [01:14:51] Actually, that reminds me that I was also going to mention the text thing so that you take a photo with your phone on iOS, [01:15:00] then you will actually be able to retrieve it. I’ve taken a photo of a coach, but I think that would be really interesting to do. One of the things that I did this week, most of my time was sort of building that performance capture thing that I mentioned and I tweeted about. But I also succeeded in getting Tesseract running in a web page such that I’m able to identify the list of to read the text names of everybody in the participant list in June. So the next step will be to be able to make sure that I’m doing that of a live feed of a Zoom call. And then I’ll be able to run some image analysis over the column of pixels that represent the microphone. So that would be able to get a time codes of when everybody’s speaking. So that’s something to look forward to. I promise I’ll make more headway on that soon. Uh, what was I going to say about everything? So [01:16:00] James Brandel has a really good post from about 10 years ago about some of the sort of physical functions that a book has as a tangible artifacts. He talks about the fact that it’s kind of an advertisement for itself, as well as healthy and as a as trite as that sounds like it’s a critical component of physical artifacts that that is simply missing. And so James Bridle has made these book cubes. My wife made a book box, which is a really interesting way of being able to create an empty paper that can at least be standard to allow you to sort of advertise that this is a book that I’ve read. There’s just sort of something characteristics and attributes. I like my wife. And you keep that book box. So Book Cube is what Brandel did. I like my wife’s book box that she did. The cool thing about that is that you are able to have the dimensionality of it represented a little bit better so that you can. Yet signs, but [01:17:00] it allows you to print out a. To the check on a piece of paper, allow you to create a 3D object of a box that is proportionate to the size of the book that you that you read and want to kind of advertise. Another thing way back, actually, that my my wife did in the university was her thesis was on way, the living room, its shape as a consequence of introduction to television. So prior to the television, people used to have all of the seating arranged around sort of the concept that there would be this expected sort of conversation. And then it became focused on the focal plane of the television instead. And so I would imagine that we were sure you should expect to have to see your point of the kitchen table. Entire spaces reimagined and reconfigured as a consequence of the social. [01:18:00] The social order of domestic spaces, as well as commercial ones, will be figured as a consequence. So it’s going to have to be a really mass. It’s one of the things the mother of all demos really interesting is Doug’s hat tip to the furniture company at the end, who helped to construct the objects that he was actually interacting with in order to make everything work. And I think that the scope of change that we should expect in terms of the interior spaces that we live in will be at least as large as that, if not sort of orders of magnitude larger. So those are those things I actually can’t remember what I was trying to come up to talk about. So my apologies.

Frode Hegland: [01:18:42] I do have something really important that I think I just noticed before. Maybe we go off the record to talk a bit more about Alan’s next hour. But and that is the thing I said earlier that I think I just think that we’re going to see in VR something similar to the OS wars we’ve had before and browser wars. [01:19:00] So then the question becomes us as at least to a degree interested in interoperable spaces. You know, we don’t want to build anything necessarily for one of the big parent companies only. You know, and we all know the story of how Microsoft Office went to Mac and how that was an issue, whatever. So then the question now becomes if we’re now all of us on separate devices, let’s say ten years in the future, the actual virtual room, let’s say it’s a full virtual room for simplicity, sake. And let’s say we’re looking at a model that Adam has built. We as kind of pirates, we have to decide, not a friendly fire. That’s another story. We as pirates have to decide. What do we expect these hardware companies to allow us to own? Do we own the room or only Adams shape on the table? Or do we own a description of the room so we can all render it in our own ways like we have with emojis today? What is [01:20:00] the space that we can create that isn’t owned by the OS? I’m not saying we should answer it now, but at some point we should probably address it right, because otherwise we are building metal buildings or old things or whatever, right? And the browser will, of course, be a part of it. But you know, will we be able to literally step into a browser like in a cartoon? Maybe, maybe not.

Adam Wern: [01:20:28] But isn’t that what we can do at the moment? Have you tried the web? It’s experiences product? Because that is what I explored most here. I haven’t looked at the apps so much. I would go and gone into the web rooms or things that are owned by you or or a group or so. So it’s a web page, but room that you step into, whether interactive room. So. And I think that is the coolest thing of all. One of the most democratic [01:21:00] as well, or it’s a real pirate room if you want.

Frode Hegland: [01:21:06] So I really liked it. Does that mean that what we should probably try as a group is to think about how to reinvent the web in VR? That that’s the angle that’s interesting here. Is that what that means? Um, I just want to throw a notch or a wrench in the thing for a moment, as I tend to do. Maybe. Thinking about this in terms of optional experience or ideal optimal experiences or whatever is. It. Well, wrong, perhaps, but not necessarily wrong. To put a different framing on it, let’s think about mathematical notation and how that came about. It’s certainly not ideal, right? It’s certainly counterintuitive and hard for people to get into. But it [01:22:00] solved incredible problems over over the centuries. A new Greek letter to stand in the state that was so much harder to do or logarithms, you know, and how wonderful that was. So thinking about it rather than, Hey, what’s a what’s a better way for us to talk and exchange of ideas versus how can we all how can we take this massively cumbersome problem and turned it into a log? You know that that’s kind of how I’d like to think about this space. Maybe that is the same thing that we’re talking about. Let’s see big time learning curve, but interesting. I got that link. Ok. Anyway, it is something to keep in mind, and it’s really good to know that there is a solid b r sorry b r thing happening. What do you guys think, I’m wondering if maybe we should talk in private [01:23:00] to Alan a little bit about what he posted in the chat about his next meeting? Are we all cool with that for a little bit of closing time? It’s not a big deal. Either way, I think it would be useful for all of us because we then kind of get to think through that particular issue. Ok, no one said I

Brandel Zachernuk: [01:23:21] Began to chat and show what I can about. I mean, you work for Twilio, but a big tech like bigger tech is even bigger, so. Yeah, I can tell you what I can tell you about all of that stuff.

Frode Hegland: [01:23:35] Ok. Stop recording now and try to edit a little bit of the transcript to remove some of that and post it. Ok?

1 comment

Leave a comment

Your email address will not be published. Required fields are marked *