Tr : 7 March 2022

Video: https://youtu.be/4LJvyxlF4co

Chat Log: https://futuretextlab.info/2022/03/07/chat-7-march-2022/

Frode Hegland: Hello. Decided not to waste any video recording myself drinking coffee. So, oh, Peter, good to see you. Well, how are you?

Peter Wasilko: I’ll sort of be a fly on the wall today. I’m still recovering from the ordeal. Yeah. How are you doing? I have to jump mom with a lot of stuff now.

Frode Hegland: Of course. How are you doing in general?

Peter Wasilko: Oh, I’ve been playing a lot of what ifs, you know, imagining all the other scenarios where things might have worked out better, but. It is what it is.

Frode Hegland: With all due respect, that is a fool’s errand. Having gone through losing my best friend, my father, my mother in law and now another very close friend is suffering from late stage cancer. I’m telling you it’s a fool’s errand, not out of disrespect, but out of going through the same myself. As you said, it is what it is.

Mark Anderson: But it’s hard. Been there to.

Frode Hegland: I think it’s one of the last taboos of our culture. Of course, losing a child is unbearably awful to even think about, but losing parents, which we all should. It’s so truly awful, and it’s not really part of what we’re allowed to look at. The only long discussion. I’ve really had on this on losing my father soon after was when I was in California at a big whatever conference related to Doug, but it was something else the next few days. And I just had one of those attacks, I don’t know if you’ve had many of them yet, Peter, where you just can hardly breathe, and the only person I could talk to was actually Tim Berners-Lee. Turns out he had lost his mother not long before so. That was very bizarre to be commiserating with someone who even my father knew who was. Wish I could have told them, but anyway. When I lost my father, Peter, one of my friends sent a strange reply to the news, he said, Now your father will never know what you accomplish, which was odd. That was from Bjrn Surf. So I asked him when he lost his father. Fourteen. So his father never knew hardly anything of Ben’s life. Anyway, let’s hope there’s more people in the room so we can get back to politics. And if you don’t know that reference, I’m deeply ashamed of you. That’s a joke, by the way. Are you getting any opportunity to relax at all, Peter, or are there any? Walks, you can go on and things.

Peter Wasilko: Well, we weren’t getting much. Sleep near the end, every time he stirred, we had to go and check and make sure they didn’t lose balance whenever he moved anywhere. So I was running on like three hours sleep a night. But now we’re sleeping normally again. Well, because normally you can sleep after something like this, but we’re getting a lot more hours. But I feel, you know, really drained caregiver burnout. Now they’re finally starting to take the mask mandates off so we can actually go out in public without one of those. Wrapped around blocking our breath.

Frode Hegland: That helps you should be careful, though, if you’re in a crowded space. The Omicron new variant, the B-2, or whatever it’s called, is pretty nasty, so we don’t wear masks all the time. But if we’re in a somewhat crowded, we still do it. Just just don’t get feet on top of this, Peter.

Peter Wasilko: Well. After the hospital told us that he’d passed away and we got home about an hour or so later, we got a robo call from the hospital. Good news General Soko does not have COVID.

Frode Hegland: Oh, my God.

Peter Wasilko: Right. Hmm. And. Now, the only good thing about that is that means that mom and I have to be free from COVID too, because if we had it, we would have given it to him and he would have tested positive.

Frode Hegland: You would think that, but I’ve actually had quite a few friends for only one person in a relationship, even if they sleep in the same bed. Got it. I don’t know how that happens, but yeah, quite a few times. Very, very odd.

Peter Wasilko: If you think mom would have been sure to have given it to him if she had it,

Frode Hegland: Yeah, I would have thought so, but a few times. Either it doesn’t happen at all or they get symptoms off to the other. One has already gone away with symptoms like the incubation period can be odd. Next, my trainer, her husband to be. Got it last week and she was fine. No problem at all. And their baby was about one. And then boom, you know, quite a few days later, they got it too, which meant that they didn’t come for Hamilton this weekend. But the good news is they’re coming this coming Saturday. So there is that to look forward. I mean, in this country, our leadership is so stupid that listening to mosque laws, you know, first of all, the government doesn’t listen themselves. So I think it goes both ways now that there is absolutely no mandate or law at all on anything. You know, we have to take it upon ourselves to be a bit careful. Well, you know, that’s yeah.

Peter Wasilko: They’re really looking forward to spring, the first of the snow crocuses are blooming, dad missed them by literally a day.

Frode Hegland: There’s no one, no crocuses.

Peter Wasilko: They’ve been here long, so they’re just starting to come up.

Mark Anderson: Yes, spring is on its way. In my recent brief break down in Cornwall in the west of England, it was quite which is always the way ahead of the rest of the country in the east of primroses, where it was quite amazing. And this is the most amazing thing at all when I see a sort of signature collection of magnolia trees. And that’s quite something when you see really, really big, tall trees just absolutely covered in flowers. It’s quite surreal, but the spirits

Peter Wasilko: Are nice for her to get the garden going

Frode Hegland: Here in London. That’s going to be one degree in the morning, at night, tomorrow. One. Hope it doesn’t go to freezing plants? Well, just go to shreds.

Peter Wasilko: Hello, Brandel. Hi, Brandel.

Brandel Zachernuk: Morning, Peter.

Peter Wasilko: Good morning, morning trying to get back on the horse.

Frode Hegland: Yes. So now that we have a quorum or whatever, have you guys had a chance to look at the document, I sense. And you have any comments or I only sent it a few minutes before, so I expect not.

Peter Wasilko: And I didn’t check e-mail yet.

Brandel Zachernuk: I just I saw that existed. I have not got I only just got up.

Frode Hegland: I just want to do

Mark Anderson: A thing here. Impressive who with us at all?

Peter Wasilko: Oh, and in another bizarre incident, had it been one day later, the nine one one service, our 3G cell phone would have been turned off. I wouldn’t have been able to dial in for help. It would have taken twice as long to get to a telephone. And then I would have been blaming myself because I didn’t have a phone that was working on the day that I needed the phone.

Frode Hegland: Think, yeah, those those. Yeah, that’s good, that’s good. I mean, those coincidences are amazing. Bruce Horn, some of you may know from our community and also he was the guy who wrote the first finder on the first Mac. It was cycling good a couple of years ago and just collapsed on the bike. And if that wasn’t a couple of people there who saw him who called the right number to get the new helicopter for the new wing, there was no chance in hell he would have made it. So that was a positive series of events, just amazing. Now he’s just normal and healthy, but any single chain of that, he would have not made it awful. Yeah. So the reason I sent this document is obviously it would be great if we could make a major demo and we all agreed on everything to make. But I think we should at least outline a demo of things we think should be doable in such an environment. And then also write down why it cannot be done line by line. So these are the issues that would have to be sorted out, and hopefully that can help us get some advisers and sponsors.

Mark Anderson: Yeah, it makes sense. And I don’t know. I’d hope to have made some progress anyway, plan at the moment is that I’m now down back into things is to use Noda and take some of the data set we played around with last year and start putting small amounts in just to test the import export or see how far we go before it. Well, hopefully it all goes in. I just don’t know, but I’m so sanguine enough that I’ll take it a bit at a time rather than just dump everything in. Good idea because I think it’s an interesting dog food exercise. And then we can, you know, if only to say, well, does does it even if all the data goes in, can I do anything meaningful for it? But I think one of the things I found myself missing at the moment is having something sort of that I vaguely know and understand to play with where the ambiguity is. And am I seeing everything I thought was there or not? It sort of, you know, have it. So OK, so I should be. The interesting thing I think is I’m saying put it in a better way is I should be interested to see if I can find the things I know are there in a new environment in the way that I sort of intuit that might. And I’m sure I’ll be surprised in good and bad ways.

Brandel Zachernuk: Yeah. One of the things that Schroeder mentioned recently about Nodar that I. I agree with this that I I don’t know of the value of the great big geometric shapes and colors that the nodes require. And it’s possible to kind of create a semantic sort of framework for oneself that gives those things meaning. But from my perspective, I feel like they just take up much more real estate than than is necessary, given the value of the labels. Unless we’re looking at the

Mark Anderson: Import, I mean, it would appear I mean effectively, all the things that you give a particular type, if you’re bulk importing them will have the same customization. No, I think the trick is how you define what a type is. I think the sort of this is where I need to dive into detail. I have a suspicion that if all things are a certain type, if you customize one, it’s a sort of generalized prototype. So if you customize one thing of a type, all the other things of that type, whichever one you touch, will all be the same, which is kind of neat because, you know, so in a simplistic sense, I thought with the data I’m going to use, I could just I could see the difference here and say keynotes and short papers and long papers, which is kind of really trivial categorization, but one that at least shouldn’t be visually too confusing and too noisy. But I think you’re right, Brandel. I mean, and I mean in this, I feel sorry for the people making the app because the problem is it’s really difficult to start with other than a fairly small sort of feature set and scope of things to do because, you know, people have got to learn the app. And if you suddenly throw them sort of three or 4000 objects, that probably is a bit of overload in terms of trying to for the tree. So I guess it’s, you know, small steps.

Yeah, yeah. Cool.

Frode Hegland: Yeah, I think that’ll be one of the really important aspects, that kind of thing, I completely agree, obviously Brandel that having funny colors and shapes doesn’t really necessarily that useful, but also one of the articles and one point two of the journal is unsettled custom, which is nice to have part of our dialogue to kind of remind us of its use. So I kind of dream is that one day we will be able to go into this space and put on an index card and index card can hold anything but us is highlighted and the demo. There are so many decisions and we’re nowhere near answering many of them, you know? One of the things Mark and I talked about this morning in our other meeting was How do you hide things? You know, that’ll be a big one. And how do you save layouts and so on? So the document is shared with you guys both, obviously to have a look and to comment on the demo. If you want to rewrite and write your own demo, of course you can do that. We can put it together. But also, if you want to be listed and if so, how? I’m very wary of putting people down in community if they haven’t explicitly said so. So you’ll say there’s a huge, big, huge, big bit about me. That’s a bit not the way it should be.

Mark Anderson: Actually apropos the thing you mentioned, Fred, I’d be interested to get Brandel take on it. When we were talking about this aspect of sort of, in a sense, you’re hiding things. One of the things it’s made me realize in terms of some of the things I’d tinker about with this. If you sometimes a sort of default assumption is if you say, Well, I want us to remove some information from plot graph, call it whatever you will that the system says, oh right, what you want is the whole thing redrawn without that information. So rather than, say, drop. In other words, rather than just sort of remove the. Presence, the weight of some things, all the bits and pieces move around, and now you’ve lost your frame of context of where things are. And I think this is especially true when you want to look for when you’re looking for missing things. It’s almost easy if things are joined by lines or visual relationship like that. It’s easy to say yes, A is connected to B, but when you’re looking for Y, you know why isn’t a thing? I expect to be here joined in in some way. Why is it missing or why is it not connected? It’s quite useful to be able to sort of dial in and out different connections, and you have a sort of view on that. And.

Brandel Zachernuk: The. There’s a really popular prevailing representation, if you’ve heard of React, the system created by Java, by the JavaScript system created by. Am by Facebook. It’s really taken the world by storm, Swift UI is also a very similar sort of representation insofar as it it’s built around being able to take it. They call it diff entries, taking the minimal difference between the previous representation and the next representation. Yeah, I don’t like it. I haven’t written much or any react actually before. And one of the things that I’m I’m curious about is how it how it copes with. Attributes whose whose values continuously change because it seems to be a primarily stakeholder. My sense is that 3D stuff, or at least 3-D stuff in my wildest, most optimistic imaginings is based on dynamic constraints and relationships that aren’t so much. Is it in this state or that state? But what are the things that are governing the display of this moment? And to that end, I actually don’t know how amenable it is to to sort of the concept of state film, as this say. And yeah, so I’m going to be interested in how that changes as people. So so there is people have made react fiber three 3GS, which is a is a system for creating 3-D things and reduce the library that I use on the web. But it’s going to there’s a decent amount of sort of essentially recapitulation of the existing sort of stateful, fully sort of governed applications for people to get through before they, I guess, recognize the utility of of constraint based sort of control of the display effect. I mean, I will say that it is it is cheaper than you would worry to to just turn a bunch of things on and off, especially if they’ve already been kind of constructed. But but it is going to be. Yeah, I’m. I don’t know, I just know that the problems that you’re sort of anticipating are ones that I think about a lot in the context of fit for four display paradigms, and

Mark Anderson: I keep his hand up. But just briefly to sort of riff off that is that because it strikes me it’s it’s almost as much a human conceptual problem as it is a design one in the sense that, oh, well, if I if I if I’m not showing something, then it shouldn’t be included in the construction of this of the relationships, which is to misunderstand what one’s doing because really what you’re you’re playing with the visual there. So, you know, it’s just a view spec on the underlying data, the fact that some of it isn’t actually visualized at a certain time. It’s a deliberate choice to actually make the picture clearer. Not to say that I want the whole thing. I just want you throw it all up in the air and redraw it in a in a in a version. I don’t understand. But I see Peter has his hand up.

Peter Wasilko: Yeah, I just wanted to mention that this is sort of why I prefer Ayimba overreact in Enyimba, it will regenerate the entire scene graph for the web page on the fly, but it memorizes everything that it’s already processed. So as long as any parameters into subcomponent aren’t there, it’ll just grab the previously computed version. But it doesn’t try to spend time figuring out what the diff is between the current one and the next view, and therefore at least it claims in its benchmark. It’s dramatically faster than react. We’re dealing with dynamically changing environment. So I think it’s the sweet spot and the right tool for what we’re doing. That’s why I’ve been putting all of my development effort into that language. And I left the link to it in a sidebar. But very cool. It can have things that will auto render, and it’s adding a new system that will basically put observers on property. So if a property on an element changes that can automatically trigger a function that will go and modify the state on the fly and basically, it pushes everything that you shouldn’t have to think about off into the background so that you can just focus on the constraints that you’re putting in place for what you’re building.

Frode Hegland: And just purely visual terms, though, this is going to be a huge issue. And as I was working on that demo text to make it real, one of the things is you would probably not want to take every single piece of data and metadata freshly into a new space every time. So what you may very well do is define one wall as a timeline. That’s always where you will have a timeline and maybe you have a bookshelf wall where you can then say, when I open this particular document or room or whatever the language we’ll use is, then I want you to only show books that this document has cited, for instance. Know we need, you know, memory palaces are based on them being static so that we can remember where things are. So the idea of being able to hang things on specific things, I think is becoming quite useful, and I think that helps us deal with our memory issue and ownership issue of where things are. So that we have kind of like spawn points to an extent, but then after a while of having, you know, more than one person, especially in a room and you put this here and put that there, there’s got to be a really nice way to say, OK, save that view somewhere. Except for this bit, and then you can come back to it just like dealing with tabs and other things on a normal computer. But just in a much, much, much richer way. So, yeah, that’s the kind of thing I hope we can build an experiment with. The demo, the experimenters name is Joe, obviously based on Doug’s 1962 paper Female, we do need more females in the group, so we don’t have to do a bit of outreach there. It’s trying to make a report of the work that we as a community have done over the year. So that’s proper dog voting, right?

Brandel Zachernuk: I don’t. I haven’t read the document yet, but I intend to. What I’m doing today is just putting the intermediation between the passing of a file and the populating of the visual display of it, such that it can happen on multiple clients so that doing that same socket connection. So sort of the same way that makes slow mirror has the ability to get hands on hit things. And various sneaky paste, which is not my choice of words, is able to broadcast those between devices doing the same thing with with the passing of the author file such that whatever we have currently in place, we have the ability to be able to shuttle over and over a network. And then hopefully that should mean that it will be able to work. If you have have a connection, open on a on a quest and then you drop on another client, the the document, and then it’ll be able to be populated by way of the messages being passed over the network.

Frode Hegland: Um, that’s not that. Well, please continue, as is going to say, that sounds amazing.

Brandel Zachernuk: Yeah, hopefully, hopefully. And then, yeah, that means that you’ll be able when we’ll be able to see things in the VR environment and then start thinking about any sort of relevant onward actions. One thing that doesn’t necessarily do is push changes back to it. But yeah, once we decide what our relevance and what sort of actions to take on that body of data than we would have the ability to push this back.

Frode Hegland: Very interesting indeed. Well, the. Simple thing of data matching. Well, it’ll be very interesting, so. If we just drop the speaker, that was so clever, if we have a timeline wall. And we have a mural. How can we get stuff in and out of there easily will be interesting over time because, you know, like a map on the table, you know, movies can do it really easily, but how do we how do we change the layers? I’m not trying to change the subject. I’m just saying all these things are if we have a placeholder thing and we can move it, how do we add to it? So I’m going to pick up my speaker.

Mark Anderson: Are you talking in terms of direct placement because it just noodling through the thing? Okay, so I’ve got a I happen to have a virtual map in front of me and I want it. I have some information. I want that I have an information object that exists outside that space that I want to put into the map. I mean, in the sense that if I had a map on the wall, I wanted to stick a pin in it. I go to the box of pins and I get it out. So. Yeah, I I sort of I guess my intuition is that the data wouldn’t just appear out of nowhere. I would basically reach into the data source. Perhaps at that point, it might literally be a physical action of picking up something effective from an inbox, so to speak, and then putting it onto a map. And then it would be interesting how some cleverness this is. Oh, all right, this is a data object and it’s a geographical object, and I know what to do with that because I’m this display space. I’m a map. And therefore, if I have geographical information, I will put it in a particular place in that display.

Frode Hegland: I have a feeling it will be maybe split between adding data and interacting with it, because Fabian said something really important recently, he said. Making these things and be are to be able to update them in real at the speed of thoughts, as you know, when we think it’s really, really important. So yes, we should be able to put a pin down on a map for sure. That’s important and maybe draw lines and routes. But when it comes to. Let’s say adding weather data. Which is not an unreasonable thing to do. You’re probably at least for a while, will go out of the air into a computer environment. Normal computer environment and find a way to do it there. But you know, these things have been possible for decades now to do in 2-D environments, but they haven’t often been done. So I think if we can provide the tool sets of how the players communicate to each other. But that just.

Mark Anderson: So sorry, I would talk to you, I apologize. I was just thinking in terms of what you were saying about a weather information is that you would be wanting to call. I mean, I suppose the two aspect is whether the information data is available. The weather data is available because to a certain extent, the current way we tend to do this mapping is it becomes effectively a visualization layer that’s laid laid across a map. So would it need to work differently in a 3D space? Because if the data is available, it doesn’t have to be taken in and out of the space because the data in a sense is available already. It hasn’t got it hasn’t got to go from a place to a place.

Frode Hegland: Well, the problem is that even today, let’s say, when I was trying to plan my drive across Poland and I got help with all that Tesla power its charger station, things I may have wanted to see because it’s a long drive. You know, the weather data on top, you know, are there going to be areas that are going to be so cold that I’m worried about my charging, for instance? That’s a reasonable thing to ask, but that data is absolutely available. But I haven’t seen any maps where you can really add layer on layer similar with timelines. I think time formats tend to change depending on where you are. So let’s say for the demo here, I’ve written that Joe has a timeline that when she is working on this project, it’s one year, meaning it’s from the beginning of this project because that’s all that this project cares about, but she can still zoom in and out of it. So in that year, depending on how we do it and whatever dimension, we should be able to add layers of our own dialogue. Plus horribly enough the news or Twitter, you know, what’s the context of what was going on in the world? Etc..

Frode Hegland: But to be able to even get that in, I mean, as you pointed out, Brandel doing spreadsheets, a lot of people know how to do spreadsheets, but they don’t know how to program. I’m not even great with spreadsheets. I can do the basics, but maybe what we need to do is consider a way of letting excel data through CSV or whatever it might be, go in and out of this environment so that people who could be bothered. Can maybe even provide that for other people, because what’s what scares me is we have two jobs. I think one race the imagination potential to try to help with infrastructures. And the first one can get out of hand if we’re good enough. And there’s a lot that’s really hard to deliver on. But what you’re doing with the old stuff is very good because there are plenty of issues with it, but for us to interact with some real data and in a real space, and it’s just simply a 3D graph. I think will help us think about these things just like Immersed, which is in the way we’re talking, very primitive, but it’s real also helped us a little bit.

Mark Anderson: I could see one of the I can see one of the sort of some well, I saw a divide, but there’s probably a sort of a crossing point that doesn’t need to be hard edged between going back to this idea of weather data or something. There’s a difference between, well, if I have made my own very specialized localised weather or something, which you know in today’s terms means that I have a file of my own, probably on my computer that I’ve got. Putting that in is becomes a file transfer thing, whereas something actually the level of, you know, show me, show me all the Tesla charging places in Poland on a map with the weather overlaid is something I would envisage in this day and age coming from some sort of an API. So there’s a different sort of interaction. And I suppose what I mean is there is this fuzzy boundary between stuff that almost I’m doing the equivalent of physically placing into this, this enhanced space. But the problem is detaching from the the the the web internets the whole.

Frode Hegland: Yeah. But the problem is I’m just highlighting concerns from to says we still have a lot to do in 2-D. What’s 3-D? You know, we are going to help where this kind of stuff is possible now, but it’s not really usable. I cannot make a map in 2D showing whether Tesla stations and so on. It’s not really available for me because I don’t have a place to place it. Google Maps I don’t know how I could take weather data onto Google Maps, for instance. It’s kind of crazy. You know, of course, Google Maps and Apple Maps up to optimize for most use. But they should be able to have a lot of data, you know, pollution levels, crime statistics, whatever it might be, should be able to go in and it can’t be done. So it’s not a VR issue at all. Yeah, but it’s just VR can really, really, really show it in such a richer way.

Mark Anderson: One thing just occurs to me with weather, of course, is but when you draw weather maps, it’s sort of yes, you may have point observations, but they don’t lend themselves. You know, if I draw a map of Poland, I can draw where the roads are because the roads don’t move, whether it’s effectively, constantly on the change. And so we do approximations of it. So. But then again, you still there’s no reason that you couldn’t draw things like weather fronts. I mean, at some point a weather front translates to a line of some kind,

Brandel Zachernuk: But

Frode Hegland: Even weather radar, you know?

Mark Anderson: Yeah, but well, OK. Yes, and that would be effectively like a opacity layer if you were to draw the weather radar over the top. I mean, I suspect I suspect the latest there. It’s I mean, it is part of it that, you know, especially someone like you or I would know how to do it. I mean, someone like me, this big glob of data, that’s my next problem was, what do I do with it?

Frode Hegland: But that’s entirely my point that these things have been messed up with 2D spaces. And if we’re going to look at, you know, having different kind of, by the way, question for everyone, let’s say we are on a VR room now full on VR or it doesn’t matter. And one of our walls is just a timeline. And that one part of the table is just a map. What do we call those things? Whiteboards, placeholders, connections, portals. When it’s I think it’ll be interesting to have a name for them.

Brandel Zachernuk: Well, one of the things that you mentioned about it before was that they’re they’re essentially framing devices. So if you have something that has a temporal component for us, a geospatial and those can be framing devices so they could be friends. But in the sense that they have some implicit sort of structuring function over data. So you could you could create frames that are geospatial and you can have frames that are interpersonal. You can have things that kind of constitute data sort of presentations and then you can have things that can structure data presentation and then you could sort of intermingle between those by by by alternating the influence of them. Potentially, I don’t know. I just saw for the. Who is the the person that was this was a fluke or who has the. It’s been presented a future of text or text symposium before where there are criteria, weighted criteria, and if you move elements around that, it sort of modifies the parameters such that it creates a awaiting that results in that node being presented in that position. Sorry about.

Frode Hegland: No, but it sounds very game like and therefore very cool and useful.

Mark Anderson: It wasn’t the people doing discourse mapping by any chance.

Brandel Zachernuk: Oh yeah, potentially, yeah. It also sounds a

Peter Wasilko: Little bit like some of the ideas presented in cyberspace first steps.

Brandel Zachernuk: The 1990 92 paper.

Peter Wasilko: Uh, the Benedict anthology.

Mark Anderson: It’s interesting how far back some of this goes. It’s just what kind of things actually to do this morning was like. I ran a Google Ngram on view specs, only to find that it doesn’t turn up at all.

Brandel Zachernuk: Oh no. Yeah, it was. It’s a word of the 90 to press. Yeah, you no. It’s one of those words that that you could tell the provenance of somebody’s ideas from their dogged determination to retain the references to to particular particular things. Same with schlumpy. Yes.

Mark Anderson: But the sad thing is, I haven’t found another term that’s being used that just describes this notion of. I suppose what better visualizations renderings of a common dataset, not multiple, as many as you like different renderings doing all sorts of different things, but just sitting atop a common data. And it’s sort of there. It’s it’s it’s mimicked in certain ways because you could say if I take a word document and I show the page settings versus the sort of WYSIWYG layout, that’s a different view spec. But I don’t think that’s where that comes from. It’s not an intent that I just have the information underneath, but I want to see I want to see it as a timeline rather than as a geographical thing.

Brandel Zachernuk: In some sense, people do use the word simply to use the word view. So in the in the software architecture sort of frame of reference, people talk about model view, control and model of your control, the model view of your model. And in those contexts, a view is a read only representation of whatever happens to be present, what happens to be processed within a data model? And then you have the the intermediation of a controller that is able to reach in and make modifications. But the view itself doesn’t. So so there’s that one. Um. There was another one that I was thinking. Yeah, so that. That has that, but it’s not necessarily. And sometimes people talk about view layers within the web. Obviously, there’s the idea of responsive web design, which is the idea that a web page flexes and changes as a result of the stuff. But that’s over at most to continuous parameters, but typically only one percent of the browser. But one of the things that some some friends were talking about recently is that, for example, CSS, the system of styling that we use is is actually a much more robust abstract system such that you can use it as a and my friend did use it as a mechanism for doing. A token parsing on an abstract, abstract syntax tree to be able to intervene on shaders. Shader programs, that system, I wrote, builds in order to be able to introspect.

Brandel Zachernuk: So in a shader, it’s it’s a fairly weird and arcane system for programming where and rather than having iteration and loops, you write all pixels simultaneously, which which means that it’s very fast if you have the right architecture, which is graphics programs, graphics processors. But the fact that you don’t have iteration in early art means that it’s tremendously difficult to introspect it. So what he did is took a system that allows you to to intervene on the shader program and then inspect the contribution of individual numbers and values on the representation within it. And the neat thing about that is he used to do it. He used TFS to identify which pace we wanted to look at and kind of highlight that instead. And so it occurs to me that like to that end, CSS is actually a tremendously powerful concept. If only it can be broken out of it’s out of its box of of only being represented at best to to somebody have a dark theme or a light theme. And how wide is that browser like that? So it might be an interesting sort of. And actually, I wrote about it a few years ago. It might be an interesting system for being able to kind of like identify constraints and the way in which you might want to have this thing or that thing be controlled.

Frode Hegland: Hmm. It’s interesting to hear you talk about that, because obviously, you know, I’m old enough and you said graphics, I thought you were going to say graphics cards. We used to have them as cards, as you all know, I’m sure. But anyway,

Brandel Zachernuk: They still are cards. They’re just most of the computer.

Frode Hegland: Yeah. Good point. Yeah, there’s going to be amazing things. And I think, OK, we’re missing quite a few people in the group today, which is OK. They’ve all been emailed the documents. But if by Friday we have some comments, then there’s a few people I want to send this document to. First of all, Tim Berners-Lee, because Vint has good access and I think he would find what we’re doing really interesting. It overlaps with his highlights of ownership in VR, but he thinks more ownership of personal data. Of course, a bit of very related issues. So once we have a couple of good advisors, both young and pioneers, then we can look a bit into funding. And once we have a little bit of funding, we can do a bit more outreach, cover some of our costs, but also hire people to do the programming that isn’t fun.

Brandel Zachernuk: And yeah, I think the fact that that event is actually a bit of a skeptic about VR is also tremendously useful resource as well to really dig into to figuring out what can be done to bring them around because I think it’s a really valuable kind of provocation. What is it that VR does? My sense is like space and space is very valuable for being able to kind of present options and flexibilities of representation. But that’s still a fairly diffuse answer.

Frode Hegland: Well, yeah, but that’s a diffuse. That is where we are. And you know, when we talk about not knowing, you know, we’ve all looked at it enough to know that there’s a huge amount we just don’t know. But I think, you know, based on the work you did with the Dot Liquid file to be able to open up a liquid and an oculus in a really easy way through maybe one web link or something. I think I’ll just have to send an Oculus and say, plug in, go to this address and the browser. This is an example of, you know, it’s almost like it opens up and you have the glossary map here, you have the references maybe to the side and then a graphic for the document, just something simple to indicate that. And so, OK, you can’t actually do anything now. Please, can we have a million dollars so we can make it possible to do something with it? Yeah. Because it was shocking how. Yeah. Yeah, it was just shocking how immersive this stuff is at the same time as being primitive. And Peter, as we have discussed by emails, when you’re ready for your Oculus, we’ll make it happen. As long as you don’t walk around, you don’t get nauseous. At least that’s my experience and Mark’s experience. Yeah.

Mark Anderson: No, I think the most interesting thing was seeing a picture of some taken of, some wearing it, doing this, which is I’ve noticed a good wave of fiddling with this interesting sort of temporary technical thing of the focal positioning. But it does when you

Brandel Zachernuk: Stand in an old way.

Mark Anderson: Peter’s got his hand up.

Peter Wasilko: Yes, since my 3G cell phone is now officially dead. If anyone want to drop me an email with advice on what I should replace it with, it’d be appreciated.

Frode Hegland: But what brand phone was it?

Peter Wasilko: Oh, it was just a track from old style flip phone. I like it because it reminded me of the Star Trek communicator from the 60s.

Frode Hegland: Emily just bought a new iPhone a couple of months ago and we agreed that you should buy the cheapest and smallest because at this point, you don’t really need anything else. You know, we have big displays when we need it. Why? You know, I’m carrying around the 11 here. I got it through an insurance upgrade. It’s an amazing device. It’s just not necessary. I don’t know. What do you guys think?

Brandel Zachernuk: Uh, I haven’t got a phone since 2011. Because I because I get them, but the thing that I’m on right now is the tennis open tennis. And yeah, I think that smartphones are very valuable and that they they are a they are ready to hand for useful information, but there are in many places more captivating, captivating when they are useful. But I wouldn’t I wouldn’t go without it. So I think you should get a smartphone if you are already bought into aspects of the Apple ecosystem, and Apple makes it very, very, very good. I also actually get more much more out of an Apple Watch than I would have expected. That’s kind of surprising, isn’t it? Yeah. Yeah. I mean, I it was my job to sell them and make the website and sell them for many years. And they were like, you have to have one to be able to to understand it. So that was that was that was that was nice of them.

Frode Hegland: I think that, yeah, that’s good.

Peter Wasilko: What’s that lighthearted thing that some of the Apple phones have in them? And is it good for anything?

Brandel Zachernuk: Yeah, it’s exceptionally good for object object capture. So on the back of, did you say, 11 or 11 pro photo?

Frode Hegland: Oh, just 11 small. Just that,

Brandel Zachernuk: Right? So that one does doesn’t have the lighter, but the Pro models, the iPhone 12 and 13 have lighter on the back, and that means that you have a much higher fidelity capacity for object tracking on the back of the face. The true depth sensor on the front has a pretty good capacity for doing tracking, but it’s primarily sort of, you know, oriented toward facial tracking. And at that distance, the Microsoft. It’s a shrunk down Microsoft Kinect effectively very down, obviously, but as such it has the structure of a sensors tailored toward specific distances and scales. So the pro model is is really, really interesting insofar as it’s got great cameras for photographs, commentary and, you know, you can call it stereo photogrammetry or whatever else which allows you to do. But RGV based color, image based sort of construction of things and the the currently fairly clearly beta software that Apple released last year. Object capture is phenomenal for being able to create objects from that, but then from a. So that’s the sort of thing that you can do at a sort of a quarter acre scale. You can you can capture sort of the size of an entire house for a room.

Frode Hegland: You have to blur the backgrounds behind you guys.

Brandel Zachernuk: Yeah, yeah. So that actually that what I called the portrait mode is from that part, partly just the disparity mapping. So having the ability to have three really good quality cameras that it it has a very, very clear. Are sort of control over and knowledge of means that it can. It can do disparity mapping for depth, estimation and things like that in a phenomenally accurate way. So yeah, I really think the iPhone is a great platform, particularly if you want to write stuff or do any sense. And and the pro models on the additional sensing capabilities they have are limited only by the things that people can, can think of to to do with them. And unfortunately, we’re at a point in technological history where there’s more technology to be had than people know what to do with. So it’s helpful if you can program if you can, if you can write stuff that can control that stuff by then. But otherwise, there’s there’s a long set of functionality that exists in most things and and it will remain relevant for many, many years to come in terms of what, what people will be able to kind of envision and build with later and other technology. So yeah, if I may sell an iPhone to you, then they’re right.

Peter Wasilko: It’s like create a model of something to get inside a blender. So when you actually scan something in 3D and then get inside a blender so that you could do something with the model or.

Brandel Zachernuk: Yes, by all means, by all means. So you know, Apple is working, so object capture is this application. Like I said, it was released last year. It’s it’s such that you can expect that it will be integrated into ways that are much more seamless for people to use. It was made by an R&D division, and it looks like it. I mean, it looks like an apple in the innovation, but it is not. It’s not. It’s not like notes or mail yet in terms of that kind of level of function, but it allows you to export USD or USD, which are now have first class interoperability inside the blender so that you can open those and animate them, export them or whatever else. I believe I haven’t done it myself. I don’t have a lighter iPhone yet that it also does some level of construction of the physically based rendering channels, the API channels that you can then use to estimate things like roughness and madness, rather than just the pure albedo. Which is a funny word, but it means the sort of reflectance terms what things look like so that you get some rough sense of whether an object was actually shiny and things like that. And that’s a phenomenally important because it means that the objects that you create are just so much more flexible and versatile for being able to be put in different lighting environments and get a sense of what the materiality of those things. So, yeah, really, really neat. And there’s a lot more to come from that that group in terms of the ease of use of that, as well as the sort of domain of application. One of the things I hope they can do at some point is reduce the polygon count of those things in a semantic way and improve the topological flow of the vertices on it. But that’s a that’s a little arcane for this.

Peter Wasilko: Discussion on the generation of hardware will keep getting better as they improve the software going forward. In other words, do you think it’d be something where in a year I’d be willing to trade in one, if I got the current generation for something new or would be good for a while

Frode Hegland: Only if it works better in concert with the AR VR headset? Do you agree with me, Brandel, if they’ve done some special possession?

Brandel Zachernuk: The I would say it depends like everybody, like a lot of people want a new phone every year. But but you won’t be left holding the bag in terms of the relevance of the device. Like I said, I’m still keeping up with my iPhone 10, which is 2018. I think, yeah,

Mark Anderson: I’m on a TED here and not not feeling left behind.

Brandel Zachernuk: Right? Right. So so to that end, there are features that are changed. So 10 s no longer does retraced shadows and air fit, for example. They are quick. Look at things, whereas things 11 and above the ray tracing of those things have a slightly more robust algorithm for depth of field estimation. But those things are sort of provided relatively seamlessly. So, you know, you can you can be relatively guaranteed that the longer you wait, the better the phone will be. That is available, but it’s not. There’s not ever confidently say with Apple stuff, there’s not ever going to be a moment where if you buy something that is new now that it will be irrelevant in six months, it’s you can no longer use the iPhone six, essentially. But that was that was the 2014 vintage, as I was just eight years old. And but you can use the the twenty seventeen vintage that mark has the twenty 2018 that I have. So, yeah,

Frode Hegland: Just like with desktop publishing back in the day and writing basic word processing back in the day, there was a point where a computer could do it. So the next year’s model wouldn’t actually improve any of that work. I think we’re currently at that with smartphones for current use. There isn’t really, I mean, yes, the new ones got a bit better camera or something like that. But what’s I think really interesting is to properly look at all these different devices like Brandel was saying, he’s surprised at the utility of his Apple Watch. Me too. And Emily, my wife, she absolutely loves her and she desperately didn’t want it. I had to give it to her as a present for Christmas. She was really annoyed that I wasted all that money. And now, you know, one of the key features of it is click to hear a ping from your phone because you can’t remember where she left her phone. But you know, just getting messages and Siri works phenomenally well on it. But another interesting thing is I stupidly bought the small pro iPad last year, hoping we could put author on there.

Frode Hegland: Couldn’t do that. Just couldn’t afford it. So we have that. We don’t use it as much. But the iPad Mini, because that’s more variable, becomes of a higher utility because, you know, there are so many devices now, they kind of overlap in use. So we kind of have to decide for ourselves where to use them. But I can’t imagine this time next year that these slider and camera things will definitely somehow be used with the AR or VR headsets. You know, it should be trivial to scan your own room at high quality to begin with. And you know, I expect the Apple AirPods, which are phenomenal, which have spatial audio, which is so important in VR they will probably be used to the Apple Watch. I don’t know how will be use, but it’ll probably be used. I think that’s one of the key things Apple will do come up with a headset that works well. But if you use it with these other additional expensive Apple devices, it’ll give you more.

Mark Anderson: I think because you mentioned I was trying to work out the the Apple, the newer AirPods spatial thing, and I think so too with the phone, I think that needs something new. So one thing that gets confusing now, given that phones are essentially good enough, is there are several different things rolling along. There’s how much slower your phone gets over time, as the newer OS adds more powerful things on it because it’s predicated on a slightly faster chip. Now it’s just much progress whether whether your usage you fall into the group you tend to do for your battery in very short order. Well, that’s not a problem I personally have, but I know some people do so. So really, the point at which I think you end up upgrading is you’ve either just, you know, is it worth putting a new battery in or not? Or is it actually you might as well? It’s time to get a new phone or you ding the screen or something? I am looking back. That’s basically how the thing has come about. It’s not been o-m-g. Omg must go and stand in a queue outside the store.

Frode Hegland: It’s really interesting to have this discussion in our group right now considering our VR focus because in a VR space, you can have any darn phone you want. And I think that’s really, really important because in a VR space, you will need something that mentally at least is tangible to interact with, at least for a period of time. Maybe at some point we’ll be floating in colors, which could be hugely valuable over time as our bodies and brains adjust to it. But that’s why we’re not talking about these frames or, as Peter suggested, I just saw on chat monitors, you know, there will be things in VR space and we need to handle them and interact with them, and we can fantasize as much as we want. But until we can test a few, it’ll be just pure speculation.

Mark Anderson: I think it has to be because sometimes not, I mean, it’s a chicken egg problem, really. Not until you’ve built it and you discover it was either much easier than you thought or more often than not much harder than you thought. Exactly. The affordances begin to suggest themselves.

Frode Hegland: Yeah, exactly. That’s why we have to get money so that Brandel can do the interesting stuff. But there is so much behind the scenes of not only data transfer and storage, but you know how things relate to each other. There’s so much that needs to be done. And if we have something viable and useful and interesting, maybe we can even patent some of it to make sure it stays open. So if some of the big guys try to do the same things just solitudes, you know, we have this is now. For posterity or whatever. Because it is, you know, when I close my eyes and try to be this Joe character, it’s like, OK, I have that there and have that they’re so limited. So, yeah, thinking in terms of devices. So Peter, your question is which phone device would you like to have in your VR world?

Peter Wasilko: And on that note, I should be bowing out, I have to help mom with a lot of paperwork now.

Frode Hegland: Ok. We are thinking of you. It’s. It’s tough, but thinking about you, Peter.

Peter Wasilko: And I don’t know what’s going to happen Friday, but if possible, I’ll try to come in, at least for a little bit. So we’ll see how it goes.

Frode Hegland: It’ll be nice to see you, but don’t feel any obligation. And if you want to meet separately, you know, we’re available, obviously.

Peter Wasilko: Ok, thanks a lot, guys.

Frode Hegland: See you. Thank you.

Peter Wasilko: Yeah, I really appreciate all your warm thoughts. Means a lot.

Good. But for now, I.

Frode Hegland: I don’t know if he knew Brandel, but Peter recently lost his father, so that’s what he was.

Brandel Zachernuk: Yeah, that’s very hard. Yeah, it is,

Mark Anderson: And it’s difficult. It’s sort of one of those things that actually something happened in the grand scheme of things really probably was inconsequential, but just feels like, Oh, if only I’d done X

Brandel Zachernuk: In the Yes, no, I have been involved with people. Sort of what? What if those things have been like this? And what if those things happen like that? They weren’t. And it is. It is difficult. But yeah,

Frode Hegland: I wonder in terms of culture and emotion, at what age when you lose your parents or you’re no longer orphaned, you’re just an adult. It’s all very bizarre. Yeah.

Mark Anderson: Well, no, I well, it’s funny you say that. I mean, I think back to, you know, certainly I suppose I lost my mother much younger. So it’s a different thing. But you know, when my father died in adulthood, there was a sense, okay, not because he was the last parent, but the sense of, Okay, so now we pass through a door you don’t come back through. Now there is. There is no one to lean on. In a sense, you are near the top of your little evolutionary tree. And that’s a that’s a sort of it’s I think people talk about it or something, but it was definitely a sort of an emotion or a sensation that I recall at the time.

Frode Hegland: Yeah, I’m scared shitless. I’ve lost my father. My mother is now in her late eighties. She’s doing very well, but at some point. That’ll happen. So it’s tough, but I have to say with with the work we’re doing, you know, sitting here in Europe, Mark and I and, you know, one of my close friends set to go to Poland to pick up his parents. And watching this quiet and close by. I just think what we’re doing is just much more important. You know, we have a magic accelerator. I don’t know what the fancy word is. Magic potion almost elixir. Yeah, elixir. Thank you. And human beings, our greatest power is imagination. It really is. And we are about to waste it. And Brandel, that’s why your deep dismissal of current state of affairs and questioning is so vital to this. And also the whole lab thing we’re doing. You know, my background is as artist, Chelsea School of Art, Human-Computer Interaction, blah blah blah. I want to make a Montblanc pen. I want to make another version of author. I want to create tools. That’s what I want. But I know what we need is to support each other and the dialogue that can make other people making a better tools.

Brandel Zachernuk: So, yeah, well, that’s that’s the thing that I think is important to to key into. With with Vince objections is what are the what are the basic propositions and how do you how do you communicate them clearly? Because, yeah, it can be easy to mistake the sort of the fundamental shifts that virtual reality support as just being sort of dazzled by the novelty of it rather than it is dazzled by the novelty. It’s just that it doesn’t go away. It’s that once you realize that, then it can become mundane and traditional computing can become not just mundane, but often disappointing. I don’t know.

Mark Anderson: It may also be that because to a certain extent, the most obvious. I mean, I know there’s some work in medicine and in in in things like architecture and think so by and large, probably to the woman or man in the street. It’s of entertainment games, sort of perhaps sort of make makes it seem a bit frothy than it really is. But you know, in I look at the things that have just come along in my adult lifetime, you know, who’d have thought we’d all be, you know, walking around carrying mobile phones? I mean, it’s just insanity, isn’t it? I mean, no one’s going to do that, and that is just pass through and living through the first 10 years of the web when people say, Well, what’s that? So you you use a telephone to talk to people via a computer. Why don’t you pick up a phone? You know, and it’s so alien. I mean, I suppose that’s why I keep I keep finding myself being drawn back to this notion of. But of. Views and views, specks or whatever one calls them, this is a problem I haven’t found a term for it from the human rather than the software because I’m sorry.

Mark Anderson: I don’t keep mentioning 10 books, but but by happy by, by happy accident. I find myself sitting in a community where there’s a there’s a tool for a long time has offered that, but I still find myself explaining to people, Well. But you know, I only use this view and I want to do that. And I said, Well, why don’t you just do it? It’s the same data. And it takes an awful I mean, this is the thing I keep reflecting on, I’m I’m constantly relearning the fact that people find it really hard to consider the same thing can be two different things or more at once. But that’s something that AR VR is going to unlock is going to bring even more to the fore. So, you know, from the human side of it, there’s there’s an element of what we’re doing that I’m we clearly need some addressing because to a certain extent, you know, even if we get even if we get all the technology right, we go get the human meat bags up to speed as well.

Brandel Zachernuk: Yeah, I was just watching the other night, a documentary series called The Mission to Change the World. I think it was broadcast as the dream machine in the UK, which maybe means it’s related to the book The Dream Machine. But one of the things that I hadn’t heard my, my, my life, it’s about computer history and my wife sort of confusedly said, Don’t don’t you know everything about computer history like it was there literally anything to learn in it because I have read a lot about it. But one of the things that was really interesting about it first was that I don’t know if you’ve heard of the The Lion Company in the UK was the first commercial seller of computers because they recognized that there are like tea and jam company would need it. And so obviously everybody else would as well. But another really interesting thing that Lion and and the UNIVAC sellers first are the Fox Eckard and then the folks who bought it, Remington Round that everybody was really, really frustrated and disappointed with having computers because then it turned out that programming computers was so hard. This was before Fortran and COBOL, so there was no programming language to do it at all. There was not even a sort of assembly language level kind of representations that people were able to kind of manipulate. But to me, that was really interesting because, yeah, sort of underscored that the the models of thinking, however expensive they were a quarter million in nineteen fifty one dollars at that time.

Brandel Zachernuk: And and by by by far, the more expensive thing was the fact that nobody knew how to think with them. I feel like that’s an incredibly relevant component of computing history is that that the mental models that were unlocked by different programming environments and runtimes had the had the ability to increase by many orders of magnitude, the people who could be involved with them. And that, I think culminating with with spreadsheets, I think that’s the place where the most people at this point have the most flexible capacity for intervening on computer information. I’m not sure. Maybe, maybe word has people being able to manipulate stuff, but for the most part, when people are using phones, they aren’t. They aren’t changing things. They aren’t changing things as they relate to a large corporation in a meaningful sense. They’re sort of providing their own Instagram feed and hinting, but not not in a not into a mode that that means that they’re aware of their own. So I would say that the high watermark, while it’s not in the most recent mark is is the IS spreadsheets in terms of people being able to kind of see and change stuff. And I think that, yeah, if we if we can, if we can recognize what it is that people actually think what it is that people that need to to do with stuff, then we can take that even further. Yeah. Still interested in what can you tell me a little bit more about what Vint says?

Frode Hegland: Yeah, he basically says there’s so much information management stuff we can do in normal screens that hasn’t been really done yet. And I completely agree with them on that. Obviously, we all agree. But my thinking is I’m 53 years old. I think something like that. And you know, I’ve done tiny little little things in my life. But I feel even though I’m not talking about, of course, but I share one emotion and that is he told me he climbed up to a mountain, saw the future and ran back down to try to explain to people what’s happening. When he felt that he had two big hands, he could for some reason it was his hands was too big. He couldn’t really make people understand. I feel similar to that. And you know, at a certain point, you have to stop doing something that doesn’t work. So that’s why I think the metaverse or whatever we’re going to call it, has an incredible opportunity for the general public to think this is a new thing. We can do new things. So, yes, if someone gave me 10 million, I would spend it on making author better, of course I would. And marketing that and we can do better spreadsheets. We can do better graphs on a computer.

Frode Hegland: Of course we can. So I completely agree with them, except for what you said earlier, like the Apple Watch tiny thing, but it’s a new dimension. And it actually helps us, right? And I I am convinced that being able to deal with things in a space will be much more powerful, but the more I work on writing our demo and thinking about it. Fake 3D like Second Life is really messy. 3d World is messy, so we’re going to have to spend real time thinking about how to organize your virtual office. So, you know, that’s why we’re talking about these frames and things like that. So when it comes to events, I think that. Yeah, yeah, once we can open something easily innocuous for him, I’m just going to have to bite the bullet and just send him an Oculus. You know, he’s supported me in the past. Once I asked for to finish author, he sent me $10000 and, you know, in a day because he saw that that was the last thing. So for me to send him a whatever, $100 Oculus, I don’t think is outrageous. But I don’t want to say is going to say is, you know, Horizon’s work rooms or something like that?

Brandel Zachernuk: Yeah, yeah. One thing I think if you’re not familiar with this, this is this is something that I come back to a lot is the model. Do you know what it represents these pictures?

Frode Hegland: Yeah, yeah. Very familiar.

Brandel Zachernuk: And that tells us that what we. What we understand with our brains, what our brains are literally built and structured and priority is what happens from and to our hands. And so it means that while it’s definitely the case that there’s a lot to be done in terms of data manipulation, use facts in those kinds of things, it should still be done with our hands, even if it’s not in 3D. So, so iPads are very, very interesting. But but we have such a woeful dearth of real insight about what intrinsically multitouch and and multi multi-point manipulation might look like for for reasonable data manipulation that I think is a terrible stumbling block in terms of our appreciation of the way we can know about things. But I think, as I’ve said before, like Vint is one of the like archetypes of people who have learned to make those concessions. And so to that end, it may be it may actually not be much more easy for him because of the way his brain now works. After 50 years of at least 50 years of of imagining the way in which one sort of intervenes on things within a digital space. But the fact of the matter is that he’s had 50 years to learn that and other people don’t. But but what everybody does have is a brain that’s structured in in in priority more like these somatosensory and motor homunculus. And so if we we think about what hands are and what bodies are in space is as we sort of deal with information, with writing, with text, with data, then it will be better for more people.

Frode Hegland: Oh, absolutely, I agree with you. He spent your program, I think you would agree with that. And because he’s learned to live with that. My father, who was not very digital, who did very, very well within business, his mindset was the world is thus I want to succeed. My mindset is the world is thus that’s annoying. You know, obviously he had the he became a success for living within it, so for him to learn new paradigms like the first time I tried to explain to them a mouse and a cursor, both him and my mother. Huh? You didn’t tell me that thing moved that, you know, and because of that one thing happening, once you know, it really slowed down their interest in learning as well because it just seemed like a word paradigm. But on the positive side, the Mac multi-touch trackpad is amazing. Hmm. You know, it’s just a tiny bit, but oh my gosh, to be able to do this, that and it’s incredible.

Mark Anderson: Yeah, thinking on the you know, the picture, the homunculus model, I mean, I was just trying to think, you know, a lot of things when we think about touch and the motor skills in the hand is we sort of thing that we think about texture and pressure and sort of physical activities. I mean, the annoying thing in my mind is just at the time where we really could do it, the fact that we stop thinking about things like knowledge, but you know, knowledge measurement, knowledge spaces, when the web came along and all that stuff sort of went off to off to pasture is a real shame because so so one of the questions that comes to me from you bring up the the model idea is I said, what, what? What aspects of this amazing extra manipulative control best lend themselves to the malleability of this abstract thing that is is information and knowledge. I mean, it’s not it’s not that we’re going to build up our personal personal data in the form of a swing ball set. So there is there is that sort of question as to which parts of which parts of the skill are most pertinent to this new environment. And my gut feeling is it won’t be the ones that you immediately go for because simply because of the things that we normally use our hands for sort of sporting activities and sort of physical contact, the manipulative things we do every day and the sort of coffee cup you pick up without thinking about it literally is, you know, those those are the things you do with less immediate thought. And yet hiring the incredibly powerful things that we don’t have in our 2D space, such as as we’re sort of in a sense saying it’s kind of good enough at the moment.

Brandel Zachernuk: Mm hmm. It’s. Writing a note, sorry. Yeah. I mean, I think I’ve said leaky abstractions before here, but it’s something that I’m on a bit of a tear sort of using as an introduction to it. If that. When were presented with the world when presented with a tangible space? There’s always more information to be had from a more prolonged and closer investigation of the thing. So when you look at your coffee, then you can see aspects of its function, you can see surface details, defects, hallmarks, the literal imprinting on the bottom that says which hole that was made in. And you know, your manipulation of that coffee cup isn’t dependent on your awareness or ability to to to to kind of read those things, but they all exist nevertheless. So Apple AirPods Pro, they have the I don’t I don’t know that it’s a thundering success, but they have the regulatory information for what the AirPods work like is printed somewhere on the inside of it. You don’t need to know about that, and that’s why they did it, because they wanted to sort of skirt the rules for for for that.

Brandel Zachernuk: But information displays of data at this point, expect you to be cognizant of or be required to be cognizant of all of the details in them right now. And I deeply resent that. I think that it means that that’s just an incredibly opinionated view of information designers and presenters to say like, you need to know about all of this and all of this is what you need to know about. No, I need more. Thank you. And so having representations that lend themselves to being able to have, like you say, the client’s ability, but also the prolonged inspect ability, the idea that there’s more and less at the same time is something that I think is pivotal for recognizing the way that sort of current generation paradigms are selling the ability to to to jump into those things short, it’s not intrinsically to do with the fact that the interfaces are good enough, but the interfaces are good enough for a very terrible series of concessions as to what what it is we’re doing with the information that we’re interacting with.

Frode Hegland: Absolutely, I want to get to a point where I told my headgear in the morning to quote unquote read the news. And one of the things I’ll have is a map next to me that will instantly be my neighborhood or it will have, of course, traffic information may be weather or in general. It’ll also have, you know, if my local supermarket has a special offer and I actually want to know about that, that might be indicated if there are any police reports near my son’s school. You know, as many stacks as I want. You know, so I should be able to make it messy, as you said, I should be able to visually inspect that at a glance to see if there is anything useful or not anything pertaining to me. You know, wipe it away. I’m now in work mode. I don’t actually need them. Or later on, I do. There’s something else. Every item should carry as much information as possible. You know, if I’m reading, yeah, it’s just we, we so strongly agree. I mean, the texture of paper means something when you read, in some cases, you know, that needs to come back. There are so many levels of this, but you can’t have it if you can’t have it, if you can’t address it, you can’t address it. You know, so how we move it around, how we store it, how we interact with it. It’s kind of amazing how all of that has to be done from scratch now. You know, so it’s I know. You know, I made this Globe thing that I tell you. For. No, I don’t think so.

Frode Hegland: Ok. Hang on. It was called Live Globe, and I got to use a framework by a geo guy who stopped supporting it after a while, I think he got annoyed at me because I asked him to do things. The idea was really simple, so we took a map of the world, you know, a geographic map. No land, no no country borders. Oh yes. Yeah. So it also had the ocean depth, but I decided that wasn’t as relevant. So I painted all the ocean in a layer. Blue looked at the opacity, so you could still see where it’s deeper, but not too much. And as you remember, you would tap on a place to see what it was country or you zoom and you get the city outline. But also, if you tap on a name, then it would flip around and you would have a graph for all the standard statistics, such as surface area population, if it’s a city or whatever, and then you could compare with another of the like. So if you had two oceans, you could do that and the lines would go back so you would learn what the categories are. They would all be horizontal. And if you compare to places, you would get these two shapes. So you could simply say, Hey, are the education levels high? But they have pollution, you know, or whatever it might be. There are so many things that can be done with that. The spider graphs and all kinds of things. Which are completely abstract, of course, but it came up onto more realistic spaces.

Brandel Zachernuk: That’s really cool.

Frode Hegland: You know, we should be able to easily. You know, I do strongly believe in the power of a document, you know, as a background, as an artist, I remember when I was a teenager and my art teacher got annoyed at me because there was an exhibition and I had an image on a table. And then I did this and it was interesting and she liked it, which she said at a certain point, you have to stop. Not just in time, but there has to be a frame if it isn’t framed, it isn’t a piece of work. Right. I just thought that was fascinating, and I just lost my train of thought entirely there, but anyway. Yeah.

Brandel Zachernuk: A frame on a pedestal and a plinth, all of those things that sort of dictate that the boundary of what is the work and what isn’t the work are interesting sort of issues and questions. I mean, one one thing just thinking about that, about what what 3-D does is that. The the I think somebody I saw somebody recently that Fred Brooks, Fred Brooks Jr., said that the desktop desktop metaphor is more actually the airplane seat metaphor in terms of the ability that one would have to manage a series of documents while sitting in economy class next to two people on it. So just in terms of the amount of space, one actually has to be able to kind of manipulate those things. And and frankly, it’s barely ameliorated by a larger display sizes because they’re still sort of shoulder width. They’re not they’re not such that you have the ability to kind of really move those things around to any substantial degree. And you know, that’s I think one of the that’s the reason for that for the for Microsoft Surface. But it’s not something that’s been connected to real work because Microsoft Research is essentially almost a basic research wing rather than an applied one. But the full size surface table, it’s not the not the Microsoft Surface tablets, but the thing that’s the size of a of a of an air hockey table or a small a small pool table are really the thing that that they were sort of thinking about and envisioning. But the economics of it are such that it’s people get too squeamish seriously, considering sort of devoting such a large kind of display surface and working area to a single individual. But it would be interesting if you think about the sort of the the smallest size IKEA desk tables. And if that is an entire display surface, what it is that one would actually do on that as opposed to plopping a computer onto it, which is barely shoulder width.

Frode Hegland: I think it’s interesting, but I think a lot of the space would be wasted just by your hands. So one thing I wondered about is why can’t software easily detect the two iPads that touched each other? So you could certainly have one bigger screen if you’re working with someone else? You know the control thing, though it can do the cursor to your own things, but it would be really nice to. I originally thought about playing a game tabletop game that they work across.

Brandel Zachernuk: Yeah. Well, hopefully they would. I would. I would hope as well. I don’t know if iPads have it have the ultra wideband chip, but the ultra wideband chip that’s in the latest generation phones and AirTag

Peter Wasilko: Are

Brandel Zachernuk: Are able to give a really high fidelity representation of the orientation and relative orientation and position of all of the devices with relation to one another, which means that if you have multiple iPads than you would actually be able to kind of not just if they were exactly parallel, but even if they were askew a little bit or actually on different angles in terms of the facing surface possible to kind of resolve those in pretty short order such that you could create a synthetic surface. Yeah, I think that that point of wasting is a really apt one because most of the sort of innovations in computing as one people have had the audacity to be as wasteful as they could imagine of a current resource. And the people who were threatened with expulsion for writing a word processing application were were a case in point that that that secretarial work and not deserving of the austere and reverent kind of regard we should have for for what computers are all about.

Mark Anderson: You know, being being told Brown. This is a computer for serious work, not for doing philosophy.

Brandel Zachernuk: Right, exactly. And so like so, so getting into the mood of really wasting pixels, I think would be, I don’t think we’ve really ever wasted pixels to the extent that we should. We’ve wasted cycles. We’ve wasted Bates. We’ve we’ve certainly wasted what, but but we haven’t we haven’t really managed to get quite as profligate with our with our pixels as we ought to, I think.

Frode Hegland: I think that’s a very good point.

Mark Anderson: And it’s interesting because also because the point that you’ve brought up before that, you know, in a sense, you don’t need massive fidelity everywhere all the time. And thus far, what we’ve tended to, you know, we’ve got are however many k we’re on at the moment screen and even the thing in the farthest corner that’s, you know, in your peripheral vision is still being calculated with the same fidelity as the thing right in your focal point. And there’s a lesson, I think, in that for us, you know how we how we become more usefully wasteful of our pixels as well?

Brandel Zachernuk: Oh, that’s fascinating. I hadn’t thought about that. So so people call that concept associated rendering. And so Oculus currently does it, as they call it, fixed 12 the rendering, if you’re familiar with it. But if you look into the periphery of most of your applications than that will be at a much lower fidelity. So not just because the final lens that you observed in the past mark, but because of the effect will be added rendering with the expectation, all other things being equal that you’ll be trying to look through the center of your vision, the center of your your your eye holes, that that’s the best place to put that thing. So that means that you can get roughly twice to more fidelity in the center of it. But with with eye trackers on a monitor, there’s no reason you wouldn’t be able to do that in two dimensions as well, actually. So if you’re looking at your great big thirty two inch screen or twenty seven inch screen or whatever they are, if you’re looking up there, then if you have something that has a devoted kind of resource for actual eye tracking, you could vote via a 2D desktop as well in order to make sure that you’re not squandering resources on something that’s forty five degrees away from your central vision. That would be fascinating,

Frode Hegland: By the way. Question for you guys. If Apple releases a large display tomorrow. Are you guys tempted?

Brandel Zachernuk: I don’t buy things.

Frode Hegland: Yeah. Well, what, you just get it delivered when you need it.

Brandel Zachernuk: I mostly don’t. I mostly just use things that are old until they fall over. In New Zealand, it’s it’s the most remote country in the world, so we mostly have a pretty good reputation for just use one, you know, using things, driving them into the ground. And so I tend to do that with most of my devices. I am on my second display, but that’s because my my I had to upgrade my Mac from the thing that I got in 2014. My my company issued one and my 2019 issued one didn’t have the ports. It was a Thunderbolt one and it doesn’t have it anymore. So I tend to make do until the last possible moment. And so if there’s a specific thing that means that that my my current set of needs are suddenly inadequate, then that will cause me to upgrade. But but otherwise, no.

Mark Anderson: But if I if I put this to defrayed for obvious reasons, given who’s in the room. But I mean, but I’m wondering, in a sense, the the main thing I see with its screens at the moment is is there a real use for stuff like video and visually creative work where you want to be able to have immense power and detail and color control and stuff? And yet everything we’ve just been talking about isn’t in that area at all. What we really want is is probably a very large obviated display. What we want is to have a effectively I mean, if you think things like spatial hypertext map so effectively you want an unlimited display that just shows the bit you’re looking at and having a wider and the system it lives in or in which it exists. Having a knowledge of of which bit of this limitless space you’re looking at and knowing what information needs to go there, which makes it a very different paradigm to where we are with with our current TV screens.

Frode Hegland: It is, but the thing is to kind of the obvious as well. Personally, when I write, I prefer a 13 inch screen because it means my eyeballs move, not my head, you know, and that’s that when I edit. Of course, it’s a different situation. But also to say the obvious, I think this is such a huge difference between a large screen, as I used to work with my images and a VR environment because the ability to just place things everywhere just multiplies it by a billion. You know, it’s such a completely different thing. I would not want to spend too much effort on a large screen, whether horizontal or vertical, because I think that the lack of resolution but increase of where you put it in VR is so big.

Mark Anderson: Well, I think the irony of this of the desktop metaphor of computing, I no longer rather has a screen space. I ran a desk space in front of my screens to put look contingent bits of information or, you know, even if it’s just a coffee cup. But, you know, literally things that I that are pertinent to what I’m doing on the screen, you know, that’s something that’s something I didn’t think about when I was wondering whether I go for a slightly bigger category to monitor, you know, to get this extra pixels on the screen, you know, because desktop was all but actually, it’s now literally the physical space in front of my monitors that I’m looking to upgrade.

Brandel Zachernuk: Yeah. So to that end, you were talking about the peripheral obviated displays that have a Benko and actually the fact that is that Facebook now makes me much more interested in Facebook, but what could possibly be pressuring their because his his work? And if you look at the short videos there, then he’s the person who’s managed to make the most constructive use of co-creative display systems. He was responsible, in my part for my room. I believe like space is also another one like that. I don’t know that I have seen that quite it’s 20 10, so there’s a good chance. I have basically watched all of this, but I’m not sure. But he’s done very, very interesting stuff about, yeah, he’s got another one from 2012 here saying make every surface a touch screen. That was the thing that was presented at at least 2011. So, yeah, those are the kinds of things that I think are very interesting, interesting in sort of investigating and to that, it’s investigations of that flavor that are relevant, I think the most relevant in virtual reality.

Mark Anderson: And I think they’re also good in that they offer. I don’t know what’s the phrase people like. It’s a provocation to some of our sort of AI. Just to some, you know, because in a sense, I sometimes feel that that not so much in this forum, but I hear people saying, Yeah, well, the main the main advantage of something like VR is I’m going to have 100 foot, you know, 5K screen. I’m thinking, No, then that’s that sort of thing. That’s you could, but that’s not what it really has to offer for us. So things that that break us away from. Basically, a bigger version of what we have in front of our eyes at the moment, I think is is a useful way of thinking.

Brandel Zachernuk: Yeah. And to get to the point where we at least get to see the practical consequences of what that bigger is.

Frode Hegland: I just wrote a note there about the thing I said earlier about Memory Palace is only work because there is a there there. And I think it’ll be interesting for us to look at those things, because that’s maybe one of the things that companies will populate really quickly is their own, their theirs. But yeah, I clicked on your link, thank you for that.

Brandel Zachernuk: Yeah. Oh, to some of the so so it’s not one specific piece of work, but obviously my room was explicitly sort of within the context of entertainment. But but boy, bingo was it. I’m sorry now, Facebook matter, research, whatever you want to call it. And he he’s done the most interesting work about taking pixel grids out of a small display, either by having a shoulder mounted projector camera. That means that you can have a have a thing here or by having a very large region that you can then think about this sort of functional as you kind of arranging those things. And so, yeah, like I would be very interested in photo to display such that you have this and then you keep your monitor, you know, and maybe 13 inches is OK once you have that. But you know, it occurs to me that if you think about the value of those pixels as being the same as those ones, then it’s always going to be more expensive. It’s always going to be too expensive to be able to do that. One of the things that I’ve been musing on recently is when people will start installing pixel grid drywall, you know, bored or whatever they call it in the UK that actually has some level of fidelity display capability as a first class component of it, because like, I’m still at the point in my life where I’m stuck in rental hell, which means that I live my life and all of my peers live their lives and rental beige. But one benefit of that is that it’s it’s an ideal projection surface, and it would be it would be neat for people to recognize the utility that can come from from actually constructing capabilities. Targeting that. But I haven’t done the work myself. I.

Mark Anderson: A step towards a personal holiday that I must say, if science fiction has taught me anything, there are health and safety nightmare.

Brandel Zachernuk: Oh yes, yes, no. I feel like there would have been a number of inquests. Oh yeah. I think just wallpaper would be an interesting start. Dynamic wallpaper? Yes. Where is it? I bought a in combination with that that. Handheld projector, I bought this chrome ball a few months ago so that I can project onto it and then have a large reflection up on the on the ceiling, which takes a relatively limited two to one throw from this projector and is able to to cast, albeit dimly. This is not a very bright projector. An entire shillings worth of stock. The interesting thing about that is then even with the surface a reflection surface like this, it’s. Pretty understandable, pretty knowable, what the resulting image up on the screen that is, but even if it weren’t, even if you were to just project on some crinkled up tinfoil aluminum foil, you’d have the ability to recalibrate by sort of looking at it to figure out what consequent kind of reflected images. And then through whatever convolution matrix, be able to figure out how to create any given image for whatever sort of capabilities you have to display on different regions of the screen of the environment as a consequence. And and the other thing about that is that those become pure additive. You throw more tinfoil with more projectors into the room. You have the ability to just have those work in concert with one another, provided you’re able to have proper sort of temporal synchronization. And then, yeah, you get to instrument space, you get to be able to create a fairly impromptu cave with a minimum of componentry. Then if you have the ability to do to be able to do programmatic ones, something that I’ve been musing about, I don’t know if I made it an appearance in here, but shortly after Valentine’s Day, there was a the local supermarket gave away all of its mylar balloons because they weren’t going to sell them.

Brandel Zachernuk: And and so I made a neutral buoyancy sort of thing where you went exactly so that it made a float. I don’t know how many people do that, but I’ve done it to every single mylar balloon I’ve ever had. So I’m confused as to why it’s not sort of more of a mainstay in terms of making sure that they get you all of the neutral buoyancy waiting for it at the moment that you buy it. But anyway, one of the things that would be very interesting with that is that it’s a pretty good reflective surface, but it’s also programmable in the sense that if you have any mind you can, you can inflate or deflate a diaphragm and the launching of those surfaces. And so you’d be able to actually change the lens without optics. If you have a parabolic or concave mirror that you’re able to change the the internal pressure of such that you’d be able to kind of direct those things relatively sort of cheaply. So that would be a lot of fun to be able to kind of then figure out how to have that intervene on an entire space. This is something of a far cry from future attacks, but I think it’s relevant in terms of the ability to manipulate spaces and have them encode things.

Mark Anderson: Is that sort of that kind of display mechanism for real focusing, whatever is that? Is that being worked on anywhere else?

Brandel Zachernuk: So in terms of having sort of a high fidelity central vision and a low fidelity thing, I’m not aware of any particular work. And Gogh’s work has gone somewhat dark as a consequence of moving inside. I think that is more more tilted towards explicit sort of commercial stuff than Microsoft Research, which was part of a commercial organization, but nevertheless understood to be a research wing. So I’m not sure of what he’s done. I don’t know anything about anything that Apple does in that regard. But because, you know, for the most part, it’s been pretty like Apple’s never even really projector, let alone anything else like that. So the displays are as large as the displays are. I think, I guess what was the largest cinema display? Twenty seven. Now there was a bigger one.

Frode Hegland: I think so.

Brandel Zachernuk: Right. So, yeah, I think the impact I think the biggest ever, so the apple. Pro Display XDR, 30, 32, but anyway, so that’s I think it’s the largest display Apple has ever released.

Mark Anderson: I quite like abstracting the notion even back into a generated display space. So in other words, you’re thinking less about it in terms of the screen. But even just if you say here are some data and you’re making a view of it, it’s bigger than I can look at in one piece. And going back to a very early sort of mention of, well, how you make things sort of fade into the background and something maybe that’s one approach because in the sense that you as you look at it, you see stuff. And part of that’s deliberate because you don’t want to give undue presence to the things that are way outside your point of interest. Sometimes you might. I mean, there’s no one size fits all, but I just came to my mind. That’s sort of a really interesting notion in terms of the way you, you approach the display or the creation of information that may be put into a display, let alone how the display itself. Tries to display it, if you see what I mean.

Brandel Zachernuk: Yeah, yeah, well, the reality is that we see so little that that in terms of the fidelity of what that display actually has, if you know that your attention, if not your like your intent like. Your intentional attention, so to speak, is sort of constrained and fixated on a specific sort of conceptual or spatial region. Then you have the ability to do a lot more. You have sort of the ability to do a lot less with the stuff around it.

Mark Anderson: I was going to say which allows you to be really profligate in other areas so that there’s a hidden upside in it.

Brandel Zachernuk: Yeah, I was thinking architecturally, if you have a lot of really dumb terminals of display devices in a space and and they’re connecting to a comparatively scarcer resource of kind of a rendering server. Then if you know somebody is literally not looking behind them, then you just turn them off. You’re not sending them any fidelity. And so if you have the ability to intermediate between the graphics processing and the display devices such that that the projectors back there are just projecting four pixels worth in order to be able to get the sort of the ambient cast of the red sheen of the thing. Yeah, that’s that’s interesting, because right now, yeah, the the concept of aviation is still much more tightly constrained to the level of fidelity that the bottom left corner of your screen versus the idea that it might be a pixel that is actually completely outside of anybody’s direct capacity for a sensation at this point. So that’s really interesting. And sort of back to, as I was saying, that stepping way way back from what CSIS is right now to what CSIS sort of technically and functionally is is an interesting and fairly revelatory sort of moment for my friend that that photo reaction per say, is a concept that one can also step back from and say, see much more of as a consequence. That’s fun. That’s really fun.

Frode Hegland: Well, a completely different tack and something that I’m wondering will obviously will be great, and we are that may be too much is when you show that Chrome, I bought a six centimeter iron ball because I saw a statistic that if you blow that up to be the size of the Earth, then each iron atom is that size. So it’s something to try to teach a girl, so I look forward to many demo type rooms and VR where you go in to experience that in the Solar System and all of that good stuff so far. Any good Solar System stuff for the Oculus?

Brandel Zachernuk: It’s no, it’s it’s surprising, isn’t it, because it strikes one as being one of the most self-evident things to do with virtual reality is to be able to manipulate scales like that. I know my my daughter really loved a film called Cosmic Zoom, which is sort of very closely related to powers of TED, and she used to watch it every time she had to go for it. And but it was it was a phenomenal sort of exploration of what was then known to be the minimum and maximum scale. What’s funny is that we can tackle another order of magnitude or so on either end of it now. But yeah, like just just really interesting as an exploration in two dimensions on a screen, but very apt to have the ability to take that into 3D and just put the rest of it on either side of it would be a really fun thing to do.

Frode Hegland: Yeah, at the beginning of my adventure at Southampton, I wanted to make a sculpture, one metre wide sculpture, 3D printed and clear resin, but every tiny air bubble is a known galaxy where we are in the center. You know, the data sets are available, but the ability to print it that big isn’t. It would be too cloudy and all kinds of things. I just thought it’d be great to have a sculpture of what we knew of the universe. Now updated any you know, we know more next year. That’s fine, but that’s what we know now. But to do something like that in VR, you know, like scrolling, zooming all the way out to Larnaca or something like that, but then always having the Earth in the center. Actually, I met Brian Cox once he was doing a presentation and he was talking about how, you know, humans are no longer at the center of the universe and all of that stuff. And then later on, someone talked about how there is no center of the universe because the The Big Bang expanded everywhere at once. So I just asked him, what doesn’t that mean that we are actually at the center of the universe? So we kind of all right, at least from our own perspective. Anyway, those things could be just amazing.

Mark Anderson: Just imagining that, you know, VR space sort of having this, you know, get out to model, it’s a bit like the Oregon Trail, you know, so you know how on earth I want you to now walk to Mercury Connect

Frode Hegland: A friend, a friend of mine in Norway? I think you’ve met him. Actually, both have been very interested in space and all this stuff. He pointed out that at the speed of light, the time it takes for the photon of light to leave the sun, to enter, I think it was Neptune for two minutes. Oh, by the way, Emily and I saw the movie Ad Astra the other day, the one with what’s his face? I can’t remember. Anyway, it’s a bit ludicrous, there’s some physics stuff that just can’t be done, but it’s got like a seventies movie visual feel to it. It’s a slow moving, nice story. So it’s about travelling a better brand on the Solar System. If you haven’t seen it already, I highly recommend it for the atmosphere. Not the logic.

Brandel Zachernuk: I think it was on the plane on our way to New Zealand just before the before the pandemic and yeah, it looks did look pretty. A lot of languishing in great, glorious spatial environment.

Frode Hegland: Yeah, well, no, it’s good for that. But, right, so, yeah, we’re wrapping up. It’s nice when the conversations go in all kinds of different tangents. I hope to send the first version of the intentions of the lab quite soon to start getting some really good advisers on board, so if you guys have any feedback this week, that will be really, really good.

Brandel Zachernuk: Sure. Yeah, no, I’ll read the document. And I think that that process of getting the. Getting the multicast capability up and running for a quest to be able to kind of consume a document that’s being opened by another machine, it should be relatively straightforward, pretty painless kind of augmentation of that, that current system. And then that would be neat because it will let that other participants get into it. I mean, the right, the right back of it is a little bit more complicated. Not not difficult, but but just complicated insofar as it requires the ability to find the place to write to. And that, but yeah, it should be should be neat, but just technical.

Frode Hegland: So I was just going to say I emailed the two of you the liquid version of the documents. So if you want to comment on it directly, it can do that. Yeah, it’s thank you Brandel for thinking about this and doing it, it’s it’s quite amazing.

Brandel Zachernuk: I know it’s fun. It’s an interesting sort of application in the sense of being applied to something because as you say, Mark, there’s a lot of people. Who sort of say imagine just having a really big screen and and probably they don’t even imagine just having a really big screen because because then now what? Having having the benefit of not having the ability to imagine at all means that I have to build it instead.

Frode Hegland: Well, I mean, I would use the big screen to watch videos that I fell, especially of my son, just, you know, my closest friend who was a photographer who died a few years ago. You know, him and I, we couldn’t watch a movie without. Oh my god, there’s clipping in the sky. Who cares about this same right? It’s just we all, you know, like obsessing over typography, I guess, or whatever it might be.

Brandel Zachernuk: Yeah, I’m very excited. So one of the things that I’ve been trying to do is figure out things that I can get my daughter to do more constructively in on a computer. She doesn’t. She doesn’t use it unconstructive because she doesn’t use it at all, for the most part. I got it to play with Blender and put a put a Santa hat on a monkey head because monkey heads are primitives there. And so you just stack a cone in a sphere and go to to build a candy cane. But I don’t know how much she remembers all of the steps for it. But getting into Photoshop and drawing is interesting, but I learned I used, I learned to paint. We’re using Paintbrush in Windows three. And at that time, we had, you know, a 320 people wanted monitor was a black and white monochrome thing. And so it was by definition, sort of low fidelity pixel art and as such, a lot easier to get something satisfying. When you have a 4K monitor and 32 bit color Wacom into US pro tablet, then all of those possibilities are much less constrained. And as a consequence, at some level, it’s actually a little bit harder to get started. So I got her to go ahead.

Mark Anderson: Sorry, Karen.

Brandel Zachernuk: And so I got I got into and I decided, OK, maybe maybe we will do some pixel art. So it showed us some Susan care, you know, the original iPod designer for the Macintosh. And then we zoomed in all the way to what is what is Photoshop result sort of report the Zoom level as like sixteen thousand percent No. Twelve thousand eight hundred percent Zoom is what they call the highest Zoom level on the screen now. And and it occurred to me that the way I think about. I love it.

Frode Hegland: Do you know what that? Oh, that’s awesome. Do you know what that symbol is? Yes. Being Norwegian, I kind of have to. It means place of interest on a map, right?

Brandel Zachernuk: But do you know what it is?

Frode Hegland: Yeah. Castle from above.

Brandel Zachernuk: Yeah, yeah. You know, that my office is is, is that shape. Really? Yeah. So actually, no, my office is is now officially an Apple park, so it’s a circle. But the building at Central and Wolf, which was designed by foster and partners, but then dropped because because Steve Jobs didn’t like it, but it was far enough along in the planning phase that Fosters sold it to somebody else. And so then they built it. It’s a it’s a common symbol. They’ve got one of the corners, so it’s three sided. So we call it the fidget spinner, but but it’s pretty funny and the floorplan is actually the command. Well, there’s three three lobes of the command symbol and then the fourth is is the cafeteria.

Frode Hegland: That’s interesting. And talking of buildings, you know, some of this early stage stuff we’re working on with VR and so on. To me, it feels a lot like we are the pirates of Brandel five. Hmm. You know, and I remember Steven Levy spoke on the Mac started with basically, you know, at first there was the light. No, I could imagine, you know, a while from now when obviously a lot, a lot of people are going to be, ah, we’re by no means the first, but I could imagine putting the headset on Vint and other people like that, and they see a completely new light and the opportunities that can actually be in this thing. It’ll be so exciting, so exciting.

Brandel Zachernuk: That’s the hope. But yeah, so with with the Zoom and the pixel art, one of the things that occurred to me is that I’ve always imagined myself just being slammed right up against an enormous pixel grid, just that it exists out to infinity in all directions, effectively when you’re at a thing. So when you said using a large display to watch movies, I was like, I don’t think you’d watch a movie like that. But but yeah, like just the sort of the sheer scale of those things, I think is pretty exciting. One of the first shaders I did was to try to explain Retina when we still thought that was a relevant concept to show the actual pixel grid on a on a retina phone versus other phones. Having the ability to make that sort of dynamically responsive or have a sort of a non non-uniform kind of magnifying look to be able to kind of show it. But yeah,

Frode Hegland: I remember that on the website you’re talking about.

Brandel Zachernuk: Yeah, we never shipped my thing because it was people have had apprehensions about using the 3D graphics sort of requirements about that stage for doing it. But but it was it’s an interesting sort of representation to be able to kind of make people aware of and familiar with the sort of the different consequences of this play. Yeah, I just love that that that sort of it’s very interstellar just having having just looking at it as though it’s just 200 metres wide.

Frode Hegland: Let’s end on Interstellar, because that stuff sets off a lot of interesting thoughts, hopefully to inspire next time. Yes, I’ve got to go make pasta for the family. Very demanding cooking task indeed. So you all on Friday and thank you for today, guys.

1 comment

Leave a comment

Your email address will not be published.