C16:24:29 A lot of people holding onto the way things are. And there’s nothing wrong with that like.
16:24:36 It’s still writing on paper is still the best, but I kind of as an example.
16:24:45 I kind of think at times that, like what we call ui.
16:24:50 Now, or conventional ui patterns dropdown menus.
16:24:55 Things like that.
16:24:57 Are currently still critical as a way to get the information that we need.
16:25:03 But it may be pretty soon, that they’re almost like the automata from the eighteenth century.
16:25:08 Little mechanical tinkerings, you know, fun gadgets that are still cool to have on a site or an app. But they’re no longer really necessary, because you have this other way to get the information.
16:25:20 That’s I think, that there’s still a lot of missing to the simple input output that we see in prompt engineering or in the chat Gpt, and I’d love to see things that are closer to creating a composition.
16:25:35 You could imagine making a graph knowledge graph, or even a drawing of ideas.
16:25:42 Where things are below other things that kind of looped inside, and then tell.
16:25:47 Gpt, turn this into an essay, you know not just as a single prompt but something that has cadence, something that hit a point 3 min in.
16:25:57 That’s my launch. Feel.
16:26:03 Other than that’s my launch. Feel at the end of the transcript.
16:26:08 That was a pretty good transcription. We need a way to capture things like what you say.
16:26:13 Better wish to look at the apps here, but I think that was really a wonderful. The terminology is important, and can you do at 5 min on this on Thursday?
16:26:24 Okay. So you’ll be on first, because we do need to question that in my own little I face this going through the corrections and all that stuff.
16:26:34 Now and issues the word metadata, which is just a useless term, you know, my examiner said, that some of the data that visual meta use is isn’t what we normally would call metadata.
16:26:49 My reply is, who cares about data? And that was a very useful conversation.
16:26:54 So, you know. Similarly, we do need to look at the language we have around AI.
16:27:00 I think the birds and analogy. The flying analogy is very, very apt.
16:27:05 Yeah, so thanks for that.
16:27:11 Yes, please. Go ahead.
16:27:11 I’m gonna yes, I was just gonna mention that.
16:27:16 I’m gonna submit very shortly. A a one pager perspective and the way I’m really thinking about it is that it’s important for us to start thinking about the digital, environments that we’re creating and making them safe prioritizing safety and collaboration really so i’m
16:27:41 looking at the Academia through that lens as well.
16:27:46 So, how do we create a a safe digital environment for academia?
16:27:53 That, and enables that enables AI to be used in really positive ways.
16:27:59 But also creates safety. And yeah, yeah, so so that’s what I’m thinking about.
16:28:06 I will send you something on that later today, and if there is space to do a 5 min talk around creating a safe space for academia, that would that would be great as well.
16:28:19 Yeah, no, that’s cool. Yeah. So I look forward to having that thingation on Thursday as well. Fabian, I want to talk about what the link.
16:28:32 Sure it’s a little bit of topic, but happy to do that.
16:28:37 It’s basically so good. I had an argument on read yesterday about a the request pro versus another headset and the quest probe beside all the Meta stuff, the bad thing about it let’s say it’s all stand alone.
16:28:55 So it’s very convenient. But if you need, like a huge graph for huge text, or something like Big 3D model that stuff is not good enough, basically but if you have a desktop, of course, you can use the power of the desktop the problem.
16:29:12 Was at least I thought was that you needed an Api from Facebook which would only work with windows.
16:29:20 But there is a tool called L. A. Ld.
16:29:24 R. For airlink VR, which allows you to use if you have a fast Wi-fi by 5, 5, 5, 6 to do the rendering on the desktop, and then to see the result in the viewer headset, so if you have for example, Bigdr games or 3D models
16:29:43 or whatnot, but you still want to use this, because the optics are pretty good, and it feels super convenient.
16:29:48 You don’t need like base stations. You don’t need whatnot.
16:29:52 You can still use this. So it’s pretty good tool.
16:29:55 But yeah, I was positively shocked that it works on Linux.
16:29:59 So I can still use an open source system, and it also was pointing.
16:30:04 It clicks for 2 min, not like compiling libraries and getting like a dependency held.
16:30:12 So now I’m able to have a still very good uptakes and convenience, but with a much more powerful gpu, so that was for the link was around. So I think it’s honestly it’s an impressive compromise, because you have your
16:30:28 desktop performance, and you still have your stand alone, so that if you just like, go for a coffee or take the plane or what not, that you just think the standard of headset and you can still have a low resolution, version.
16:30:42 So I think it’s good. Good solution.
16:30:45 Not supported on Macro, fortunately, but that’s kind of normal.
16:30:49 I think it’s tricky for the Gpu.
16:30:51 That’s the problem.
16:30:54 This is really cool. It makes me wonder, though the rumor for Apple’s reality headsets stupid name.
16:31:02 It’ll have a battery pack that attaches to your belt.
16:31:08 I have a feeling that must be an optional accessory, like the quest pro.
16:31:15 It only has about an hour battery, and then, you know, you can power it through a cable, because, you know, with the M.
16:31:22 2 and 3, and whatever chips coming out, I’m sure they will probably also allow for offloading of processing to the devices to the Max, or something.
16:31:31 So I know I just just with Little.
16:31:33 This this should, but they would.
16:31:38 You remember the mouse with a charging vendor, so I’m very much in wait and see mode.
16:31:45 Colleague also came, I think, on Friday in my office, make a joke like Paul.
16:31:51 Did you see the apple? Headset this out? I was like, what?
16:31:54 And it was assuming he was an image. Rumor, but I still checked, and I still saw a bunch of different rumors, and again wait and see mode, so it will be there with me there as long as we keep on learning before and after I’m fine, with it I
16:32:12 can show something also. Briefly, I’ll share my screen, and please let me know when you see it.
16:32:25 Do you see something? Okay? So I know if you’re familiar.
16:32:35 So that’s my office.
16:32:37 And this you should be familiar with. It looks a bit like a 3 60 photo.
16:32:45 Very blurry one, but still but what should be strange to most of you is that well, first the caller change a bit when I turn around, so that’s a hint.
16:32:57 But then I can translate to move a little bit to the right.
16:33:01 He’ll be to the left.
16:33:04 And I can zoom in and out up to a certain point.
16:33:09 So I can move inside the 3D. Space if I see if it was a 3D. Model.
16:33:17 Is this a lead? Our skin?
16:33:19 No, it’s done with my 3 60 camera. But with a normal camera, with a normal video that was just the same, I can show you the.
16:33:33 It is at one spherical image, or have you taken more than one?
16:33:37 So it’s. It’s very called video.
16:33:40 And if I see the behind the scenes, it cuts them in a 8 perspective per frame.
16:33:49 And it roughly invert the or tried to guess, the path I took with a 360.
16:33:56 So I started here and then walks around. And I kind of look there and the program without any sensor.
16:34:05 So net, no depth, camera, no imu. Just from the images, basically invert the position and then reconstruct it as a 3D.
16:34:15 Not really a 3D. Model, but a Pdc. In that list.
16:34:20 Which the.
16:34:24 Okay, yeah, this stuff is just really, really interesting. When you see, technologies competing like leaders competing with this and also with the photogrammetry.
16:34:38 And all these different types of it’s fun and useful the one thing that I’m really surprised with the whole commercial quest thing is, by the way, can you do share it?
16:34:51 This screen again. I just want to see our video cameras, because I just wanna check the one I’m using.
16:34:58 If you don’t mind.
16:35:05 So can you grow back to us as in come back to yeah.
16:35:11 See on my screen. My video looks a lot better and here it looks a lot the same.
16:35:17 So I guess the compressions are what happens right?
16:35:20 Okay, thanks. Just wondering about that.
16:35:23 Oh, and so sorry just also to finish on the nerve thing as you might have guessed from my constant with whining.
16:35:35 But it’s all open source. So the entire tool chain, everything I can be inspected, modify automatize, so that, for example, you go from a video for a image or serial image or point of view, you send it to you search for your desktop whatever.
16:35:51 You want to get the 3D model, and then an experiment I did last week.
16:35:55 Was doing this, but instead of a room was a book, and then you can grab the book and manipulate it, not.
16:36:03 High resolution, because, honestly, I was. I wanted like quick and dirty.
16:36:06 Actually, I wanted quick entertain. But there’s a possible.
16:36:09 So I got quick and dirty, and then Wednesday night, to prove the point.
16:36:11 But you can imagine an entire workflow with everything is automatized without going through the cloud without any provider, and you get your object, you take a normal video of it about the right way, and couple of minutes later, in the headset.
16:36:28 Then you get the 3D model of it. And then you can start to interact.
16:36:31 You can share that model to your friends or colleagues.
16:36:35 Get a matter of minutes.
16:36:38 It’s kind of funny. How are these buzz? Phases go through, and everything else becomes really boring.
16:36:45 It’s like fashion. Our VR is dead in the media now.
16:36:49 It’s all about AI. But clearly they are going to be developing in parallel.
16:36:54 And clearly. It’ll be very, very exciting when they both mature a little bit and equally, I’m clear the kind of things you’re talking about now is stuff we should probably get into.
16:37:08 Volume 4 of the book. You know. We’ve spent a year as a community.
16:37:14 And you know, this community. Obviously, if this is not the only community, everyone’s part of.
16:37:19 But it’s so important to write these things down.
16:37:21 I really do think that when you know Apple will be commercially successful, there’s no question about it.
16:37:26 It’ll take them a few years. They’ll mess up like they always do, and then they’ll find their feet, and they were on the interactions.
16:37:33 A lot of the stuff you just talked about it has to be, you know.
16:37:36 I don’t know if you saw on the screen here when I was sharing my screen.
16:37:41 So Mark Anderson, he’s not here today because he’s writing a bit of a history of hypertext for the Hypertext Conference and he’s pointing out how much of the history’s missing, and it’s so important to go to be able to go and read the early
16:37:56 papers quite simply for inspiration. I think so. We are the old, old time for people in the future.
16:38:08 You know our ignorance and our naivete is gold dust for people, and the future was grown up with headsets and grown up with advanced a.
16:38:18 I they’re gonna just be, you know, buying it and using it.
16:38:23 So. Yeah, sorry. At least once a week. I have to say that.
16:38:27 A little remark also about for people in the future.
16:38:30 It wasn’t interesting, wasn’t an article.
16:38:33 It was just like a short back post by. I’ll look up the link on Twitter.
16:38:39 But from a user in privacy, and AI from Princeton.
16:38:42 Professor, I think, who used Chat Gbd. For he 3 years old, kid, I think. Daughter.
16:38:51 And so first, it’s an interesting article. It’s an interesting process.
16:38:54 And an article, but what to me was especially interesting is because this person’s job is artificial intelligence, understanding how it works, what are the limits?
16:39:06 What are the limits for society? All this he’s still does.
16:39:12 Share his experiment with his own kid. What’s interesting is because if there is one thing that people can be, I don’t know, not heard, but mindful about is getting advice on how to raise others people’s kid, and then if you put this on social media, you can imagine the amount of
16:39:36 a gentle suggestion from others like basically, you should raise your kid like this.
16:39:39 You should raise your kid like that if he still shared it.
16:39:43 And what was the most interesting part for me was that he said he was surprised because Tbt was basically mindful of his kids.
16:39:56 Page he did say, Oh, my kid’s page! He did say, Oh, my kid is a three-year-old!
16:40:00 Can you please? Something? What happens when you turn the light off?
16:40:07 And to me that was the interesting part that a researcher, an expert in the field, a professor, was surprised that the boat, the change gpt the lower time with model, did exactly what he wanted.
16:40:22 He asked the the model to address the person as a kid, and the model did, and I think that’s quite interesting, because it shows that dev expert when it’s, I guess, an emotional moment, not like writing about something the abstract.
16:40:40 But leaving the thing still get full, because in the end he knows it’s a bunch of algorithm inputs output bunch of words.
16:40:49 There is no meaning. It just invented, or so what through from by the reader and himself giving meaning to it, and he was still surprised.
16:40:59 So I think that I was fascinating.
16:41:03 Huh? Yeah, interesting. Somebody said, you know. Don’t ask AI to think.
16:41:10 Ask it to do, which obviously it’s a bit of a easy thing to say.
16:41:16 But you know it’s really fascinating when you say it in context of of that.
16:41:31 Okay. I guess my turn now.
16:41:33 Oh, yes, please. Sorry. Peter.
16:41:36 Yeah, I’m thinking about all of the hey, I systems that they’re currently pushing.
16:41:46 I really see them more as a search tool than anything else.
16:41:51 What you’re doing is you’re taking a look at tens of thousands of documents predicting next word probabilities for them and building a model by clustering things in a higher dimensional space.
16:42:02 But it’s so heavily dependent upon what the original training data was.
16:42:07 Now, the Mega Corps love this because they have terabytes of training data and the CPU resources to be able to grind away at it, to build up these models.
16:42:18 But there really is no modeling of the underlying substance of what these models are generating.
16:42:28 I had it spit out, non-existent academic citations for me.
16:42:33 It’s cited Mark Bernstein, as having written a paper that he never, ever wrote a subject that was a plausible subject for him to have written on.
16:42:43 But it was basically asserting that he drew conclusions that were totally off the wall from his entire body of writing, because the language it was generating was Powell based upon having seen dozens of acm papers.
16:42:57 So the quality of the underlying knowledge coming out of it is abysmal and I dread the thought of students flooding their professors with essays that they’ve used this technology to generate because the sheer amount of time to go and fact tracking them would make grading them
16:43:15 utterly impossible if the only model is, show me your result, and I’ll try to figure out whether it was a good paper or not.
16:43:22 What they are really good for is more of creative writing. Kinds of applications.
16:43:28 I found that the A has a real affinity for HP.
16:43:31 Lovecraft, and spit out absolutely marvelous cosmic horror.
16:43:37 Text as long as it doesn’t go too far and start cycling back on itself.
16:43:41 But if you stay within its token window, how much context it can keep in its own memory, it can produce very plausible descriptions of rooms, environments up to a few paragraphs.
16:43:54 Again, before it starts to run off the rails, so as a creative writing seed tool, it’s really potentially effective.
16:44:02 But anything that’s dependent upon fact. You run into trouble.
16:44:07 It can handle. Yeah.
16:44:07 I know. Hang up, Peter, yeah, cause you’re kind of going back to the beginning point.
16:44:13 I think you may want to paraphrase that.
16:44:15 And when you’re talking about facts, the issue with the Mark Bernstein papers, if you ask it to do something academic, it’s gonna produce something that looks academic.
16:44:26 Yes, it will produce fake citations, because that’s what it expects you to want.
16:44:31 So that is why it’s so important to like Alan was talking about earlier to know what kind of flight you’re getting on, you know.
16:44:41 First of all to use AI to write a paper.
16:44:44 I think it’s the most absurdly stupid idea in the world.
16:44:48 Apart from creative writing. Like you just said, you know. Then it’s kind of fun.
16:44:52 But if you’re trying more academic or human to human communication rather than something creative.
16:44:58 And this this is not me disagreeing with you.
16:45:03 Yeah, what? Part?
16:45:01 This is just, you know, trying to frame things. We have to find more intelligent ways to use it.
16:45:09 Yeah, so that’s where I think the answer from the faculty perspective would be, don’t tell the students that they can’t use a tool because they’re gonna use the tool and lie to you. I think the way to do it is we want you to use the tool.
16:45:22 But explain how you’ve integrated into your process and show how you backstop any assertions that the tool spitting out to you to make sure that they’re really grounded in substance, and they’re not just plausible generalizations based upon what the system’s saying another
16:45:39 troubling issue is just general overall bias that’s present in material that’s available on the web versus all material and all potential perspectives on an issue.
16:45:51 It would be very easy to get into pockets of political bias where the system might not see possible alternatives, because the general consensus and milieu on the Internet is looking and one area of the search space and there could be a lot of other possibilities.
16:46:11 But if they’re not something that’s been talked about in a large number of sources that the model has been trained on.
16:46:16 The model is going to be unlikely to get. So let’s put a real real hot temperature and taught it to be really creative. But then it’s going to be throwing out so much pure fictional stuff that’ll be hard to separate the wheat from the chaff so that’s an issue.
16:46:30 I know some people like trying to train them to bias them for alternate pockets of bias.
16:46:38 Since the general Internet community is probably somewhat more liberal, possibly, than the general population there’ll be a certain shift in the perception.
16:46:46 Oh, what the AI consensus of reality is versus.
16:46:51 What ground truth would be if we had access to everything, including things that aren’t currently digitized.
16:47:04 I am sure.
16:47:04 So we have a lot to think about there. And of course, from again the paper writing perspective, I really like the Literate programming model where you write the essay about how you’re writing the essay.
16:47:16 In addition to the essay.
16:47:22 So that you can get basically the meta-level.
16:47:25 Hi, what your research was. In addition to the final output level.
16:47:31 And that would really prevent people from just turning the system loose and submitting work that they hadn’t contributed a significant amount to.
16:47:41 Oh! There are also some very interesting things coming up now. The Us.
16:47:47 Patent Office, refusing to copyright material that leg was generated by AI, and then some of the people using the tools of Come back at the Patent Office that kind of trademark over saying, Well, women, we should be able to copyright this because yes, this graphic was produced by the AI but you
16:48:05 know. I spent 6 days praying different prompts with it, and rejecting vast numbers of generated output images until I came up with the images actually matched.
16:48:17 What I wanted. So I should be able to copyright it.
16:48:19 And that’s gonna be a real big raging, deep legal debate.
16:48:21 And I think it’s gonna come down eventually to the question of fact, for a jury to take a look and decide whether you really did enough engagement with the tool.
16:48:30 So the result of you, plus the tool should be where they have copyright.
16:48:35 A bright line thing if you use that tool, it’s no good.
16:48:38 Well, then, where do you draw the line in the other direction?
16:48:41 Couldn’t someone just as easily argue that you can’t grow well free hand.
16:48:46 But if you use a draw app to produce your nice straight lines and all, you shouldn’t be able to copyright it, because there was an app that figured out where those coordinates should be, and drew those lines for you.
16:49:00 I had a discussion with someone this morning on that. Well, very parallel, which was, I saw on Facebook that someone said, Oh, so exciting!
16:49:10 I was in this place, and I borrowed a friend’s lightga camera, and the pictures look really really good, and you know it’s a little back and forth.
16:49:21 How nice blah blah blah! What model was it that kind of talk?
16:49:23 And then one of my friends, a very, very intelligent person.
16:49:28 Close friend, said, she doesn’t believe that photography is about the equipment, and that was very interesting, because it relates to what you’re saying for me.
16:49:36 Huff. A photograph is the subject. The other half it is the lighting, the texture, the voca, the what the lens contributes all that stuff.
16:49:49 So if you take a picture with an early iphone or if you take a picture with a massive medium format camera to me artistically, those are very different things, because there is a contribution by the equipment, and the reason, I said kind of mirrors a little bit what you’re talking about is
16:50:08 because you know, what does photographer do they click a button?
16:50:12 The people who made the camera, should they, on copyright to all the pictures, because they put in much more effort than the photographer just pressed a button.
16:50:19 It sounds like it’s a provocative analogue.
16:50:33 A couple of comments on on that wide, ranging lot of good stuff, and largely agree.
16:50:43 It would be interesting to gather a bunch of claims together and sort of sort them, and into.
16:50:54 That’s how long the lines of what are things that won’t change over time.
16:51:03 And what are things that will right? So, for instance, we’re talking a lot about large language models and the way that they operate.
16:51:12 And that’s exciting. But it’s definitely not.
16:51:15 I mean, it’s just one peg on this journey, and so far the journey’s been amping up, you know, what’s going to come after transformers.
16:51:23 There’s nothing to be something that’s obviously limitations of large language models.
16:51:25 It’s not the end, all be all. Maybe the next thing will be some idea of a Meta token, you know.
16:51:30 Some sort of, you know. Zoom token qualia right, which gets close to ontology tracing, which is kind of like what Marvin Minsky was trying to do at the beginning of it all.
16:51:44 There are going to be different approaches, and they are going to do different things.
16:51:49 Well, so it’s useful and good to talk about.
16:51:54 You know, here’s what this is best at.
16:51:57 But it’s at times going so far to say, this is what it can’t do.
16:52:03 We have to make sure that, are we talking just about large language models?
16:52:06 Are we talking about email? And I’m certainly not on the side to say that email can do all things.
16:52:16 I’m actually very suspicious of it in the way that I’m suspicious of. You know.
16:52:25 Government, in a lot of ways. You know, anytime that you automate something you’re going to pave over anytime.
16:52:33 He make an interstate. You’re gonna pave over neighborhoods at some point that’s mainly me coming and reading the power brokers.
16:52:41 So that’s where that’s coming from. But there are things where automation will never really be able to do.
16:52:48 Well, and there’s a lot that large language models is tied to very similar to automation.
16:52:56 Specifically the point of corporea, or sucking.
16:53:01 And all of these multiple corpuses of text has the flip side, the double edge, sword.
16:53:10 Whatever the power of that phrase goes, of limiting its knowledge to legacy.
16:53:18 Legacy, formats, legacy, ways of thinking right? So if we suddenly moved into a virtual reality, world like, I don’t know, there was some sort of bond that went off and blew up all the Internet and blew up all paper, and now the only thing that we could do is use
16:53:33 our you know, hands and gestures in motion inside of virtual space, to transmit ideas.
16:53:41 The AI. Would probably wouldn’t be useful for, like 30 years.
16:53:45 Right. The it’s the brute force work of all of the stuff that’s already been on the Internet.
16:53:52 That’s helped it get to this point.
16:53:55 And so, yeah, I, Peter, I agree a lot of what you’re saying.
16:54:00 There, and I don’t think I disagree with any of it.
16:54:06 But I but it’s important to maybe decompose some of the forces at play, you know there are forces around what is copyright protection.
16:54:18 What does it genuine and actual? Then there’s the other forces.
16:54:23 Of what can this technology actually do? Or it’s limitations.
16:54:26 What are its limitations this month versus next month. The thing when I said earlier that I’m optimistic about and excited about is that I do think going to I don’t think anybody’s really processed.
16:54:41 How? Yeah, how Gpt is going to affect, how we write and how we read in particular.
16:54:46 Remember, I’m really, really excited about that.
16:54:52 Anyway, that’s just a lot of thoughts. From what I heard, I’m sorry I missed a bunch of it. Yeah, go ahead.
16:54:57 So, as you all know. I tell you many times I’m finishing my thesis, maybe trying to do corrections, and a lot of it is truly bullshit.
16:55:07 A lot of it is. Make this sound more academic. A lot of it is.
16:55:11 Add some stuff here, and I talked to my advisors about this, and with the whole AI revolution.
16:55:18 And then I won’t talk to me while I’m in process, so to speak.
16:55:21 But they are saying that academia will have to change, and I think we’ll have to write more hypertextually and less worthy one of the hardest things, in actual thinking is clarity right?
16:55:38 I mean. Yeah. You while Noah Harari, he writes that he meditates twice a day for 2 h.
16:55:44 I’m sorry once a day for 2 h, and it is to achieve clarity.
16:55:48 I don’t meditate much, so I’m not.
16:55:50 This is not about me, meditation, but he uses the term clarity, and when you are trying to learn something you are trying to achieve clarity in heritable.
16:55:59 But for some reason I. Academia is not about claritin is about prestige and sounding cool.
16:56:06 You know, I have a problem with my thesis because I can explain it in 20 s.
16:56:12 Right, there is so much that has to change culturally, when anybody can use an AI to do something, even if we ignore the issue.
16:56:22 That is valid. Peter of fake citations, even if we have a chat, Gpt, or similar interface, or we say, using no fake citations, I will add them later through a separate mechanism.
16:56:32 You know. Write a 30 page piece on so and so, based on this and that, even if it’s a really interesting prompt, who would the hell benefits?
16:56:41 There is no intellectual benefit for anybody other than maybe as an inspiration.
16:56:47 As you, said, Peter, I agree with it also. An inspiration for some kind of creative writing task.
16:56:52 So we have to change what it is, I mean. Chris Gottaridge, colleague at University of Southampton.
16:56:59 He studied for Phd. Men years ago and gave up on it wasn’t for him, and he is a ridiculously intelligent person.
16:57:04 It just wasn’t his mode of thought. But one thing he did discover in this research was most of what a paper is about you can find by reading the last sentence of the abstract.
16:57:18 You know, and the abstract is where the author feels they can actually say what it is.
16:57:22 The lost sentence is the key thing. It’s ridiculous.
16:57:25 It. Basically almost always works so I’m trying to bring that into reader as a feature but what I’m saying here is we need to reinvite reinvent writing because anytime somebody is good at something they will want to go to the next level.
16:57:43 If someone is good at being an academic they will be wanting to be more academic, not necessarily more well informed, or a better teacher.
16:57:54 If someone is good at singing, they are going to do many la la laws, not necessarily a song someone else will like, etc., right?
16:58:02 We all do that in whatever field to a certain level, we just improve what the thing is, not necessarily how it connects to the rest of the world, right?
16:58:11 So now I’m going to connect that to Phabian.
16:58:20 I’m wondering when you say this.
16:58:22 Yeah, it’s something, I think. I hope at least that most of us do.
16:58:26 Meaning that we go through their abstracts. We go with scheme through to see it makes sense, and if if somebody recommended as a paper, none of us has enough permitted time, so we need to double check that, you know, we know too much about it, or we don’t know enough, or we doesn’t
16:58:45 make sense. And we need to dig somewhere else first. And indeed there were not shortcuts, but pretty, repeated patterns that help us, and some bad habits let’s say some skipping through the experimental protocol this kind of things but I’m wondering to repeat
16:59:05 a bit what you say about oh, as soon as Apple is going to use the headset, then they don’t need to own the interaction.
16:59:15 Most people, not to say everybody, but most people that we do stick to it.
16:59:21 I’m wondering what’s the equivalent of Chat Gbd, or large language model for writing.
16:59:26 Aren’t that? Are isn’t going to make people. I don’t know.
16:59:30 Lazy in some way bad ways, because, for example, I think a lot of people use Chat Gpt to summarize, whereas, indeed, the introduction or the executive summary, I mean, as as Bob would say, if you don’t see need to summarize if the structure the table of content everything
16:59:56 is properly done, because that should not be the answer most, then you need to navigate to the right place for your query or your inquiry.
17:00:05 But but I’m wondering how many of us, how many of everybody is going to use Jgpt or launching one model to think by the default way, meeting what everybody is doing, where brats they already are, some structures in kids.
17:00:21 So I’m wondering, is the Gpp going to own some of the interaction with text making us eventually lazy, and that I think again, in just defined way, because there are good techniques like this, like the last sentence of the abstract data.
17:00:40 I’m really, personally very interested in that. Like.
17:00:45 So to go back to where you started about.
17:00:49 You know the I assume that people, you know scan, scim the white paper, whatever you look at the index and everything, and you might be right on that.
17:01:00 But it’s also kind of surprising how many people don’t, I’ve been reading this lately and enjoying it, and it it actually had me thinking a lot about future texts, right?
17:01:12 Because early on just saying, like the value of not just skimming and looking at the table of contents, but also looking back at the index, seeing the words that are used right?
17:01:23 And then maybe sort of randomly. Picking out a bit in the book and kind of getting the vibe of it right kind of like a beat cop walking around the neighborhood trying to get a sense of what’s this neighborhood about right?
17:01:33 And I got to the index, and you know it’s just this list.
17:01:37 And nowphabetical order. And I’m thinking how ridiculously unhelpful that is.
17:01:44 It’s helpful in a dictionary, but I wanna see when Marvel Andrew Andrew Marvell was brought up.
17:01:55 How close is that a relation to Alexander Pope?
17:01:56 That’s on the next page, right? So I got to thinking about the graphs that are in reader and altar, and how like, why aren’t there graphs as just a part of all Pdfs or part of printed books and Graphs doesn’t even have to be the right word but
17:02:16 a clustering of associations. Here’s what you can expect to find in the next 30 pages.
17:02:21 Here you can expect to find overview like.
17:02:25 And then a book has evolved to handle a lot of that.
17:02:28 But a lot of that is the way in the e-book format.
17:02:32 Right, this is the main point that I’m kind of getting to.
17:02:35 We have 50 years of technology that has actually sort of gone away from our intuition.
17:02:42 Are these longer crafts that have helped our sense of periphery and our ability to look at this different scale?
17:02:51 And and we’re stuck like, in a way, a book is a lot like a gun in that.
17:02:57 It’s can be a like a religious object that can’t be changed or improved on right like a gun.
17:03:04 The only way you can approve a gun is if it shoots harder or faster, or more accurately right.
17:03:09 That’s what a gun is, and a book is like the same thing like we can’t.
17:03:13 We can’t. We seem to be unable to.
17:03:20 Rethink what it could be. That’s the troubled analogy.
17:03:24 I’ll put that aside for now. But what I hope is that chat? Gtt.
17:03:29 Will kind of break open power models our current legacy models.
17:03:33 Of how we think about this.
17:03:35 Yeah. And I’m glad you put that one aside.
17:03:38 That’s a bit yeah. Problematic. I think.
17:03:47 No product.
17:03:45 By the way, I Bradel, we’re trying to talk a little bit about what we’re gonna talk about on Thursday, which is AI, and writing and reading and academia and stuff and Alan was talking about different ways of reading a document and things should be in there this was based on
17:04:03 what I think was right before you came in, a friend of mine, Chris Gottridge.
17:04:09 He figured out that you can understand most of what an academic paper is by reading the last sentence of the abstract.
17:04:17 That seems to be the sentence where they put what they really mean.
17:04:19 I’m gonna try to build that as a feature.
17:04:22 But Alan can expand it upon that. So I think what you’re talking about there, Alan, is also a context awareness.
17:04:29 So yesterday Edgar was not well, and I was sitting down here working early in the morning, and I heard Emily say, Pro, come up, and when you have a child, and you have that sentence even a relatively neutral terms, it’s the most horrifying effing thing ever it’s really weird you
17:04:49 know it’s like, is it just a nothing? Is it a stomach? Pug?
17:04:53 It’s a being or something. Your mind races like crazy, right?
17:04:56 At any other time, my wife would say, quote, Come here! It would be a very positive thing, and we all know about this you know, I’m not teaching anything here.
17:05:06 Obviously, I’m just highlighting the fact that what people say something and an academic document, how do you get the real context for that?
17:05:14 Are they being praising someone, or are they being spiteful like standing on the shoulders of giants?
17:05:20 When that was used, it was basically a bit of a joke right?
17:05:24 That’s not how it comes down later. How can we imbue contacts better?
17:05:29 And how can we find it out later? Better. And Randall thanks for coming in at this point, because one of the things we’ve been talking about is using AI stuff to write an essay for us doesn’t really help anyone.
17:05:42 And it’s gonna be a lot of nonsense in it.
17:05:45 So how can we change the language? This is what Alan was talking about in the beginning.
17:05:49 Not just use the simple term AI, but one of the examples he referred to.
17:05:53 Was the 4 flights, the humans, could do. You know, flying was just bird fly.
17:06:00 We have many different understandings of what it means to fly you know, we need to expand the vocabulary for what different kinds of ais can do for us, and the really important thing I think we need to do is change writing towards clarity.
17:06:16 You know we all have an intent when we write something we intend to communicate something.
17:06:22 How can these systems help us with that? And how can we help the reader understand?
17:06:30 Understand critically understand what that intent was over to Papian.
17:06:37 So it’s going to be much more basic than this.
17:06:41 But it was on on the weekend sometime at some point.
17:06:50 Let’s say we go. We go with our tools. We go through the abstract introduction.
17:06:54 We see it’s actually a relative article of paper whatnot we know.
17:06:58 It’s going to take a while to read it, and we’re on the move and I end up doing like this.
17:07:04 I’m up and down to scroll and scroll and scroll for this on the desktop.
17:07:08 I am. I wrote a little snippet for the browser, so we just crueled automatically.
17:07:16 I don’t have to use the mouth wheel, and it forces me to be focused because if I got in the moment then end of the paragraph, what happened?
17:07:23 I don’t know what’s there, and I’ll briefly share the screen.
17:07:29 But basically, I did a virtual for mobile phone that makes you oh!
17:07:40 And that’s can you see?
17:07:45 Okay. Thank you. And it’s through the browser.
17:07:49 No app to install little Button to do it faster, slower, etc.
17:07:54 So there is no AI there. I wrote the code the old school way, with my little hands on it, actually not on a keyboard.
17:08:04 That’s the funny part and that’s why I mentioned it.
17:08:09 It’s because I wrote the code on the mobile phone itself.
17:08:11 So I was on my phone writing the code and so that I can reshape the way I read on Mobile on my bed you don’t need to know details, was the weekend lazy. I guess.
17:08:24 But I was still coding, so that I can read hopefully better, maybe even faster.
17:08:29 And I thought that was, it’s not the most convenient way that somehow it would I was able even to share it back on Twitter, and I’ve done all this with my thumb.
17:08:41 I think maybe I typed with 2 hands. I don’t remember.
17:08:44 But on this tiny piece of her, on this tiny slab of glass and plastic in their tracks.
17:08:50 And I thought this by all the AI hype at the moment.
17:08:54 That was pretty amazing to be able to have high quality contact.
17:08:57 The text, some images all that, but also being able to shape it back, who just that?
17:09:03 And I thought, Yeah, that was quite an amazing moment.
17:09:07 Because it’s there is no evidence. It’s a computer.
17:09:12 But it’s like a tiny computer that fits in my pocket, and to be able to reshape it this way.
17:09:16 So that was just quick, small example. But I’m pretty sure I’m going to read more in the subway and elsewhere.
17:09:23 Just because I have this detailed details that’s been done.
17:09:28 Not on the move, but without any assistance. So just a tiny example of a tool old school kind of thing that doesn’t necessarily need AI, but might shape for change a bit the way we over the way. At least, I read.
17:09:46 Yeah, definitely. And also there’s the text kind of blinking, scrolling system.
17:09:53 Yeah, no, that’s definitely something to look at. Peter.
17:09:58 Yeah, I’m gonna have to leave in a couple of minutes.
17:10:01 But I wanted to do a quick mini demo for you.
17:10:03 Of what I’ve been working on last couple of weeks.
17:10:06 So if I could share my screen real quick.
17:10:09 Yep, please.
17:10:16 Okay, can you see it?
17:10:21 Alright, basically. What I did was I built a command line terminal widget for use in the browser or in web based apps.
17:10:34 And I have that tied to a side escape graph visualization.
17:10:40 So we have a command history here. If we enter something that doesn’t know they’ll say it’s non-recognized command, and we can use the up and down arrows to cycle through previously issued commands.
17:10:54 Let’s say we wanna add a node to the graph.
17:10:57 There it is, and let’s add B. Now we have 2 nodes in the graph.
17:11:05 Add C. Maybe we don’t like how they’re laid out.
17:11:09 Shake it up a little bit. And it will redo the layout.
17:11:12 Now let’s link C to B.
17:11:24 Trying to spell. Track me alrighty!
17:11:31 So you get the idea we can type in to link nodes together, we can add new nodes by command.
17:11:48 It’ll draw the arrows and you can check up the graph and the code. Do.
17:11:52 That is so like a generic readevaluate print loop that we have over here.
17:12:00 And again we add commands, and then we tie them in to the Api for the graph visualization.
17:12:05 The nice thing about side escape is, it also has built-in functionality to be able to go in and workout path between given nodes. So we can figure out if there’s a series of links that will connect one node to another and it can also find all of the loads that are pointing into a given node and
17:12:24 basically give you a transitive closure on the graph. And I wanna use this in working with grammar development so that I can capture the relationships between different productions inside of a grammar.
17:12:37 And then I can have a grammar production mover, which will have 2 grammars side by side, and the ability to grab a rule from one and drag it into another.
17:12:47 One, and by using this functionality, pull along all of the rules that the rule that I’m dragging over from one gram to another, dependent on so I’ll be able to get the whole dependency of subgrammers from one grammar moved into another one and that should help with some more sophisticated
17:13:04 work with text.
17:13:08 When you say grammar, what do you mean by grammar?
17:13:11 In this context.
17:13:11 In this context, I’m talking person, expression grammars that can be used to recognize.
17:13:18 Arbitrary language structures.
17:13:26 It’s nice to see the left hand side of the screen.
17:13:32 Like this?
17:13:36 So that’s how the system operates. Hey? You have some nice.
17:13:47 And this way it’s part of the main visualization.
17:13:49 It’s a little bit more attractive to look at.
17:13:52 I have the core functionality that I want. And again, we have just this is just like a generic framework.
17:13:58 Any code manipulating anything could be using the terminal component over here in it, and.
17:14:07 This sort of like a layer of tooling will help.
17:14:10 Also, I spent a fair amount of time debating between whether to use electron.
17:14:19 Or a neutral Lino Js. I’m now leaning to.
17:14:24 It seemed to have a few less bugs, and as a built-in file watcher I haven’t cried out yet, but if it works that will again help in automating, tooling.
17:14:35 And all, best of all, it’s executables that produces are much smaller, like 2.8 Meg files, as opposed to 300 plus Meg files for electron applications.
17:15:07 to pull along half of the Internet just to write a small hello world.
17:15:15 Yeah, that’s useful. Very cool. Anyone have any questions or comments before we go over to Randall.
17:15:49 Okay, I’m gonna have to duck out now to provide transport from, and I’ll see on Thursday.
17:15:56 And it’s so regular time, right?
17:15:54 Yep. Look forward. Thanks, Peter. Yes, yes.
17:15:59 Okay, same bad time. Same bad channel.
17:16:02 Yes, exactly.
17:16:04 Thanks. Kevin.
17:16:05 Bye, bye!
17:16:06 Bye, bye!
17:16:09 Let’s see what happens to your screen.
17:16:13 And let’s see, stop sharing. There we go, and heading out, bye, guys!
17:16:18 See you later.
17:16:21 Interesting that the shadow, they use, the shadow planes in there just to imply what is sort of hey surface.
17:16:30 What’s in the surface, and what’s on a surface, the way that they inspector is kind of presented as though it’s a an instrument on top of this machine that has a thing inside it.
17:16:42 I don’t know if that’s an intentional thing on this part, but it’s a it’s an interesting thing to think about in the context where there are more.
17:16:52 Shall we say?
17:16:56 Intrinsic sort of consequences, more more like deeper consequences.
17:17:02 For things like 3 Dcs. Because if you had, like a an actual surface, then it would make it quite clear that something’s in a surface.
17:17:13 Then that means that this sort of is an attribute or a property a much more intrinsic attribute.
17:17:18 Property of that object. In the same way that a control panel on like in like dipping into a table means that it’s something that’s sort of fundamental to the table, whereas, like a screen on top of it would imply a different thing about the relationship between the information, and the screen, you
17:17:34 could conceivably have a an object that’s simply is resting on top of it, like the way that my phone is on the table here.
17:17:43 I wanted to talk about the instrument of writing and of books and book making, not of gambling, but constructing books back to what you are talking about, Alan, with regard to hypertext and links and maps and things like that, and the oh, that is that a tool for
17:18:09 My King!
17:18:09 Oh, lovely! Yeah, that’s cool. Something that kind of blew my tiny mind.
17:18:16 Recently. I was, you know, like, for people who don’t just collect information like this.
17:18:23 So just looking back and thinking about the fact that the saxophone was made in the mid nineteenth century by, but also, like the even the valves that only came about in about the 18 thirties, which means, like.
17:18:38 And once you start thinking about how little time that is and culture years, then you start to think about how poorly and integrated the trumpet on the saxophone are into the rest of aspects of music, you realize how little time we’ve had to think about what they’re for and what
17:18:54 they go with, and how, despite the similarities, the trumpet is one of the horn is one of the oldest musical instruments we have.
17:19:05 We’ve cut things off animals and then blown through a hole in the end for quite a while.
17:19:11 Now the sort of the complex complexities and the realities of of a vowel trumpet are such that, like, despite the similarity to the things that occurred in the past, they what we do with them now doesn’t have all that much to do with them, because we
17:19:27 because the way that we understand how to sort of integrate census is so much more fragile and superficial that I would expect, and then that same way, the sort of the connective technologies and the creative technologies of constructing a book I’ve been send
17:19:44 it around, if not, Manus based text than justified text.
17:19:49 And all of these other forms, such that we simply haven’t had particularly manageable reproductive methods for information of the form that you’re kind of describing, and that’s not to say it’s not an excuse.
17:20:04 But it is an invitation to go like we clearly have not just a little bit more, but much more.
17:20:12 Now, and realizing how hard it is to grapple with even the margins of a problem, slash opportunity space like that kind of not on land titles, but obliges us to consider what we might be able to do with authentic alternatives and doing things like reading bob
17:20:39 horns, books, and and honestly struggling with them, sometimes because it is a such an illegible form.
17:20:46 Not by any fault ahead, but simply because he’s trying something.
17:20:49 Off, then, authentically. No, with it, I think it’s really interesting and really intriguing to sort of interrogate the form and go like, well, what do we get out of different?
17:21:03 And what are we losing as a consequence of the very real fragility of?
17:21:10 Not just ourselves, but everybody’s lack of capability to learn how to read these things like the way that we make sense of a form is, it’s so much more fragile and so much more contingent than we would initially assume like you sort
17:21:22 of naively want them nicely. I’m not. I don’t speak for you guys assumes that if you were to come up with a book like, Follow a book that is even a little bit different, you know what what to do with it.
17:21:35 But we don’t. But we would get different things out of it.
17:21:39 So, yeah, I I think that that scale of the history, what we have done with what we’ve done with the form so far have a massive impact.
17:21:53 And while it doesn’t discount the lack of experimentation, it does explain it and invite us.
17:22:00 And hopefully to focus the way we might direct our investigations as a consequence of having that much more at our at our fingertips today.
17:22:11 Yeah, I appreciate you saying that, Brandon, a couple of other thoughts on that, because it does, and also I want to comment on your link about Alice’s adventures.
17:22:24 Wonderland, because it ties into this. But yeah, wholeheartedly agree. We are.
17:22:29 I mean, there’s so much around us. Get distracted by we’re in a state of kind of like a constant modernity.
17:22:36 And it’s it’s kind of a shift in perspective.
17:22:42 When you realize how recent the keyboard is, or the saxophone is, or things like that, even from reading this book a little bit.
17:22:47 Finally I realized that, like, you know, there are, there are these. There’s a syntax to books themselves, you know, that includes a table of contents includes the Glossary Index that we’re all familiar with, and the focus has for a very long time been on the course.
17:23:05 The content itself right, but in the sixteenth century there was far more focus on the table of contents and on what kind of description should I put around this?
17:23:15 You know what are what our page numbers. And and thinking through that I realized that I lot of what we call tools for thinking tools for thought.
17:23:25 Now really should have thought of as the table of tables of content.
17:23:31 They are different kinds of tables of content or context, and they’re kind of experiments in these extra appendages that come with books that we’ve kind of forgotten to think about.
17:23:44 We’ve been thinking so much on the narrative and on what’s called the body, or the content that we’ve left these other things hanging.
17:23:51 And if you, if you put tools for thought in that context that actually gives me some relief because it shows that there’s some deep roots to this struggle of how do we get through book, how do we think about it?
17:24:03 How do we know that we want to invest all this time?
17:24:06 Right and now there’s like a if I think about it them as more of tables of content rather than tools for thought.
17:24:17 I can add it to this longer, like you know, list of innovators in the book space.
17:24:25 So I totally agree with you there, and that’s what I’m excited to see with even with the the shake up of Gpt or larger machine learning that it might make certain things easier, so that suddenly, like even the standoff properties might not be necessary anymore.
17:24:47 From a back-end perspective of capturing metadata, but they may become a really cherished front end tool of an alternate way of experiencing text, so to tim it over to your Alice of interest in Wonderland.
17:25:02 I love what you’ve done here. Yeah. First, when you start skimming, you do the power scim, and everything goes crazy.
17:25:09 And you’re like, Well, what’s this? But then, after a few seconds, you start skipping like a frog in a pond, and you can just jump through the paragraphs.
17:25:19 I really love it, and it makes me want.
17:25:24 If a way like it would be wild if if you changed your perspective when you’re you know, scrolling if you got a kind of supplyology right?
17:25:36 So that, like I met pig and pepper, if pig and pepper sort of raised up, not just as a text, but sort of above or behind.
17:25:43 You know the opening text, and a they had a kind of a mountain range effect, where maybe every time Alice is mentioned, there’s a kind of you could be color, or it could be that there’s a particular sort of level that Alice is at and what would that experience be if you’re skimming now
17:26:01 you know, that’d be pretty wild. Anyways, I love it so. That’s all I have to say about those things.
17:26:10 Okay, plenty of things on that. So just come back from Japan.
17:26:16 It was very nice. I have my iphone 14 pro which I’ve already said is quite insane in terms of quality.
17:26:23 So even little family moments were captured with that, and even though I have an M.
17:26:28 One macbook pro, with all kinds of RAM and hard drive, whatever editing it is an absolute dog.
17:26:33 It takes so long parts of its, so that’s just really annoying it’s probably the way I’ve set up the hard drive. I’ve done something wrong doesn’t matter.
17:26:43 I also have the opportunity to do 360 video, which I didn’t do.
17:26:47 I did a little bit of lead art scanning dark tonight, Japanese houses, which was nice.
17:26:53 So that’s when you talk about mountain ranges and stuff.
17:26:56 To currently offer very stuff like that. This is a lot of work to building a interactive thing like Brandal has shown us to, you know, mixing, it. It’s really, really a lot of work.
17:27:11 So I think it’s really interesting. You bring it up in terms of our discussion on AI, because I think that is where it’s going to be.
17:27:19 Really really interesting, because if you look at images created, yeah, using these prompts things, it’s neat.
17:27:27 But it’s not necessarily what an artist would do, and when it comes to what you’re talking about here, there’s a very large for all caps.
17:27:37 So what’s right? If you can say to a system, go through this corpus, and what I want to see is Lisa on a mountain.
17:27:47 What I wanna do. You know you do that? It may not be in the specific artistic style that you first envisioned.
17:27:54 Maybe you can prompt change that, too. But you haven’t entirely new way of presenting where so much of that manual labor of the visual stuff is taken away.
17:28:05 That is absolutely incredible, you know, I’ve been so annoyed for so many years.
17:28:11 People talking about rich media, and oh, it’s so boring with text.
17:28:16 The next thing is video, you know, I’m a professional with this stuff.
17:28:21 And even for me it’s a lot of work, but that’s changing.
17:28:23 It’s amazing. If that’s a comment on that.
17:28:26 And then there’s another thing in this community. We go all over the place with text, but we have to look at, you know, text for what?
17:28:34 And if we’re looking at text and academia, there is several very, very distinct things we have to honestly look at.
17:28:40 Number one is what a student text student text is there to show that certain criteria for a grade has been met really boring, it’s really dry.
17:28:52 But that is what it is, and there are legal requirements for that, you know.
17:28:57 When I did my thesis I had to look up.
17:28:59 What the different levels of education in the Uk need. And it’s really vague and this all conservative and, by the way, one thing that’s never mentioned is citations according to the law of education helps judge, you don’t need to site anything that’s a culture.
17:29:15 Of academia. Of course that’s important, but it wasn’t. There.
17:29:17 So that’s interesting. But what we’re talking about this text, I think in our community, what we’re really talking about is often it’s text to think so.
17:29:29 It’s for yourself. And then what you decide to share it, then it becomes text to get your intent across.
17:29:37 But there is that other stream of academic text which is just to prove something which is kind of annoying and boring.
17:29:42 But gotta keep it on, mind. That was that.
17:29:50 I have to jump fantastic, seeing you all again.
17:29:54 Yes. Yeah. Thursday.
17:30:02 Something about done similar to the the Alice monitoring thing recently. So that’s just using Css.
17:30:11 To move it around. So it’s not sort of thing.
17:30:14 It works, for example, they are. But but I took I don’t know if anybody’s played.
17:30:18 There was a text based game called a dark room. Okay?
17:30:23 Using tricast text. I’ve made that and Webex are so that you have the ability to so it’s sort of click buttons and then get a message log and have messages messages.
17:30:35 Be added to it, and trade in the trade out, and things like that.
17:30:37 But it’s really exciting, but it makes me realize that that sort of the technical backdrop of all of that stuff is absolutely appropriate for some of those more fine-grained experiments with text and scale position in zoom and distortion that Alan was referring to in the context, of
17:30:55 the Alice in Wonderland party there so very much looking forward to that.
17:31:00 Yeah, that’s a really important point that you were saying.
17:31:02 Quote about about the different purposes and functions of text.
17:31:07 I think that one of the things that I’ve always been troubled by as the sort of neutrality of text entry.
17:31:15 And what it is to do things with, and that a lot of people so believe that the neutrality of the tool was sacrificing to this point because you could use it using it for communicating to other people, or you could be using it for sort of sense making for oneself
17:31:45 Hmm, yeah.
17:31:36 despite the fact that one rarely uses company. For example, like apple notes, to write to anybody else, and one rarely rarely uses Microsoft word to write to oneelf.
17:31:53 The 2 functions are relatively distinct in terms of the and you know they’re manifested a little bit by how reachable the test.
17:32:02 Formatting tools are the sort of the canonical typical text formatting tools.
17:32:06 Tools are present within notes, but they’re actually behind a behind a button click or a keyboard press, whereas there!
17:32:13 So, if ever present, and in fact, pretty difficult to get rid of in Microsoft word.
17:32:18 And so there are so input implications already in the form of the tools that are initially biased towards that.
17:32:26 Self, versus other, but it would be interesting to think little bit more about taking that further, and where the lines are with people’s comfort about the different functions, it’d be interesting to think about.
17:32:38 Yeah. Liquid and an author, I think, whether there are ways of emphasizing the self versus others, or or that sort of Internet versus outer, whether there are dualities between the sort of the presentations are the same
17:33:01 pieces of writing, in order to be able to kind of carry that expectation more explicitly about whether you know whether things are for one or the other, or a if author is for one versus the other.
17:33:23 One interesting constraint is the text we’re generating.
17:33:29 Now the transcript of this conversation, and if this whole conversation transcript is put into it’ll say it’s too long.
17:33:38 So if there was an intermediary tool that could find the long, you know, when we have specifics, presentations find the long bits, have those summarized separately, you know some are mechanical to augment the augment.
17:33:58 The AI. That could also may be be interesting, because.
17:34:04 You know the text for whom you know that we are a very specific text, and you did a magic blanket a year ago.
17:34:13 Brundle, you know we should be able to do that.
17:34:16 More more easily!
17:34:22 So the thing is. Yes, I’m all about the future of text, but for me, text is kind of sacred.
17:34:30 It’s very special, and that doesn’t mean it should only all be text.
17:34:34 It means texture, be used in an interesting and intelligent way, so the whole thing I’m talking about a mountain with somebody’s name on top of.
17:34:45 If you have a system where the text is not a lot of words at all, but the words have more meaning.
17:34:57 Yeah, yeah, definitely.
17:34:57 Future texture not be more text.
17:35:02 Yeah, back back to AI on that. I mean it.
17:35:08 The migrant with AI as a tool, and with any tools, in fact, is that the instrumentality of a tool has to be kind of prioritize.
17:35:18 The Mac mechanism through which we have an ability to have a fine grained, low latency ability to sculpt the sort of the characteristics and of how that tool is intervening on the task.
17:35:40 And we’re so so far away from that right now, because the sort of the fact of the AI where the I and the AI is so file removed from the actual sort of surface of what we’re doing with it, you know, you type, something.
17:35:56 And then you get it written back and that’s just absolutely antithetical to the learn ability of any instrument that we might have today be at a valve, jumped or or or a keyboard, or any of those things you know, I was absolutely thrilled by that
17:36:20 that tweet that you linked Toabian about the sort of some matter sensory integration of the vibratory characteristics of sticks of different lengths.
17:36:32 The reality of how learnable those things are in turn, and how they can be integrated into.
17:36:39 I like an acute and fundamental sense of vibration or an impact at a certain distance along the stick.
17:36:45 Despite the fact that there’s a degree of removing that comes from the fact that we have the ability to to learn these things from whatever signals we happen to get, and it just falls to tool designers to to send out those signals in such a way as we can sense them.
17:37:04 But they sort of the obsession with computing, led in part by organizations like, I think, the one I work for is to have clean lines of delineation and separation between the sort of the interface and not the interface, so that people only
17:37:26 have this very narrow band kind of see through and work through.
17:37:29 But you know, I think that there’s there’s a significant problem with that, and that is thesis to be as learnable and I think that’s that’s important in all forms of interface.
17:37:42 But it’s never been more important than with what we would characterize today as AI.
17:37:48 In that once you have these really comprehensive systems of intervening on the work, then we need to have ways of being able to intervene on that working that carry the dimensionality and the nuance of of being able to make it the thing that I want to make you kind of said
17:38:10 Florida about the the Hey art prompting systems? And how are their unique?
17:38:16 But they’re not what you want necessarily, and that’s the crux of it is that, having what you want, and for that to be sort of not just like I want a cute, you know, cat, girl, but that there are these multi-dimensional
17:38:38 attributes to the same environment that sort of the function, the the throwbacks to various stylistic kind of influences and things like that.
17:38:49 Those all need to be carried through. And it’s doesn’t mean that you need to be painstakingly responsible for every brush stroke.
17:38:54 But it does mean that you have to have the ability to kind of a view it with some level of authorial intent.
17:39:02 And unless you’re willing to concede that it’s not yours, and then you know, he didn’t really want it.
17:39:07 So that’s what I would bring to AI discussions is, how do we improve agendas? You know what it brings?
17:39:16 Every discussion. Is, how do we improve the interface? How do we make sure that the interface is as rich as possible?
17:39:20 In order to make sure that, as the sort of the capacities of these tools becomes more elaborate, that we still remain in control authority of what it is we want them to do for it.
17:39:36 You used you talked about, you know, basically it’s a human on the other end of interpretation, partly that.
17:39:44 And in one sense it’s very easy to say who cares if it’s an interesting thing if you don’t know, however jokingly referred to Cam girl, I’m sure you did it, and a relatively benign fashion.
17:39:58 But if you look at not literally, but if you take into account the notion of pornography on the Internet, there’s a lot of stuff people can download for free.
17:40:07 But the results are a thriving industry. Will you pay someone to take the clothes off?
17:40:13 Live, and I think that is really, really relevant. Because what does that say?
17:40:18 It says that the actual human matters a lot, even if there is a screen, you know, why are people spending all that money on that when it’s available for free?
17:40:27 Is what an economist might say so similarly, here when you’re talking about this, how can we?
17:40:36 I think I’m paraphrasing you.
17:40:39 How can we show that there is a human on the other end?
17:40:41 And to what degree do we want to do that? Because one of the great things about text is, it isn’t the same as speech.
17:40:48 So let’s say someone has a bad accent, speaks the language badly, or as a stutter, whatever it might be for them to have less of their personality, as it would be face to face intact as a benefit, you know so how do you you use authorial that’s also a notion of
17:41:08 authenticity. And what is authentic? Is it dressing up in a nice shirt, or is it wearing a T-shirt?
17:41:14 Same thing with text. Should you try to bright impressively, or should you write your own street slang?
17:41:18 I agree with you. These are really important and not obvious questions.
17:41:23 But can you elaborate more on where you see AI?
17:41:27 And that.
17:41:34 So I think that the if we are to accept hey? Aye, in this sort of the Chat gpt form, which is implicitly, we’re explicitly, I think it’s presented as though it’s an agent.
17:41:53 Then it means that you know creation with or through Chat Jpt has to be understood.
17:42:00 Co-creation. That’s sort of negotiated between agents, and, you know, like putting aside arguments about the appropriateness of that ever versus today, I think it’s those are valid things to have.
17:42:17 But if we were to put those aside, then you can say, Well, what are the ways in which you know agents, the ones that we kind of recognizable agents today successfully co-create?
17:42:34 And I went to a concert. Last night I went to.
17:42:36 They might be giants, and you know, seeing the way that these multiple players, these musicians, get up on stage and they participate with each other there’s a tremendous amount of individual and then group rehearsal in terms of what that performance entails but even in the moment there’s a really rich attentiveness
17:43:00 to all of the factors are going on up on the stage now in the audience of what people should do at different times, and those roles shift, and they shift dynamically.
17:43:13 So somebody who’s got a heavier performance like musical performance burden on a given track may spend more time on more sort of mental resources on that and less on the coordinated aspect.
17:43:25 But then that kind of shift and flip, relatively, fluidly based on, you know, a mutual respect and regard, and a genuine interest in in kind of being able to respond to the best.
17:43:40 Of their ability to all of the queues together. I would say, that’s absolutely enough to the nature and the structure of co-creation with any of the agents as they presented today.
17:43:53 So you know, the most flippant thing.
17:43:58 So to one of the things I did a few years ago.
17:44:00 I think I sort of more threatened to do it.
17:44:02 Then succeeded in doing it so far is that you know I built a keyboard in Webex.
17:44:07 Are. I built some other things that look finally faintly like I could become agents at some point in the future.
17:44:13 They look at your hands, they look at each other, and they and and so, as you play, they might be looking down at your hands, and they might be looking up one of the things that a chat gpt of a future it could do is rather than being mediated, or prompted, through
17:44:31 typey typey text, or even if it was to have some kind of visual manifestation of attention, so that it had the ability to look at what you were doing, how you were responding towards creation for to have it’s sort of an oral channel where you were talking to it
17:44:50 I’m gesturing to it and having the ability to move your hands and for it to have the ability and the obligation to parson process.
17:45:00 The grammar of hand motion, such that it those things could be used as sort of programmable aspects, and then to be able to have these cause right now with a chat Gpt, you can start it, and you can stop it as it’s kind of creating these things as it’s sort of
17:45:17 creating the making use of various convolutional things. And it’s fancy brain to produce a stream of things, but from for to our intensive purposes it appears deterministic.
17:45:30 It appears, and that might be actually the truth. But, like the moment you press the button to send the prompt away, and it’s right, it might be that it’s actually all done, and there’s an artifice on the other side where they’ve decided to kind of have
17:45:45 this progressive, disclosure of that information. That’s, you know, one way or other, like it’s something that I did a few years ago.
17:45:53 It’s also on Code pen. This I took A, the Washington Post that printed the transcripts, The of Trump, in the discussion with an Australian Prime Minister or or other, and I transposed those into this kind of progressive
17:46:09 disclosure, along with the beeps. They go into a Japanese role, playing game kind of thing where, like you have one pitch for trunk and one pitch for the other speaker and one color, and that’s then it.
17:46:22 Goes like.
17:46:27 And there was really interesting sort of consequence of of seeing dialogue unfold and be delivered in that way, because it really emphasized the temporality of it.
17:46:39 The fact that we sort of know intuitively that whenever something is literally a dialogue, whenever it’s a transcript, then there was a moment when the playhead effectively was here, and this was not said, and not known as an interesting sort of attention, whenever you
17:46:57 read, some watch, something with subtitles, and you go ahead of it because it inevitably serves to deflate some of the tension that might be present as a consequence of very punctually sort of timed kind of utterances, but so back to
17:47:12 chat gpt. I think that one of the things that could be done is expose the realities of what it’s doing.
17:47:20 So if that’s a friction that do away with it, if it’s a reality, or to the extent that it’s reality, lean into what it is that is as yet ungenerated.
17:47:31 And consider what kind of channels and opportunities we might have to intervene on or manipulate.
17:47:36 They sort of the the outgoing. As it happens, one of the challenges that the it’s antithetical to the notion of unit testing and sort of the singular closure of these things, because it means that there are any number of multi dimensional channels that could
17:47:57 be changed, and probably would be changed at any time, based on things like the inference, the active inference engine that would have to be good done over, whether I look like I’m happy with your you know, outcome your outfit.
17:48:09 But I you know that’s that. Those are, I think, are table texts for making what?
17:48:16 What chat gpt, it is kind of a starting.
17:48:19 It is by virtue of the these sort of environmental queues of the progressive disclosure. It’s agent presentation into something that really deserves that space in our minds friendly hopefully, that makes sense.
17:48:35 Yeah, yeah. You triggered something I’ve been following a guy named David Shapiro and his work around constitutional AI.
17:48:45 There are other people doing constitutional AI, but essentially, it’s what he’s advocating.
17:48:52 For several heuristic imperatives that would be that each input and output would be just against in.
17:49:02 There have been other efforts with constitutional AI.
17:49:07 I think the most famous one is after pick. They have one heuristic comparative which is harmlessness.
17:49:14 What Shapiro is saying is that you really need to have, you know, multiple one multiple heuristic imperatives that can at times be at odd with with each other and the ones that he’s suggesting and trying to build support for our one reduced suffering 2 increased
17:49:38 prosperity, and 3 increase understanding. So the increase understanding, one would be perhaps a filter for getting out the fiction.
17:49:50 As you know, one aspect of how a constitutional AI might work addressing what you’re speaking with.
17:50:05 On a slide tangent. What that inspired me is, you know, like I don’t know if you sell Brandndle.
17:50:16 But earlier it took a 3 60 camera round the room and photogrammetry type.
17:50:21 Things, made a virtual model of that room which is really really fascinating.
17:50:25 But I’ve had a thought before about having 2 or maybe 3 cameras.
17:50:30 Yeah, 3, 60 or iphone, or whatever doesn’t matter in a row.
17:50:33 So when they film, they’re filming more of a real 3D thing that can that instantly be put together.
17:50:39 Even a 2D screen, I’m thinking, maybe someone who’s a trainer could then view that.
17:50:45 And if you’re posture isn’t right, they can tell you exactly what’s going on.
17:50:48 That can’t see everything. But in a conversation like ours.
17:50:51 Now so probably in. And I had this very impressive virtual reality experience with Mark Anderson.
17:50:59 That falcon is just so impressed by it. Oh, okay, you were listening, just checking, anyway.
17:51:03 It was the horizons work room, and it, of course, technically, if you look 10 years ago, it’s very impressive, but it’s basic.
17:51:09 It’s a bit cartoonish with the with the frost.
17:51:16 Pro sorry. My brother just texted he is in San Diego, so it’s good to know that he’s okay.
17:51:25 You capture the fishial expressions. Nice, but then, how is it presented? Still?
17:51:29 Cartoonish, I’m sure Apple’s reality will be blah blah blah amazing.
17:51:34 But it’s bizarre sitting in front of this 27 inch screen, or whatever it is, having.
17:51:39 You guys with funny little boxes. So that’s part of it.
17:51:43 It’s easy to capture me. Okay? And then and you could all invent easy systems to capture ourselves.
17:51:48 But how do we then see the result of that? You know, if you put a headset over your head, then you’re back into the whole cartoon, generated pixels, which is what the CEO of Nvidia said.
17:51:58 Anyway, all Pixar are going to be generated soon, but it becomes quite artificial.
17:52:02 So to have a work space, whether with text or otherwise, where we can capture more of who we are.
17:52:11 Our expressions, our environment. If we want to. How do we then see them?
17:52:15 I mean, I think both Brandon probably have experimented with projectors.
17:52:23 It would be really cool to have a projector behind here behind you guys where I could see more of you. I know it’s a bit of a tangent, but it’s just bugging me. Any thoughts on that.
17:52:34 I think capture, capturing reality. It’s really tricky, like the little nerve experiment I showed before.
17:52:47 It shows one thing, 1 point of view, and I think we have.
17:52:51 A there was an interesting documentary on the Google Earth. That’s a documentary, actually efficient Google Earth.
17:52:57 And the more legal debacle behind it, and I think we have a this fantasy of being able to capture reality, and I yeah, I don’t think he’s supposed to go.
17:53:07 We capture. That’s I insisted on this, because I think as long as we capture something interesting.
17:53:15 That’s worthwhile, and it’s, of course, related to our task and our culture.
17:53:19 And whatnot, but reality. No matter, the all the different candidates and techniques I’ve tried.
17:53:27 Show me how far we are from anything, everything!
17:53:33 It’s very basic. It’s like, no matter the amazing progress.
17:53:37 It’s like being able to go into nerve and like translate in a volume from just a camera is mind-blowing.
17:53:43 And yet it shows it’s very superficial.
17:53:47 It shows very little. So I think it’s it remains a whole problem.
17:53:51 Capture in general worthwhile one, but I don’t have interest.
17:53:57 Yeah, I I would say on on a purely concrete material level in terms of portraying reality in that sort of greater depth that you’re using, for I’m really.
17:54:13 By the work of folks like her voicemail he normally goes by Benco, just Bmko. He used to work at Microsoft Research now work with reality labs, you know.
17:54:26 I’m not currently going now, but he would be the main reason.
17:54:29 I would ever move somewhere. People like that are just really important in terms of their impact on thinking about what it is that you do.
17:54:37 And you’re saying, like Microsoft Aluminium Room is the idea of having in your video game sort of setup.
17:54:45 You know your big TV for playing your Xbox game with Microsoft?
17:54:48 So of course it was Xbox. But you have a big wide throw projector that has a much lower resolution.
17:54:55 That is targeting more or less the same angular range, but it’s out like this.
17:54:59 So that it has nowhere near the same resolution, but it has the effect of being able to talk about, to sort of characterize the projective or the excel intervene space as being sort of foveal that the center region being very high fidelity and this
17:55:18 other stuff doing what it’s kind of there for, and that’s the sort of a the pattern of all of the work that he did.
17:55:24 He also stuck a ring of Leds around the interior of an eyebox inside, inside a if you are headset, so that you have the ability to sort of characterize the periphery with a very low fidelity.
17:55:37 But in such a way as it gave you that sort of comprehensive sense of sort of horizon and lines, so that if you turn your head, then you would get those views like that, their challenges with the fact that the do you have that lower
17:55:56 fidelity, than you end up with some fairly sharp jud judging of it.
17:56:01 So it’s not perfect, but it’s it’s a really interesting kind of, I think, response to the realities of what the limits of the human perceptual system are.
17:56:10 And catering to those rather than sort of assuming the worst case behind the same sort of fidelity of textual display.
17:56:17 Everywhere around, to what Fabian was talking about in terms of the difficulty of capturing and representing reality similar to the to the photo scan kind of stuff that yeah, you were doing.
17:56:31 I recently. So we have a couple of cats that sort of come in and around before in the past, and that we’re thinking about how to let them out of the house and into a safe space, because we don’t want them running around that neighborhood we don’t want
17:56:46 them, risking their lives on the road that’s how we’ve been designing sort of an escape through the house and into the backyard into a sort of a cloistered area that we would then have the ability to kind of contain.
17:57:01 And so I did. I recently got an iphone 13 pro.
17:57:04 That has Lidar, and scanning capability.
17:57:07 That’s why I did that. Scanning and put together a the space in the full scale, and things like that.
17:57:14 And it was really interesting because back at the blender I can actually get all of the dimensions and measurements off it directly.
17:57:21 And then if I get, for example, products on Ikea or Amazon, I can either build out the dimensions directly from those specification sheets, or I can download the models.
17:57:33 Of them them directly, and then sort of go like, okay, I need this.
17:57:37 Many things I don’t think I have. Self-pitchens like that’s, you know, one of the things that we intend to use to scare the cats out of phone leaping out of the corner of the backyard.
17:57:49 And yeah, like, that was really cool. But at the same time it’s not the map is not the territory. There.
17:57:56 It’s a very small set of attributes of the dimensionality and the characteristics of that space that are represented.
17:58:06 And and it takes it doesn’t take much sort of interrogation from my daughter to be left with like, no, it doesn’t do that.
17:58:15 I’m sorry. No, you know it doesn’t have any information like that.
17:58:18 You. You can’t see in that way. It doesn’t have that.
17:58:21 No, it’s it’s just paper and infinitesimally sort of represented.
17:58:28 Triangular surfaces, with a bunch of pixels on them that may or may not be correct, based on the angle we we can’t total lights on or off.
17:58:37 We can’t see around that corner. Yeah, just so much that it isn’t.
17:58:39 And and it, yeah, like, understood, how like can tensionly define all of those things are for a specific purpose when we can sort of all our sense to itself until it fall.
17:58:53 Sense of neutrality and canonicalness.
17:58:56 I thought that information.
17:59:08 So 2 quick things. The first is, yeah, it is indeed so superficial and it’s paradoxical that most of us do spend so much time on it like being photos being 3 model nerves photogramity, whatnot and it’s kind of weird that
17:59:26 when we see a recording of it, we want it.
17:59:28 We wanna see if we wanna be in there. But as soon as we’re in it like that the thing just these are they’re using these appears.
17:59:37 It’s like it is indeed so flat, superficial.
17:59:42 There’s nothing yet but it’s in a trace.
17:59:44 So that’s why I think why we still can kick back to Mary, and couple of weeks, month, years, the kids I don’t know.
17:59:52 So it’s not definitely worthless. That’s why we’re kind of counterbalanced. I spent a lot of time in it, and yet I’m so that’s critical about it.
18:00:00 And then the very quick. I’ll share my screen again, and please let me know when you see it.
18:00:12 And it’s this window that I wanna show. Do you see the window with the text scrolling?
18:00:20 Yeah, yeah.
18:00:21 Yeah. And do you see my webcam at the same time?
18:00:26 Oh, yeah.
18:00:26 No oh, right, right right!
18:00:34 Now the orientation defines the base of the scrolling.
18:00:40 If any, see the number of the top right, that’s how tilt it is.
18:00:46 And if I put it flat for vertical, let’s say he doesn’t control, and if I bring you away and goes back.
18:00:59 That was just from the suggestion that Randall put in the chat, that was not a feature.
18:01:07 At the beginning of this conversation, and I wanna say, thank you for this, because that’s the kind of discussion that I find, especially fruitful when we have a back-end. Fourth.
18:01:18 And then I get I could sing out of it, and I think hopefully, it’s going to help me think and read.
18:01:24 Yeah, no, that’s so good.
18:01:24 So thanks. Everyone.
18:01:28 I’ve please describe it as closely as possible on Twitter.
18:01:33 I think that like getting people to see because that’s instrumentality, you know.
18:01:39 That’s wielding your phone as a rating instrument.
18:01:42 In a way that I don’t think we have very good reference points for, and I don’t, and it’s as evidenced like not to denigrate your coding skills, but as evidenced by how fast it was to come together from when I showed
18:01:55 you thing. Not that technically complicated, but it’s essential in terms of getting at that core of this notion of instrumentality, of what instruments are and what happens when we have the ability to drive them and form with them with the level of sophistication that we
18:02:17 get with just a but a knife, let alone a trumpet, you know we, our tools are not even butter knives, and that’s that’s that’s sorry state, you know, and we have to be able to control chat Gpt with at least a freakin by the knife
18:02:37 I have to show you something. Sorry. 1 s. Okay, it’s really, really brief.
18:02:47 But this is a couple of years ago. Let’s say, yeah, 2013. Can you see this tiny image?
18:02:55 It was an app I had called 3D. Pick. This is before a lead, or anything else.
18:03:03 So the whole idea was just. If on your camera you have a little oval shape, and then it asks you, you click a button and you have to move you know. Like to do it.
18:03:13 Let’s pretend you’re doing a 3D scan of someone right?
18:03:17 So more just a little bit. But all it does is to create a movie.
18:03:22 So when someone views it, if you roll your iphone to the side, it’ll play the movie in this or that direction.
18:03:33 Starting from the middle. So that means that if it’s shot properly, it really feels like it’s a 3D. Image.
18:03:42 Right? So it’s.
18:03:41 Yeah, yeah, that’s I mean, those are fabulous.
18:03:46 And quick time VR is is exactly that.
18:03:49 You know the ability to reconcile what it is that a video frame is whether it’s sort of supposed to be understood to be temporarily stable.
18:03:56 Once heck us here. Yeah, yeah. And it’s been been used so little.
18:04:03 So to say, you’re doing this, Fabian, with texts, you know.
18:04:06 I would very much like Lazy up, you know, Ree, you know, if I just slower, faster to go down the page.
18:04:12 Yeah, that that makes a hundred percent sense. So it’s nice as a antidote.
18:04:19 All this advanced stuff, as you said earlier, Fabian, so many more interesting ways, we can look at it, I guess.
18:04:25 Just had a haircut. Let’s have a look.
18:04:26 Come on, show everyone your cut. You look amazing. Look at this young man.
18:04:31 Hello! Hello!
18:04:33 Hey! Greg, you want to tell everyone you were on the other side of the planet last week.
18:04:38 You say I just did. This little man was in a flight that was 14 and a half hours after a flight.
18:04:46 That was 2 h and bullet train. Yeah, okay, we gotta go back.
18:04:54 We’re finishing the meeting. Can you close the door? Actually, it’s a bit. Can you close?
18:04:59 Okay. He’s not closing the door. Fine. Anyway. Yeah, to capture more stuff like that.
18:05:03 And to use it is, it’s really interesting.
18:05:10 Yeah, you know how currently in Reader Arrow key back and forth between pages up and down as chapters, right?
18:05:19 Maybe on an ipad. I’m just having fun with what you’ve been doing here.
18:05:24 Because I don’t have to code, so I can think without coding.
18:05:27 Imagine reading on a night pattern doing this sure, but and also the way you tilt page page.
18:05:33 Alright or section section, right? This is phone.
18:05:41 This is useful fun.
18:05:43 Yeah, you could even flick between chapters by doing those I mean, like there are, or have things where, where you actually do things with 2 things.
18:05:53 But it means that if you do them with both, it’s more so that if you, if you, if you in the same way, that if you roll a ball, on a surface, but then you also push the ball, then those 2 things while there it you could, consider it redundant you could also have it
18:06:14 just move a lot faster, so that you know, if you are doing a swipe scroll in combination with an angle, then that one changes the inertial property so that there’s no slowdown.
18:06:30 But also, you know. So I think that there’s some really.
18:06:32 That’s really interesting, isn’t it? Like the idea that so, like most of the time when you scroll on a phone today, it has this slow down here.
18:06:43 But if you have it so that it’s responsive to the angle that it simply doesn’t.
18:06:48 Because the inertial forces are based on the gravity vector so that if you have it tilted, then it will continue in perpetuity, or even like you play with it.
18:07:01 But it’s I think it’s really interesting.
18:07:01 Hmm! That’s interesting.
18:07:06 One of the things we talked about, Brandon before you joined.
18:07:09 I know we’re over time, but this is too exciting.
18:07:12 Was learning mentally to use even the tools we have now, like in Japan.
18:07:18 We started using a watch to do trackations which was absolutely, amazingly high quality but it’s something took us a week to even think about.
18:07:26 But the thing we’re talking about now about different kinds of dimensions, of reading, a even a topic.
18:07:37 Excuse me, Linux. People like an ipad.
18:07:40 It does have so many dimensions of possible movements.
18:07:45 You know there is, you know, what happens if you move it away from you, you know fast.
18:07:50 Move it towards you that could actually have interesting effects.
18:07:56 It’s nice. It’s good also that we have.
18:08:00 Okay, hang on my brains overloading in the real world.
18:08:03 Thanks, have attributes in the digital world. They have to be given attributes, rights, and we can extract them through interactions or metadata.
18:08:11 And that kind of stuff. So today, these little devices have an insane amount of sensors like orientation sensor, we talked about vibration sentences, and all of that.
18:08:24 But it they’re just not used so in terms of the discussion on Thursday.
18:08:28 How can they be used in an interesting way to deal with with AI?
18:08:43 So as a little of counter argument, let’s say a lot of them are used with AI, the one of the creators at the moment is large language model, and some of hijacking, let’s say, clip and how it’s useful is image.
18:08:57 Generation. But there’s a sensory there. The camera in the quest to know what it’s your existence is.
18:09:06 It’s a powered. All the different.
18:09:10 How do you say computer vision for tracking of where the ground is?
18:09:13 Oh, probably. And what I mean is, how can we, as end users, consciously interact with an AI not have the AI be a translator?
18:09:23 That’s what I meant.
18:09:26 Hmm! I mean, I think what probably I’m saying is still interesting is that we like the AI is what what I think you’re saying is that AI is fed by a phenomenal amount of sensor data and
18:09:42 phenomenal range of forms, that all of those things are being used.
18:09:46 But I would also say that you know the manner in which that’s manifested into our current understanding of AI is through these sort of this vast bulk collection process.
18:09:56 This bulk processing of that data, and then being sort of left with the very thinly presented interface from the trained model.
18:10:06 As a consequence, and yeah, I think that obviously needs to be very different.
18:10:11 We need to have the ability to to use those sensors.
18:10:15 I think that there’s a there’s just a basic gap in our capacity to imagine those things as sort of constitutive of interface and I don’t know what to do with that other than sort of present alternatives to say like here’s here’s a way.
18:10:36 That you can think about these fingers doing this thing or this head doing this thing.
18:10:42 Given that we know these pieces of information from it.
18:10:44 The position of a quest headset, or the orientation from your apple airplanes, because, like, you can know that I’m not saying that app, you know, apps. Use it most of the month almost.
18:10:56 Certainly don’t, but it’s noble, you know.
18:10:57 It has to be knowable for a position audio, special audio, but it could be constitutive of of an interface.
18:11:04 I think there’s there’s a conservatism to the closure over an interface.
18:11:09 I bought an Xbox controller, right? A playstation controler.
18:11:13 Recently, that sort of test out the extent and the limits of the safari.
18:11:16 Api for for for Gempad, really really pleasing to know how much is already there.
18:11:23 So I’m really excited by it. But but you know, like one of the things that happens in game development is that people wanna hit the lowest common denominator.
18:11:32 So whenever there’s any particular like, so the Playstation Controller has a sort of a swipe pad on the top of it.
18:11:38 But it doesn’t get a lot of use, because a lot of people are sort of apprehensive about being reliant on something that’s available on a subset of those canonical controllers.
18:11:50 And so for the most part, all of the games rely on things that have to the full mapping and overlap between the Xbox and playstation controllers.
18:12:00 And that’s a challenge. It also masks the fact that people haven’t kind of grappled with what the capabilities of these these inputs modalities might actually be and what they might be best.
18:12:08 Adding for. And yeah, like I said earlier, I think that we shouldn’t underestimate the fragility of our understanding of something based on the exact form it happens to be in at the moment, and so it requires people sort of will and willfully attempting to break that and really
18:12:29 sure force, exploration of those things in order to be able to say like, actually, maybe the scrolling is awesome, and I I think what you’ve got there.
18:12:37 Fogging is incredible in that, in that regard.
18:12:43 No, no, no, please!
18:12:41 One thing this is. Oh, go ahead! Oh, I just gonna say one thing that just popped into my head as you were speaking.
18:12:51 Randall was, what what what you were saying was, we don’t really understand how these sensors are working and how to use them.
18:12:59 At least that’s how I was taking from it, and I was just imagining an application that would allow you to select one of the sensors, get a visual representation of of what is doing over time.
18:13:12 Be able to, you know, set triggers, and then be able to experiment with it.
18:13:17 Move your phone around, etcetera, and then observe what it did.
18:13:24 It could be interesting learning tool for using those sensors.
18:13:27 Totally, that you’re describing almost exactly my process for discovering a new sensor on a device, and being like, okay, so what do I get out of that?
18:13:36 What are those things mean? And then how can I start building sort of synthetic characterizations of that input input data such that it can be more intrinsically meaningful.
18:13:47 So I wrote a word processor of many years ago, using the late motion with finger data and in and in motion you get all of the positions of all of your fingers.
18:13:58 But that’s not easily understandable as a thing. So one of the things that I did is in geometry and vector algebra is this idea of a dot product which is, it’s kind of a measure of the angle between 3 points.
18:14:12 So you can tell whether they are co-inear, which is, they’re all in a straight line, or if they’re at 90 degrees, or whether they’re bent past it, and things like that.
18:14:21 So what I did was once I got all of those I created the compound vector dot product of the unfactor from here to here to here, and then from here to here to here, and then from here to here to here, and then you use those as a single measure of curl and then
18:14:38 that becomes a dimension which is actually approaching usable.
18:14:42 So then you can tell when a hand is all straight versus pointing a finger versus to fingers versus anything like that, because you have dimensionality of a kind.
18:14:55 But it requires that sort of a feedback loop of being able to go like.
18:14:59 So what’s this? And how can I understand it? And then how can I drive something on that basis?
18:15:04 So, yeah, I’d like more people to work like me.
18:15:08 I guess is what I’m saying.
18:15:11 A quick bit more quickly. Mark on this, and the sensor.
18:15:16 And one of the typical thing in Xor development is, you do a motion, and then, instead of doing it again again, again, you record that motion.
18:15:25 So the position and rotation of your head and your controllers, or your hands and then, once you have this little moment in time, this capture, then you replace it and do the interaction again, otherwise, like putting it on you, etc.
18:15:38 But I was thinking, and one of the thing I did way back when it was for the 5 to play the camera position over time like I said her curve.
18:15:47 And then the color change over time. So as you get newer it’s brighter and darker.
18:15:53 If it’s bolder, so you can see the motion of the time running quickly.
18:15:57 But what you describe them. It made me think we could have this.
18:16:02 Still useful, but with also next to each, let’s say, sphere that represent the position of the head.
18:16:07 For example, the sensor and the sensor value, so that if it’s there are other things I don’t know.
18:16:13 Temperature whatever. And then also we present them either as a curve or whatever else.
18:16:20 And then, you’ll be able to navigate that space, replace and then bind them.
18:16:24 For example, another item, and it’s reductions scale whatever link to one does the sensor.
18:16:31 And if you do this indireor like you freestyle, you have the motion, then you bind them.
18:16:34 Then you see the revolution of the other item over time.
18:16:39 Yeah, yeah, this is all very good. We need to wrap up, of course, but some kind of a forum for how our devices sense the world to have an ongoing dialogue with that would obviously be very useful on the other side of this and innovation that I think
18:16:55 is amazing is when the L watch first stopped being black when it wasn’t red, you know that is all.
18:17:02 We’re showing something. It’s really a night and day when you see a TV show where the characters have a black couple watch, you know, when it was made, but also it seems like the device is dead.
18:17:10 And now, of course, this is an always awesome display, and that changes what this device is, and really interesting ways, you know.
18:17:22 Obviously dims of it, and all that stuff. But the messages come up, so I’m just wondering you’re all relatively well, maybe not even. But back in the day we had screensavers flying toasters and space forps and all kinds of things, maybe one of the interesting and useful things we could
18:17:39 actually do is build a modern screensaver where you know that we aren’t using.
18:17:51 Why not have something useful on them and if it’s a device that’s plugged in, you know that I could really give an interesting view.
18:17:59 Imagine, when you sit down like in the morning, when I wake up.
18:18:02 This tells me the weather and stuff, and that’s really nice.
18:18:05 That it makes it feel like I’m living in the future.
18:18:08 But imagine sitting down in front of your workstation, and by default there is a not using a lot of energy, but a little thing saying what your colleagues are doing.
18:18:18 This is what the world is doing, but not to linear texts.
18:18:22 Some kind of a visualization. So you get a feel of the day. In a sense, you get a temperature weather forecast for the day, based on behaviors or something like that.
18:18:35 Absolutely yeah, whether people mostly.
18:18:37 Could you imagine Tim Cook introducing a new mark and saying, and the big thing has a screensaver?
18:18:44 That’d be fine.
18:18:46 I for me it’d be really interesting if I was able to see what the the trending conversations were that I’d be particularly interested in each morning, and not have to go to Twitter.
18:19:02 And you know all these places to try to figure that out.
18:19:07 Yeah. For me. Bill Buxton has done a really good job of sort of characterizing the difference between a 3 foot interface to a 10 code interface.
18:19:18 You know one meter versus 3 meters away from a computer, and the sort of the modalities that might be appropriate for display and interaction.
18:19:27 And one of the things that I the moment is the total lack of interactivity with the computer, that you’re 3 metres away from where I think that you know.
18:19:37 Obviously you’re not positioned oriented in a way to have fine grained interaction with it.
18:19:43 But like, if it knows what you’re looking at, and whether you’re responding to it, and things like that, there will be ways of intervening on it for it to be able to kind of present the sort of the characteristics of what it is that you’re interested in there David and
18:19:57 I think that would be really valuable to where like, if you need to get that more fine gr interaction, then you can get up closer to it.
18:20:05 Nor to have that more detailed sort of interaction. But if there are there’s information of a kind that can be kind of obtained and observed from a distance, then we should be able to do it from the distance, and I think that you know the the multiple surfaces
18:20:21 that we live our digitally mediated lives on now should start to give us a picture of what kind of surfaces are appropriate, for what kind of interactions, at what level of granularity you have a watch being especially good at being like no this is really not for that but it’s
18:20:37 good for all of the things that it’s good for.
18:20:40 You know, and I like. I say, that not as an indictment, but as a celebration of recognizing where very real limits exist as a consequence of some basic human factors.
18:20:52 Among other things. So yeah, like having the ability to be a little bit more open-ended about where those limits are, and having a sort of more gradual tailing off of that capability rather than a hard cut of like do you know that the
18:21:07 keyboard. No computing for you. Sorry, you know.
18:21:13 Oh, yeah, this is really ridiculous, because of the timing.
18:21:18 But the very last secondence there, of course, there’s the camera at me right now, so I should still be able to do stuff without the keyboard rights which has been discussed.
18:21:28 But just imagine this having a screensaver that is the camera into a VR environment.
18:21:36 So let’s say you have your daily meeting place that Fabian has built, and the screensaver is a camera view of that.
18:21:44 Maybe it doesn’t update every sixtieth of a second, or whatever.
18:21:48 Maybe it’s not so much so. You just walk past.
18:21:51 Or you could see someone there, or you know, what are my colleagues working on?
18:21:56 When I was in London, as I am now, I had a partner for a project in California, and this many years ago, during work hours, we just left Skype on in the background even when we’re not talking to each other right.
18:22:11 So we just had that thing like we’re in the same rooms.
18:22:12 Oh, so and so there was no hello there was no hanging up, there was no ringing, it was constant, so maybe a visual background thing like that may be when the machine is out of screen saber mode.
18:22:24 It can go into a widget type thing that’s so.
18:22:29 That means that your other work environment is constantly there. These are things that are interesting.
18:22:35 Yeah. Are you familiar with Matt Webb? He was.
18:22:39 He worked as an organization called Shultz and Webb, and then they rebranded it to burg the Brg British Experimental rocket group.
18:22:47 My web’s got a mailing a newsletter and his most recent thing is called it’s time to rethink the phone call.
18:22:54 Really, really fascinating sort of trade us on what a phone call might be in a context where that we can kind of transport people to spaces and have sort of the ability to control what might be presented there, and how somebody might be able to kind of producing an
18:23:11 artifact for asynchronous consumption that still has these performance sort of attributes and characteristics to it. So really exciting.
18:23:19 I’ll forward it to you, and you’d really enjoy some of the outputs, especially on the basis of that last.
18:23:27 I that would be wonderful I’ll see if I am but just super brief on the rethinking of phone call and Star Trek, you know.
18:23:33 He touched the communicator and say, Come on to look forward to Blah.
18:23:37 That is so implementable. Right? Because with Siri we could say Siri Brandell, what are you doing now?
18:23:46 I want that, does is it? Send you an audio message?
18:23:49 That is, that if you choose to reply, I think can open a facetime audio discussion.
18:23:54 Why the heck? Are these things not made? Why do you have to click a button to dial someone, or expressly say, dial?
18:24:00 Someone. It’s absolutely ludicrous.
18:24:06 Yeah. Very briefly, on the on the screen saver, for the your room.
18:24:12 I think it’s a brilliant idea, because I’ve done it.
18:24:14 I mean that, but I don’t need to remember. I many.
18:24:18 That is the Frenchest thing that’s ever been said.
18:24:22 It’s brilliant because I’ve done it.
18:24:07 Sound like, but yeah.
18:24:23 For now I don’t know. If you remember, I had to update my remarkable because I got the keyboard which I don’t recommend.
18:24:31 But that’s for another story. But but before this I don’t know if you remember.
18:24:37 But I had the image here of could plain a cabin of a plane at some point, and that image of the cabin was a screenshot from a perspective from the viewer space.
18:24:49 The idea being that it’s the same perspective I’m sitting when I do this typical direction wasn’t social.
18:24:58 But the recent version of it before I did not maintain it up to date, so I can’t really say how good it would really be if it was really updated let’s say, at the end of each session, for example, or every 10 s, 10 min, whatever but yeah, I mean
18:25:16 it’s definitely a good idea. And there is also a French company called Lavit, and what they sell is a 2 meter screen or 250 image screen. They don’t actually sell the screen.
18:25:29 It’s just a device code. And on it they have micro videos because training software.
18:25:34 And you were supposed to have roughly the same setup on the 2 side of the screens.
18:25:40 You have one at home with the office rather another office, and you have, like a corporate, and then, if you are not doing the actual call, it’s all kind of blurry, and then you can knock on the physical screen.
18:25:52 If we could knock on the other side. It’s a little bit like a portal, but it does some of the things that yeah.
18:26:00 You have a bloody preview of where you’re going to go and interact.
18:26:03 It’s it’s not technically difficult, but their interaction that the usage is.
18:26:09 Actually, it’s pretty well done to points well done when the setup is good.
18:26:12 It’s a pretty pleasant.
18:26:16 Yeah. Another. Yeah, you’re talking about updating right?
18:26:23 So it’s not a screenshot from before.
18:26:25 You’re also talking about. The idea of this is the room. Now wish!
18:26:29 Right? Well, yeah. Now, it’s because it’s blurry.
18:26:32 And unless you say it’s fine to Streamline.
18:26:35 I’m looking also at the link that Brendell showed about this 7 foot thing.
18:26:39 Horrific, other than if you’re showing someone you bought a new jacket.
18:26:43 Why would you want to make such an effort to talk to someone?
18:26:45 I mean really cool as as I specific use case and test.
18:26:49 But one of the great things about this is you can hide part of you, you know.
18:26:54 Not. Everybody needs to see that I’m drinking tea and coffee, you know, or standing up.
18:27:00 Standing up is hard work.
18:27:00 Not justifying their tool, because it’s a different context.
18:27:10 Oh, yeah, apparently that’s true. Yes, right? Okay. I thank you guys for today.
18:27:18 And today was nice because we went, hey, wire? We went everywhere as we should.
18:27:23 I look forward to Thursday, all of us trying to rain the conversation back to VR.
18:27:28 And text back to VR. And text repeatedly. And you know one thing, please help me, guys, if someone just goes into a big runt about how horrible it is as long as it’s a brief intro that’s okay.
18:27:42 But we know there are going to be issues you know. We only have 2 h.
18:27:47 So how can we employ this power is really what we want to focus right?
18:27:53 Not that we’re naive technologists. Of course we’re not. But.
18:27:58 Right. Anything else before we wind up today.
18:28:03 It could work completely over time, and it’s very quick and and silly.
18:28:07 Quick and silly. We have time for.
18:28:09 I’ve made a business card.
18:28:13 So I later cut it, and what you might see there is the antenna of the Nfc.
18:28:23 Because now everybody has a phone with Ansu reader, it was I had those Nfc tags.
18:28:30 They’ll sticker for years, actually, 5 years or more.
18:28:33 But nobody had the phone to use it. And now everybody at least in Belgium, taps to pay.
18:28:40 And now, if I step 2, not be. But I got the URL from the browser, and if I click on it I go to my mid ever view world or Webex, or experience.
18:28:54 So yeah, you told business code that I just show tap and don’t share.
18:28:59 That’s fabulous.
18:28:58 That’s really cool. Yeah, that’s really cool. By the way, hardly anyone in Japan uses anything but cash, which is bizarre, super bizarre.
18:29:08 So Randall, since you’re on Apple. Do you remember in long time ago someone had an app that if you tap the phone it will exchange contact information.
18:29:20 It was just so clever, because obviously they’re just sharing.
18:29:23 They were both bumped at the roughly the same time. Why don’t you do that for the apple watch, please.
18:29:28 For the apple watch.
18:29:30 Yeah. So if you tap another apple watch, exchanges contact details.
18:29:38 That’s yeah. That’s interesting. I I guess I’m not familiar with how people share contact details.
18:29:47 You could probably do with the phone, too, right? Like, I mean, it sounds like they did do it with the phone.
18:29:51 Yes. By the way, don’t worry. I’m not stripping, but I have a cardiac monitor here.
18:29:58 We’re gonna have for 7 days. And it’s because of my stroke and whatnot.
18:30:03 But if something specific happens, that’s stressful or I’m exercising, I can actually add a marker to this for the people reading the Ecg.
18:30:12 Later. Do you know how I do that? A double tap? It!
18:30:17 Right. How this hour is that it’s come out from our world kind of computer double top.
18:30:25 Now, it’s a normal thing. Double. Tap your health device.
18:30:30 Yeah. So yeah, Brendle, just a V card would do.
18:30:38 And obviously it knows when it’s on a transport thing here in the Uk.
18:30:42 I’m sure it’s the same with you when I go on the cheap.
18:30:45 I just do this. So imagine if you did a boom, it would be so cool, and then Fabian would have to start getting into the lockdown apple environment as well.
18:30:56 The 3 of us are cool. You are not cool.
18:30:55 I’m waiting for that. I look forward.
18:31:01 Alright. Thanks. Again. Very, very. Yeah. It’s so appreciated to see you guys.
18:31:08 Okay, take me till that Thursday. Hi!