Chat : 11Feb 2022

Video and transcript:

16:03:43 From Peter Wasilko : Hi All!
16:05:08 From Mark Anderson : Hi Peter!
16:06:20 From Peter Wasilko : @Mark, starting to excavate old CD-ROMs, so far a bunch of SIGWEB discs and a collection from the HCIL at U of Maryland. I’m sure I have a VIKI somewhere, but it is it must be in my deep archives.
16:07:29 From Peter Wasilko : @Fabien, do you have a URL for the reader?
16:09:20 From Fabien Benetou :
16:09:21 From Peter Wasilko : OH do you have a link for that one too?
16:09:50 From Mark Anderson : @Peter. Great. No hurry. I just know Claus Atzenbeck’s group would love to see ViKI and or VKB running again. Not sure about VIKI, but IIRC VKB was in Java 1.0. [for others this relates to recovering early Spatial Hypertext systems]
16:10:25 From Brendan Langen : oh this is awesome
16:12:49 From Fabien Benetou :
16:14:01 From Peter Wasilko : LaTeX uses inline markup of the \footnote{yada yada yada} and typesets it using TeX.
16:14:04 From Brendan Langen : re: VIKI @mark + @peter, would also love to see that running. think we can get cathy marshall in for a monthly chat?
16:14:09 From Peter Wasilko : *of the form:
16:14:20 From Mark Anderson : @Fabien: more on Visual Meta
16:15:07 From Mark Anderson : For later , before I forget, Robert Scoble’s blog post on ‘The Augmented Home’:
16:15:26 From Frode Hegland : Scoble is a funny guy but that is a good piece
16:15:27 From Peter Wasilko : @Fabien Google TeXLive for a free distribution for your platform of choice.
16:16:46 From Fabien Benetou : for the eink hardware side is the open one, the other is reMarkable but with closed UI but still Linux base.
16:16:51 From Brendan Langen : what activities do you like using/imagine using the ZUI for?
16:17:14 From Peter Wasilko : I am building a suite of display widgets in imba for use in future projects. It is a rather lot of up front tooling, but it will make building future demos vastly faster.
16:22:21 From Brandel Zachernuk :
16:27:18 From Brandel Zachernuk : Bob discussed this in a great presentation on this here:
16:31:50 From Peter Wasilko : The HCIL at U Maryland did a lot of ZUI work in the 90’s; but their systems were Java based and the students who wrote that code moved on leaving those projects too brittle to maintain in the modern era.
16:32:42 From Peter Wasilko : Ben Bederson was the PI on those projects.
16:34:27 From Brendan Langen : some interactions + concepts that jump to mind for this info mural in digital/VR form:

  • trails through different themes (tangle of problems, timeline, etc.)
  • OCR search on the page
  • intro walkthrough from the creator that highlights how to read
  • add to this and allow people to annotate alongside (crowdsourcing problem solving)
    16:34:35 From Peter Wasilko : I saw an intriguing blurb someone on using high powered Lasers to somehow transmute radioactive waste into a stable form on a very short timescale. No idea of the power required to run the gizmo though.
    16:34:42 From Peter Wasilko : *somewhere
    16:35:37 From Peter Wasilko : Some of Ben’s projects had a Semantic Zoom to elide details at larger scales.
    16:35:50 From Brendan Langen : love that. add canonical view to that trail list!
    16:36:56 From Peter Wasilko :
    16:37:36 From Mark Anderson : I see this poster and see the implicit hypertextual structure
    16:39:09 From Fabien Benetou : physical poster could also provide interesting affordance for AR to zoom in or through in a specific part
    16:39:45 From Fabien Benetou : precisely because there is a canonical view to start with.
    16:39:47 From Peter Wasilko : Oh yes, we could open up one of those blurbs to see the full text behind it overplayed on top
    16:40:50 From Peter Wasilko :
    16:44:12 From Fabien Benetou : is it WebXR usable?
    16:45:03 From Fabien Benetou : like UltraLeap?
    16:45:04 From Fabien Benetou : ok
    16:47:31 From Fabien Benetou :
    16:48:09 From Fabien Benetou : which includes debugging tips remote debugging from the browser is a reltively efficient process
    16:50:46 From Brendan Langen : could imagine something like this very useful in the synthesis process. a lot we can gather around implicit structure if someone was working through this.
    16:53:59 From Fabien Benetou : can probably be tailored with n-grams where you go from e.g 100 then down until maybe 2 or 3 until you get an interesting amount of links
    16:56:12 From Fabien Benetou : (also that’s “me” I guess beyond the reMarkable based prototype)
    16:58:37 From Peter Wasilko :
    16:59:45 From Fabien Benetou : making snapping right is the most challenging affordance IMHO 😀
    17:00:45 From Brandel Zachernuk :
    17:02:32 From Peter Wasilko :
    17:03:16 From Peter Wasilko :
    17:03:27 From Peter Wasilko :
    17:07:28 From Peter Wasilko :
    17:11:22 From Frode Hegland : How can we make a test space for this so that less technical people, like me, can tweak the parameters? Can we invest some money in a VR ‘lab’ maybe? Any suggestions? I could contribute some of our lab money for this.
    17:11:27 From Mark Anderson : Yes. We need to escape coloured paper and spray glue as a way in break down complex problems (which seem to be where much high level work is stuck , at least in public service).
    17:12:39 From Mark Anderson : Posting order! sorry my last was in reflection to previous conversation, not @Frode’s comment.
    17:13:56 From Frode Hegland : Visual-Meta and HTML and VR.
    17:17:49 From Adam Wern : Peter: demo made with threejs + troika-text. I’ve emailed you a zip-file with my code now
    17:17:54 From Frode Hegland : Super cool
    17:18:18 From Peter Wasilko : @Adam, thanks so much!
    17:18:38 From Peter Wasilko :
    17:29:13 From Peter Wasilko : Imagine this in VR:,Trinity_College_Dublin,_Ireland-_Diliff.jpg
    17:30:04 From Peter Wasilko : We wouldn’t need the ropes to protect the books.
    17:31:24 From Fabien Benetou : and with multiple manipulable content and views, I can definitely imagine 😀
    17:33:12 From Mark Anderson : In terms of enhancing existing work, to me, subject matter is less important, than the completeness of the starting artefact(s). Getting good metadata is a hard slog, so a good source saves a lot initial investigation as to the underlying structure.
    17:34:27 From Peter Wasilko :
    17:34:47 From Peter Wasilko : This might be of interest.
    17:37:03 From Peter Wasilko : Should we take a look at the International Image Interoperability Framework (IIIF)? This is the first I’ve heard of it.
    17:39:37 From Brendan Langen : fyi – hard stop for me in 5 mins
    17:41:01 From Frode Hegland : ok
    17:41:07 From Mark Anderson : @brendan hello and (soon) goodbye
    17:41:26 From Fabien Benetou : Im trying to catch up :-#
    17:47:15 From Brendan Langen : well said.

always enjoy the chat. out next week for travel but back in 2 weeks. cheers!
17:48:09 From Peter Wasilko : I don’t know where it is coming from
17:49:13 From Frode Hegland : Adam, how do you want to write, twitter style like we tested or what is good? Can we all write what we want to do by Monday?
17:49:18 From Peter Wasilko : Finally I killed it.
18:01:37 From Peter Wasilko : Epimarkup ™ may be an approach to explore.
18:02:32 From Peter Wasilko : Stylized parsable text we can generate Visual Meta from.
18:12:46 From Brandel Zachernuk : I have a hard stop in 3 mins
18:12:54 From Frode Hegland : Ok, look forward to monday
18:13:20 From Fabien Benetou : I have to run too unfortunately
18:14:44 From Peter Wasilko : Sorry for the ‘noises off’ today.
18:15:40 From Mark Anderson : We can (ought to be able to) capture relevant tweets into our Journal as they are essentially open. If necessary we could add light annotation if the content, e.g. videos might not be self-explanatory enough for the wider audience.

1 comment

Leave a comment

Your email address will not be published. Required fields are marked *