by Claus Atzenbeck – ACM SIGWEB Newsletter Spring 2024 | DOI
Matthias Müller-Prove: My grand topic seems to be cultural computing. It has started by digging the history of hypertext and graphical user interfaces for my master thesis. Then I realized that computer systems make a difference on how users perceive the world and, in that sense, computers have an impact on society as such. reboot7 trip report Interactive systems are not only tools to accomplish certain tasks – networked computers are also a universal communication medium to connect people with each other, to access our history and to shape our future.
In 2016 I got involved in Coding Da Vinci, a series of cultural hackathons in Germany. Museums and libraries open their collections to coders and designer to initiate an innovation process. This was the ignition point for the Chrono Research Lab. At the core of my research activities stands the design and development of a IIIF viewer to browse vintage maps and old books. More than 7,200 maps and 3,500 geocoded info spots are being projected onto a global map, while books with hundreds of pages can be studied from a birds eye view down to the level of the smallest blob of ink. I should briefly introduce IIIF. The International Image Interoperability Framework is an open file format to deliver zoomable images over the net. Images, i.e. scanned pages or maps among other media types, can be presented as larger entities like books or atlases. IIIF viewers are Web apps that display such items in a browser. My Chronoscope World offers time travel on old maps and combines it with browsing through millions of books from libraries and archives world-wide.
Through the hypertext-glass… The globe is the canvas for spatial hypermedia. Of course, the locations are predefined by the content of the maps, but in terms of information architecture and retrieval, spinning and zooming the globe is the first step for finding old maps or info spots which stimulate the user’s curiosity. The map coverage is always visible in a highly responsive user interface. The collections of more than 80 openGLAM institutions are meshed together and interconnected with links. (openGLAM is a movement of galleries, libraries, archives, and museums to share collections under Creative Commons license or into the public domain.) Classical hyperlinks connect two anchors with each other. In Chronoscope, maps are connected with each other regardless of the GLAM server. They are clustered by e.g. cartographer, topic, timeframe, or location. Some groups are predefined – others are based on dynamic filters and full text queries.
Browsing medieval books provides a first hand experience in writing culture and history. These are the predecessors of modern hypertext. Providing easy access and a unified linking scheme with ChronoLinks down to page and pixel coordinate is an exciting R&D mission.
Ben Shneiderman’s keynote at CHI 1998 made a deep impression on me. He delivered his talk »Codex-Memex-Genex« with a zooming user interface technique. It was quite similar to Prezi or some Mural boards today – just a quarter of a century earlier. The concept of zooming has gradually developed over time. Pinch and spread gestures became natural with the advent of mobile touch devices. The intriguing aspect is the explorative flight into deeper zoom levels. Lost in cyberspace is less of an issue because you stay oriented due to the bigger picture in the mental rear mirror, so to speak.
A second aspect touches the tools’ architecture. Existing IIIF viewers, like the open source projects Mirador or Universal Viewer, are usually integrated into the collection’s page layout. The nature of a plugin makes it institution-centered rather than user-centered. Furthermore, each site adds a little bit of customization to better present their heritage documents. As a result, each IIIF viewer looks and behaves a little different across sites. Even though interoperability is a proclaimed objective, from the users point of view it is hard to accomplish. The user experience feels siloed. There is no obvious and easy way to use any preferred tool on all GLAM sites. Therefore the flexibility to interconnect items from any library with any museum within just one environment is quite limited. A deep technical understanding is required to accomplish average tasks on a set of distributed collections.
My vision would consider infrastructure, open formats and flexible tools that bridge the gaps between heritage institutions and the public. Maybe an analogy with the Internet will support my reasoning. Originally the Internet provided the infrastructure to inter-connect several independent national networks with each other. On top of the Internet, the World Wide Web is a standardized hypertext services that offers unified access to information. Several Web browser projects are still under development to support users with powerful tools to access online media.
The IIIF community has contributed a very important asset – i.e. an open standard to transfer digitized artifacts and metadata – which is widely adopted by heritage institutions. The users should just be free to choose among flexible tools that best support their tasks and scenarios.
This seems like a historical question. When you take a look at the state of the art, not much of the original ideas of hypertext has survived. At the surface you click or touch on certain active areas to trigger the program to push new content to you. Hypertext – in the notion of Ted Nelson – used to be the ability of non-sequential writing. Transforming ideas into meaningful text and offering pathways through the maze. Mind the difference between authoring and consuming. Mind the totally different software environments that are on the one hand required to create a Web site and, on the other hand, needed to browse the Web.
The mission of interaction designers is to provide useful and usable systems that support the user to get a job done with ease and joy. The product teams deliver tools and complex Web sites that should be easy to navigate and operate. The user should deal with the tasks and not struggle with the user interfaces. The craft to design comprehensive and clear navigation structures is most closely related to hypertext. The conceptual design to show follow-up actions and next steps is hypertext. Undo – a.k.a one step backwards – is hypertext. Interaction design is the creation of virtual spaces that offer functions to proceed to next desired steps and hopefully closer to the completion of a task. This can be a shopping experience or getting a news update on the latest events of the day.
Identifying mutual goals and objectives would support inter- and cross-disciplinary approaches. As far as I can tell the current research topics in digital humanities are statistical analytics of written corpora (distant reading), AI support for OCR and transcription projects, and knowledge graphs for semantic analyses. W3C Web Annotations are quite popular in the IIIF area. Digital Humanities is not my field, hence I would probably have to apologize for the brevity of my summary. The Chrono Research Lab contributes some perspectives to the field: computer science and software engineering, user centered interaction design, usability, pragmatic solutions, and hypermedia concepts.
Let me as well apologize upfront for any assumptions regarding the hypertext community. I’ve seen plenty of interesting concepts how to evolve hypertext, hypermedia and the Web. However, the Web and all online media are not running on experimental servers anymore. Hypertext in the current flavor of social media is relevant for the well being of our societies. Many research approaches that aim to address certain problems are destined to fail because they don’t get traction in the wild. They are neither adopted by browser projects, nor do they spread from site to site due to their alluring elegance or proven quality.
Hypertext researchers, Web and interaction designers, and also – my current focus – cultural computing professionals should meet and learn about their different perspectives and discuss overlapping areas to the benefit of all parties, last not least the users.
It is challenging to describe hypertext history as an evolutionary development. In my opinion important concepts have not survived; even worse: they are forgotten. For instance, the idea to edit and publish trails of thought has boiled down to timelines with very limited interactions. Users stare on screens… doomscrolling to the next meme or funny video. When did the meaning of algorithm shift from a computer program to a mysterious black box that shuffles addictive and persuasive micro content without context? And I haven’t even touched the area of AI fake news and troll farms yet. Reality hit the fan of hypertext. Hypertext is no longer mere means to knowledge management. Its grand children are the elements of a new omni-present cyber reality that shapes our perception of the world.
Electronic literature is no longer explored. For a few decades it was a field for avant-garde poets to push the limits of language. For instance multi targeted links have been used to convey special meaning in online fiction. I have high regards for artists who’s aim is to widen the realm of the thinkable. More expressive structures like typed hyperlinks and tagging had their time. Remember the fashionable tag clouds of the Web 2.0 era? On the other hand the semantic Web has been postponed several times. To phrase it more friendly, it has changed its foundations several times until large language models seem to solve all issues with one big artificial-neuro-stochastical-language approach.
No. In my opinion there is no linear trajectory from classical hypertext to current research. I can draw a line from Bush to Nelson and Engelbart, to van Dam, then Shneiderman, Atkinson, Berners-Lee. The first decade of the Web culminated in the dot com bubble. Web 2.0 during the early 2000s is the last phase that I can see in the fainting tradition of hypertext. Mobile apps and the so called social media of course use Internet and Web technology to deliver their services. But the motivation is to attract as many eye balls as possible to maximize the exposure rate of ad banners or to boost influencers for marketing or ideological reasons.
The continuity broke nearly 20 years ago. It is surprising because usually in retrospect historians can construct a clear path of progress. In the area of hypertext I would say that development is driven by hardware innovation; each generation of devices – from PCs to networked computing to mobile to VR/AR and mixed realities and conversational chat and voice user interfaces – adds a new flavor to hypertext interfaces. The more successful it became the less hypertext-like it remained.
I am just paraphrasing McLuhan’s The medium is the message.
Current dominating media channels are timeline-based micro video and chat systems. Micro-blog or vlog do not seem to be proper terms for insta, tiktok, weibo and the late twitter anymore. I’d like to see a framework of categories that captures the current landscape of so-called social media.
Hypertext stands in the tradition of culture techniques. Oral language, writing, the printing press, telegraphy and text processing on computer screens. Hypertext is the latest form of written communication between people. It used to have the premise of more flexibility and connectedness compared to static ink on paper. Computers are communication devices. Hypertext could have played a dominant role to capture the context and dynamics between distributed messages on social media and semi-private chat systems. Instead, the timelines and discussion areas are long scrolls that can hardly be digested. They do not foster discussions and understanding of one another. The tools do not support the users to gain an overview and contribute their point of views to the discourse.
But you’ve asked for science in general. We are still bound to the concept of rectangular sheets of paper. Since the advent of word processors, academic papers emulate paper on screen with footnotes and appendixes. Hardly any active hyperlinks to cited references. Yes, DOIs are used and open access publications become more widely adopted. But classical link decay is terrifying. 38% of webpages from 2013 are no longer accessible Internet Archive’s Way Back Machine just helps sometimes. Some formats are not even supported anymore. The content of embedded interactive Flash elements is lost. Early modern books come to mind. They are easier to access than some Web sites of the last decade. Ironically, early Web pages with pure HTML and simple CSS work quite well, though.
Imagine a scientific publishing ecosystem that offers links to related papers and research data as simple as a tap with your finger. Initiate a conversation with the authors to ask questions or to provide feedback right there from the screen where you are reading her “paper”. Engage with other readers and exchange ideas in a respectful and open-minded platform. Collaborate with other authors in a safe and trusted hypermedia space. /cf. JournalHub alpha
It is tempting to erase some unfruitful inventions and hope to improve the world. I just don’t believe that this would really help. The issues that are really concerning are conflicts between people and selfish ideologies that do not care for peace or a healthy future on our Planet Earth. Technology as such is not the reason for so many catastrophic situations right now. It is just misused by parties who strive for their own advantages. Let me quote one of Kranzberg’s laws of technology which frankly I still try to understand: »Technology is neither good nor bad; nor is it neutral.«
AI and Augmented Reality seem to be obvious. But I hope that we make progress in the area of online communications. There is so much room for improvements to eliminate spam, fraud, hate and fake content. Political decisions should not be influenced by micro-targeted opaque campaigns. There should be better tools to explain complex and dynamic systems like climate, biodiversity, urban development and sustainable energy supply, just to name a few. There should be easy accessible simulation tools and safe and easy authoring of texts with well defined target audiences. Everything is a remix. Well, if so then please make it easy to look up the original sources and the context with tools that are flexible enough to bridge the gaps between media silos, regardless whether they are economically motivated or language or cultural in nature or – most easy on the digital infrastructure level – just technical hurdles of interoperability.
Matthias Müller-Prove (Dipl.-Inform., U Hamburg, Germany) is an independent innovation and interaction designer. His experience is based on 25 years of designing software for international operating companies such as Adobe, Sun, and Oracle. Matthias shares his insights as a lecturer at design academies on e.g. Information Architecture, Branded Interactions, or Social Media Trends.
Matthias thesis “Vision and Reality of Hypertext and Graphical User Interfaces” was honored with the Wolfgang von Kempelen Prize for Computer Science History. His Chronoscope World was a selected project of the European Cultural Heritage Year 2018. Since 2009, Matthias is a Senior Member of the ACM.
Cover ccbync mprove photography – A sculpture at Trinity Colleage Dublin