Interview with Belinda Barnet, author of “Memory Machines: The Evolution of Hypertext”

9780857280602_hi-res_1The following is an interview with Belinda Barnet, author of
Memory Machines: The Evolution of Hypertext

This book is an exploration of the history of hypertext, an influential concept that forms the underlying structure of the World Wide Web and innumerable software applications.

Q: In the Introduction to your book, Stuart Moulthrop describes the current state of networked, computational media as ‘ugly and benighted’. Do you agree with him?

Belinda Barnet: I think we are facing some very real problems with networked computational media that people like Ted Nelson predicted we would face. For example, although the web is a successful globe-spanning archive and publishing system, it has issues as a hypertext system. Links break, content is duplicated all over the shop, copyright is difficult to preserve, the page you visited three months ago has now vanished. Many of the earlier hypertext systems I survey in this book had their own solutions to these problems. So yes, I understand what Stuart means by ‘ugly and beighted’.

Q: How do you account for the success of the Web if earlier versions were ‘in some respects more powerful than the Web’, as you say?

BB: I think the web won because it worked. It was not a proprietary system (like most of the systems I survey in the book), it was not difficult to learn, it provided a cheap and quick and effective way to publish things and create links *between* documents on opposite sides of the world. That’s quite an amazing feat. But just because it works doesn’t mean other systems would not work just as well, or better. Systems like Xanadu for example.

Q: You write that ‘[t]he systems I look at here were exciting and revolutionary in their era, but like mermaids gold from children’s storybooks, they turned to ashes when brought to the surface’. Why do you think alternative visions have never made it to the surface?

BB: They all had different reasons. For NLS, although the demo changed the computing world, the system itself ultimately faded out. I think it was because it was difficult to learn and use, but also because Doug was brilliant, a gifted and visionary person, but not really a businessman. I personally still hold out hope that Xanadu will make it to the surface, but I think it hasn’t so far for a number of reasons that are detailed in the book. Part of it may be that Nelson was at times his own worst enemy. Another reason may be that transclusion, the very powerful and groundbreaking idea that content should be re-used by reference rather than copying (and that this content should be traceable back to its source), has been a little difficult to build. Nelson has always wanted that to work, and has waited a long time for it. Memex was never built because digital computing swept the planet and it was really an analogue device, so the technology itself became obsolete.

Q: What, in your opinion, would be the aim and purpose of a perfect hypertext?

BB: The perfect hypertext system would be simple and easy to use, which the web already is. But links would not disappear or break, you would not lose documents, you would be able to re-use content by reference and trace that content back to its source. The perfect hypertext system would not require you to festoon content with markup before you publish it – and it would not require search engines to make sense of it for you.

Q: Why would you recommend students read your book?

BB: There is much to be learned from the history of hypertext, especially the systems that failed. We are currently facing problems that the early pioneers predicted we would face, and they had their own solutions to those problems. So this history is also relevant to people who are currently trying to design solutions for the problems, like the semantic web. I would like to see computing science as well as media and communications people reading this book.


Find out more about the book and the author on our website: