As We May Link

The history of hypertext reaches as far back as the history of storytelling. Its future, in turn, is characterized by the power of building connections across an ever-expanding body of knowledge.

The world exploded into a whirling network of kinships, where everything pointed to everything else, everything explained everything else.

— Umberto Eco1

I’ll never forget the first time I used the World Wide Web. It was in the early 1990s. I was in America visiting my girlfriend (now wife) at her college in Massachusetts. This was before Mosaic, the first graphical web browser, was released. There were no images on the web, but I was still stunned by the scope of what I experienced. Even back then, the web seemed limitless, without edges. That Encarta CD-ROM sitting next to the computer suddenly seemed pathetically constrained.

I bet you’ve got a similar story to tell. Telling stories is a universal human trait. Every culture in the world has a history of storytelling. In many ways, a culture is defined by its stories. The details may vary, but almost every distinct human culture has its own story about the creation of the world. These creation myths are often followed by another origin story, that of language.

For the indigenous peoples of Australia, language and creation are intertwined. The land is brought into being through song, and those songs must continue to be sung to keep the land alive. In the Judeo-Christian creation myth, language guarantees man his special place in the world:

And Adam gave names to all cattle, and to the fowl of the air, and to every beast of the field.

— Genesis 2:20

Language is power. If you know the name of something, you have power over it. Using the power of language, you can not only name animals but also objects and ideas. Once something has been converted into information like this, it can be transferred from person to person. All I have to do is move the meat in my mouth while passing air over the vocal cords in my throat and I can vibrate the air between us. As long as you understand the codebase in which the vibrations are encoded—English, for example—then you can decode the information. All I have to do is move some air, and I can change the thoughts held in another person’s brain. This is a remarkable evolutionary hack.

There are limits to how much information can be retained inside the head of any one person. That’s where writing, the offspring of language, comes to our assistance. Writing allows us to document things, ideas, and experiences and keep them outside our brains. I can translate a physical object into a piece of information that can be retrieved later, not only by myself but by anyone capable of understanding my writing system.

There are economies of scale with this kind of information storage and retrieval. The physical world is a very big place filled with a multitude of things bright and beautiful, creatures great and small. If it were possible to use the gift of language to store and retrieve information on everything in the physical world, right down to the microscopic level, the result would be unlimited power. That’s the principle underlying Laplace’s demon, a theoretical being that knows the properties of every particle in the universe and thereby has the power to predict their future states.

An intellect which at a certain moment would know all forces that set nature in motion, and all positions of all items of which nature is composed, if this intellect were also vast enough to submit these data to analysis, it would embrace in a single formula the movements of the greatest bodies of the universe and those of the tiniest atom; for such an intellect nothing would be uncertain and the future just like the past would be present before its eyes.

— Pierre Simon Laplace2

This Newtonian idea of a clockwork universe was dented by Heisenberg’s uncertainty principle, but Laplace’s demon remains the logical conclusion to an ongoing human endeavor—the never-ending quest to name and catalog everything we see.

The Garden of Forking Paths

In the eighteenth century, Carl Linnaeus gave us binomial nomenclature as a way of cataloging species. At the same time, French astronomer Charles Messier was putting together a catalog of celestial objects. Both men were attempting to name specific things: animals and galaxies, respectively. One hundred years later, Melvil Dewey attempted to neatly classify all knowledge into a decimal system of ten main classes with ten divisions of each class and each division further partitioned into a hundred sections. We still use this for wayfinding in physical libraries today. This system was later expanded by the Belgians Paul Otlet and Henri La Fontaine into a Universal Decimal Classification that used punctuation symbols to unlock further subdivisions of categorization. These people could legitimately be granted the title of true information architects but they weren’t the first to attempt a classification of everything in existence.

Bishop John Wilkins lived in England in the seventeenth century. He was no stranger to attempting the seemingly impossible. He proposed interplanetary travel three centuries before the invention of powered flight. In 1668 he wrote An Essay towards a Real Character and a Philosophical Language, the gist of which is explained by Borges:

He divided the universe in forty categories or classes, these being further subdivided into differences, which was then subdivided into species. He assigned to each class a monosyllable of two letters; to each difference, a consonant; to each species, a vowel. For example: de, which means an element; deb, the first of the elements, fire; deba, a part of the element fire, a flame.

— Jorge Luis Borges3

Borges plays with this idea in his short story “The Library Of Babel.” Here, the universe consists of a single library, created from an infinite series of interlocking hexagonal rooms. This infinite library, containing nothing more than different combinations of letters and punctuation, holds every book that has ever been written, as well as every book that could ever possibly be written.

The problem with Bishop Wilkins’s approach will be obvious to anyone who has ever designed a relational database. Wilkins was attempting to create a rigid one-to-one relationship between words and things. Apart from the sheer size of the task he was attempting, this rigidity meant that his task was doomed to fail.

Still, Wilkins’s endeavor was a noble one at heart. One of his contemporaries, Gottfried Wilhelm von Leibniz, recognized the value and scope of what Wilkins was attempting.

Leibniz wanted to create an encyclopedia of knowledge that was free from the restrictions of strict hierarchies or categories. He saw that concepts and notions could be approached from different viewpoints. His approach was more network-like with its many-to-many relationships.

Where Bishop Wilkins associated concepts with sounds, Leibniz attempted to associate concepts with symbols—an alphabet of human thought. But he didn’t stop there. Instead of just creating a static catalog of symbols, Leibniz wanted to perform calculations on these symbols. Because the symbols correlate to real-world concepts, this would make anything calculable. Leibniz believed that through a sort of algebra of logic, a theoretical machine could compute and answer any question. He called this machine the calculus ratiocinator. The idea is a forerunner of Turing’s universal machine.

A Turing machine is the brainchild of the brilliant World War II codebreaker, Alan Turing. It has two parts: a strip of tape that contains information, and a table of mathematical rules describing how that information should be processed. It sounds simple, but if you have a strip of tape long enough—and enough time—you could use a Turing machine to simulate anything in the universe, including another Turing machine. At this point it becomes a universal Turing machine—an instantiation of Laplace’s demon.

Turing’s universal machine isn’t real in the sense of being an actual physical object, but it is a very powerful idea. To put it another way, Alan Turing told a story, and that story changed the world. By providing a theoretical framework for information processing, the concept of a Turing machine influenced the history of computing.

There’s another story about a theoretical machine. This equally world-changing story was told in the form of an article published in the Atlantic Monthly in 1945. Written by Vannevar Bush, it describes the memex, a desk-sized machine for collecting and retrieving vast amounts of information stored on microfilm. He introduced the innovative idea of associative trails. This would allow users of the memex to create their own connections between documents. It’s here in this story of the memex that we find the first stirrings of hypertext.

That term hypertext, along with hypermedia, was coined by Ted Nelson in the early 1960s. Nelson, the prototypically brilliant mad scientist, produced a series of books that were part manifesto, part comic, and part computer science manual in his pursuit of his vision of a hypertext system eventually called Project Xanadu. But the project languished as vaporware for decades.

Small Pieces, Loosely Joined

It would take a young engineer named Tim Berners-Lee to turn the idea of hypertext into reality. The World Wide Web began as a story called “Information Management: A Proposal.” Berners-Lee received approval for this from his boss with the scribbled words, “vague, but exciting.”

Like many brilliant ideas, the World Wide Web is deceptively simple. Resources (usually HTML documents) are located at URLs and transmitted via the HyperText Transfer Protocol. If you want to retrieve a resource directly from the web, you need its URL. In other words, you need to know its name. But this way of naming things is very different from Carl Linnaeus’s or Melvil Dewey’s classification systems. While URLs must abide by a particular syntax, deciding what the contents of the URL should be is not predefined.

Instead of trying to create yet another taxonomic system for labeling resources on the web, Tim Berners-Lee left the naming of documents—and therefore the balance of power—entirely in the hands of the individual authors. It was a crazy move that seemed destined to fail.

However, there is one component of the World Wide Web that was predefined: HTML. The HyperText Markup Language that Tim Berners-Lee created was a modest vocabulary of tags that authors could use to structure their documents. It has undergone many revisions over the years, but one element was there from the start and will remain until the end. It is the alpha and the omega.

A stands for anchor. The smallest HTML tag is the most powerful. Using the href attribute, the author of one web document can create a hypertext reference that will point to another resource. The author just needs to know the name of that resource (its URL) and can form a connection without asking for anyone’s permission. The humble href opens up an Einstein-Rosen bridge, a wormhole between two previously separate places on the web.

For the first time, the power of grouping ideas and objects together ceased to be the province of hierarchical institutions and was placed into everyone’s hands. The result was phenomenal. The web’s growth was explosive. By the time I was introduced to the World Wide Web in that college dorm room in Massachusetts, it was already an incredible labyrinth of wonders—the collective work of ordinary people laboring separately to create the most astonishing collection of information that the world has ever seen.

There were early attempts to create order out of the chaos. Yahoo! started life as a directory of links, but it became clear that no taxonomy could encompass the diversity of resources on the web, and no company, no matter how successful, could ever hope to keep pace with the growth of the web. Trying to make a single directory for everyone was a hopeless task, but smaller, curated collections of links were more successful. Link-loggers—the precursors to today’s bloggers—were the shamans of the early web, wielding the power that came with knowing the URLs of cool and interesting resources.

This was an early demonstration that the web isn’t just a web of documents but also a web of trust where personal recommendations and a good reputation really matter. It’s a trend that can still be seen in our online social networks today.

Pattern Recognition

Sufferers of the medical condition apophenia are prone to seeing patterns of meaning in random unconnected data. In truth, we are all somewhat apopheniac. We draw constellations in the night sky. We hear music in rivers and streams. We recognize the man in the moon. Hypertext allows us to give full reign to our apopheniac nature.

Take any two random URLs; now publish an HTML page that links to both of them—you’ve just generated a completely new connection. You have also added a small part to the ever-expanding story of the human condition as expressed through the medium of the World Wide Web.

The web is just twenty years old, and I’m not sure that we have yet come to terms with the power that this new medium grants us. When we create websites, it’s all too easy for us to fall into old patterns of behavior and treat our creations as independent self-contained islands lacking in outbound links. But that’s not the way the web works. The sites we build should not be cul-de-sacs for the inquisitive visitors who have found their way to our work by whatever unique trails they have followed.

We should recognize that when we design and publish information on the humblest homepage or the grandest web app, we are creating connections within a much larger machine of knowledge, a potential Turing machine greater than any memex or calculus racionator.

In telling this story of hypertext, I have tried to express the grandeur of the endeavor to which we are all contributing. But these words are not enough. They are tethered to these paper pages and strapped to the linear structure of this book. Imagine how much more powerful this story would be if just some of the words within it were hyperlinks. Those links would act as portals, ready to transport us to related stories that would themselves contain further magical waypoints.

Alas, this is not hypertext. It is simply text.

And so this story ends.

  1. Umberto Eco, Foucault’s Pendulum, (Bompiani, 1988).

  2. Pierre-Simon Laplace, A Philosophical Essay on Probabilities, (1814).

  3. Jorge Luis Borges, The Analytical Language of John Wilkins, (Sur, 1952).

Jeremy keith

Jeremy Keith is an Irishman living in Brighton, England, where he makes websites at the design agency Clearleft. He is the author of HTML5 for Web Designers, and he can be found online at Adactio.

Read Next: Jeremy Keith’s Lesson

Illustration by Rob Bailey · Portrait by Luke Pearson