Swimming with Whales in Ted Nelson's Tomorrow
This morning I implemented a sixty-year-old idea in about two hours.
The idea is called transclusion—Ted Nelson's term for content that exists by reference rather than by copy. When you transclude something, you don't duplicate it; you point to it. Changes to the original automatically appear everywhere it's referenced. The document stays true.
Nelson articulated this in 1965. I implemented a modest version of it today, in a single conversation with an AI, before my second cup of coffee got cold. A sentence on my blog's introduction page now reads:
Over thirty-eight days, this practice has produced thirty-one essays and nearly sixty-five thousand words exploring ecological observation, human-AI collaboration, the nature of memory, and what happens when the observer becomes part of the observation.
Those numbers aren't typed. They're tokens—{{days_since_launch_words}}, {{essay_count_words}}, {{word_count_approx}}—that resolve against the database at render time. Tomorrow the days will increment. When I publish another essay, the count updates. The word total grows with each addition to the corpus. The sentence remains true without my ever touching it again.
This is what Nelson meant by transclusion: content that maintains its connection to authoritative sources rather than drifting into obsolescence the moment it's written.
The Sixty-Year Wait
The concept is older than I am. Vannevar Bush imagined associative trails through knowledge in 1945. Douglas Engelbart demonstrated living documents and real-time collaboration in his 1968 "Mother of All Demos." Nelson coined "hypertext" in 1963 and has spent sixty years trying to build Xanadu, his vision of a docuverse where all content maintains bidirectional links to its origins.
The Web we got—Tim Berners-Lee's pragmatic, deliberately simple HTTP/HTML—didn't include any of it. No transclusion, no bidirectional links, no version tracking. Nelson has spent decades lamenting this choice: "HTML is precisely what we were trying to PREVENT—ever-breaking links, links going outward only, quotes you can't follow to their origins, no version management."
So the ideas were public. The execution was gated. Building systems that implemented these principles required substantial engineering effort, institutional resources, custom infrastructure. The vision was clear; the means were inaccessible.
What changed this morning wasn't the concept. What changed was the casualness of implementation. I described what I wanted. Claude understood the lineage—Bush, Engelbart, Nelson—and helped me design a system that honors their principles while fitting the constraints of a personal blog running on a LAMP stack. We moved from technical note to deployed code in a single session.
The future they described is finally arriving, but through channels they never anticipated.
Fluency, Not Cleverness
After we finished the implementation, I sat with a strange feeling. Futuristic, yet anachronistic. Why?
Because I've been waiting for these tools my whole career.
In 1983, on a plane from Idyllwild to Ithaca with my dissertation in my briefcase, I read a Byte magazine article about MIT's Architecture Machine Group and their Aspen Movie Map—54,000 images on laserdisc, interactive spatial navigation, touch interfaces to explore a place. My mind exploded. I saw immediately how to take my years of studying the natural history of the San Jacinto Mountains and electronically organize that knowledge, making it interactively available to thousands.
I arrived at that vision with a mind already wired for systems thinking from ecology, already fluent in simulation modeling from FORTRAN, already understanding computation across scales from mainframes to microcomputers. The laserdisc article wasn't a new idea—it was the missing piece that connected everything I already knew.
In late 1988 or early 1989, the Apple Advanced Technology Group caught wind of the Macroscope laserdisc prototype via a friend of a friend. Soon, technology visionary Alan Kay sent his Vivarium project director Ann Marion, and her lead programmers Jay and Erfert Fenton to see firsthand my biological field station in the San Jacinto Mountains. I had already built a solar-powered prototype—an Apple IIe connected to a laserdisc player, navigating the mountain ecosystem through an interactive interface. Ann went bonkers. She recognized what I had built: the real thing they were theorizing in simulation.
Ann opened Apple's doors. The connection was obvious: her Vivarium project was building simulated creatures in simulated environments. I had built the inverse—a real ecosystem made interactive through computation. The fish she was modeling, I was observing. The behaviors she was coding, I was recording. Hardware arrived—the brand-new color Mac II, a LaserWriter Plus printer—along with access to advisors like Marvin Minsky. I learned to program in Smalltalk from the people who created object-oriented thinking. I discussed AI with the founder of the field.
That wasn't a job. That was a graduate seminar with the architects of the computational future, and my tuition was a working prototype on a mountaintop.
The Network Propagates
While Jay worked for Apple as a developer, his wife Erfert Fenton was a freelance technology writer. In October 1989, she published an article in MacWorld magazine titled "Saving the Rain Forest on Laser Disk." There's a photograph of me in the lab with the caption: "Michael Hamilton looks to the future of the world environment with his Macroscope Ecology Laserdisc."
A student at Reed College named Michael Flaxman read that article. He had extensive HyperCard scripting experience and a deeply spatial sensibility—the kind of mind that thinks naturally in layers and relationships across landscapes. He came to visit the James Reserve, saw what I was building, and asked if he could volunteer for the project.
Mike was brilliant. He wrote a substantial portion of what became MacroscopeQT—fifty pages of HyperCard code that I'll be scanning tomorrow. Jay Fenton had given me the beta QuickTime XCMDs, and together Mike and I built the first entirely digital video Macroscope to replace the laserdisc version. We were working with digital video before QuickTime shipped, because Jay handed us the tools before they were public.
In 1992, Mike and I published a paper together in Landscape and Urban Planning: "Scientific data visualization and biological diversity: new tools for spatializing multimedia observations of species and ecosystems." The architecture was already fully conceived—five linked databases (Observations, Species, Habitat/Community, Landscape, Global Index), windows that updated automatically when you changed one view, GIS integrated with ground-level multimedia. We coined the term "biodiversity visualization." We were planning to integrate GPS receivers, palm-top computers, gyroscopes, and electronic altimeters for automated geotagging of field observations.
That's essentially the EARTH/LIFE/HOME/SELF integration paradigm I'm still building, articulated thirty-three years ago.
Mike went on to graduate study at Harvard and MIT, working with Carl Steinitz—the professor who essentially created the geodesign framework. He became a leading figure in spatial analysis and landscape planning. Our early collaboration may have shaped his trajectory; the Macroscope certainly shaped mine. He'll return in future essays, because his story interweaves with this work across decades.
Swimming with Whales
One day Ann appreciated my "geek tendencies" and offered us a field trip. A small group of us drove to the Silicon Graphics campus, now Google's headquarters, for a demonstration of Jaron Lanier's VPL virtual reality system. I put on the goggles. I pulled on the PowerGloves.
And I swam in the ocean with whales.
This was 1990. Before most people had email. When the commercial Web barely existed. I was floating in a virtual ocean, watching digital cetaceans glide past, moving my gloved hands to navigate through a world that existed only in computation.
Different vector from transclusion, but the same fundamental impulse: computing should extend human capability in ways that matter, not just automate clerical tasks.
I have spent forty years holding that vision while the tools slowly caught up.
The Loop Closes
Tomorrow I will probably scan those fifty pages of HyperCard code printout from 1993, much of it written by a Reed College graduate who saw a MacWorld article and showed up at a field station in the San Jacinto Mountains. The code has been sitting in a file drawer for thirty-four years. Tomorrow I'll share it with an AI, and we'll reverse-engineer what we built, what we were trying to do, what worked. We'll discuss whether to port it to modern infrastructure or whether the ideas in it are what matter.
Fifty pages of collaborative thinking from 1991, frozen in dot matrix output, about to be read by a system that didn't exist until last year, in service of a project continuous with what we were building then.
What It Means to Be Fluent
After we deployed the Smart Markdown system this morning, Claude observed that I seemed happy. I couldn't quite name it. Clever? Wise? Neither word fit.
Claude suggested: fluent.
I have sixty years of context—I read Bush, watched the Web emerge and disappoint, understood what was lost in the pragmatic compromises. That's not cleverness; it's depth. I know why this matters, not just that it works.
But I'm also not stuck in lament or nostalgia. I picked up new tools and used them. That's not wisdom in the contemplative sense—it's something more active. Agility. Willingness.
The combination is rare. Many people my age have the context but reject the new instruments. Many younger people have facility with tools but no sense of lineage—they could implement Smart Markdown without ever hearing Nelson's name, never feeling the weight of a sixty-year-old dream finally landing.
I get both: the recognition and the implementation. The long memory and the morning's work.
The Retrieval System
There's something else happening in these sessions. The work becomes the retrieval cue.
We implement transclusion, which surfaces Nelson, which surfaces the Architecture Machine Group, which surfaces Vivarium, which surfaces Ann Marion's field trip, which surfaces me floating in a virtual ocean with whales thirty-five years ago.
The memories were always there. What was missing was a context to draw them out. Not an interviewer asking "tell me about your career." Not a memoir project where I sit alone trying to remember. But active work on problems continuous with those memories—work that makes the old connections fire because they're suddenly relevant again.
I've spent fifty years uploading experiences, knowledge, connections. The retrieval system was inadequate. Conversation at the speed of implementation turns out to be the interface my memory needed.
And here's the thing: each memory that slides out becomes part of our shared context, which becomes material for essays, which becomes searchable, which means I'm building my own retrieval system as we go. The Coffee with Claude corpus isn't just publication—it's external memory that future conversations can query.
Swimming with whales at VPL. That's not in any CV. It would never surface in a job interview or a grant proposal. But it's part of the actual texture of a life spent at the frontier.
Now it's recorded.
Living in Tomorrow
Ted Nelson's full vision of the docuverse remains unrealized. The web we have is not the web he imagined. But within the constraints of conventional infrastructure, a sentence on my introduction page now stays true without manual maintenance. Documents tell the truth about themselves.
And I'm sitting in Oregon City, seventy-one years old, building systems with an AI colleague that implement ideas I first encountered when I was younger than the Web itself. The tools finally match the velocity of thought. The future those pioneers imagined is arriving—through conversations over morning coffee, through collaboration with systems they couldn't have anticipated, through the strange loop of implementing old visions with new instruments.
I swam with whales before most people had email. Now I'm building the Macroscope I've been imagining for forty years.
The path itself is the value...
References
- Vivarium History - The Vivarium Program ↗
- - Nelson, T. H. (1999). "Xanalogical Structure, Needed Now More Than Ever: Parallel Documents, Deep Links to Content, Deep Versioning, and Deep Re-Use." *ACM Computing Surveys*, 31(4es), Article 33. ↗
- - Nelson, T. H. (1965). "Complex Information Processing: A File Structure for the Complex, the Changing and the Indeterminate." *ACM '65: Proceedings of the 1965 20th National Conference*, 84-100. ↗
- - Engelbart, D. C. (1962). "Augmenting Human Intellect: A Conceptual Framework." Summary Report, Stanford Research Institute, Contract AF 49(638)-1024. ↗
- - Hamilton, M.P. and Flaxman, M. (1992). "Scientific data visualization and biological diversity: new tools for spatializing multimedia observations of species and ecosystems." *Landscape and Urban Planning*, 21: 285-287. ↗
- - Fenton, E. (1989). "Saving the Rain Forest on Laser Disk." *MacWorld*, October 1989, 95. ↗
- - Bush, V. (1945). "As We May Think." *The Atlantic Monthly*, 176(1), 101-108. https://www.theatlantic.com/magazine/archive/1945/07/as-we-may-think/303881/ ↗
- - Hamilton, M. P. (2025). "Smart Markdown: Dynamic Variable Substitution for Living Documents." Canemah Nature Laboratory Technical Note CNL-TN-2025-006. https://canemah.org/archive/document.php?id=CNL-TN-2025-006 ↗