Comments from Frankston, Reed, and Friends
Thursday, October 28, 2004
BobF at 9:52 PM [url]:
I'm finally getting around to updating my site. It's been too long since I've posted a fresh essay. One reason is that I have too much to write about. I'm trying to focus on key issues against all the background noise. This year's presidential race is a depressing reminder of how difficult it is to communicate ideas across conceptual divides -- the words seem the same but have entirely different meanings.
The purpose of this essay is to catch-up on a lot of interrelated topics by careening through them. I can then relax and start to write about each one in more detail. It's best to think of this as an overture that presents the ideas as a prelude to a set of essays I'm planning to write.
Though I cover a number of topics in this essay there is a unifying theme which is the end-to-end argument which decouples the communications (or message) from the transport. It changes telecommunications into tele/communications. It also applies to marketplaces -- it is more valuable to provide opportunity than just solutions to old problems.
Short blog pieces are fine for discussion among a group with a common framework but not for explaining new ideas. My VON Magazine columns are limited to 750 words which is about enough to make a statement but not enough for any depth. This is frustrating but forces me to organize my thoughts and I hope it makes people interested in learning more. It's far easier to write email in a discussion thread than to write an essay that stands alone -- that's why I post pointers to letters I write in public discussions such as David Farber's Interesting People list.
Instead of getting lost in the myriad details of tele/communications policy, I'm trying to identify the essential issues or leverage points. I insert the "/" to emphasize the parallel with the / that separates TCP/IP. The two are independent -- TCP (Transport Control Protocol) is not a layer above IP (The Internet Protocol). It seems natural to view TCP as a layer above IP but this creates an unnecessary dependency. When we write telecommunications as a single word we bury this assumption in our language and it becomes difficult to recognize that there are really two independent concepts.
The key to the success of the Internet is the end-to-end argument. It decouples TCP (communications) from IP (transport bits over a distance or "tele"). The Internet was such an improvement over earlier architectures that problems such as the need to maintain the mapping of names to IP addresses were viewed as mere annoyances. Eventually the Domain Name System (the DNS) was created to keep the problem manageable and then governance efforts such as ICANN were created as the difficulties multiplied on the assumption that issues should be resolved by consensus rather than technology. The Internet routing tables have to keep track of all the local networks in order to keep the IP addresses relatively stable -- the resulting complexity is resolved by setting priorities rather than removing the source of the complexity. The application layer is dependent upon the central services to get addresses and domain names. Domain names are considered commercially valuable and thus must be allocated in accordance with social and commercial policies.
I use the term "Internet Inc" to describe the imaginary entity that is supposed to be providing Internet services. Since the Internet is so valuable it is easy to confuse "end-to-end" with "womb-to-tomb" -- the vital nature of the Internet has gotten lost in translation and we find ourselves dependent clients of this imaginary company.
We've lost the essential idea of removing dependencies in order to allow innovation at the edge. The P2P (Peer-to-Peer) community demonstrates the resilience of E2E by taking control at the edge just as the Internet community did when the center was the telephone network. As a reminder of the difficulty of crossing conceptual divides, many people think P2P is about stealing music but that's another topic in its own right -- so-called digital rights.
End-to-end argues that we can find ways to accomplish our tasks without waiting for the "Internet" to give us a solution. In fact that's just what P2P does -- it creates connections despite the impediments in the middle.
A few weeks ago I came across a twenty-five year old Xerox design document that described their routing protocols. They recognized that mobility was an issue and made sure that the Ethernet or MAC address was globally unique. The route to a particular network was only a hint -- if the end point moved a new route would be established. The IP is only meaningful on a given network -- if the end point moves it must get a new address and lose all connections.
Rather than trying to continue to patch around this serious design error we need to rediscover the "real" Internet. We don't have to fix the Internet -- the P2P community demonstrates that we can simply create new opportunities at the edge. Today we know how to generate the identifier at the edge without any central authority and we know the importance of encryption in order to avoid having to naively trusting any transport. Though cryptography was explicitly banned from initial Internet implementation for reasons of "national security" we now know that it is necessary in the same way that people need to lock their doors when living in a city.
Once we can create our own end point identifiers we don't need to limit ourselves to one per computer. We can use them to define relationships between any kind of end point just as we use URLs today to relate documents. Unlike URLs the end point identifiers do not depend on the volatile DNS. These end points are stable -- if we create a voice conversation between two end points they can both move without breaking the relationship. The path between the two points will change but the conversation will persist. Today's cellular networks must track all conversations -- by shifting the definition to the edge not only do we get mobility, we get a simpler system. This is a sign of an effective architecture.
With Ambient Connectivity we define connectivity in terms of relationships, not network paths. These relationships don't depend upon a central authority, a physical path (such as a wire) or a dedicated "frequency" -- they are independent of such mechanisms! We can use any medium and any path to exchange packets.
Traditional radio uses high energy radiation because it must reach it's destination in a single hop and is based on a century old technology--the electronic tuning fork. It's so primitive that one must be licensed to transmit a signal! That would be bad enough but since we fail to distinguish between the meaning of the communications and the transport, we control communications) speech). This is the spectrum legacy that I have written about.
With ambient connectivity we can simply ignore this whole system because it takes very little energy to send a puff of information a short distance and that energy is not associated with any particular frequency. Even if you can detect the energy it is meaningless unless you have the corresponding key.
You should be skeptical about all my claims -- I'm only trying to give a sense of the possibilities without going into much detail. The important point is that I'm not writing about technology as such -- just marketplaces. A marketplace is just a complex system. The Internet itself is a marketplace but the ideas apply just as well to more traditional marketplaces.
It's not about "Internet Inc"; it's about opportunity. Solutions to particular problems have only narrow value. This is why a special purpose solution such as a typewriter lost out to the general purpose personal computer. At first the computers were inferior but they are used for so many purposes that each application is only a small incremental cost. This is the same reason that Voice over IP is rapidly replacing traditional telephony -- we're replacing a very expensive system that has only one purpose with a simpler system that happens to be able to carry voice as an incidental capability.
I'm positing Frankston's First Law as Marketplaces that provide opportunity rather than just solutions allow demand to create supply. This process works best in digital systems which allow us to regenerate ideas that work and discard those that don't. We call this process evolution -- it's not about probability but opportunity systems since we are not predefining solutions. Biological evolution is simply one example -- it's not about directed progress -- it's about discovering simplicity. No wonder our genome isn't very different from worms.
I may seem to be jumping between technical details and high level issues but it's really about trying to understand how systems work be they marketplaces or interconnected devices.
When my computer crashes because I disconnect a remote USB device I wonder why the operating system itself should care about the relationship between a particular device and an application that uses it. It's a gratuitous dependency that stems from an outdated model of computer systems as expensive machines upon which we layer an operating system and then add layers of applications. Installing a program is necessary because we must carefully craft it to fit into the particular environment so that we could fit within tight resource constraints. These dependencies contribute to what I call "bit rot" -- the failure of software due to external changes in the environment -- and make it difficult to create resilient systems.
I do want to write about the particular frustrations--not just concepts but it's hard to fit both kinds of writing in the same framework so I plan to implement a separate series of writings with its own RSS feed. Whether I implement it myself as on Frankston or use a canned solution as in SATN is my choice. Doing it myself does take extra effort but I learn a lot in doing so. The general concepts about opportunity and marketplaces come from this experience. I learn a lot when things don't work because it forces me figure out what's gone wrong. While Dan writes about software that lasts 200 years, I'm happy when my systems can run for a single day--we have met the enemy and it is complexity. Simplicity is the result of effective systems architecture rather than limiting capabilities.
It's difficult enough to write about technical topics; I find it's even more difficult to write about social issues when there are so many implicit assumptions. This is especially true for politics and next week's presidential election. George Lakoff has written about this problem. It helps me understand a shockingly incurious administration that doesn't admit to mistakes let alone learn from them. How can such an administration understand a world redefined by the end-to-end concepts? How does one deal with the idea that the bits on the Internet have no intrinsic meaning to people who see morality as fundamental? The point of science is that one must challenge ones assumptions and continue to refine ones understanding. It is a way of looking at ideas that is very threatening to those who seek only to reinforce preexisting assumptions.
It's too bad we only teach evolution in biology classes -- it's hard to learn hard science amidst the complexity of biological systems. There is nothing special about biological evolution -- it's just one example of how complex digital systems change. I worry about an electorate that seeks solace in myths rather than being curious about new possibilities.
I like to end essays by tying the ideas together but I've simply opened too many topics -- or, perhaps, created opportunities for future thoughts and writings. So, for now "?!".