Communities can't make progress without some basic foundations: identity, a positive self-image, an economic foundation and meeting places. This is true of peer-to-peer as much as any other movement.
Identity is such an obvious necessity (because you have to know you're part of a group before you can act consciously within it) that you might wonder why I bother mentioning it. But technologists working on the various projects everybody now calls "peer-to-peer" didn't identify themselves with that label until four or five months ago. For years Internet researchers and businesses were opening new territory by distributing resources among users, but each innovator struggled alone with the same problems of resource discovery, authentication, security, caching, and so forth, reinventing all the wheels he needed to carry his carriage over the rough terrain. The announcement of a Peer-to-Peer Working Group at Intel's Developers Forum in August, and a peer-to-peer summit, organized by Tim O'Reilly in September, were the beginnings of a real peer-to-peer community. One of the major tasks of the O'Reilly summit was identifying the members of that community (see Remaking the Peer-to-Peer Meme ).
Next comes a positive self-image. Oppressed minorities have always struggled to develop pride, and the eras during which they turned deprecation into celebration of their identity were always historical turning points for their communities. The buzz around peer-to-peer has also been accompanied by the sting of prejudice from the start. Outside of O'Reilly Network and some progressive pieces by such thoughtful journalists as Dan Gillmor of the San Jose Mercury News, media coverage of peer-to-peer seemed obsessed with a single application (Napster-style file trading) and with the legal or ethical issues it raised. The broad promise of peer-to-peer for distributed computing, for augmenting users' knowledge through the pooling of multiple source, and for rich communication mediums, has only gradually and insufficiently entered the public story.
An economic foundation is also needed to sustain a community. As a positive image developed for peer-to-peer in the past few months, the term finally started to impress a handful of venture capitalists. But for reasons I'll explain later, peer-to-peer is a hard soil to till. Some are managing to cultivate profitably, but the crops look quite different from the foliage of other climes.
Last on my list of prerequisites, a community needs meeting places. And finally, after some 400 words, I have come to the topic I was supposed to write about when O'Reilly Network asked me to summarize last week's Peer-to-Peer conference.
What can I say about the tone and feeling at this first-ever public event on peer-to-peer? It was quite a trip. The elegant Westin St. Francis hotel hosted 850 paying attendees plus an additional 150 journalists and assorted comps. The organizers had to turn away late-comers for lack of space (although they made an exception for Shawn Fanning, who turned up unexpectedly on the second day).
These attendees formed a Symphony of a Thousand that played earnestly throughout the conference. Comments from the audience were routinely as knowledgeable and insightful as the speakers. New projects were being spec'ed out in every corner, exploiting a variety of computer languages, platforms, and application domains.
Commentators routinely compare new business and research opportunities to America's Wild West (in fact, the Electronic Frontier Foundation was prominently represented at the conference), but the potent combination of aggressive entrepreneurs and venture capitalists made for a particularly breathless experience inside and outside the conference sessions. Everybody knew that deals were going to be made by somebody, somehow, so it felt as if Single's Night at the local watering hole had drawn a thousand stampeding bison.
Peer-to-peer presents a basic problem, of course. As many people have pointed out, the concept implies decentralization. And if everything is decentralized, there exists neither a central point to collect money nor a means to enforce payment. The explosion of peer-to-peer businesses that I optimistically believe will occur during the next few years will be shaped by the clever ways found by firms to overcome this basic paradox.
Remember, nobody has yet figured out a sure-fire way to put a sustainable business on the Internet, even leaving aside peer-to-peer. I challenge anyone to show me a Web site that you can guarantee will be earning money five years from now. Not even that pinnacle of Internet savvy, O'Reilly & Associates, has proven that it can profitably go online (though we're trying a number of promising experiments).
|"The Domain Name System is almost entirely decentralized, preserving only a single point of control at the root server, yet look at the control nightmare that tiny bit of centralization has led to."|
The key to building a business in this decentralized space is to find some tiny point of central control, because hardly any system can exist in a practical sense without some centralization. Take Gnutella, for instance. A pure decentralized system in theory, it has developed an implicit hierarchy through host caches (still a pretty decentralized architecture). The Domain Name System is almost entirely decentralized, preserving only a single point of control at the root server, yet look at the control nightmare that tiny bit of centralization has led to. (As an aside, the Open Root Server Coalition might counter that every computer has a point of control, located in its resolv.conf file, and that each computer administrator can control DNS by what he or she puts in that file.)
To repeat an analogy that turned up at the conference -- particularly appropriate to our San Francisco location -- the people most likely to make money during a Gold Rush are the ones selling pickaxes and shovels. Some of the comparable strategies being used in peer-to-peer are:
Keep in mind a very perceptive statement made by a marketing manager of Sun Microsystems after Bill Joy's keynote at the conference. She pointedly used the term "peer-to-peer space" because, "There's no peer-to-peer market any more than there's a client/server market." Peer-to-peer is too big to put in a business plan. In fact, I'm going to stop talking here about peer-to-peer as a business, because it's important to remember that it's more of a community.
Individual communities of highly educated and committed activists have already grown around each major peer-to-peer project, like Freenet, Gnutella and Jabber. But those are just islets in an archipelago that is rapidly self-organizing into a single nation. The wrath of this community erupted when standardization efforts began.
Already, two major corporations are offering substantial resources and commitment to standardizing the infrastructure under peer-to-peer space; community members worry whether this outreach includes a hidden threat of control.
Intel is still getting flack for its early proposal to organize the Peer-to-Peer Working Group as a consortium with fees and an operating style much like the World Wide Web Consortium. At a Friday panel on standardization, and again at the "town hall" or open meeting that ended the conference, several attendees insisted they've made a 180-degree turn, that they're not trying to restrict discussion to paying members, that they're active on the pivotal decentralization mailing list, that they'd be willing to open the planning meetings if the community wants it, and that they're reaching out to project leaders all over the P2P space and getting input from everyone. Much happened even in the past week, and Tim O'Reilly assured us that "it's not done yet." Bob Knighten said they were starting with the basic question of what needs to be standardized. He told me later that the open-source library Intel recently released to provide peer-to-peer security is an "experiment" and that they look forward to others putting forward their packages.
Before Bill Joy came to give the keynote Thursday, he was rumored to have his own announcement in the peer-to-peer space. I didn't enter the Westin's grand ballroom, but I had read several cynical reports (such as Sun's Net Effect by Rael Dornfest) of Sun's recent Sun ONE announcement, which several people thought was mostly a overly hyped combination of old features. I knew Joy was going to make a major announcement in his keynote, but expected nothing better than "ONE squared."
|"Bill Joy explained that many years of tunneling through the dictionary during Java development have left Sun with trouble finding more words that begin with J."|
What we got on Thursday was JXTA, the odd buzzword that Sun derived from the word "juxtaposition." Bill Joy explained that many years of tunneling through the dictionary during Java development have left Sun with trouble finding more words that begin with J. He apparently didn't think it endearing to use the word "jockeying," even though this is obviously what the new initiative is trying to accomplish relative to Microsoft's .NET and Intel's working group.
But there does seem to be solid sense behind JXTA, because Joy and his management staff claim to be disavowing world domination. First, JXTA is meant to be small. Like a well-designed language, Joy said, an infrastructure should provide just enough to fulfill all the requirements at its own level and leave room for people to innovate at a higher level. Second, Sun is trying to avoid a repeat of the grumbling its Java licensees have expressed over its control of that language. Project manager Mike Clary avows that Sun is "just another participant out there that's helping to influence the technology." JXTA will be placed under an Apache-style license. Their goal? Quite modest, said Joy: "We want to develop a platform so we can produce some successful apps and build a community around it. If there are other communities, that's fine."
I was wondering why Intel was slammed so hard by people in the P2P field, when Tim Berners-Lee got away with creating the W3C with pretty much the same structure. (Very rarely has anybody grumbled about the W3C -- and usually just when they weren't making progress on something.) My answer to this question centers on the context in which the W3C started, versus the current P2 field. When Berners-Lee proposed the W3C, Netscape had ravaged Mosaic (through superior technology, to be sure) and emerged as the ferocious lion dominating the WWW savanna. While Berners-Lee's organization was slanted toward heavy-weight corporations, that was widely seen as the only group that could cage the lion and save the Web from a looming monopoly.
By contrast, the P2P arena is completely open; there's no way to tell what the relationships are among the players or who will win. People want some coordination and standardization, but they're not going to put up with the faintest attempt to draw a line and say who's in or who's out. Intel, like Sun with their JXTA proposal, is coming into a very different environment from Berners-Lee. It's also, of course, a different era in computing history, characterized by a triumphant open-source movement.
Microsoft, which was present at the conference, has not stepped forward like Intel and Sun, and probably are smart to avoid anything in the peer-to-peer community that people might interpret as embrace-and-extend.) They did, however, help promote SOAP as a potential unifying technology between their .NET and Web services developed outside the .NET framework.
On Friday morning, John Perry Barlow made a surprise appearance on the P2P conference stage. He offered a number of pungent observations, including his oft-heard maxim that "information is not a noun, but a verb." If physical commodities are nouns and information is a verb, perhaps peer-to-peer is a preposition. It's the ineluctable, nearly invisible line that ties everything around it together. And if you follow that metaphor, perhaps what's really important in P2P is the 2.
Just when we -- the writers, the researchers, the entrepreneurs -- thought we were getting the point across that peer-to-peer was a critical new technological innovation, BANG!: along came another court decision Feb. 12 with a message of doom for Napster. Once again, just two days before the conference, the newspapers were full of talk about illegal activities as the characteristic trait of peer-to-peer.
Although Napster's name was invoked throughout the conference --with Clay Shirky giving a stunning keynote at the start of the conference that drew valuable lessons from Napster's popularity -- the music service came up mostly as an example of good social engineering.
Only on the final day did political issues take center stage. The impetus was a rousing keynote by Lawrence Lessig, a professor at Stanford Law School, who generated more energy by far than anything to date. Lessig immediately showed his deep caring for, and knowledge of, the various disciplines relevant to the social meaning of new technologies. He expertly wove computing history, the technological dilemmas confounding current policy, and, of course, the laws of intellectual property into an exhortation that, "The public should be aware of the extraordinary system of control that is being rammed down our throats in the name of the Constitution."
What we're losing is the public domain, the right to build on the work of others and the subtle mesh of rights that copyright law calls "fair use." These include educational uses, citation for the purposes of commentary or criticism, and (in technology) the ability to investigate a vendor's product in order to hook into it or compete with it.
|Many conferences feature people standing in tight clumps arguing about the best strategy for integrating RMI and COM or some such narrow topic... we also got questions like, "What could people accomplish if they could share all the resources on a million computers?|
I've noted that people do care about policy and demonstrate a desire to influence the political sphere, but rarely do they know how to harness and carry through that desire. Lessig provided that focus on the last day of the conference, and this sensitivity to policy issues cycled on throughout the day.
Overall, the great thing about the P2P conference was that it gave us all a chance to re-examine our fundamental premises and purpose. Many conferences feature people standing in tight clumps arguing about the best strategy for integrating RMI and COM or some such narrow topic. While we got plenty of that at the P2P conference, we also got questions like: "What can allow people to make the best use of the Internet?" or "What could people accomplish if they could share all the resources on a million computers?" Not in a fluffy or superficial way, but a serious exploration of a serious technical question teaming with social implications.
In a session about successors to Web crawlers, Cory Doctorow of OpenCola, Inc. pointed out that artificial intelligence projects use machine intelligence to aggregate human intelligence. For instance, Deep Blue applied machine intelligence to processing huge archives of chess end-games played by grandmasters. Peer-to-peer is a powerful new model for combining the machine intelligence (programmed by innumerable programmers) with human intelligence across the world.
There will never be another peer-to-peer conference like this, the first. But there will be another O'Reilly peer-to-peer conference Sept. 17-20, 2001 in Washington, D.C. The second conference will be totally different from this one, I'm sure, because the field will be different. What both will turn out to be like remains unknown until people turn up there.
Andy Oram is an editor for O'Reilly Media, specializing in Linux and free software books, and a member of Computer Professionals for Social Responsibility. His web site is www.praxagora.com/andyo.
Discuss this article in the O'Reilly NetworkGeneral Forum.
Return to OpenP2P.com.
Copyright © 2009 O'Reilly Media, Inc.