urprisingly, the book is still on a few shelves! Which is lovely, of course, but I don’t update this site much these days. I’ve put links to two of the most evergreen chapters in the right-hand column here: Chapter 2, History of Open Discourse and Chapter 6, History of Journalism. Feel free to comment.
These days, you can find me on a variety of services. Click any of the links below to stay in touch.
If you’d like to visit a site that compiles many of the submissions to the above services, check out:
he folks over at ars technica report that Warner Music Group has forged an agreement with video newcomer/behemoth YouTube to host its entire music video catalog. That in itself would be ground-breaking these days, but in addition Warner invites fans to create their own videos using Warner songs as soundtracks. In return, Warner will get a cut of the advertising revenues. This all bodes well for the video hosting upstart, enhancing their revenue possibilities and suggesting a path for financial sustainability. Alex Zubillaga, EVP for digital strategy and business development sums up Warner’s stand,
“This agreement establishes a model by which content companies can transform consumers’ creativity into a legitimate commercial enterprise that will benefit fans, artists and copyright holders.”
Good luck to Warner and YouTube in this joint venture. It’s a brave move, but one that seems very forward thinking.
mong the points I find myself mulling regularly is the problem of discovery. How do we find blog posts, news items, web pages, and philosophical soulmates amidst all this glut?
One of the best thinkers worrying this problem is Nicolas Carr, who yesterday wrote a post called The Great Unread on his Rough Type blog. In it he laments the concentration of links accruing to a few already well-known blogs and the difficulty the rest of us face in finding readers for our own blogs, as well as finding other bloggers who suit our fancies along that ever lengthening Long Tail. I strongly encourage you to read Carr on the topic, but I’m among those who’d like to believe there may be some relief as better recommendation engines and collaborative filtering solutions come down the pike. A post today by an old workmate, Matt McAlister (who’s now at Yahoo), titled My personal blogger hierarchy echoes some of my own thoughts on the topic. Matt writes:
I suspect that the idea of the blogosphere and the blog elite is a temporary one. The blogger hierarchy does not make the substance of a post any more or less valuable. Ultimately, that value is completely up to me, not some shallow power structure.
I’d love to hear from any readers how they think this may all play out and what any of us might do to help with the problem.
im Berners-Lee knows a thing or three about the World Wide Web. He invented it, after all. On his blog, he talks about a lot of issues. Web censorship. Microformats. Protocols. And, with increasing frequency, Net Neutrality. Yesterday’s post is titled, “Net Neutrality: This Is Serious.” In the succinct post, Berners-Lee defines Net Neutrality:
If I pay to connect to the Net with a certain quality of service, and you pay to connect with that or greater quality of service, then we can communicate at that level.
He makes strong arguments for why we need legislation in the United States that guarantees this access and that doesn’t succomb to the short-sighted quarterly thinking being promoted by corporations and media giants.
On his own blog, Lawrence Lessig weighs in pointing out that one clue to the debate involves watching “what kind of souls are on each side of the debate.” On the one side, we have those who invented the Web along with those who’ve managed to profit from it — Berners-Lee and Microsoft. On the other, we have those who find themselves eating dust — the telcos and cable companies. The United States is in danger of hobbling itself in a global information market. May the smart guys win out.
nush Yegyazarian over at PC World magazine just published an article titled, “Your Privacy Under Siege.” If you’re looking for a well-reasoned argument for the institution of strong privacy guidelines, this is a good place to start. Yegyazarian begins by cataloging recent U.S. government actions that privacy advocates find troublesome—the NSA’s culling of data from phone companies; the Justice Department demanding search records from Google, MSN, and Yahoo; and Attorney General Alberto Gonzales’ plan to require Internet service companies keep user activity records. She acknowledges that these measures might occasionally result in the exposure of a terrorist or child pornographer, but raises the question: How can we devise safeguards that protect the rights and privacy of innocent citizens?
Her proposal is straightforward. Encrypt all data. Make fine print explicit. Allow opt out (except in criminal. Define government agency parameters. Monitor agencies. Impose penalties when agencies overstep.
Comments about the article on the digg.com page linking to the story probably suggest the spectrum of our response as a society. Some agree with Yegyazarian’s pragmatic approach. Some are resigned, convinced that both parties are in cahoots with the communications giants—consumer be damned. Some champion greater security measures, pointing to increasing unrest on almost every front. Some wonder at the trivial number of criminals and terrorists apprehended as a consequence of this mass collation of personal data.
The question will not be answered this round. Nor any round, really, I suppose. A popular understanding and interpretation of privacy is an ongoing process, one that mirrors the savvy and social conscience of each era. We seem a little timid these days. We watch the corporate/governmental panoptikon scanning our horizons and try to find ways to call it beneficent. But history would suggest that societies that maintain a healthy vigilance are much better able to maintain their rights, to defend them against inevitable intrusions by whatever hegemonies are in place at the time.
Clashes are occurring in surprising quarters. Two hours ago, the New York Times posted and article about Vice President Dick Cheney defending domestic eavesdropping. “These communications are not unusual — they are the government at work,” says Cheney. Senate Judiciary Committee chairman, Republican Arlen Specter disagrees. He wants to subpoena telephone company executives to testify in hearings to determine whether the eavesdropping is unconstitutional.
Vigilance surrounding our rights. The new patriotism?
Nature pits Wikipedia against Encyclopedia Britannica and the free, user-edited encyclopedia holds its own.
The good news for Wikipedia began yesterday with a special report by the venerable science magazine, Nature. The periodical oversaw the peer review of 42 entries common to the two encyclopedias and found errors in both. In fact, they discovered 162 errors in Wikipedia and 123 in the Britannica. Among the errors, four in each compedium were dubbed “serious.”
Determined to test the mettle of the online encyclopedia, two of the Wikipedia’s 45,000 registered “editors” carried the math a little farther and discovered that the Wiki articles used in the review were, on average, 2.6 times longer than the Britannica’s. The authors are “cautious about drawing conclusions, but from a purely statistical standpoint, this means that the Britannica yielded 3.6 errors for every 2KB data while the Wikipedia ended up with a mere 1.3 errors per 2KB.
The good news followed hard on the heels of bad. John Seigenthaler, founding editorial director of USA Today recently accused the encyclopedia of erroneously implicating him in the assassination of Robert Kennedy. Mr. Seigenthaler declined to edit the document.
Plus, there is the class action suit against Wikipedia brought by Baou Inc. Baou is run by Greg Lloyd Smith, who launched and defended the questionable QuakeAID project and was once sued by Amazon for engaging in fraud while using their name. Win or lose, defending cases like this one are costly and distracting.
Wikipedia will never be free of problems. There will be misinformed editors, ham-handed writers, vandals, and prolix types with axes to grind. Even with new mechanisms being put into place to screen entries, the sheer volume of data on the service prohibits any kind of full vetting. As of this writing, there are 3.7 million articles in 200 languages in the Wikipedia.
But the Nature article is thought provoking. First, it’s good to remember that even our most trusted reference tomes can make mistakes and, second, the Wikipedia, with its self-correcting nature and protean body of content, isn’t all that terrible a resource, after all.
Not long ago, corporate wisdom had it that content and customers were best kept behind tall garden walls, but the recent announcement that Microsoft and Yahoo will open up their networks and allow their respective instant messaging users to talk with one another is yet more proof that those walls are tumbling down. While altruism may have played some part in the deal, it is likely that the two partnered in hopes of overtaking AOL with its 56% of the current market share. They’ll have a rough row to hoe. AOL’s software is ubiquitous and friendlier than either Yahoo or MSN.
Plus, could the hand-shake be too little too late? Upstarts like Cerulean Studios’ Trillian and Defaultware’s Proteus X for the Mac have been providing free software allowing individuals to chat across all three systems for a while now and they come with a friendly array of customizable features, among them video and SMS support for forwarding messages to your phone.
This is all well and good for the user who just wants to chat with their friends and colleagues across systems, but the real competitive edge may well turn out to be voice over IP (VoIP). The recent $2.6 billion purchase of VoIP company Skype (whose tagline reads “the whole world can talk for free”) by eBay was considered by many to be chancy, but signs are good that individuals will choose the much cheaper (free!) VoIP services over more traditional telephone providers whenever it’s possible and easy. The Microsoft/Yahoo partnership will make it very easy for users of those two systems.
The real story may lie in rumors of talks between Microsoft and AOL’s owner Time Warner to discuss more interoperability between those two chat and voice systems. There’s been little love lost between the two in the past but market pragmatics could force them to at least kiss for the cameras. Meanwhile, newcomer to the messenger game, Google, is also said to be in talks with AOL. It will be worth watching how this all plays out.
With the second annual Web 2.0 Conference convening in San Francisco on Wednesday, it’s little wonder that the blog world is full of discussion on the topic. Still, it’s a bit of a challenge to get a handle on just what is at the heart of the movement. Among those who lay claim to the Web 2.0 mantle, there are as many definitions as there are definers, but that’s in the spirit of the “radical decentralization” and “architecture of participation” that distinguish the movement.
Advocates are generally in agreement when pointing to examples of Web 2.0 success stories: the active community that’s grown up around photo-upload site Flickr; the ingenious and addictive bookmarking database at del.icio.us; the real-time delivery of video and other large files across BitTorrent’s decentralized network in which every client is a server; the breadth and utility of the user-written Wikipedia project; and, of course, the phenomenal growth of blogs and syndicated feeds.
What distinguishes all these efforts is the fact that they have adopted the web as a platform, delivering services across networks and devices rather than distributing software artifacts. They are all free and, as such, can afford to be in perpetual beta release. Users happily provide testing, feedback, and suggestions. Some even take advantage of the open source code beneath the surface, contributing valuable variations and enhancements.
By design and default, individual users evangelize and populate the databases. The more people use Web 2.0 services, the more robust and useful they become, and all at very little additional cost. Because much of the contributed content is niche, users are rewarded with an unprecedented array of choices.
Indeed, a whole new argot is growing up around the movement. Folksonomy, a portmanteau for folks and taxonomy, describes the popular use of tagging or freely chosen keywords to categorize content from the bottom-up rather than the top-down hierarchies employed by traditional systems. The Long Tail, with its L-shaped distribution curve, illustrates the cost-effective access to niche content allowed by Web 2.0 mechanisms. Remixing is stolen from the pop culture world and is used to describe hybrids like the many homegrown Google Map applications.
Whether these grand experiments in “radical trust” and the “wisdom of crowds” survive and evolve viable business plans is yet to be seen, but the next few days should provide us with some intriguing prognostications from those on the frontlines.