A blogosphere thanksgiving

November 23, 2005

Dan Farber yesterday responded to my response to his post extolling the virtues of the blogosphere. I'm now going to respond to his response to my response to his post. I'm not sure whether this back-and-forth supports my thesis about blogging or shreds it into little pieces, but it seems like a good way to sign off before the Thanksgiving holiday.

This year, by the way, my family has decided to avoid all the nuisance involved in putting together a big meal in the meatspace and instead indulge in a digital simulation of the feast through the new Google Holidays (beta) service delivered through the Google Brain Plug-In (beta). What's great about the service is that, because it's supported by ads, you can enjoy your virtual turkey with all the trimmings while at the same time getting a head start on your Christmas shopping. Thanks, Sergey!

Where was I? Oh, yeah: blogging. Farber notes that:

Nick's critique of blogging is really ironic. He started blogging in April and has now become part of what he calls the fantasy community of isolated egos. Clearly, the blogosphere is not as collegial or knowable as the Harvard campus.

I've been struggling with that irony as well. For the time being, at least, I'm going to revel in it rather than resist it. As to the alleged merits of the Harvard campus, I haven't been there in a couple of years and have no plans to return.

Farber goes on to sum up my motivations as a blogger:

Instead of writing longer articles and waiting months for them to appear in print, or just emailing with his colleagues, [Nick] can offer and receive near instantaneous feedback, which, by the way, is all fodder for going 'deeper' and creating end (some revenue-generating) products, such as books, articles and speeches.

I'm not sure about the fodder point - so far, the blog stuff and the other stuff haven't melded much, and the time given to the former has detracted from that given to the latter - but he's right that instantaneous, self-controlled publishing is awfully seductive. Web 2.0 is kind of the apotheosis of the vanity press. But that seductiveness is, I'd argue, part of the problem. It's so easy and cheap to circulate in the blogosphere, or the broader webosphere, that we, as a society, will inevitably tend to spend more and more time there - a trend, it's important to remember, that Google, Yahoo, et al., have enormous economic incentives to propel. Slowly but steadily, the internet comes to mediate the way we take in and disgorge information, ultimately influencing, even reshaping, the very way our minds work. I really think that guy Richard Foreman was on to something when he wrote:

I come from a tradition of Western culture in which the ideal (my ideal) was the complex, dense and "cathedral-like" structure of the highly educated and articulate personality - a man or woman who carried inside themselves a personally constructed and unique version of the entire heritage of the West ...

But today, I see within us all (myself included) the replacement of complex inner density with a new kind of self-evolving under the pressure of information overload and the technology of the "instantly available". A new self that needs to contain less and less of an inner repertory of dense cultural inheritance - as we all become "pancake people" - spread wide and thin as we connect with that vast network of information accessed by the mere touch of a button.

That's what scares me.

Jellybeans for breakfast

November 22, 2005

When my daughter was a little girl, one of her favorite books was Jellybeans for Breakfast. (Holy crap. I just checked Amazon, and used copies are going for hundreds of bucks!) It's the story of a couple of cute tykes who fantasize about all the fun stuff they'd do if they were free from their parents and their teachers and all the usual everyday constraints. They'd ride their bikes to the moon. They'd go barefoot all the time. They'd live in a treehouse in the woods. And they'd eat jellybeans for breakfast.

Yesterday, Dan Farber wrote a stirring defense of blogging, illustrated by a picture of a statue of Socrates. "For the most part," he said, "self assembling communities of bloggers hold a kind of virtual Socratic court, sorting out the issues of the day in a public forum, open to anyone, including spammers." After discussing some technologies for organizing the blogosphere, he concluded:

For a journalist, technologist, politician or anyone with a pulse and who doesn't know everything, blogs matter. Every morning I can wake up to lots of IQ ruminating, fulminating, arguing, evangelizing and even disapassionately reporting on the latest happenings in the areas that interest me, people from every corner of the globe. That's certainly preferable to the old world and worth putting up with what comes along with putting the means of production in the hands of anyone with a connection to the Net.

That's one way of looking at, and most of what Farber says is true. I don't think it's the whole story, though. The blogosphere's a seductive place - it's easy to get caught up in it - and there's lots of interesting thoughts and opinions bouncing around amid the general clatter. But does it really provide a good way of becoming informed? Experiencing the blogosphere feels a lot like intellectual hydroplaning - skimming along the surface of many ideas, rarely going deep. It's impressionistic, not contemplative. Fun? Sure. Invigorating? Absolutely. Socratic? I'm not convinced. Preferable to the old world? It's nice to think so.

For all the self-important talk about social networks, couldn't a case be made that the blogosphere, and the internet in general, is basically an anti-social place, a fantasy of community crowded with isolated egos pretending to connect? Sometimes, it seems like we're all climbing up into our own little treehouses and eating jellybeans for breakfast.

Two-dimensional culture

In another great leap forward for two-dimensional culture, the U.S. Library of Congress is today proposing to build a "World Digital Library" of scanned artifacts from around the globe. In an op-ed in the Washington Post, the library's top dog, James Billington, writes, "An American partnership in promoting such a project for UNESCO would show how we are helping other people recover distinctive elements of their cultures through a shared enterprise that may also help them discover more about the experience of our own and other free cultures."

Does our arrogance have no bounds? Long the world's cultural bulldozer, we're now appointing ourselves to lead the way in creating a digital simulation that, says Billington, "would create for other cultures the documentary record of their distinctive achievements." It's so real you can almost touch it! Needless to say, Google's the primary funder of the effort. As the Register puts it: "All your cultures are belong to us."

Distrust and verify

Microsoft is giving itself a hearty pat on the back for announcing its intention to open up its Office formats. It will, says product manager Brian Jones, "fully document all of our schemas so that anyone can understand how to develop on top of them." It will also change the formats' licensing terms, providing "a very simple and general statement that we make an irrevocable commitment not to sue" anyone using the formats. Crows Jones: "This is obviously a huge step forward and it really helps to increase the value of these document formats because of the improved transparency and interoperability." Adds Alan Yates, another Microsoft executive: "We look forward to the day when people look at this as a milestone, as the beginning of the end for closed documents."

Whether it's a huge step forward remains to be seen - there are a few weasel words in the official announcements - though it does look like a clear step forward. But excuse me if I hold my applause. Microsoft has been an obstructionist on open documents for years, and the reason it's finally changing its ways is because governments have been holding a gun to its head, abandoning or threatening to abandon Office in favor of the open-source alternative OpenOffice. (Microsoft still refuses to make OpenOffice's Open Document format compatible with Office.) For Yates to say that Microsoft's announcement is "the beginning of the end for closed documents" is ludicrous. The beginning happened a long time ago, and Microsoft had nothing to do with it. It would be nice if the company acknowledged that.

So, sure, let's welcome this move. But my advice to the governments and other organizations that have spurred it is this: Keep up the pressure.

Process matters

November 21, 2005

Last week, Ross Mayfield posted an interesting essay called The End of Process. In it, he argues that software-mediated social networks will tend to render formal business processes obsolete by reducing the costs of communication and coordination. "I do believe," he says, "the arguments for engineering organizations are being trumped by new practices and simple tools. The first organizations bringing [processes] to an end will have a decided competitive advantage." He goes on to claim that even today "some staid corporations are abandoning process all together."

While provocative, the argument is much too broad, and it floats on a raft of dubious assumptions. "Organizations," Mayfield writes, "are trapped in a spiral of declining innovation led by the false promise of efficiency. Workers are given firm guidelines and are trained to only draw within them. Managers have the false belief engineered process and hoarding information is a substitute for good leadership." Actually, over the past 50 years or so, businesses have become steadily more flexible, more innovative, less bureaucratic, less hierarchical, and less characterized by rigid work flows and fragmented and hoarded information. They aren't free from these problems, of course, but in general they've been getting better rather than worse. There's no "spiral of declining innovation led by the false promise of efficiency."

In fact, meticulously defined and managed processes continue to be a powerful source of competitive advantage for many companies. Look at Toyota, for instance. Its highly engineered manufacturing processes not only give it superior productivity but also provide a platform for constant learning and improvement. The formal structure, which is anything but democratic, spurs both efficiency and innovation - productive innovation - simultaneously. Structured, well-thought-out processes are also essential to most knowledge work, from product development to financial analysis to software engineering to sales and marketing. And the more complex the effort, the greater the need for clear processes. Far from making business less effective and agile, the increasing attention to process has increased effectiveness and agility.

In a response to Mayfield's post, IBM's Irving Wladawsky-Berger says that "an innovative business looks for the proper balance between process – covering those aspects of the business that can be designed, standardized, and increasingly automated – and people – who bring their creativity and adaptability to handle everything else." I see what he's getting at, but I don't agree that there's necessarily a tension between process and people. Bad processes can destroy individual initiative, but well-designed processes, even very formal ones, can encourage individual initiative and, importantly, guide personal and group creativity toward commercially productive ends. I'm not sure you need to balance process and people so much as harmonize them.

If Mayfield had narrowed his argument, focusing on the way knowledge workers collaborate in certain situations, rather than on business processes in general, he would have been much more compelling. The simple group-forming and information-sharing software tools now being introduced and refined will often provide greater flexibility and effectiveness than more complex "knowledge management" systems. But even in these cases, processes aren't going away; they're just changing. There can't be organization without process.

Your brain on Google

November 20, 2005

There's a new book out called The Google Story, subtitled "Inside the Hottest Business, Media and Technology Success of Our Time." I haven't read it, but I did read a review in this morning's New York Times. The reviewer describes a passage that comes at the end of the book:

Sergey Brin, one of the search engine's founders, is marveling, as he and his co-founder, Larry Page, are wont to do, about their product's awesome computational powers. Having hatched a plan to download the world's libraries and begun a research effort aimed at cataloging people's genes, Mr. Brin hungers, with the boundless appetite of a man who has obtained great success at a tender age, for the one place Google has yet to directly penetrate - your mind. "Why not improve the brain?" he muses. "Perhaps in the future, we can attach a little version of Google that you just plug into your brain."

Visionary? Scary? Cute? Hey, give a kid a Fabulous Money Printing Machine, and he's bound to get a little excited.

What struck me, though, is how Brin's words echo something that a Google engineer said to technology historian George Dyson when he recently visited the company's headquarters: "We are not scanning all those books to be read by people. We are scanning them to be read by an AI." I wasn't quite sure when I first read that quote how serious the engineer was being. Now, I'm sure. Forget the read-write web; the Google Brain Plug-In promises the read-write mind.

The theme that computers can help bring human beings to a more perfect state is a common one in writings on artificial intelligence, as David Noble documents in his book The Religion of Technology. Here's AI pioneer Earl Cox: "Technology will soon enable human beings to turn into something else altogether [and] escape the human condition ... Humans may be able to transfer their minds into the new cybersystems and join the cybercivilization ... We will download our minds into vessels created by our machine children and, with them, explore the universe ..."

Here's computer guru Danny Hillis explaining the underlying philosophy more explicitly:

"We're the metabolic thing, which is the monkey that walks around, and we're the intelligent thing, which is a set of ideas and culture. And those two things have coevolved together, because they helped each other. But they're fundamentally different things. What's valuable about us, what's good about humans, is the idea thing. It's not the animal thing ... I guess I'm not overly perturbed by the prospect that there might be something better than us that might replace us ... We've got a lot of bugs, sorts of bugs left over history back from when we were animals."

As I described in The Amorality of Web 2.0, this ethic is alive and well today, and clearly it's held not only by the internet's philosopher class but by those who are actually writing the code that, more and more, guides how we live, interact and, yes, think.

Plug me in, Sergey. I'm ready to be debugged.

Worth reading

November 18, 2005

Two of the best writers on information technology, Robert X. Cringely and Joel Spolsky, have just delivered thought-provoking new articles. Cringely speculates on how Google plans to deploy scores of portable data centers throughout the land, which combined with its reputed ownership of large amounts of installed fiber-optic cable will enable it to turn "the entire Internet into a giant processing and storage grid" under its control. "There will be the Internet, and then there will be the Google Internet, superimposed on top. We'll use it without even knowing. The Google Internet will be faster, safer, and cheaper ... And the final result is that Web 2.0 IS Google."

Spolsky offers an alternative, and persuasive, take on why record companies are trying to get Apple to charge different prices for songs at its iTunes Music Store. "What they really want," he argues, "is a system they can manipulate to send signals about what songs are worth, and thus what songs you should buy. I assure you that when really bad songs come out, as long as they're new and the recording industry wants to promote those songs, they'll charge the full $2.49 or whatever it is to send a fake signal that the songs are better than they really are." We usually assume that you increase sales by reducing prices, but as Spolsky shows it's not so simple. Consumers are anything but rational.

The web's porn problem

Huh? Problem? What problem?

Ah, precisely.

A week or so ago, the U.S. Senate held some hearings on pornography, including the internet's vast and various store of the stuff. Orrin Hatch, the Utah Republican, called porn a "problem of harm, not an issue of taste." Nobody, though, paid much attention to the proceedings. Popular blogger Jeff Jarvis, in a post titled A Nation of Hairy Palms, dismissed it all as "silly crap" from "conservative prudes."

Jarvis's reaction is typical of the blogosphere's, and, for that matter, the whole country's, laissez faire attitude toward online porn: Yeah, there's a whole lot of it out there, but it's basically harmless, even kind of amusing. Anyone who has the temerity to criticize it, or even call attention to it, is just a prude or a loser who deserves to be ridiculed and ignored.

Another common view of digital porn is that it's useful - as a case study for internet businesses. Paul Kedrosky, in a recent post, rehearses this theme: "I think that a valuable startup exercise would be to do a wholesale survey of all emerging technology in the promotion, selling, and distribution of online porn ..." I've probably said or written similar things in the past, as have many others.

But maybe the most common reaction of all is simply denial. When Icann recently proposed setting up an online red-light district, under the .xxx domain, many politicians around the world, led by President Bush, attacked the idea, and Icann shelved the plan. Establishing a porn domain would have acknowledged the fact that the web is crammed with naughty pictures and videos. Without .xxx, we can pretend it doesn't exist - or at least distance ourselves from it.

I don't think I'm a prude (and I like to pretend I'm not a loser), but I'd like to suggest that internet pornography is bad. Very bad, in fact. I'm not talking here about your run-of-the-mill dirty pictures and movies - the stuff you'd find in Playboy or Penthouse or your local video store. I'm talking about the really gruesome stuff. If you have a blog, you're familiar with trackback spam - links that spammers add to your site in order to promote their own sites. My daily chore of deleting trackback spam has, unfortunately, opened for me a window onto the internet pornography industry. Here, for your edification, is a small selection of the headlines I routinely have to delete from my site:

Sex with animals

Rape fantasies

Father-son incest

Gagging

Brutal [fill in the blank]

These are not the worst of them. The long tail of online pornography is a very long tail indeed, meticulously documenting the full scope and intricacy of human depravity, from the simulated rape and torture of women through bestiality and on to child pornography and other criminal diversions. And guess what? Photographs and video clips of all of it are readily available, not only to adults but to children as well. There are no drawn curtains, no blacked-out windows, on the internet. Think you need a credit card to get this stuff? Think again. Think that filters are reliable, or that kids can't get around them? Dream on. (And even if you carefully monitor what your kids do on your family's computers, do you really think all your kids' friends' parents are as diligent? Yeah, right.)

[A few hours after posting this entry, I decided to delete a paragraph that originally appeared here. The paragraph provided an example showing how easy it is for anyone to access the type of stuff I've been describing (and also illustrated how search engines, by cataloguing the material, facilitate its discovery). I came to fear that the example might have the counterproductive effect of promoting what I'm trying to criticize.]

This isn't a call to arms. I don't have the backbone to be a crusader. I just find it curious how easily we've come to accept what just a few years ago would have been unimaginable - both the content and its accessibility. Then again, maybe it's not so curious. In his 1993 article Defining Deviancy Down, published in the American Scholar, Daniel Patrick Moynihan pointed out that:

there are circumstances in which society will choose not to notice behavior that would be otherwise controlled, or disapproved, or even punished. It appears to me that this is in fact what we in the United States have been doing of late. I proffer the thesis that, over the past generation ... the amount of deviant behavior in American society has increased beyond the levels the community can "afford to recognize" and that, accordingly, we have been re-defining deviancy so as to exempt much conduct previously stigmatized, and also quietly raising the "normal" level in categories where behavior is now abnormal by any earlier standard.

Moynihan was writing specifically about criminal behavior, but the analysis holds for pornography as well. As a society, we can't afford to recognize what we all know exists - what in fact lies just a click or two away from whatever we, or our children, happen to be looking at on the web at any given moment. And we can't afford to consider that when Orrin Hatch calls it "a problem of harm, not an issue of taste," he may be right. It's so much simpler to pretend that what he's saying is "silly crap."

The MySpace (bottle) rocket

November 15, 2005

Friendster's fading. Flickr's feeling tired. But MySpace is rocking, Facebook's booming and TagWorld's launching.

It's clear that community sites can have a lot of appeal, particularly to the young. MySpace, for instance, logged nearly 12 billion page views last month - that's more than eBay - according to Business Week. What's less clear is how long the appeal will last. Will those that flock to community sites when they're fashionable hang around indefinitely? Or will they stay only until a hipper joint opens up down the (virtual) street?

My guess is that online hot spots, like their real-world counterparts, will go in and out of fashion fairly quickly - and that those betting on their staying power will be disappointed. One reason is simply the fickleness of the young; as soon as a place gets too popular (and the bald-headed guys with backwards baseball caps start showing up), the trendsetters head for the exits, and the crowd soon follows.

But another, more subtle force may also be at work. On the surface, it would seem that the more you invest in a community site - designing a home page, uploading photos, tagging everything that moves - the harder it would be to leave. After all, if you go somewhere else, you'll have to start all over. But maybe it's just the opposite. Maybe what's really fun about these sites is the initial act of exploring them, putting your mark on them, checking out the marks made by others, spreading the word to friends, and so on. Once you've done that, maybe you start to get bored and begin looking around for a new diversion - a different place to explore and set up temporary quarters. Community sites may be like games: once they become familiar, they lose their appeal. You want to start fresh.

When it comes to Web 2.0 communities, in other words, familiarity may breed not loyalty but contempt. As we're seeing with beleaguered Friendster, their trajectories may follow the paths of bottle rockets: up fast, down fast.

If I were Rupert Murdoch, whose News Corp. bought MySpace a couple of months ago, I wouldn't just be investing in expanding the MySpace property. I'd be building (or buying) the site that's going to displace MySpace as the in-place. It's fine to pitch a business to a capricious clientele; just don't expect stability.

Such a cute gorilla

November 14, 2005

Continuing its masterly replay of Microsoft's old kill-competitors-by-bundling-their-products-as-free-features-in-our-platform strategy, Google today begins giving away tools for analyzing the effectiveness of web advertising. The Google Analytics service is a repackaging of the Urchin software that Google acquired earlier this year. In addition to the competitive benefits - Google Analytics is a hammer blow to small specialist companies like Web Side Story, and it neutralizes analytics as a potential advantage for direct competitors like Yahoo and Microsoft - the free service will likely increase sales for Google's AdWords service. Analytical tools for ad performance are a complement to advertising itself. By giving the complement away, you'll tend to boost demand for the ads. (Joel Spolsky explains the economics of complements well.)

Of course, the service also means that Google will gain access to a ton of new data on ad performance that it can analyze itself. For one thing, Google Analytics can be applied to ad campaigns running outside the Google network, giving Google information on how its competitors' programs perform. For another, Google will gain greater insight into the decision making of its AdWords customers, information that could help it in optimizing its ad pricing (ie, maximizing its income at customers' expense). No need to worry, though. Google says we can all trust it to do the right thing. A Google executive told CNET: "We have very strict controls on the data. It is only used to provide reporting to customers and people using the analytics."

It's hard to complain about getting stuff for free that you used to have to pay for. This is, after all, how competition is supposed to work - the benefits fall to the customer. And that's great. Beyond the question about what happens to the data Google collects, though, there's another concern, which may or may not be purely theoretical. When Microsoft fought competitors by bundling new features into its operating system, it was frequently attacked for chilling innovation. When you give a product or service away, you take away the economic incentive for inventors and entrepreneurs to improve the product. Google right now is such a rabid innovator that it's hard to think of the company as being a force that impedes innovation. But even a cute 800-pound gorilla is still an 800-pound gorilla.

Love, money and podcasting

November 13, 2005

It’s one of the great questions of our time (or at least this past weekend): Is a podcast a podcast if it isn’t an MP3? Internet audio pioneer Audible set off the debate on Friday when it announced it would enable podcasters to distribute their work in its proprietary, and trackable, .aa format, facilitating advertising and paid subscriptions. Dave Winer declared that "if you're not using MP3, you're probably trying to make podcasting into a replay of previous media." Om Malik accused Audible of "trying to hijack a popular trend." Mitch Ratcliffe, who helped Audible develop the service, shot back, arguing that the denunciation of "any departure from the basic technology of MP3 and the business model of 'no commerce has its claws in us' is like trying to freeze broadcasting in the era of amplitude modulated low-power broadcasts."

I find it hard to see Audible's move as a horrible thing. If its Wordcast service (as it calls it) or the associated pricing model is flawed, then it will flop. If it succeeds, then it, by definition, has value - for content creators or consumers or both. You can't hijack a trend without the market's blessing. In any event, it's hard to see how, as Winer fears, the Audible service threatens the ability of regular folks to continue to distribute podcasts as MP3s, free and without advertising. I, like most people, can play both MP3 files and .aa files on my computer and my portable player, and that's not going to change.

Amateurs should, and will, be able to disseminate their creations, whether podcasts or songs or blogs or films, over the Internet. But those who hope to make a career of writing or talking or making music or shooting video should be able to protect their work and try to earn a living from it. If we don't encourage experimentation with profit-making business models (beyond just search-based advertising) - and with rights-management schemes - we'll end up restricting the creation of web media to amateurs, particularly amateurs of means. And we'll end up with mediocrity. The greatest content is not created by those who do it just for love; it's created by those who are so dedicated to their craft that they have no choice but to do it for both love and money.

Peter Drucker, RIP

November 11, 2005

I had almost come to believe that Peter Drucker was immortal, but sadly it's not so. He died earlier today, at 95. Drucker was one of the great writers on business and management, who over the course of 70 years tirelessly championed the dignity and the intelligence of workers. In his 1946 book Concept of the Corporation, he made an eloquent case against mindless bureaucracy, arguing that managers should give workers the power to make decisions and take the initiative. A radical thought then, it's become the common wisdom today, though few businesses actually live up to Drucker's ideal.

There are good obituaries in the New York Times, Financial Times, and Business Week, and Drucker's grandson, Nova Spivack, has written a tribute on his blog.

Secrets and lies

As Apple's designers and engineers toiled away at creating an iPod that could play video, Steve Jobs kept telling reporters that a video iPod was a dumb idea. Even in late September, a couple of weeks before the video iPod's unveiling, he was saying "that the market isn’t yet right for personal video devices." So is Steve Jobs a big fat liar? No, he's a smart businessman.

"Transparency" is a big buzzword these days. To succeed in today's interconnected world, the common wisdom says, you need to let it all hang out - expose your data, expose your processes, expose your plans. One prominent management consultant's message, according to CIO Insight magazine, boils down to "bare it all and share it all." Call it the slut strategy.

Now, there's a lot to be said for transparency, and for modular processes. But it's easy to go too far - to make your company so transparent, so connectable, that you turn your entire business into an easily copied (and easily discarded) commodity. Even in the Internet Age, a company's competitive advantage still hinges on what, to outsiders, remains hidden, obscure, and hard to replicate. Google has made a lot of its technology transparent, but the essence of the company, the source of its advantage, remains opaque.

Openness and honesty are good things. But let's not lose sight of the enduring power of secrets and lies.

The thinnest client

November 10, 2005

Most of the current discussion about the big changes under way in computing focuses on the software side, particularly on the shift from locally installed software to software supplied as a service over the internet. That's what all the Web 2.0 fuss is about. Less attention has been paid, so far, to the equally dramatic shift that the new utility-computing model portends for hardware. The ability to deliver ever richer applications from distant, central servers will lead to the increasing centralization of hardware components as well - and that, in turn, will open the door to hardware innovation and entrepreneurship.

Take Newnham Research, a startup in Cambridge, England. It's developing the thinnest of thin clients - a simple monitor adapter, with a couple of megabytes of video ram and ports for a mouse and keyboard, that plugs directly into an ethernet network. All computing is done on a central server, which can be either a traditional server or an inexpensive PC. The only thing delivered to the monitor, directly over the network, are compressed pixels. The device is called Nivo - for "network-in, video-out."

Newnham, which is being backed by Atlas Ventures and Benchmark, isn't producing its technology in volume yet. On its site, it suggests a few applications tied to the ease with which you can drive multiple monitors from a single PC, but it's easy to think of much broader applications as well, particularly in schools, shops and small offices.

One nonprofit company, called Ndiyo, is already putting Newnham's hardware innovations to work - as a way to deliver computing to people who haven't previously been able to afford it. Ndiyo began with a simple goal: "Instead of starting with a PC and seeing what we could take out, we began with a monitor and asked what was the minimum we had to add to give a workstation fully capable of typical 'office' use." It's created a system, using open-source software like Linux, OpenOffice, Firefox, and Evolution, that allows a half-dozen users to share a single PC simultaneously, all doing different things. Need to add more users? Add another PC to create a little Linux cluster, and off you go. (For more details, you can download a pdf presentation from Ndiyo.)

Bill Gates and Ray Ozzie talk about the disruption of the software market. Hardware's going to go through a disruption, too - and it's about time.

Ozzie ascendant

November 09, 2005

Ray Ozzie's vision is now Microsoft's vision. That may be the most important message of the "leaked" Microsoft memos (I put "leaked" in quotes since it's obvious, as John Battelle points out, that these documents were intended to be made public from the get-go). Bill Gates's memo is just a cover note to Ozzie's agenda-setting memo. It's Ozzie's letter, Gates writes, "which I feel sure we will look back on as being as critical as [my] Internet Tidal Wave memo was when it came out." That's the sound of a baton - a very heavy baton - being passed.

Gates's desktop era is over. Ozzie's internet era has begun. Up until now, Microsoft has looked at the internet through the desktop; now it's looking at the desktop through the internet.

Ozzie's memo is very good. If you want an introduction to what's often called "Web 2.0," you could do worse than start with this document - not least because it doesn't use the term Web 2.0 at all. Three things leapt out at me:

First, Ozzie seems to truly believe that the advertising model may be as lucrative for the software business as the licensing model has been. "In some cases," he writes, "it may be possible for one to obtain more revenue through the advertising model than through a traditional licensing model. Only in its earliest stages, no one yet knows the limits of what categories of hardware, software and services, in what markets, will ultimately be funded through this model. And no one yet knows how much of the world’s online advertising revenues should or will flow to large software and service providers, medium sized or tail providers, or even users themselves." Shifting away from the licensing model represents an enormous risk for Microsoft - a risk that could, under a worst-case scenario, prove fatal. It's a risk that Ozzie doesn't seem to shy away from.

Second is Ozzie's recognition that proprietary data formats, which have always been crucial to Microsoft's success, may be turning into liabilities. He writes: "For all its tremendous innovation and its embracing of HTML and XML, Office is not yet the source of key web data formats – surely not to the level of PDF." This, too, implies another fundamental break with Microsoft's heritage.

Third, and related, Ozzie explicitly embraces a truly open platform: "We’ll design and license Windows and our services on terms that provide third parties with the same ability to benefit from the Windows platform that Microsoft’s services enjoy. Our services innovations will include tight integration with the Windows client via documented interfaces, so that competing services can plug into Windows in the same manner as Microsoft’s services." Now, maybe this is just boilerplate to keep the lawyers happy. But I don't think so. I think it's another admission that the pillars of the old Microsoft are crumbling. They'll hold for a bit longer, but in the meantime a new foundation needs to be poured.

SapOracleSoft

Yesterday, I questioned some undocumented research that SAP claims shows its customers are much more profitable than other companies. Today, the Financial Times reports on research from the Hackett Group, a benchmarking firm, which indicates that it doesn't matter whether your enterprise resource planning system is from SAP or Oracle or PeopleSoft: "they're basically all the same."

Hackett examined a big group of large companies. It culled out the 25% that had the most efficient back-end processes (the ones automated by ERP) and found that about a third of those companies used Oracle, a third used PeopleSoft (now owned by Oracle) and nearly a third were SAP users. (Just a few had other vendors' ERP systems.) It then looked at the other 75% of companies and found that the proportions using each of the main vendors' packages were about the same. The lack of variation suggests, as the FT reports, "that their impact on efficiency and effectiveness is the same."

Here's how Hackett's research guru Philip Carnelley sums up the findings: "Every way you look at it, it doesn't make a difference. There are aspects that are quite different in terms of the architectural side but in terms of features and functions they are all very comparable products." Is this news? Not really. Back in 1998, Oracle's then-president Ray Lane said, "customers can't find 5% difference among SAP, PeopleSoft, and us."

So what does matter? Standardization and simplicity. A lot of big companies run different ERP systems in different units, a messy problem that can be exacerbated by mergers and acquisitions. Such fragmentation leads to complexity, confusion and high costs, and consolidating the systems offers the opportunity for big gains. As Carnelley says, "It doesn't matter which way they simplify, but simplifying is a good thing. The 'world class' organisations tend to have only one ERP system."

ERP is infrastructure. Rationalize it.

The sixth force

If you have an interest in business strategy, you're familiar with Michael Porter's five forces framework. In his landmark 1980 book Competitive Strategy, Porter overturned much of the conventional wisdom about business, showing that the governing assumptions reflected a much too narrow conception of competition. Companies don't just battle their direct rivals for the profits in a market, Porter argued; they also contend with suppliers, buyers, substitute products, and potential new entrants. Together, these five forces of competition shape the structure of industries, determining how much profit is generated and how it's carved up. Porter's framework has come to underpin the way managers think about - and plot - their companies' strategies.

The world has changed, though, since 1980, and I think the time has come to add a sixth force to Porter's framework: the public interest. In The Public Wants Your Profits, an article in the new edition of Forrester Magazine, I argue that in recent years, as the power of unions and government regulators has waned, the public itself has become a force shaping industries and influencing the generation and distribution of profits. (Just look at Wal-Mart's recent travails, or the pressure being placed on oil companies to curtail their windfall gains.) Traditional "corporate social responsibility" programs, which tend to be operated in isolation from companies' central profit-making functions, are not a sufficient response to the growing power and complexity of the public interest. As I argue in the article, managers need to recognize that the public interest now manifests itself as an economic interest - and hence must be a core concern of business strategy.

Lies, damned lies and IT statistics

November 08, 2005

The latest issue of SAP's Business Flash newsletter just popped into my in-box. Under the email's catchy headline "Companies That Run SAP Have 32% More High Fives At Their Staff Meetings" runs this sentence: "A recent study of companies listed on NASDAQ and NYSE found that companies that run SAP are 32% more profitable than those that don’t." At the end of the sentence is an asterisk that leads to a footnote: "Based on a 2005 Stratascope Inc. analysis of publicly available fiscal results of all non-financial companies listed on NASDAQ and NYSE." OK, I'm intrigued. A broad study that links a particular corporate software program to vastly outsized profits is interesting. I want to learn more.

So I click on a link in the email that says "We invite you to see for yourself," figuring it will bring me to a copy of the study, or at least to some details on the research. Wrong. The link brings me to a marketing page on the SAP site filled with the usual slogans, like "enable business flexibility." The only information on the study is this: "Companies that run SAP are 32% more profitable, according to results from a 2005 Stratascope Inc. study, which analyzed publicly available financial results of NASDAQ and NYSE companies. The study also found that these companies delivered 28% more return on capital. Clearly, SAP customers have a strong track record of outperforming their peers."

Clearly? Seems pretty opaque to me. I mean, where's the data?

Not to be put off, I do some searching, assuming that the details of the study have been published somewhere. Nope. I can't find any trace of this research on the web. I do, though, find the home page of Stratascope, the company that did the research. Its business consists mainly, it seems, of providing IT sales forces with financial data on public companies. On every page of its site is a large promotional advertisement highlighting some of its key clients, one of which is SAP. Hmm. I also find that the chairman and president of Stratascope, Juergen Kuebler, "was employed at SAP AG for 9 years where he was responsible for the sales launch of the product 'R/3 on AS/400' as well as the sales training of the complete staff at SAP AG." Now, the fact that SAP and Stratascope are cozy - and that it's in Stratascope's interest to make its client happy - doesn't mean that Stratascope's research is necessarily unsound. But it does raise questions - questions that can only be answered through a careful review of the research methodology and results. A review rendered impossible by the fact that neither SAP nor Stratascope is revealing details about the research.

This isn't an isolated problem. IT companies are always throwing around seemingly precise statistics claiming to show that their hardware or software is associated with competitive advantage or superior financial results. As far as I've been able to discover, the research is almost always dubious. Either the methodology is flawed (tiny or biased samples), or the research is carried out by the company itself or some sycophantic supplier. And rarely are the full details of the study divulged. IT buyers shouldn't pay any attention to such faux statistics. If a vendor dresses up its marketing slogans with research results, then it should show us the data - all of the data.

Cute puppy or fist in face?

November 07, 2005

This may or may not offer any insight into the different ways information technology is viewed in the world, but here's the cover of the Japanese translation of my book:

And here's the cover of the Russian translation:

I'm fond of the fist, personally.

Search is a commodity (again)

Until Google came along, internet search suffered from two big problems as a business. First, it was hard to make money off the end users (as a result, search engines had become commodity services sold to and rebranded by portals) and, second, switching costs were low (there was little to stop users from hopping from engine to engine). Google solved the first problem, cracking the nut on search-based advertising, but it has never really solved the second problem. Users flocked to Google because its PageRank algorithm provided clearly superior results to those of other engines, but most Google users remain ripe for the picking - there's little to stop them from switching to another engine if they're so inclined. (Habit can be strong, but it's not that hard to break, at least on the internet.)

The low switching costs could turn into a big problem for Google for a simple reason: basic internet search is once again a commodity. Do a search on Google or MSN or Yahoo, and you'll find little differentiation in the relevance of the results. Yes, if you're a super-sophisticated searcher, you may be able to point to variations that you think are important, but casual searchers won't notice any difference - and the vast, vast majority of searchers are casual searchers. There may be another great, proprietary breakthrough in internet search in the future, but for the moment Google has lost its lead. As for expanding search to more specialized areas, like 18th century manuscripts or academic working papers on quantum physics, that's not going to make much of a difference to Joe and Jane Searcher, neither of whom gives a toss about musty books or egghead treatises.

Of course, Google knows this, as do its competitors. They're all looking for ways to increase switching costs, or, as we used to say, make search sticky. One way is to make it easier for a user to default to your engine - by embedding a toolbar on his desktop, say, or putting a search box into his browser window. That can be pretty powerful. I've continued to use Google for most of my searches simply because Apple stuck a search box in the corner of my Safari browser window. But it's also a tenuous advantage. If in the next Safari update that box gets switched to Yahoo search, I doubt I would go to the trouble of hacking it back to Google. I'd start using Yahoo. Unless you control the desktop or the browser, in other words, you're stickiness is in somebody else's hands. Somebody like Microsoft, who may just happen to be your biggest competitor. Or somebody like Apple, who sooner or later is going to sell its real estate to the highest bidder, squeezing your profit margin, or incorporate internet search into its own Spotlight engine. Or somebody like Dell, or HP, or even IBM.

A better approach is to do what Yahoo's doing with My Search 2.0 - using personalization to embed proprietary data into search services. With My Search, you can tag pages that interest you and then restrict future searches to the tagged set. (You can also join up with friends to tag interesting pages, and then restrict future searches to the "community pages.") Because you can't take your tagging data with you when you go to Google or MSN, you suddenly face a real switching cost. Of course, it remains to be seen how attractive personalized search services will be to Joe and Jane Searcher (who may not give a toss about tagging or communal browsing, either). And you can bet that other search providers will quickly mimic any such service - so while it will increase switching costs, it won't necessarily enhance competitive differentiation.

There's also the old-fashioned portal strategy: provide an array of useful services to get users to spend a lot of time at your site, and they'll tend to default to your search service. That's still a good strategy - even if in the long run portals become relatively less important in people's everyday use of the net - which is why Google, despite its promises to the contrary, now offers a portal. But in this model, search inevitably becomes relatively unimportant again - differentiation and switching costs lie elsewhere in the business.

Maybe, then, what we've seen in the last few years is an aberration. Maybe the basic internet search engine is fated to be a cheap commodity running behind the scenes. And maybe those who control the search function - and most of the related ad revenues - won't be the guys running the engine but those who own the desktop or the portal (or whatever replaces the desktop or the portal). Maybe search doesn't really matter.

HITs for HAL

November 04, 2005

Amazon.com has out-googled Google with its creepily brilliant Mechanical Turk service, a means of embedding human beings in software code. If you're writing a program that requires a task that people can do better than computers (identifying buildings in a photograph, say), you can write a few lines of code to tap into the required human intelligence through Mechanical Turk. The request automatically gets posted on the Turk site, and people carry out the Human Intelligence Task, or HIT, for a fee set by the programmer, with Amazon taking a commission.

As Amazon explains, this turns the usual computer-human interface on its, uh, head:

When we think of interfaces between human beings and computers, we usually assume that the human being is the one requesting that a task be completed, and the computer is completing the task and providing the results. What if this process were reversed and a computer program could ask a human being to perform a task and return the results? What if it could coordinate many human beings to perform a task?

I have no clue how useful Mechanical Turk will prove immediately, but Philipp Lenssen (who foresaw the service in a remarkable post earlier this year) thinks the "potential is immense." Certainly, the implications are mind-bending. In an essay I discussed last week, George Dyson described how the Internet provides a platform, or operating system, that enables computers to harness and learn from the work of people: "Operating systems make it easier for human beings to operate computers. They also make it easier for computers to operate human beings." Google uses this capacity implicitly by basing its search engine on human actions and decisions - as we make our daily strolls through the Web, Google gets smarter. Amazon's Mechanical Turk uses the capacity explicitly, turning people into a "human layer" in software.

But let's not get too comfortable in our new role. No one, after all, is indispensable.

The all-seeing eye

November 03, 2005

An addendum to that last post: Google Print doesn't just raise complicated issues regarding ownership, compensation and copyright. It also provokes tricky questions about how content and form will be influenced over the longer run. At what point does a writer stop writing for the reader and start writing for the scanner?

Fair or unfair use?

Unlikely bedfellows Pat Schroeder and Bob Barr team up to make a case against Google Print in an op-ed in the Washington Times today. The piece is a response to Google CEO Eric Schmidt's recent op-ed in the Wall Street Journal. Unfortunately, the Schroeder/Barr article is as shrill as Schmidt's was self-righteous. At one point, they write, "Not only is Google trying to rewrite copyright law, it is also crushing creativity. If publishers and authors have to spend all their time policing Google for works they have already written, it is hard to create more." That's silly.

But Schroeder/Barr do raise issues that lie at the heart of how we'll think about the ownership of creative work in a world where all that work can be stored in a database operated by one, or a few, profit-making companies:

Our laws say if you wish to copy someone's work, you must get their permission. Google wants to trash that ...

Authors may be the first targets in Google's drive to make the intellectual property of others a cost-free inventory for delivery of its ad content, but we will hardly be the last. Media companies, engineering firms, software designers, architects, scientists, manufacturers, entertainers and professional services firms all produce products that could easily be considered for "fair use" by Google.

Google envisions a world in which all content is free; and of course, it controls the portal through which Internet user's [sic] access that content. It would completely devalue everyone else's property and massively increase the value of its own.

Google's moving forward with its plan to scan copyrighted works into its database without the permission of the copyright owners. Whether you're in favor of that or against it, it's worth pausing a moment to ask where exactly all of this is headed.

Keats vs. Matrix

November 02, 2005

Yes, it's the first Western Civilization Smackdown!

Number of paragraphs in Wikipedia entry:

John Keats: 7
The Matrix: 62

It's over, folks! Cult Sci-Fi Flick Starring Keanu Reeves does some serious whoop-ass on Consumptive Romantic Poet Who Writes Odes!

The mainstream blogosphere

Give little kids a big bowl of free candy, and they'll keep eating until they get sick. Give adults the same bowl, and most of them will pick out a couple of their favorites and then walk away to do something else. That's pretty much the way it goes with any freebie - you consume a whole lot for a while, then you start tapering off, becoming more selective.

Blogs (and, I'd suspect, other free media) are no different. Recently, we've seen people start to fret about the looming attention crisis, which is a highfalutin way of saying they're becoming overwhelmed by the number of blogs in their RSS feeds. Om Malik speaks for many of us when he writes, "I have been overwhelmed, and have started trimming the feed list." This is the blogospheric equivalent of "Mommy, my tummy hurts."

Some people max out at 5 feeds, some at 50. I've even read some people claim they're topping out at a truly nauseating 200. I currently have 27 feeds, and that's way too many. I've gone from adding to pruning. Most of us will ultimately cut back to a handful of blogs that we read regularly, supplementing them with the odd post from here or there. That's only natural.

What it means, though, is that the blogosphere is going to end up looking a lot like the old "mainstream media." Rather than being a great democratic free-for-all, the blogosphere will become steadily more rigid and hierarchical. Structurally, it'll resemble the magazine world. A relatively small number of high-traffic blogs will dominate the market, and then there'll be a whole lot of more specialized blogs with fewer readers. (I'm not including here the zillions of "my diary" blogs, which are not aimed at gaining broad readerships and tend to be short-lived, anyway.) It won't be quite as hard for blogs to climb the hierarchy as it is in the print world (simply because the costs of blogging are so much lower), but it won't be easy, either.

Indeed, the technologies we use to manage our blog reading will reinforce the hierarchy. RSS, for example, imposes the old subscription model on the blogosphere - it's fundamentally anti-democratic, as it tends to lock us into a set of favorite blogs. (Even though blogs are free, the subscription model imposes real switching costs.) Also, the inevitable (in my view) shift away from blog search engines based on posting date (like Technorati's traditional default mode) to ones that use measures of "relevance" based on traffic or link intensity (like Google or Sphere.com or Technorati's "authority" engine) will also make the hierarchy more rigid and less democratic - as will third-party headline aggregators like Memeorandum, which also tend to reflect and reinforce established patterns of popularity.

The fact is, truly democratic media is good in theory but exhausting in practice. Our natural bent toward efficiency in consuming information will turn blogs into another mainstream medium.

Tent shows

The elitist-hippie-boy-scout culture of the Web 2.0 intelligentsia is funny in itself, but that doesn't mean a good parody isn't welcome.

Live and kicking

November 01, 2005

Having been on a plane earlier, I've been trying to catch up on Microsoft's announcements today. (The best rundown on the event I've seen is Tim O'Reilly's; the best analysis is Dana Gardner's.) Much is still fuzzy, particularly the precise tiered-pricing model for Office Live (as opposed to Office Dead, I guess) - and how much is covered by ads alone. But in general it looks like a good, smart move by Microsoft, surprisingly aggressive in its breadth without being excessively risky.

The company's challenge in moving to the software-as-a-service era has always been more about timing than technology: Shift too late, and you risk losing the market; shift too early, and you leave lots of traditional license profit on the table. With its Live plan, it seems to have struck a balance that, on paper, at least, looks smart. As Gardner writes: "By targeting small businesses with Windows- and Office-like services and juxtaposing them to contextual advertising, Microsoft diversifies its business model closer to what Google and what other software-as-a-service vendors do, but does not really dent its historic money making machines: the Windows operating system and Office suite of personal productivity applications. At least not for some time." And by integrating the utility version of Office with the on-premise version (it remains to be seen how that will work), it has the potential to put a tough barrier in Google's path into the business market.

Now, we'll see how well Microsoft can actually execute the plan. Can it make its battleship maneuver like Google's cigarette boat?

One final note: There's a real rush right now to give software away and make money from advertising. The strategy is built on aggressive projections for on-line ad revenues as far as the eye can see. What nobody's talking about is the fact that advertising is a very cyclical business. If you're publishing a newspaper or magazine, you have considerable variable costs (paper, editorial content) that you can trim when there's weakness in the ad market. With software-as-a-service, you don't have that flexibility in the cost structure of your business. (Your customers aren't going to use your software less because there's an ad recession going on.) At some point, and it will probably be sooner than the current rosy forecasts suggest, the on-line ad market will take a dip. Then things will get very interesting very fast.

Michael Dell is from Mars

Back in January, I described the bifurcation of the home PC market - into a dirt-cheap low end and a fashion-driven high end - and the challenge it posed to Dell. Back then, the challenge was theoretical. Now, it's real.

The company's announcement of a major sales and revenue shortfall yesterday underscores the problems it's having, particularly in the consumer market. Dell's been extremely successful for many years in riding the IT commoditization wave - streamlining its operation to make money at a price point that's unprofitable for competitors - but now it's finding that sometimes cheap can be too cheap. Unlike in the business market, where Dell has been able to offer attractive value-added services to keep box prices off the floor, a large number of home buyers are just grabbing the cheapest machine available. With competitors, including a resurgent HP, now willing to battle Dell for market share, particularly in the expanding laptop market, Dell's in a squeeze. It's lost its margin on home PC sales. Although the consumer market represents a relatively small portion of the company's overall revenues, it's big enough to wreak havoc with Dell's results, a fact that's led investors to flee the once bullet-proof stock.

Dell's response? To shift away from its traditional, scale-driven commodity strategy and try to boost profits by selling high-end machines to the well-heeled. Because the new positioning goes against the grain of its low-cost, anti-innovation heritage, the shift will be a tough one to carry out. Dell will have to compete more on the terms of high-style companies like Alienware and Apple, rivals it hasn't had to worry about much in the past. It's leaping, in other words, into a new world.

Michael Dell is from Mars, Steve Jobs is from Venus. Planetary convergence is rare, in business as in the heavens.

Wireless 1.0

October 29, 2005

As anybody who's read my work knows, I'm fascinated by the utopianism that springs up whenever a major new technology comes along. I recently picked up a collection of essays on this theme, called Imagining Tomorrow, which was published in 1986 by the MIT Press. One of the essays, by Susan J. Douglas, looks at the excitement set off by Marconi's introduction of radio - the "wireless telegraph" - to the American public in 1899. "Wireless held a special place in the American imagination precisely because it married idealism and adventure with science," Douglas writes.

The invention stirred dreams of a more perfect world, expressed in language that won't sound unfamiliar to today's readers:

Popular Science Monthly observed: "The nerves of the whole world are, so to speak, being bound together, so that a touch in one country is transmitted instantly to a far-distant one." Implicit in this organic metaphor was the belief that a world so physically connected would become a spiritual whole with common interests and goals. The New York Times added: "Nothing so fosters and promotes a mutual understanding and a community of sentiment and interests as cheap, speedy and convenient communication." Articles suggested that this technology could make men more rational; with better communications available, misunderstandings could be avoided. These visions suggested that machines, by themselves, could change history; the right invention could help people overcome human foibles and weaknesses.

The Atlantic Monthly even published a sonnet titled "Wireless Telegraphy" that ended with these lines:

Somewhere beyond the league-long silences,
Somewhere across the spaces of the years,
A heart will thrill to thee, a voice will bless,
Love will awake and life be perfected!

The rise of wireless also set off a popular movement to democratize media, as hundreds of thousands of "amateur operators" took to the airwaves. It was the original blogosphere. "On every night after dinner," wrote Francis Collins in the 1912 book Wireless Man, "the entire country becomes a vast whispering gallery." The amateurs, Douglas reports, "claimed to be surrogates for 'the people.'"

But it didn't last. By the 1920s, radio had become "firmly embedded in a corporate grid." People happily went back to being passive consumers: "In the 1920s there was little mention of world peace or of anyone's ability to track down a long-lost friend or relative halfway around the world. In fact, there were not many thousands of message senders, only a few ... Thus, through radio, Americans would not transcend the present or circumvent corporate networks. In fact they would be more closely tied to both."

Beyond Google and evil

Last March, on the website Edge, the playwright Richard Foreman wrote what might be taken as a draft of an elegy for humankind:

I come from a tradition of Western culture in which the ideal (my ideal) was the complex, dense and "cathedral-like" structure of the highly educated and articulate personality - a man or woman who carried inside themselves a personally constructed and unique version of the entire heritage of the West ...

But today, I see within us all (myself included) the replacement of complex inner density with a new kind of self-evolving under the pressure of information overload and the technology of the "instantly available". A new self that needs to contain less and less of an inner repertory of dense cultural inheritance - as we all become "pancake people" - spread wide and thin as we connect with that vast network of information accessed by the mere touch of a button.

Will this produce a new kind of enlightenment or "super-consciousness"? Sometimes I am seduced by those proclaiming so - and sometimes I shrink back in horror at a world that seems to have lost the thick and multi-textured density of deeply evolved personality.

George Dyson, the historian and author, wrote a fascinating response to Foreman, in which he suggested (at least this is what I think he suggested) that we are at a turning point in the history of the computer and, in turn, the world. Up to now, computers have been limited by the fact that "every bit of information has to be stored (and found) in precisely the right place." This rigid system is completely different from the biological model of information processing, "which is based on template-based addressing, and is consequently far more robust. The instructions say 'do X with the next copy of Y that comes around' without specifying which copy, or where." But today we're seeing the biological model begin to be replicated in an electronic information system. Who's creating this new computer? Google. Built on the self-evolving biological model, Google's search engine, according to Dyson, represents the first step toward "true" artificial intelligence - the 'super-consciousness' that already has begun pounding us "into [Foreman's] instantly-available pancakes," turning us into "the unpredictable but statistically critical synapses" of the Google Brain.

Dyson has now expanded and extended his essay. The inspiration was a trip he recently made to Google's headquarters, where an engineer told him, "We are not scanning all those books to be read by people. We are scanning them to be read by an AI." After reporting this comment, Dyson quotes Alan Turing on the development of AI systems: "In attempting to construct such machines we should not be irreverently usurping His power of creating souls, any more than we are in the procreation of children. Rather we are, in either case, instruments of His will providing mansions for the souls that He creates."

Dyson ends on an ominous, if enigmatic, note:

Google is Turing's cathedral, awaiting its soul. We hope. In the words of an unusually perceptive friend: "When I was there, just before the IPO, I thought the coziness to be almost overwhelming. Happy Golden Retrievers running in slow motion through water sprinklers on the lawn. People waving and smiling, toys everywhere. I immediately suspected that unimaginable evil was happening somewhere in the dark corners. If the devil would come to earth, what place would be better to hide?"

For 30 years I have been wondering, what indication of its existence might we expect from a true AI? Certainly not any explicit revelation, which might spark a movement to pull the plug. Anomalous accumulation or creation of wealth might be a sign, or an unquenchable thirst for raw information, storage space, and processing cycles, or a concerted attempt to secure an uninterrupted, autonomous power supply. But the real sign, I suspect, would be a circle of cheerful, contented, intellectually and physically well-nourished people surrounding the AI. There wouldn't be any need for True Believers, or the downloading of human brains or anything sinister like that: just a gradual, gentle, pervasive and mutually beneficial contact between us and a growing something else. This remains a non-testable hypothesis, for now. The best description comes from science fiction writer Simon Ings:

"When our machines overtook us, too complex and efficient for us to control, they did it so fast and so smoothly and so usefully, only a fool or a prophet would have dared complain."

Pot. Kettle. Black.

October 28, 2005

Bloggers haven't been shy about pointing out the flaws of traditional print and broadcast journalism - what they often call the "mainstream media." Up until now, the criticism has been mostly a one-way street. The articles about blogging in traditional media outlets have been, on balance, pretty positive. That's changing now. As the blogosphere's influence grows, its own flaws are finally getting the inspection they deserve. In its new issue, for instance, Forbes has a big story that examines how the blogosphere has become "the ultimate vehicle for brand-bashing, personal attacks, political extremism and smear campaigns." It's a charge that's hard to dispute, and Lyons does a good job of documenting the problem. The article's aggressive, to be sure, but that's Forbes's style.

It would be nice to think the blogosphere would use the piece as an occasion for a little bit of soul searching. But instead of addressing the criticism, most bloggers are simply blasting the messenger. Dan Gillmor sums up the article as "a pile of trash ... an alarmist and at times absurd broadside." Paul Kedrosky says the article is "dopey" and asks "how did it ever see print in tech-friendly Forbes." Steve Rubell, who charmingly refers to the real world as the "meatspace," goes into church-lady mode: "Forbes, I am very disappointed that you chose to take such an unbalanced POV when BusinessWeek and Fortune told us both sides of the story."

A common theme in the responses is that Lyons is "damn[ing] all bloggers for the sins of the few," as Doc Searls (in an otherwise balanced response) puts it. That's a misrepresentation. Lyons specifically writes that "attack blogs are but a sliver of the rapidly expanding blogosphere." (He does go on to argue that the problem extends beyond the bad actors themselves - scurrilous or one-sided attacks are naturally amplified in the blogosphere's vast echo chamber - but that's a valid point.) The fact is, in the context of the article's argument, it's clear that references to "blogs" and "blogging" are references to the attack blogs that are the subject of the piece, not to all blogs or bloggers.

Lyons's article isn't beyond criticism. His rhetoric does get overheated at times, and he can stretch too far in trying to make his points as pointed as possible. But those are hardly hanging offenses in magazine writing, and in the "citizen journalism" of the blogosphere they're as commonplace as typos. In rushing to dismiss the article, the blogosphere is simply exposing another of its shortcomings: It can dish it out, but it can't take it.

Shock treatment

October 27, 2005

We're in the early stages of the second great transformation in business computing - a shift from the reigning client/server model (in which individual companies own and maintain their own IT "power plants") to the utility model (in which outside utilities will run the plants). The change is going to take a while, not just because utility computing's underlying technologies, like virtualization, are far from mature but also because managers naturally fear losing control over the IT assets that have become so essential to their operations. Few executives enjoy having to run their own IT plants (Lord knows, it's not their core business), but most of them have at least a little bit of the "box hugger" in them.

What will spur companies to make the leap will in many cases be a crisis. We've already seen an example with the imposition of regulatory regimes such as Sarbanes-Oxley or, in health care, HIPAA that require firms to meet tough standards for data security, disaster recovery, and so forth. Faced with having to invest heavily in modernizing their IT infrastructures, policies, and staffs to meet the new requirements, some businesses have opted to unload much of their infrastructure onto utility providers running secure, state-of-the-art data centers. They find it makes economic sense to offload the capital investments and labor costs to an outsider, and, equally important, they like the fact that it gives them a way to get the risk and liability off their own shoulders. (It's kind of like keeping a get-out-of-jail-free card in your back pocket.)

Now, companies suddenly have another good reason to jump to the utility model: electricity costs. Corporate data centers are power hogs, and their gluttony gets worse every year. Earlier this week, TechTarget reported on a new survey by AFCOM, one of the leading IT professional societies, that showed the amount of electricity used by the average data center is increasing at an 8% annual clip, and for some centers the growth rate is as high as 20%. Up until now, though, the increases haven't been severe enough to attract the attention of most business executives. But that's about to change. The spike in oil and natural gas prices is pushing the cost of electricity up dramatically as well. Boston's Beth Israel Deaconess Medical Center, for instance, has just been told by its electric companies that rates are going up 27%. Combine that kind of rate jump with the ongoing increase in consumption, and you've got a problem that's going to get noticed. As the hospital's data center manager, Bob Doherty, notes, "If I told my boss that my staff wanted a 27% increase [in pay], I'd be downstairs on the carpet."

If energy prices stay high, expect to see another wave of companies embrace the utility model and start to close down their data centers. It looks like box-hugging is about to get considerably more expensive.

Inside Google

October 26, 2005

Dan Farber provides an interesting report on a talk by Google IT executive David Merrill about how the fast-growing company works. What particularly struck me is the simplicity of the company's process for sharing information among the many projects it has under way at any given moment. Actually, it's not even a "process." Its just "an email posting of a list of bullet points" that anybody in the company, "'from engineering to sales to folks who sweep the floors," as Merrill puts it, can read and add to. It kind of makes you wonder about all the time and money companies have dumped into complex information systems for "knowledge management." Google uses similarly simple email systems for evaluating the performance of employees and collecting comments on potential new hires - two other processes that companies often "automate" with complicated technology.

Also of note is Google's practice of keeping its people on the move: "Part of Google’s innovation strategy is to keep its employees challenged, and the company does that by moving people from project to project, Merrill said. An average project lasts three months or less, and employees spend only a year to 18 months in one area." He notes that this practice creates its own set of problems, such as "maintaining continuity," but despite the drawbacks it seems like a good way to keep smart people engaged and motivated - and thinking about the broad interests of the company rather than their own pet projects.

Less valuable is Merrill's discussion of the company's deliberately messy organization. "'We always over hire," he says, and the structure of the organization is in constant flux. Google currently has the luxury of being inefficient because of its enviable position as the most powerful member of an oligopoly controlling an exploding market (Internet advertising). Most other companies don't hold such a lucrative position, and they can't be so cavalier about expenses. Sooner or later, Google will have to run a tighter ship.

Farber also notes that Google's commitment to transparency ends at the boundaries of its own organization. "Working at Google," he writes, "is about openness, flatness and transparency, which is pretty much the opposite of the company’s interaction with the outside world ... Google is enlightened on the inside but closed to the outside."

Trouble in Wiki Land

October 24, 2005

The nonprofit Wikipedia Foundation's announcement last Thursday that it is launching a money-making joint venture with for-profit Answers.com has set off a storm of protest among contributors to the on-line, open-source encyclopedia. Many believe the commercial effort runs counter to the ideals of the vast Wikipedia community. "I feel utterly betrayed by the foundation," writes one Wikipedian on a discussion page at the site. "I will withdraw all and any support for the Wikipedia if this satanic project continues," writes another. A third asks: "Has the WikiVatican started selling indulgences?"

The rift comes at a time when the quality of the encyclopedia, which has long been held up as an example of the Internet's ability to harness "collective intelligence," is under debate (a debate set off by a critical post of mine earlier this month). The Register today runs a series of letters on Wikipedia from its readers. "While Wikipedia still has its defenders," the online journal writes, "there's a palpable relief that its shortcomings are finally being given the critical eye." The Guardian Unlimited, which a year ago called Wikipedia "one of the internet's most inspiring success stories," today ran a story headlined Can you trust Wikipedia? in which it asked experts to rate some of the encyclopedia's entries. The assessments ranged from "factually accurate" to "not terrible" to "inaccurate and unclear."

Even Web 2.0 guru Tim O'Reilly, in a sharp retort to what he calls my "cynical rhetoric," can't quite bring himself to defend Wikipedia's quality. Instead, he tries to make a case that any criticism of the encyclopedia is somehow politically incorrect: "How can we castigate Wikipedia as flawed when our conservative television news services managed to persuade their viewers that weapons of mass destruction were found in Iraq, and that evidence was found linking Saddam Hussein to the Al Quaida attacks on 9/11!" Now, that's a smokescreen if I ever saw one.

Ross Mayfield, another influential wiki-promoter, does rise to Wikipedia's defense, but only by redefining it as something other than an encyclopedia: "I know of no goal of being authoritative, but the group voice that emerges on a page with enough edits (not time) represents a social authority that provides choice for the media literate." Mayfield may not be aware of it, but the Wikipedia community has explicitly stated that one of its goals is to make the encyclopedia "the most authoritative source of information in the world." Its founder, Jimmy Wales, has also made it clear that he "intends that Wikipedia should achieve a 'Britannica or better' quality," according to Wikipedia's entry on itself. Also, I'm not aware of any attempts to restrict access to Wikipedia's content to what Mayfield calls, in a strange but perhaps revealing outburst of elitism, the "media literate."

Wikipedia wants itself to be judged as an encyclopedia, not as a "group voice [that] represents a social authority" (whatever that means). To fudge the issue and judge it as something other than an encyclopedia is, in my view, to be condescending toward the Wikipedians. It's also a way to avoid a hard discussion about the true nature of "collective intelligence" on the web.

Older posts

Subscribe RSS | Atom

Recent posts

Greatest hits

The amorality of Web 2.0

The mainstream blogosphere

Cold War 2.0

Massachusetts and Microsoft

Google eyes

What IT's all about

Really simple revolution

How ivory is your tower?

The next UI

Other writing

Does IT matter?

IT doesn't matter

Top-down disruption

The end of corporate computing

Bridging the breakthrough gap

Secrets of succession

Hypermediation: commerce as clickstream

More

Advertising:

Click Here Click Here For The Wall Street Journal

Rough Type is:

Written and published by
Nicholas Carr

Powered by
Movable Type 3.15

Designed by

JavaScript must be enabled to display this email address.

Ad-driven, for your pleasure.™