Wednesday, December 19, 2007

Early Indications December 2007: Prediction Scorecard

How did we do?

Last December I wrote that "we see a collision between systems based
on old and new models of regulation, remuneration, protection,
privacy, and so forth. At base, we are having to redefine some of the
core systems that make the world work: money, contracts, civil rights
and civic responsibilities, identity, possession, and others. This
year, I believe that several of these collisions will reach new
heights of unexpectedness, expense, and impact."

Prediction 1:
"This year, look for a still grander failure of data protection
[relative to the VA], either in one highly visible episode or a
cumulative increase."

Result: Hit
The British Ministry of Revenue and Customs loss of 25 million names
is truly spectacular: it's roughly half of England's population, and
sensitive information included naming 350 people in witness protection
programs. The costs and risks of providing new identities for those
affected could be extreme. The TJX breach, meanwhile, was initially
reported to have involved 46 million records but according to a recent
court filing could have exposed 94 million credit card-holders -
nobody can say for sure, but the affected banks and the retailer are
said to have settled. The amount of the settlement was undisclosed,
but the company, which books about $18 billion in annual revenue, set
aside over $100 million for litigation settlement.

Prediction 2
"YouTube and related content distribution mechanisms will push the
envelope too hard, with a high-profile episode of unauthorized copy
distribution prompting legislation, litigation, and potentially
business failure."

Result: Too early
Litigation, yes, courtesy of Viacom, business failure, no.

It's also worth watching a court case in the adult entertainment
industry, since that sector is often a forerunner of changes in the
wider business environment. According to the Los Angeles Times, on
December 10 "Vivid Entertainment Group filed [a] lawsuit in Los
Angeles federal court against PornoTube and its parent, Data
Conversions Inc., which does business in Charlotte, N.C., as AEBN
Inc." The YouTube-like web business is said to be posting copyrighted
material, costing one of Vivid's competitors 35% in revenues,
according to the article.

Prediction 3
"Some new activity - whether job referrals, recipe swapping,
rotisserie baseball, genealogy, Christian evangelism, or something
similarly below radar - will break through using a Google-like
monetization model and approach the growth rate we saw for video in

Result: Hit
Facebook was clearly the big story of 2007, but even as early as June,
it was reported that digg had passed Facebook in number of unique
visitors, having grown 1400% in one year. May 2007 data from Compete
show digg with 22.6 Million unique visitors, while Facebook had 20.2
Million. It's important to note, however, that people spend far more
time on Facebook. Fantasy [American] football has about 12 million
players, up 33% since 2005; overall, fantasy sports are a $2 billion
industry, or about 13 times Facebook's estimated 2007 revenues.

Prediction 4
"For all their amazing capabilities, communications and computing
systems still can't cheat physics. 2007 will see the so-called virtual
world continue to encounter the physical environment in important
ways. A few examples suggest the breadth of the issue:
*Data centers are beginning to scale up to the size of factories and
even foundries in their energy consumption."

Result: Hit
So-called "green" computing is indeed front-page news. Google's data
centers are setting the pace as 40-70 megawatt facilities are coming
on line. Large single points of failure in any system increase the
potential scope of damage if an outage were to occur.

Prediction 5
"A different facet of the energy and transportation systems relates to
automobiles. While Chevrolet just introduced a good-looking electric
car, the Volt, that anticipates developments in battery technology,
Tesla Motors will ship over 200 Roadsters at $100,000 apiece that
out-accelerate a Porsche 911 and achieve the equivalent of 135 miles
per gallon fuel efficiency. [The enthusiasm for ethanol will continue,
despite severe limitations.] How politics and markets react to rising
oil prices, from a systems-of-systems perspective, will determine
quite a bit about the shape of the next 10-20 years."

Result: On hold
Tesla found car-building more complicated than the founders thought,
and slipped its ship date again. The Volt is being touted as a signal
of rejuvenation at GM under Bob Lutz, while Honda announced a major
commitment to less expensive hybrid engine technology. Ethanol mania
seems to be subsiding slightly. A huge oil discovery off Brazil must
be countered by growing political instability in many oil-rich
regions, and high prices reflect a combination of that political risk
with booming demand in the developing world.

Prediction 6
"2006 did not see a major disruption to the world's transportation and
communication systems. Such good fortune cannot last indefinitely, yet
readiness for the unexpected remains lower than it could be."

Result: Glancing blow
Yahoo's merchant servers melted down on "Cyber-Monday," leaving many
of its 40,000 businesses searching for new commerce providers after
seven hours of outage and another five of slow performance. Air
travel is suffering both meltdowns both macro and micro (as at LAX in
August, when one bad network card shut down the airport and stranded
about 20,000 fliers, or when JetBlue infamously mismanaged weather
delays in February), but we saw nothing that qualified as a major

Prediction 7
Paradoxically, even as people and devices grow more connected, with
access to more information, the need for intermediaries evolves rather
than disappear.

Result: Hit
Apple's iPhone was clearly one of the year's big stories, as were
YouTube, Facebook, Amazon (particularly its Kindle reader, but also
Mechanical Turk's role in the Steve Fossett search) and Google's
unrelenting command of search and advertising. All are
intermediaries, or filters. As Facebook discovered, matching
advertising to audiences in return for money is very appealing in its
revenue potential, but hard to do and easy to get wrong. Microsoft
just announced a major ad placement deal with Viacom. Along with its
Facebook investment, this puts Microsoft in excellent position to
learn at the front-ish edge of ad serving and measurement,
realistically behind Google and perhaps Yahoo. The biggest noise of
the year was made by the social networking model, which is such a
powerful filter we have yet to devise cogent models or names for what
might be possible: the filtering and sheer time-consumption of
MySpace, Flickr, LinkedIn, and the rest may finally have driven the
final nail into the 1990's mantra of disintermediation.

Overall, it was a decent showing as no assertion fell wildly off the
mark, and another several areas appear to be unfolding in line with
the prediction, just not quite in this calendar year.

I hope every reader finds joy and peace in the holiday season, and
we'll start the new year off with an

Monday, November 19, 2007

November 2007 Early Indications: 10 Predictions for the Next 10 Years

As promised last month, here are ten information-technology-related areas to watch over the next ten years. Rather than attempting to be systematic, this list will merely suggest topic areas and point to some relevant data points; otherwise, a ten-item list would soon get unwieldy. Key areas such as liquidity in financial markets, global immigration policies, warfare and diplomacy, and credibility of government, financial, and cultural institutions also merit close watching, of course, but will be outside our scope for the moment. (Note that this material is also available in a presentation.)

1) The New Physical Layer

Although everything from power grids to bridges and ports to railways is being built or rebuilt, our focus here is on computing and networking. In particular, power and bandwidth will be transformed in the next decade.

Taking power first, cloud computing vendors are waging an arms race as they build data centers to power a range of offerings loosely called "web services." Because of the intensity of their power consumption, these often appear near cheap hydroelectric power sources (which themselves may be affected by global climate changes). It's estimated, for example, that Google's data center, housed in two adjacent buildings in Oregon, contains 1.3 million computing cores on 9,000 racks per structure, and photographs of the cooling towers are staggering.

Something else is going on: Caterpillar reported that its Q2 07 revenues from sales of backup generators, such as those used in data centers, were up 41% at a time when overall U.S. construction equipment sales are slumping. The growth of "cloud computing" feels as though it's related to the trend toward virtualization, where resources can be located, physically and/or logically, away from their locus of deployment. At the end of the day, however, servers have to sit somewhere, and when they do, lots of heat follows.

At the same time, the need for portable power to support an increasingly mobile user base means that fuel cells, batteries, and associated technologies will also attract investment and talent. Solar power, meanwhile, is a complicated issue: there's clearly a lot of froth around silicon panel plays, which compete with the computing sector for resources, talent, and production capacity. How much solar helps address computing's need for portable power and how much it constrains it will be important to watch.

Bandwidth consumption is exploding as video expands farther and farther into a global customer and user population. In both wired and unwired domains, a lot is happening. On one side, perhaps even the term "wired" should be amended as optical connectivity proves its superiority; while glass can be fabricated into cables, maybe the word "wire" has become misleading. Delivered in the U.S. by Verizon and to a lesser extent AT&T, fiber is driving wider delivery of 20, 50, and potentially 100 MB/sec download speeds along with faster multiplayer gaming action and multiple high-definition television signals. Over the ether, WiMax's future got a bit less rosy recently as Sprint dissolved its partnership with Clearwire as the stumbling cellular carrier searches for a new CEO. Even so, whether it's that particular technology or potentially a cellular variant, mobile broadband will be a key area for the next decade.

2) Enmeshed

The Japanese have already named a relevant demographic better than Americans have: "oyayubizoku," clan of the thumb, is far more evocative than "digital natives." Whatever they're called, people under 30 around the world are redefining mobility: who is supposed to say (or otherwise convey) what message to whom, in what contexts, with what expectations in return is being defined in fascinating ways. I'm reminded of the need for a new greeting at the introduction of the telephone, as people of manners were not supposed to speak to someone unless they had been introduced. Many languages differentiate between telephonic greetings and spoken ones ("bonjour" vs. "allo" in French), but before "hello" was carried over, Alexander Graham Bell preferred "ahoy" as the English-language telephone greeting.

The distinction between telephones and PCs is getting fuzzier every year, as we have noted, and the iPhone presents a clear case in point: running a Unix variant, it can be spoken at, but performs best moving and manipulating images and data. Mobile phones, ultra-mobile PCs (UMPCs), gaming devices including Nokia's N-Gage, handheld PCs, televisions, and other devices (such as standalone GPS trackers) will continue to converge. Note that the success of this sector depends heavily on commercialization of the power alternatives listed above.

GPS phones are estimated to be a $30 billion segment next year. Some of the most promising applications involve the combination of mobility and convenience, location awareness, and social networking: as Google enters the phone market, expect to see some variation on the Dodgeball service it acquired in 2005. Being able to visualize a list of friends, in their current physical locations, in order to coordinate seems like a truly harmonic convergence of capabilities.

Television over mobile handsets is estimated to reach over 100 million users by 2009, and the number should soar further in conjunction with the 2010 World Cup. Expect to see spirited competition among content owners like News Corp, handset manufacturers, network equipment firms (including heavyweights Qualcomm, Nokia, and potentially Intel), and carriers such as Vodaphone and T-Mobile. Finally, given that [lots of] advertising is involved, expect something unexpected from Google. There's little question as to demand, particularly after seeing adoption in Japan and Korea, but allocating the money may prove to be difficult.

3) Healthy, Wealthy, and Wired

Entire books need to be written on various facets of information, technology, and health. A few bullets suggest the reach of potential issues:

-Electronic medical records have the potential to improve care, save money, and enhance the patient's experience with his or her health care system. EMRs also could help transform the economics of health insurance, lead to data breaches of untold pain and economic impact, and alter the role of physicians relative to insurers, employers, and patients. Automating the current, broken U.S. system (I can't speak for other countries), feels unappealing, which means that implementing EMRs implies deeper transformation, parallel to but much bigger than the changes brought about by corporate ERP implementations.

-Better information regarding public health statistics is essential, particularly given the experience with SARS and fears about future pandemics. But once again, social, cultural, economic, and legal questions emerge. Ranging from "who owns the data?" to "who defines how data is shared across jurisdictions?" to "who pays and who benefits?," these questions will test an already under-funded global public-health infrastructure. For an upbeat and visually riveting vital statistics story, see "No More Boring Data," a video of a lecture on global demographics.

-What does it mean to be human? Mechanical joints and prostheses are rapidly becoming more sophisticated and digitized. When does a disability become an unfair advantage? Oscar Pistorius is a South African sprinter whose 400 meter time is about a second slow of Olympic qualifying. He's also a double amputee whose carbon-fiber "legs" are challenging old ideas about fair competition. Or take Jesse Sullivan, a former lineman from Tennessee who lost both arms in an electrical accident. He has a nerve-controlled robotic arm connected to his chest. Told by his doctors not to baby the device, he returned one time carrying his hand, which he had detached while starting a lawn mower. Cochlear implants are already common solutions to hearing loss (Rush Limbaugh has one) and electrical implants also help patients with Parkinson's Disease, so it is a short hop to implanted chips that enhance brain function: when will 14-year-olds start getting "Harvard chips" to enhance test-taking, piano-playing, physical endurance, and other competitive traits that will help college admissions - and beyond?

-What will be the long-term effects of nearfield electromagnetic emissions, particularly after they have been focused through the ear directly into people’s brains? Cell phone antennas are a potential hazard, but so are earbuds and Bluetooth radios, and nobody knows yet what might or could happen across broad populations with widely varying spectrum allocations, cultural patterns, and governmental regulations.

4) Connection Machines

As more kinds of things get connected to information networks, the potential for unexpected consequences gets ever more interesting to contemplate. Just listing the number of classes of devices that can or will soon interoperate gives a sense of scale:

-telephones, the wireless variety of which can be understood as beacons, bar-code scanners, and network nodes - potentially in a mesh configuration
-motor- and other industrial controllers
-surveillance cameras (of which there are over 2,000 in Chicago alone)
-sensors, whether embedded in animals, affixed to pharmaceutical packaging, or attached to engine components to predict mechanical failure.

All told, there are dozens of billions of items that can connect and combine in new ways.

Look at robotics in the realm of warfare. Small portable robots, literal cousins of the Roomba vacuum cleaner, can investigate caves or tunnels, while the last two DARPA autonomous vehicle challenges (one across open terrain, the most recent at an abandoned Army base simulating urban conditions) have produced multiple successful entrants. Unmanned Aerial Vehicles are flown by crews remote from the battlespace. The pace of successful deployment will certainly continue, raising a wide variety of heretofore purely theoretical questions about the ethics and costs of combat.

Other machines are less visible. Amazon Mechanical Turk was recently used in the search for pilot Steve Fossett: aerial photographs were loaded into the system, which then systematically presented volunteers with images to scan visually for evidence of wreckage, a parachute, or other clues. Combined computing power with human pattern recognition will become more common in a wide variety of domains.

5) Virtual Fences

It's extremely difficult to delimit this space. Risk, trust, identity, and security are all intertwined, and each has implications for the others. Just this week New York Governor Eliot Spitzer backed off on a plan to issue illegal immigrants New York driver's licenses. This in turn means none of these people can fly on commercial flights unless they hold a passport. The 50 states, meanwhile, are in various degrees of agreement with a federal plan for regularizing driver's licenses to create a de facto national identity card. Both driver's licenses and passports, meanwhile, will get embedded RFID chips, which have been cracked already in a variety of trials. At base, the questions of "who are you," "can you prove it," and "who else knows your information" are all in play, all over the world.

Spam is more prevalent than ever, and creative code-writers are unleashing new technologies to build networks of dormant, compromised computers waiting future instructions. The so-called "Storm" worm is actually a worm, Trojan, and bot combined: it changes its payload every 30 minutes, effectively mutating far faster than antivirus software definitions can be written, much less applied. It operates on evolving IP addresses and in a peer-to-peer network configuration, so very few infected machines point to a central point of control (thought to be Russian). Between 1 and 50 million machines are believed to be at risk, but because there is no spike of malware traffic, as there was in the incredible spread of the Slammer worm (which spread to 75,000 machines in 10 minutes), Storm is nearly undetectable. Given the numbers of networked devices listed above, one must assume viruses will attack everything from powerplant controls to cellphone networks to several types of security systems.

The biggest data breach I'm aware of is the 47 million credit-card numbers lost by TJX (parent company to TJ Maxx, Marshalls, and HomeGoods) as a result of improperly configured in-store wireless networks. Last month, a group of banks alleged in a court filing that in fact 94 million records were lost. Currently liability rests with the banks and credit-card entities even though the merchant was responsible, so expect new legislation to reallocate the blame (and financial responsibility) when the next leak occurs.

6) Of Memory and Forgetting

As more of humanity's mental output is digitally recorded and preserved, we will see new kinds of challenges and opportunities related to the storage of said output. My colleague John Parkinson was fond of saying that "digits never die," and anyone who posted stupid newsgroup utterances 15 years ago or candid MySpace pictures seen by a potential employer will understand. Insofar as much of the "web 2.0" traffic is about "me" (and my opinions, and my friends, and my pictures, and my goings-on), it feels like there will be an emerging dialectic between asking for attention and asking for, if not privacy, at least some control over one's cumulative bitstreams.

Many questions relating to monetization of data are relevant here. Who owns my trail of digital breadcrumbs that everyone from Axciom and Amazon to Vodaphone and Yahoo is trying to use for commercial purposes? In healthcare, who holds, owns, and controls my lifelong record of prescriptions (filled and unfilled), medical test results, over-the-counter and supplement purchases (helpfully recorded by loyalty cards), public health data, and even caloric intake and, at the health club, expenditure?

Embedded metadata is another area to watch. Many digital cameras embed information into the image file relating to camera, shutter speed, lens, and time and date. If you look at the most recent versions, what the privacy types call PII (personally identifiable information) also shows up: latitude and longitude of the location, the photographer's name (handy for claiming artistic royalties), and other information that is not obvious when looking at the image. Various generations of Microsoft Word embedded sometimes embarrassing information relating to authorship, editorial changes, and the like: more than one consulting firm has been caught repurposing a proposal (or deliverable) when hidden layers of information told their tale.

As more bits are generated and stored in networked contexts, we will see a reinvention of the public record; just this week a D.C. circuit court judge ordered the White House to stop deleting e-mails, given that 5 million are alleged to be missing. At the level of less prominent individuals, we will see extremes from privacy fanatics that try to commit as little as possible to digital media, all the way to Microsoft researcher Gordon Bell, who is attempting to digitize his entire life, from birth certificate forward, the last few years in real time. (Here's a New Yorker story on Bell.) How the rest of us sort out the middle will be unpredictable.

7) The Human Peripheral

Traditionally, people connected to the computer through punch tapes or cards, keyboards, and screens. That list is getting longer, quickly.

It's been five years already since Cambridge and MIT researchers shook hands across the Atlantic. Haptic (3-D touch-based) interfaces are entering the mass market, most visibly via the Nintendo Wii, which is outselling conventional game consoles from Sony and Microsoft.

The Audeo system processes human intentional thought and converts it to speech. That is, it acts on "I want to say 'hello'" rather than broadcasting one's daydreams.

Vyro has developed a Bluetooth device about the size of a gum eraser. It measures stress through sweat gland activity in the skin, so one application is a clever game in which two players race their cars on a Bluetooth phone, the winner being the one who's more relaxed.

-New screens
Organic Light Emitting Diode (OLED) technology is coming to market soon, in Sony televisions for instance. Compared to LCD, OLED is brighter, more power efficient, and thinner - but it reacts badly to water. E-ink and other flexible displays are making similar progress.

While Microsoft's SPOT technology has not made much of an impact, datacasting is still viable. Ambient Devices make products that convey information at a glance. Those who have been to Boston know that the Prudential building's spire tells the weather: steady blue for clear, blinking red for rain. Ambient's Orb conveys weather, stock market performance, and other complex information by its color, and there's an energy monitor that tracks the price of electricity, weather forecast, and other information relevant to deciding whether or not to run the dryer or air conditioner.

8) Education

Officially, we now live in a services economy: at the global level, the switchover from agriculture happened only last year, which means that at scale, manufacturing was never earth's dominant economic activity. Education systems everywhere are struggling to adapt to digitization, to services, and to new demographic realities. In the U.S. for example, in 2050 there will be a huge blip of elderly women who are now just finishing childbearing. Who will support them, what will they do for both economic and other rewards, and how will they learn to do those things? In the developing world, projected demographic pyramids are even more striking as life expectancy changes dramatically in just a few decades.

How do schools prepare young people for jobs and organizational designs that have yet to be invented? To take two current examples, where did today's generation of sushi chefs and yoga teachers get their training? Where will robot mechanics, Internet addiction counselors, and Chinese lawyers get started? Getting computers (possibly through One Laptop Per Child or Project Inkwell) to the masses will start a process but by no means finish it.

As online course delivery ramps up, questions arise about architecture: what should a virtually-enabled classroom look like? Where should schools be built, particularly in developing environments? What should they look like? What is the role and function of a public library in a world in which the place of print is in major upheaval?

9) {Your Theme Here}

As blogging, social networking, and user-generated content proliferate, we're seeing one manifestation of a larger trend toward delegitimization of received cultural authority. Doctors are learning how to respond to patients with volumes of research, expert and folk opinion, and a desire to dictate rather receive treatment. Instead of trusting politicians, professional reviewers, or commercial spokespeople, many people across the world are putting trust in each other's opinions: Zagat is a great example of formal ratings systems being challenged by masses of uncredentialed, anonymous diners. Zagat also raises the issue of when crowds can be "wise," cannot possibly be "wise," or generally do not matter one way or the other.

Information markets hold great potential, but like real markets, suffer from bubbles, information asymmetry, and other externalities. Nevertheless, such exemplars as Hollywood Stock Exchange (now owned by financial information giant Cantor Fitzgerald), the Iowa Stock Market, and startups like Fluid Innovation are leading the way toward wider implementation. At the same time, we've seen markets process information for a long time: when the NBA addressed its betting referee, the situation highlighted the secrecy with which the league assigns refs to games. Referees are prohibited from telling anyone but immediate family about travel plans, because the Las Vegas point spread moves if the reffing crews are revealed ahead of game time. That point spread is a highly nuanced information artifact of a market compensating for new information.

So-called crowdsourcing will bear watching. Gracenote, the service that lists a CD's track names when you load them into iTunes, began with volunteer labor. What would happen with Wikipedia if Jimmy Wales followed Gracenote's history and monetized all of the volunteer labor? Another new business, Satisfaction applies crowdsourcing to customer service issues. As Google moves away from the idiot-proof search bar into applications, who delivers tech support? Two Google employees currently answer queries at Satisfaction, but it remains unclear who pays whom for what in various tiers of service, who's liable for the consequences of advice, and how might the system be gamed.

Clay Shirky has suggested that flame wars are essentially inevitable outcomes, rather than side effects, of social software. Many blogs have comments turned off because of abuse that imply takes too long to monitor and manage. Given that more people will be in contact with more people in new ways, how will new rules of behavior take shape? Will the lack of interpersonal civility (exemplified in the golden age of the ad hominem attack, offline and on-) evolve? If so, in which direction?

10) Silicon Emotion

People are interacting with other people with multiple layers of computing and communications in between. The nature of emotional expression is changing as a result.

-Dancing alone
What does it mean when tens of millions of music lovers listen in isolation, through headphones, rather than in rooms, or concert halls?

Back when the average MySpace user had 347 "friends," what did that really mean? Might Facebook, which has suffered in the eyes of some users from its retreat from exclusivity, be surpassed by a Ning or other network with express provision of firewalls between sub-communities?

-Inhibition deficiency
In addition to flaming, people will say things electronically they would be much more
hesitant to articulate verbally. Watching teenagers IM each other fluently and unabashedly, then stand with each other awkwardly after school, is a fascinating exercise. In the Nordics, the second-most prevalent use of text messaging (after coordination), is "grooming" - flirting.

-Robot love
The Roomba has inspired tremendous affection in its brief lifetime. (See the fascinating paper by Ja-Young Sung, Lan Guo, Rebecca E. Grinter, and Henrik I. Christensen, all of Georgia Tech, entitled "'My Roomba is a Rambo': Intimate Home Appliances" for compelling evidence on this point.) Sony's Aibo dog and Honda's Asimo can trigger similarly rich emotional responses in some people. iRobot, the Roomba folks, recently introduced a beta version of ConnectR, a "virtual visiting robot" projected to sell for $499. According to the website,

"Combining the latest in Internet communications and robot technology, ConnectR lets you virtually visit with loved ones, relatives and pets anytime you wish – seeing, hearing and interacting with them in their home as if you were there in person."

I can't imagine that this kind of technology will do anything but surprise people with its unintended consequences.

One final word: ten years is probably too long a time horizon for some of these areas, but institutional change, in education for instance, is always the slow part that will balance out some of the blink-of-an-eye things we’re about to witness.

Wednesday, October 24, 2007

October 2007 Early Indications II: Ten big technology-related busts in the past ten years

Earlier this month we marked ten years of this newsletter's publication by noting ten developments that quickly permeated the market after being nonexistent or invisible in 1997. This time out, I'll list ten big failures that at one time or another looked like can't-miss propositions.

1) Online grocery

Grocery is a notoriously tough retail category, with thin margins, fickle and price-sensitive customers, and perishable inventory. At the same time, it's an enormous market -- absolutely everybody eats -- so in the late 1990s, the perceived invincibility of online grocery made for failure of dramatic proportions. Webvan combined aggressive expansion, a long leash from investors, and questionable management to create an $800 million sinkhole. The firm was operating in Chicago, Los Angeles and Orange County, Portland, San Diego, San Francisco, and Seattle at the time of its demise, and many customers were disappointed at the loss of a convenient, time-saving service, particularly after Webvan undid many of the successes of the HomeGrocer chain it acquired. The customer base remains tantalizing, particularly as commutes grow longer and free time shrinks, but the logistics of automating picking out a cart-load of groceries from among 200,000+ SKUs, some fresh, makes this a daunting entrepreneurial challenge.

2) AOL and Excite@Home

For a time, AOL ruled the world of dial-up Internet access. Its carpet-bombed floppy disks (later CDs) helped introduce millions of Americans to the Internet, or at least an isotope thereof. It combined access with content (in some measure, in the form of other people) to reach an astonishing price/earnings ratio of 700. But when broadband delivered by incumbent telcos and cable companies split AOL's access from its content, the supposed synergy broke down and the bubble burst.

Beginning slightly later than AOL, the Excite search engine (like Yahoo and Google, a Stanford creation) was bought by the @Home broadband startup in hopes of another content+pipes goldrush. The merger was a disaster: $7 billion of market capitalization vaporized. Cox, TCI/AT&T, Comcast and the other cable companies who owned physical plant and had operational responsibilities, were ill matched with the Silicon Valley engineering culture that emphasized features and glamour over reliability and customer service. That Kleiner Perkins owned stakes in both @Home and Excite compounded the enthusiasm for a rush to synergy, but the operational realities of rebuilding physical infrastructure, combined with the regulatory scrutiny drawn by @Home's proprietary relationships with one of several competing portals, meant that the cultural and leadership issues helped precipitate a train wreck of epic proportions in 2001.

(On AOL, see Kara Swisher, There Must Be a Pony in Here Somewhere (2004); on Excite/@Home, see Frank Rose, "The $7 Billion Delusion")

3) Iridium

Motorola was a major shareholder in and primary supplier to this satellite telephony venture. After its 1997 IPO, Iridium faced loan covenants that required it to sign up 213,000 customers soon after it began offering service in 1999. When only about 10% of that number materialized, Iridium filed for bankruptcy: $5 billion in assets was liquidated for $25 million, and only last month Motorola -- itself Iridium's largest creditor, to the tune of $2 billion -- appeared to have escaped further liability with a court ruling in New York. The service was never aimed at a mass market, with phones costing $3,000 and calls $7 per minute. Coverage was good in open oceans and deserts, but not in moving cars or cities -- and the handset, while technically sophisticated, was big, heavy, and sported an antenna "the size of a toothbrush," in the words of the Wall Street Journal. Satellites, meanwhile, have been similarly costly to rival radio providers XM and Sirius, which between them have accumulated historic losses of $8 billion and are now trying to merge.

4) Super Audio Compact Disc/DVD-Audio

Roughly 20 years after the launch of the compact disc audio format, which itself came about 35 years after the introduction of the LP record, the entertainment industry brought out competing high-resolution optical disc formats for audio. Sony and Philips introduced SACD in 2000, while the DVD Forum, led by Panasonic and Toshiba, brought out DVD-A at about the same time. Audio quality is much higher than CD from both formats, but market confusion has been a major limiting factor. Customers of a certain age who already had to buy music collections twice over were reluctant to commit to one of two competing formats, and while hybrid players now support multi-channel audio playback from either source, software is not widely available: artists and labels had to bet on one standard or the other, and the slow market penetration has resulted in relatively few, and expensive, titles being available. The format war coincided with the explosion of digital file sharing (hence strict and cumbersome copy protection schemes for both SACD and DVD-A), and customers have widely defected to portable, lower fidelity media such as MP3 files. The net result is that both high-resolution audio formats are essentially irrelevant, and the DVD standard itself is in the early stages of a similar format fight, with potentially similar results.

5) Quokka Sports

Rereading ten years of Early Indications and its predecessors, I was struck by how amazed I was by three or four software demos. One was Keyhole, the technology that became Google Earth after the company was acquired. Another was Quokka, which was devoted to delivering data-rich sports coverage over the web. From its origins in Australia, Quokka began with immersive feeds of long sailing races such as Sydney-Hobart: data relating to biometrics, meteorology, speed, absolute and relative position, and participant narratives made for engrossing viewing. Quokka bought the Internet rights to the Sydney Olympics in 2000 after moving to San Francisco, but the lack of a viable advertising model combined with common dot-com management failures to force a shutdown in April 2001.

Partnerships with NBC and Major League Baseball, along with further Olympic rights, cost money but failed to deliver returns. In retrospect, Quokka was probably better aligned with low-viewership sports like sailing and mountain-climbing that could find Webcast niches than with big-audience events with established television techniques and politics. Sports remains unevenly instrumented: NASCAR races are data-rich, but the single biggest predictor of a pass play's success on a football field -- how long the quarterback holds the ball -- is not recorded. Baseball, meanwhile, has generated hugely popular online fantasy leagues, with football following suit, in ad-supported models of which the Aussies could only dream.

6) OpenFund

If open-source works for software, why not try the model elsewhere? MetaMarkets, founded by two veterans from Barclays Global Investors, launched in August 1999 on the basis of full transparency as fund managers disclosed every trade, often with commentary. The fund started fast out of the gate: at year-end 1999, it was up 91% (by comparison, the NASDAQ was up nearly 50% in the same period). The fund fell 42% in 2000, and dropped another 26% between January and August 2001, when it shut down. In part, the fund was a victim of small scale: whereas most mutual funds need to run at least $100 million in assets for viability, OpenFund was at about $10 million when it was liquidated. Both management and critics compared OpenFund to a finance chatroom with real money: a Morningstar analyst noted after the fund's demise that "the entertainment, the gimmick, doesn't really have anything to do with investing." This sounds plausible: if my money is in free fall, I'm not sure chatting with the fund managers is going to help either my mood or the fund's performance.

7) General-purpose Speech Recognition

Ever since at least 1997, Bill Gates has been predicting that speech recognition will be an integral aspect of the PC experience. In his 5-to-10-year timeframe, it never happened, but not for lack of trying: Dragon Systems, headquartered in the U.S., was losing money selling speech recognition software before it was bought by Belgian competitor Lernout & Hauspie in the spring of 2000, just after L&H paid $1 billion for Dictaphone. The Dragon founders, however, had the misfortune of watching their company go into reorganization after accounting irregularities made the L&H stock worthless. Revelations of fictitious transactions in Korea and over-stated earnings elsewhere eventually sent the L&H founders as well as CEO Gaston Bastiaens (an industry veteran who helped launch the compact disc at Philips and later worked on the Apple Newton) into criminal proceedings that remain ongoing six years later: before Enron, Lernout & Hauspie was the archetype of corporate scandal. ScanSoft, which made optical character recognition products, bought the assets, but even now, neither Nuance (as ScanSoft renamed itself) nor Microsoft has made speech interfaces work for general-purpose computing. In vertical domains, however, speech interfaces -- particularly telephonic customer service and medical transcription -- are working well.

8) Digital Appliances

From high-profile efforts at Oracle (the NC) and Sun (JavaStation) to consumer efforts from the likes of Uniden, the late 1990s witnessed a variety of efforts to displace the personal computer with a network-intensive, easy-to-use, easy-to-manage device. The ideal of plugging a device into the Internet without need for hard-disk-resident applications or storage was motivated by a variety of factors, but ten years on, the vision has yet to catch on. For one thing, wireless devices allow much of the NC's functionality to be experienced on the go (cf. the Blackberry). Terminals and emulators never left the list of enterprise alternatives, as Citrix-based Windows systems illustrate: the PC remains a flexible platform that can be configured into diskless, mobile, or other alternatives. The relentless improvement in PC performance, particularly from 1990 until 2002 or so, made the PC's price-to-performance ratio continually appealing, until processing began to outstrip most of the application stack's needs. Finally, the lack of true broadband, until recently, made the devices slow in many environments.

9) Business-to-Business Exchanges

Talk about a shakeout: from 1520 exchanges in 2001, only about 10% were still active only two years later. VerticalNet, one of the first B2B exchanges, had 1700 employees and a $10 billion market capitalization at its peak; shortly afterward the CEO was faced with keeping 50 people on the payroll, using about $11 million that remained in the bank. Covisint, designed to make automobiles parts-buying more efficient, had a similar fate. That both survive today, albeit operating at minute fractions of their projected volumes, illustrates that while business-to-business commerce is huge, it is also difficult to reinvent.

Sellers stayed on the sideline as auction models presented the specter of purely price-based competition. Buyers, while wanting the price leverage, also realized that a) customer service and relationships matter and b) that bankrupt suppliers (as in the auto industry) are not in the buyers' long-term interest. Many exchange providers turned into merchants of purchasing efficiency inside the firewall, relying more on software and process expertise than on convening power. Running a market is also not necessarily attractive: as this newsletter noted in April 2000, in 1998 the New York Stock Exchange only made $101 million on 169 billion trades totaling $7.3 trillion.

10) Business Models Based on "Free"

At one time, at least two dozen Internet Service Providers offered free connections, usually over dialup. Free-PC was one of multiple attempts to get consumers to watch ads in return for hardware. Netscape famously gave away browsers to sell server software, a strategy that backfired for a number of reasons, one of which was Microsoft's anti-competitive behavior with Internet Explorer. Stocks in VA Linux, a company with real hardware sales but ample "free" hype, rose from $30 to $320 on December 9, 1999, the first day of trading, but later fell to 54 cents in July 2002.

More recently, eBay has encountered major difficulty making Skype pay off; Sunrocket and other VoIP providers are either shuttered or weathering tough times. There are also many businesses that have been collateral damage in free scenarios, some of them illegal or otherwise of dubious ethical standing. Music companies that have been slow to respond to file-sharing with appealing alternatives are the most visible of these. Even so, it has been repeatedly proven that you can in fact "compete with free" and in fact usually win.

Sunday, October 21, 2007

Early Indications October 2007 issue 1: 10th Anniversary Breakthroughs

In October 1997, the Ernst & Young Center for Business Innovation in Cambridge, Mass had just hosted its first meeting of a corporate consortium investigating emerging directions in e-commerce. Since that time, the newsletter that initially was called "Networked Commerce Update" and then "Early Indications" has appeared monthly. It has attempted to spot trends, situate developments in broader contexts, and share some of my excitement and occasional dismay over the state of information technology and the many uses thereof.

This month, we'll look at ten developments that, while feeling routine today, still lay in the future only ten years ago. We'll also review ten can't-miss technology stories that somehow went bad. Next month, look for a list of ten trends for the next ten years.

First of all, however, it's important to thank some of the many people who have helped make this ten-year run possible. Jamie Taylor, since before issue 1, and John Parkinson since soon thereafter have served as my go-to technical tutors. Christina Winquist and Dan Stevens from Capgemini, along with the ever-helpful John Parkinson, helped fill in the gaps in my archive as I reread the entire run this summer. My former assistant Lesley Livingstone helped assemble an earlier archive and kept the issues of that era carefully posted; Heather Weikel, my current assistant, is doing those jobs now. My Capgemini research colleagues, particularly Tim Simcoe (now a professor at the University of Toronto), Geoff Cohen, and Karina Funk, guest-wrote columns, tracked down obscure but valuable facts, and saved me from errors of many sorts. Andy Mulholland, Lanny Cohen, Stew Bloom, and John Parkinson delivered executive air cover, market observations, and sage advice. Finally, Lawrence Baxter has been the most visible of a very small number of readers who have been on the list from issue 1, but thanks go, in the end, to the many readers around the world who have found the newsletter useful, told their colleagues, and kept me honest.

And now to the list: Ten breakthroughs that have become mainstream since 1997, in no implied order, and not of equal magnitude.

1) Distributed infrastructure
The power of the personal computer and its associated hardware has given millions of individuals and small businesses the ability to perform tasks that not long ago required technical skills and expensive capital goods. The list of newly technically sophisticated establishments is getting longer every year. Initially, compact disks could be broken apart and recombined, much like mix tapes but at higher quality, so CD pressing plants were supplemented: some record chains in 2001 estimated they sold one recordable blank CD for every four music titles. At the same time, prices for audio and video production facilities are now falling from hundreds of thousands of dollars into the nearly free category: last week I bought Apple's iLife software, which includes a reasonably powerful video editing and DVD authoring platform, for $39 at academic discount. Millions of YouTube videos are being made outside a/v production houses. Whether with travel agencies with their formerly prized ticket printers, recording studios, photo labs, or printing of various kinds, the capital base is becoming lighter and cheaper. Capabilities are being distributed at the edge of the network rather than consolidating as they used to. In short, if someone wants to make a demo (or production) music disk, produce a TV commercial (Heinz recently asked for exactly this, paying over $50,000 to a context winner), manipulate a color image, print a book, broadcast an editorial, print a boarding pass, or create an animated short, he or she can likely find an inexpensive desktop production environment.

2) Offshoring
In 1997, the Year 2000 bug was beginning to be addressed. As volumes of code rewrites climbed, several firms discovered the excellent quality and low prices offered by Indian firms in particular. After the turn of the century, several astute businesspeople began repositioning the offshore firms from code remediators to code writers, architects, and business process outsourcers. At the same time, India's heritage of English-language education helped drive call center business in much the same way. By 2005, it was impossible to find any sizeable services company or software company that had not moved aggressively into India. The industry will never be the same: whether the low-cost producer of the moment is the Philippines, China, Vietnam, Estonia, Portugal, or someplace else, services-labor arbitrage, made possible in large measure by the Internet and voice over IP, has become perhaps the dominant factor in tech-sector economics.

3) Always-on People
The phrase is Chris Shipley's, but the phenomenon is widely observed: countless newspaper articles have focused on the etiquette of checking your mobile message device away from the office, whether at home (one guy ducked into his closet), out socializing, or in business meetings. That the RIM device is so often called the "Crackberry" gives some sense of the addictiveness in play, but the phenomenon is as broad as it is intense: in April of this year, Rim broke the 8 million subscriber barrier, and millions of GSM phones allow their owners to maintain seamless global connectivity. In 1997, by contrast, text pagers were in their earliest stages, only plumbers and doctors had beepers, and world phones were strictly a niche luxury. Now, whole negotiations are carried out in motion, with little regard for time or place. For millions of managers, the notion of being “out of the office” is almost quaint, and the blurring of work and personal time is less clear than ever before.

4) Architectures of Participation
The phase is, I believe, Tim O'Reilly's. The Internet has allowed entirely new kinds of social groups to identify themselves, assemble, mobilize, and persist. Whether it's Linux and the associated Internet infrastructure tools and environments, Wikipedia, the social networking businesses, or user feedback currencies at Craigslist, eBay, or Amazon, we are seeing the voices of identifiable individuals connected to much larger assemblages to build fashion, trust, and, sometimes, insight. In addition, one in four eligible Americans (and many ineligible Americans as well) uses an online dating service, of which there are now over 1,000. According to one measure, the average MySpace account-holder had 347 "friends," which begs the question of what indeed a friend is as opposed to an acknowledged network contact. In such settings, opting out is known as "Facebook suicide," suggesting that we are also witnessing the emergence of new architectures of exclusion.

5) The Telephonic Inversion
Despite (or perhaps because of) being some of the oldest tech firms on earth, telecommunications companies have had a tumultuous decade. Customers are defecting from landline service at staggering rates: according to the Telecommunications Industry Association, U.S. landline subscriptions declined by over 20 million in the five years to 2005, and perhaps another 10 million since then. But 2005 was the year U.S. wireless subscriptions surpassed wireline -- and on the global scale, this is pretty late. Technical developments such as dense wave division multiplexing made infrastructure investments in fiber optics stretch farther, and new revenue sources -- particularly texting and ringtones -- helped offset the wireline decline. Any way you slice it, however, the telecom business model of 2007 is upside down from what it was a decade ago as mobility surpasses fixed connections, data traffic outpaces analog (goodbye fax machines), and perhaps the most troubling competitor -- Skype and its 200 million users of nearly free international calling -- is itself a major headache to eBay, which has yet to monetize its original $2.6 billion investment.

6) The Digital Home
According to the U.S. Consumer Electronics Association, DVD players went from zero in March of 1997 to 132 million a decade later, in roughly 100 million households. Broadband penetration (using an admittedly generous definition of the term) went from zero to 84% of connecting U.S. households in that same period. HDTV penetration is currently between 25% (2006) and will hit an estimated 50% in 2008. Five years after launching, iPods can be found in one in five US households. Digital video recorders, which hadn't been invented in 1997, are estimated by Jupiter to be in one third of US homes by sometime next year. Digital cameras were estimated to reach 70% market penetration in 2007 by IDC. Roughly 10% of U.S. households have a wireless data network. Taken together, the uptake of all these new technologies represents a wholesale reinvention of the entertainment platform in just a few years.

7) Search
Remember Lycos? It began as a research project at Carnegie Mellon in 1994, went through an IPO in 1997, and was sold in 2000 to the Terra Networks arm of Spanish Telefonica phone company for $5.4 billion. Four years later, Terra sold Lycos to the Korean Daum Communications firm for $95 million - less than 1% of the purchase price. What about Altavista? Originally a research project inside Digital Equipment, it was for a moment the troubled company's most powerful brand, making it logical to extend the search engine's name to . . . firewalls and other products. After DEC was sold to Compaq, CMGI (remember them?) bought Altavista for $2.3 billion. AltaVista was subsequently sold to Overture, and then Overture was bought by Yahoo. Prime mover Louis Monier remains a force in the industry, recently having left eBay to join Google.

The rapid grown in the scale of the web presented new challenges to the search companies, making Google's page rank and related algorithms particularly valuable: rather than focusing on text-matching, Mssrs. Page and Brin looked at the structure of networked documents, cracking the problem in an elegant and, from a subsequent advertising-centric perspective, extremely profitable form. In the meantime, advancements in image, geospatial, video, and domain-specific search continue to advance both the state of the art and the potential for new business models.

8) Mapping
In 1996, GM introduced the OnStar navigation and assistance service in high-end models. The division has yet to drive significant revenues for the parent company, but there's no question that GPS and related technologies have exploded in the intervening decade. The widespread use of Google Earth in television is one indicator of the underlying trend, as is the fact that the top two sites ranked by traffic (Yahoo then Google), as well as #4 Microsoft and #13 The Weather Channel rely heavily on interactive mapping. Handheld GPS units are doubling in sales every year, in North America anyway, to an expected total of five million this year. As the technology is integrated into mobile phones, the social networking market is expected to drive far wider adoption. Google's Dodgeball and other capabilities, numerous startups, and the telecom carriers are expected to deliver applications linking "who," "where," and "when." A powerful indication of this tendency came earlier this month when Nokia bought Navteq, the "Intel inside" of many online mapping applications, for $8.1 billion.

9) Peer-to-Peer
It's impossible to envision what the 2007 Internet would look like without peer-to-peer file distribution. While the business model disruption of the music and telecommunications industries has been significant, the sheer volume and velocity of information in motion (much of it admittedly of the copyrighted variety) staggers the imagination. In a recent Siemens patent application, it was claimed that 50 to 80% of all Internet traffic is handled by p2p arrangements. Starting in 1999 with Napster and Gnutella, continuing through Kazaa and BitTorrent, and now through Morpheus, BearShare, Skype, Joost, and dozens of others, it's clear that these services are a permanent part of the landscape.

10) Networked Pestilence
Not all the developments have been improvements. Spam was certainly with us in 1997, entering as it did Oxford English Dictionary in 1998, but the volumes have skyrocketed: according to the IEEE, spam increased 100,000% between 1997 and 2004, but recent trends, including remote enlistment of so-called "zombie" computers, is raising the total to the point where legitimate e-mail could be only 5% of total traffic. Phishing is a newer blight, but potentially more profitable; the potential for identity theft is higher as well. Data breaches have been well cataloged, whether from a local government agency that prints personally identifiable information in directories, to the 47 million names exposed in the break-in through one TJX store's wireless network, to lost data backup tapes, to the infamous (and unencrypted) 26 million records lost on a Veterans Administration laptop.

It's clear that the last ten years have been a time of momentous change, but it's also sobering to see what hasn't happened: we have no cure for AIDS or malaria, commuting times get longer rather than shorter, incarceration is up, bridges and other critical infrastructure are decaying, air travel is in many ways quantitatively and qualitatively worse. Before we look ahead to the next ten years, in the second October letter we'll look at some of the biggest busts of the past decade.

Sunday, September 30, 2007

Early Indications September 2007 - Web 2.0 and the Enterprise: Beneath the Surface

As managers of enterprise computing environments confront both perennial and emerging challenges, a new set of technologies is complicating the situation. While so-called web 2.0 was born of such consumer-driven sites as Wikipedia,, YouTube, and various blogs and blog-related efforts, a growing number of observers and participants is arguing for the utility of Web 2.0 principles and tools in workplace computing. At the end of the day, the question is more subtle than it may appear at first glance.

Rather than hedge with the standard "it depends" conclusion, I believe that the various tools will prove to reinforce existing competitive advantages rather than confer new ones. That is, the cultural attributes necessary for successful Web 2.0 behavior are in and of themselves powerful differentiators, and the tools will amplify either the presence or absence of such traits as accountability, openness, receptiveness to change, sensitivity to customer needs and preferences, and the like.

The term and concept of "enterprise 2.0" appear to have originated with Harvard Business School professor Andrew McAfee, most explicitly in a Sloan Management Review article from this past spring. He argues that "the new technologies are significant because they can potentially knit together an enterprise and facilitate knowledge work in ways that were simply not possible previously." (p. 22; citation below) Specifically, McAfee points to search, links, "authoring" (blogs and wikis), tags, "extensions" (algorithmic extrapolation), and "signals" (mostly RSS) as the primary enabling technologies.

On its face, much of the argument seems straightforward and even exciting: having the ability to develop nuggets of business functionality quickly, from the edge of the organization inward, presents a stark contrast to many software development efforts. Being able to identify the right people with relevant skills and knowledge in minutes makes many document-centric "knowledge repositories" feel frustratingly ill-conceived. Assuming that experts on a subject would voluntarily articulate their expertise and create metadata would have been naive only a few years ago.

In the right situation, any of the above behaviors may, in McAfee's word, "emerge" as the result of bottom-up self-organization and effort rather than the mandated top-down kind. But emergence is a very tricky business -- the sciences of understanding its sources, implications, and results are still immature. Let's look at a few complicating factors that could stand between certain flavors of corporate reality and the ideal of enterprise 2.0.

-Of Computation and Communications

Corporate IS organizations have traditionally been responsible for the electronic automation of business tasks and processes: order entry, accounts receivable, warranty service, and more recently customer contact management and new product development. In contrast, web 2.0 technologies don't automate much; they facilitate richer, sometimes better organized and more widely distributed, communications. The first complication comes as IS organizations look at conventional questions that have surrounded application development: what is the ROI, what are the payoff metrics, where is the audit trail, who will manage access and permissions. More simply, issues of control show up almost immediately, as the need to specify goals, metrics, and chains of responsibility encounter notions of wide participation, of distributed authority, and of "shoot first, aim later (if at all)."

-Of Signals and Noise

The core assumptions of web 2.0 -- that users own the content they create, and that said content is of interest to someone else in a long tail of taste and proclivities -- have led to a veritable explosion of original and republished (in a variety of forms) content: whether as a Myspace profile, a YouTube video, a self-published movie review or political rant, or a wiki entry, content is everywhere. The larger problem of editing remains an issue even at "formal" publications, but it's intensified in a workplace where people may not have the same ability to opt out, and who, at 5:00 pm or whenever, really want to go home with more rather than fewer tasks completed. The incessant blurring of personal and work time, and personal and businesses modes of behavior, is playing out vividly in the Web 2.0/Enterprise 2.0 debate. As long as the tools for publishing and distribution develop faster than the tools for managing and filtering, web 2.0 has the potential for unpalatable signal-to-noise ratios, particularly with captive or semi-captive audiences.


This emphasis on communication is already having dramatic effects, according to 40- and 50-something peers of mine, particularly in knowledge-driven industries such as advertising, accounting, and consulting. I frequently see generational differences working with university students, but from the reports of many colleagues, the sharp differences in communications platforms across generations are radically complicating the task of management. It's not unheard-of for senior executives to have admins print off their e-mails, and voicemail remains the medium of choice in some firms. At the other demographic extreme, e-mail is often disregarded in favor of some combination of twitter, text messaging, PC-based instant messaging, and social-network message tools.

People who grew up with a web-centric social sensibility often communicate rather more freely than their elders (or regulators, in some cases) would prefer. Enterprise IS has the unenviable task of logging all material communications, and sometimes of turning off some of the most powerful web 2.0 exemplars. The aforementioned middle-aged managers, meanwhile, must communicate across an increasingly wide variety of technologies, each with particularities of convenience, cultural norms, interoperability, and security and privacy. Add to this cultural dynamic the technical incompatibilities among communications tools. It feels a bit like the days of Compuserve vs. Prodigy: my Facebook message won't cross over to your Myspace page. Being a contact on LinkedIn doesn't mean I can see you on Spoke.

-What's the platform Kenneth?

Once upon a time, a phone was a phone and a computer was a computer -- even when it connected to phone lines. Then phones went mobile but it was still easy to tell a Star-Tac from a Thinkpad. These days, however, gaming devices, smart phones, ultra-mobile PCs, and other hybrid devices have blurred the old easy distinctions. The iPhone is a computer, no question, but is neither marketed nor used like a PC. 200 million Skype users have proven powerfully that voice is just another data type over the network. More in Asia than in North America, the mobile phone is a television "set" -- even the old words are antiquated. In the enterprise setting, this proliferation and polymorphism of devices combines with the content explosion and communication imperative to create unprecedented complexity: complexity for users of various tools and platforms, complexity for application specification, complexity for network design and security officers.

The many costs of these multiple layers of complexity begin to illustrate how web 2.0 tools can, in the wrong setting, extract far more than they contribute. Flame wars provide an accessible case in point: even though there may be wisdom in crowds (whether through various forms of voting, prediction markets - which McAfee doesn't mention, or simply an unexpected discovery of domain expertise), there will be more far instances of threadjacking, name-calling, bad information, and other forms of noise.

At the same time, in the right organization, web 2.0 tools can enhance existing forms of positive dialogue. Given the technologies' emphasis on communication, for example, the contradiction between operations and marketing might be creatively discussed and addressed. Why does marketing so highly value (and expensively pursue) depth and duration of customer interaction while call centers are designed and run to minimize the company's contact with precisely the people marketing is struggling to reach? In such fluid, indeterminate situations, McAfee's characterization of "emergent collaboration" may indeed be realized.

So the question comes down not to "are web 2.0 technologies applicable to enterprise IT?" but rather "in what kinds of cultures and in the context of what kinds of business processes can wikis, tags, blogs, and their associated tools make a difference?" That is, once we shift the focus of inquiry from the technologies to the locus of their deployment, the believers and doubters can both begin assembling the relevant evidence for what promises to be a long, strange experiment and discussion.

Andrew P. McAfee, "Enterprise 2.0: The Dawn of Emergent Collaboration," Sloan Management Review 47:3, 21-28.

Friday, August 24, 2007

August 2007 Early Indications: China's Changing Role in the Tech Sector

At base, technological change and globalization cannot be cleanly
distinguished, and thus will be interlinked for the foreseeable
future. The shipping container is arguably one of the five great
breakthroughs of the twentieth century. Cellular telephony's
revolution of participation, the impact of voice over IP on
international calling, offshore call centers and code factories, price
transparency, and many more facts of global life originated in a lab
or startup.

Given that China's rapid growth and wide impact have become
essentially synonymous with globalization, it makes sense to examine
the current state of the tech industry relative to this awakening
giant. Worldwide interest in the question has been on the upswing,
prompted by two developments: the acquisition by Lenovo of IBM's PC
operation, and Apple's reasonably prominent branding of the iPod's
Chinese manufacturing. More recently, the UK's Mail on Sunday
newspaper ran a critical story in June 2006 on the Chinese factories
from whence the devices originate. Since then, attention has been
focused on wages, working conditions, and the business models behind
the influx of Chinese-made devices and components. (A bibliography
appears at the end.)

James Fallows, who writes for The Atlantic Monthly, recently reported
from Shenzhen, the port city home to the contract manufacturing
factory linked to Apple. One theme that reappears throughout the
article is Fallows' amazement at the scale of Chinese activity:

-The port of Shenzhen and Hong Kong (only about 30 miles away)
dispatched 40 million cargo containers, or the equivalent of one per
second, in a calendar year. (The U.S. exports that return to China in
those containers consist primarily of scrap paper and scrap metal,
along with empty containers.)

-Shenzhen is a planned city that 25 years ago was a fishing town of
maybe 75,000 citizens. It is now bigger than New York, having grown
100-fold, or more, in 25 years.

-At the Foxconn manufacturing plant, a vast number of employees work
12-hour shifts turning out all manner of electronic goods: the precise
number is not made public or perhaps known, but estimates range
between 200,000 and 300,000 people, many of them young women from the
countryside who have migrated to the factory. The facility serves
150,000 lunches per day.

At the macro level, the impact of Chinese exports on the global
economy appear to be mostly anecdotal and probably overstated. In
selected markets, however, China's combination of low wages and
manufacturing scale has driven prices lower in much of the rest of the
world. A famous example is bicycles, but for our purpose, the low
prices of many advanced items -- including cell phones, laptop
computers, cameras, some medical devices, and electronics equipment in
general -- derive in part from China's impact on the industry. That
is, the availability of such items as Motorola Razrs for (apparently)
free and laptop computers for $500 and potentially $100 owes as much
to China's economics as it does to Dell's direct business model or
Moore's law.

The companies driving this transition are, for the most part, not
household names. The electronics manufacturing services (EMS)
industry, formerly known as contract manufacturing, is itself only
about ten to fifteen years old, but growing about 20% per year. In
the mid-1990s, Nokia, Cisco, Sony, and other major brands began
exiting the manufacturing business, leaving it in the hands of such
companies as Solectron, Flextronics, and Jabil.

The largest current EMS, Hon Hai Precision, is the parent of Foxconn.
It is expected to grow from $40 billion to $54 billion in revenues
this year after having grown 44% in 2006. The founder, Terry Gou, is
a native of Taiwan worth $10 billion, according to the Wall Street
Journal; he does not appear on Forbes Magazine's list of the world's
richest people, where he would rank in the top 65. Hon Hai, a
publicly traded company, is China's largest exporter.

As EMS companies seek to increase margins and avoid commoditization,
they take on more upfront work, moving toward becoming so-called
Original Design Manufacturers (ODM). A quick quiz: what do Quanta,
Compal, Inventec, Wistron, and Asustek do? According to Fallows, they
collectively account for 90% of global production of laptop computers;
at one factory, he saw machines from three different major brands
coming off the same assembly line. As a quick check of these
companies' websites illustrates, many laptops we might associate with
HP, Dell, or other major brands began life in one of these Asian
firms, which broadly speaking are higher in the food chain than EMS

The final step up the margin ladder is for a manufacturer to design,
make, and label its own offerings for market, as an Original Brand
Manufacturer (OBM). Brand is in fact a major story at Lenovo,
formerly the Chinese Legend PC firm, which bought the IBM business in
2005. The company's marketing is focusing heavily on sporting events,
with Olympic sponsorship at both the Turin and Beijing games.
Lenovo's story is fascinating: the CEO, Bill Amelio, is an American
with a Karate black belt hired away from Dell, while the chairman,
Yang Yuanqing, is Chinese. The company's ownership is split among
public shareholders (35%), the state-run Chinese Academy of Sciences
(the original investor in Legend at 27%), employees, IBM, and private
equity firms. Lenovo sells in 66 countries and recently announced
plans to open factories in India and Mexico, the better to shorten
supply chains and thus accelerate inventory flow.

Lenovo's headquarters moved from Beijing to Raleigh, NC shortly after
the IBM transaction, but Amelio lives in Singapore. The culture of
the company is in flux as Chinese managers take courses in directness
and accountability and IBM, Legend, and Dell habits are sorted out.
The legacy IBM business, meanwhile, is being upgraded with investments
in IT, R&D (moved increasingly to China from the U.S.), and supply
chain. With 8.3% global market share, the company ranks #3 worldwide
in PC shipments, barely ahead of Acer and lagging HP (19.3) and Dell
(16.1). Competition is intense: Dell recently invested $16 billion in
one year in Chinese capacity, more than Lenovo's entire revenues.
Lenovo has responded by cutting costs, including laying off 1400
employees announced earlier this year, and by reinventing its channel
model outside China.

While the whole world is watching to see how Lenovo fares as China's
first global brand, another company from the other side of the ocean
is trying to create a hybrid Chinese-American firm. 3Com has had a
wild ride in its nearly 30 years of existence. After being co-founded
by Ethernet inventor Bob Metcalfe in 1979, the company sold a variety
of networking equipment including interface cards, and attempted
several consumer plays including USRobotics (modems) and its
subsidiary Palm Computing that were later spun out, as well as the
Kerbango Internet radio that never came to market and the Audrey
Internet appliance, which lasted less than a year.

In 2003 3Com formed a joint venture with Huawei, now an $8 billion
company of 62,000 employees that sells networking gear primarily to
telecom operators. Earlier this year, 3Com bought back Huawei's stake
for $882 million in the JV, now known as H3C. Total headcount in the
company is now heavily weighted toward Asia (5,000, mostly in China)
with about 1,200 employees still in the U.S. The company now enjoys a
similar R&D situation to Lenovo, in that engineers are about 1/5 as
expensive in China as in the U.S., so investment can go a lot farther.
3Com also will encounter some of the cultural issues that slowed
Lenovo after the IBM acquisition, but like Lenovo gained global scale
via a trans-Pacific deal.

So what's the overall picture? Software creation is generally a
non-issue, except domestically, where the Baidu search engine has had
some success and Lenovo has introduced some functionality specific to
the home market. Chinese firms have proven they can build electronics
to order, and build from original designs in certain segments.
Quality control and material provenance remain problematic. Lenovo
has proven it can sell lots of PCs in its home market and that
spending lots of money can build a brand. Unlike India, China has not
produced a generation of globally prominent managers and executives,
with the exception of Lenovo's Yanqing.

The dominant business model of China's role in the global technology
industry, however, is probably still represented by a man James
Fallows calls "Mr. China," an Irishman named Liam Casey. Casey runs
PCH China Solutions, a firm built up from Casey's personally-acquired
Rolodex of factory locations, contract outcomes, manufacturing
capabilities, roads, and many other factors. If someone needs a
widget built, Casey is likely to know who can build it, who has
capacity, who can supply appropriate materials, and how much it should
cost. For outsiders entering the country, as they are in droves, such
knowledge can be found only with informed intermediaries like Casey;
as Fallows notes, "foreigners don't know where to start or whom to
deal with in the chaos of small, indistinguishable firms."

The rapid growth, corruption, and lack of supply chain transparency
have led to predictable consequences, as when Mattel could not name
its suppliers of tainted toys until long after lead was discovered.
Pollution, the classic externality, is fast becoming a front-burner
issue, and could play a dramatic role in the Beijing Olympics.
Working conditions don't measure up to western standards, but at the
same time, China's industrialization has alleviated severe issues of
rural poverty. Furthermore, the process is probably safer and more
humane than what weavers experienced in Manchester, spinners
encountered in the Carolinas, or early auto workers persevered through
in Flint. To some extent, comparing historical examples of industrial
misery is an apples-and-oranges exercise, but it serves to remind us
that any judgment of these conditions is relative, and for better and
worse, Chinese factory workers are generally better off than they were
on a farm. The various winners and losers remain to be fully sorted
out, but China's emergence will continue to reshape many aspects of
the global order.

"Bold fusion; Face value," The Economist, Feb. 17, 2007, p. 74.

Steve Hamm and Dexter Roberts, "China's First Global Capitalist,"
Business Week, Dec. 11, 2006.

James Fallows, "China makes, the world takes," Atlantic Monthly,
July-August 2007, p. 48.

Jane Spencer, "Lenovo Looks to Expand Global Reach," Wall Street
Journal, July 27, 2007, p. B4.

Jason Dean, "The Forbidden City of Terry Gou," Wall Street Journal,
August 11, 2007, p. A1.

"The stark reality of iPod's Chinese factories," The Mail on Sunday,
August 18, 2006.

Bruce Einhorn, "The Tech dragon Stumbles," Business Week, May 17, 2007, p. 44.

Friday, July 27, 2007

July 2007 Early Indications: From Programming to Programming

In the past ten to fifteen years, many barriers between traditional
industries have broken down. We're in the early stages of another
big, blurry brawl, but to set some context, here are a few examples
and data points:

-Entertainment and computing now overlap in many significant ways.
According to Neilsen, Americans between 8 and 34 spend more time
gaming than watching television. Globally, computer gaming has become
about a $30 billion industry, compared to worldwide box office
receipts of about $26 million in 2006, which was an all-time record.

-Telecommunications and media are battling into each other's
territory. Cable television and voice providers, as of 1990, were
separate and distinct. By 2000, cable providers led telecoms, in
North America anyway, in Internet access. Starting early in the
decade, the cable providers have gathered significant share
(approaching 30% in some markets) of the wireline voice market, but
DSL has gained back some share in Internet access. In the next five
years, look for telecoms to provide television content; Verizon has
announced Fios1, a "hyper local" channel in the Washington, D.C. area.
A recent study by Motorola found that 45% of Europeans, led by 59% of
the French, watch some TV over the Internet. Cellular telephony is
shaping up as the next media platform: Japanese phones routinely
include television tuners already, and growth is expected to be rapid
in many areas of Asia.

-Retail has been redefined along several dimensions. The U.K. grocery
chain Tesco punches far above its weight in petrol sales: with only
4.3% of the retail locations, it has captured over 12% of the market,
lagging only BP, which controls 16.5% of the market but has three
times as many locations. Here in the U.S., it's hard to believe that
Wal-Mart expanded from general merchandise into the grocery business
less than 10 years ago, but it controls at least 20% of a highly
fragmented market. eBay, which began as a secondary market, now also
includes many new, branded goods from established sellers.

Our focus today, however, is on a different "invasion" of an adjoining
market. Ten years ago, when investors were looking for "the next
Microsoft," they held certain assumptions about what a highly
successful software company looked like:

-Winners choose the right platform, picking well according to the
market size and share of the hardware on which the software, of
whatever sort, runs. IBM's OS2 operating system in the early 1990s
had some technical advantages over Windows, but IBM never established
the application layer which would make its OS competitive.

-Winners develop mechanisms for user lock-in and network effects.
Word processing programs stand as an obvious example, where switching
is hard and expensive, and it makes sense to be on the same product as
all of your co-workers.

-Winners manage upgrade cycles efficiently: that locked-in user base
will eventually have to buy the new, improved version, delivering a
major revenue infusion to the software seller and perhaps the wider

-Winners sell software to large customer bases one consumer or one
business at a time. This reality of the market implies effective
management of brand, retail channels, and enterprise sales forces.

-Winners care about software functionality and performance; data as it
is generated or managed by the application falls out of scope.

-Winners hire strong technical teams because functionality is
specified early in the new product development cycle and must be
hard-coded into the package.

-Winners think of the world in rows and columns. Whether in
calendaring, spreadsheets, databases, project management (swim lanes),
presentation graphics, or customer contact management, most programs
of the 1985-2000 period deeply embedded a grid metaphor and/or

Times have changed. Whether one looks at Google, a clear challenger
to Microsoft's dominance, or at the new crop of companies all seeking
to ride the "web 2.0" bandwagon, many of these assumptions about
software no longer hold true. For example, in a survey of 25 startups
to watch compiled by Business 2.0, fully 20 had revenue models at
least partly based on advertising. Greatness in software now requires a lot of the old world skills and positioning, plus a healthy dose of some new elements as well.

To set some context, look at some familiar companies listed by market
capitalization and price/earnings ratio as of 19 July:

Company Cap P/E

Microsoft 301B 23
Oracle 105B 25
SAP 67B 26
Disney 68B 16
Time Warner 78B 13

Google 171B 48
Yahoo 35B 51

Ten years ago, anyone looking for "the next Microsoft" probably would
not have looked to Viacom or Disney as models. And for good reason:
the role of "pushed" content is itself in transition. Yet the core of
the media model -- the packaging of audiences for sale to advertisers
-- is fueling growth at Google, presenting both technical and cultural
challenges at Yahoo, and the source of deep concern among Microsoft's
top leadership. The changing of the guard is further emphasized by
Microsoft's experience with the most recent exemplar of old-school
software, its Vista operating system. The product shipped three years
late, with a stripped-down feature set, and effectively cost several
senior executives their jobs: Brian Valentine, Jim Allchin, and, to a
degree, Bill Gates. It also has yet to sell in large numbers, in part
because enterprise buyers are waiting for the first updated release,
when many of the first-run glitches will be addressed.

What are the emerging dynamics for software dominance? Compared to
the standards for success circa 1997, a few factors have been inverted
while most still hold true, with a twist.

Rather than developing for Unix, Windows, Mac OS, Symbian, set-top
boxes, and a variety of other operating systems, Google and Amazon
have led the way toward development of services for the Internet as a
platform. Among other things, this stance greatly simplifies product
distribution: the differences between Google Maps and my 1998 version
of Rand McNally's Windows package are striking. Every time a new road
is built, or interstate exits renamed, or a pedestrian mall built,
millions of CDs become obsolete. Google (or NavTeq or whoever) makes
one change to the base map and every subsequent query will be addressed with accurate information. Getting the platform right still matters, but the definition of the term is changing from local to virtual, solitary to distributed, and product to environment.

This aspect still concerns financial analysts, particularly because
switching costs can be so low. If I change from Yahoo Finance to,
say, Fidelity's investor workbench, apart from my investment in the
old interface, there's very little to restrain me from leaving. Tim
O'Reilly, who helped formulate the very notion of Web 2.0, asserts
that users own their data in these sorts of scenarios, but the
exceptions to his assertion prove that Web 2.0 is hardly the last
word. My eBay reputational currency, iTunes preferences, and Hotmail
account are neither open nor portable -- by design.

Network Effects
There's no question that successful software still exploits network
effects. The more developers who code to a given platform --
Facebook, Salesforce, or Google maps -- the more that standard gains
authority: note that none of those aforementioned businesses counts
as a website. One of the platform pioneers powerfully illustrates the
point perfectly: Amazon just noted in its earnings conference call
that it has 265,000 developers signed up to use its web services.
There are also powerful network effects among users, whether at eBay,
MySpace, or BitTorrent: the more people who use the service, the more
valuable it becomes.
Compare that one fact to consumer products,
banking, automobiles, or pharmaceuticals, and we are reminded how
significantly online dynamics depart from those of widget business or
even most of the service sector.

Upgrade (and therefore revenue) Cycles
No longer is the objective to leverage a large installed base onto a
new version of the product. Google makes money every hour of every
day, and apart from acquisitions, we don't expect spikes in its
revenues. Indeed, the escape from the cyclicality of product upgrade
cycles may not yet be fully appreciated as analysts assess the new
breed of software companies. The dependence of shrinkwrap software
companies on secondary revenue streams may become problematic: Larry
Ellison noted in an interview with FT last year that Oracle was
getting 90% margins on maintenance. Customers can't be, and aren't,
happy with those economics, so it is likely only a matter of time
until competition and/or customer resistance change the model. To what, nobody can say.

Selling Software as a Product, One at a Time
On July 19, Google reported quarterly revenues of $3.87 billion, a
year-over-year improvement of 58%. Did its sales force grown by 60%
in a year? I highly doubt it. Although the company offers a few
software products a customer can purchase, they amount to mere drops
in that $15 billion annual bucket: enterprise search hardware and
software, hosted applications, GIS tools. An important facet of the
(lowercase) software as a service trend is that in an increasing
number of cases, users don't have the software on their own devices,
but access a server, its location irrelevant, to get something done.
As a result, the customer base (of advertisers) is dramatically
smaller than the user base, delivering favorable sales force
performance metrics.

Accordingly, software distribution channels are being completely
reinvented: the old goal used to be to get your product onto a shelf
and/or catalog page at Computer City, Egghead, or Micro Warehouse.
Note that all of those businesses are defunct, another indication of
deeper change in the industry. In a related development that sheds
further light on a complicated situation, PC Magazine subscriptions
have dropped from 6.1 million in 2003 to 4.8 million.

People Buy Features and Performance
There's a wonderful video that embodies this thinking perfectly: enter
"microsoft ipod" into the YouTube search bar. Microsoft apparently
produced this spoof internally, illustrating the trend toward "speeds
and feeds" in stark contrast to Apple's aura and powerful design
sense. Just run down the standard old-school software questions in
regard to Hotmail or Mapquest:

-What is the recommended processor?
-How much free disk space is required?
-What is the minimum memory required?
-How many transactions per second can the application handle?
-How fast can the application render/calculate/save/etc.?

The very mention of these former performance criteria in regard to the
most successful "applications" of our time highlights the
discontinuity between where we are and where we were. It's critically
important to note that the path from Lotus Organizer or the original
Encarta to Basecamp or Wikipedia involved a step-function change
rather than evolutionary progression.

Hire the Best Technical Team
There's no question that high-caliber architects and developers
matter. Look at the arms race among Microsoft, Amazon, Google, and
Yahoo to hire the giants of the industry: Gordon Bell, Brian Valentine
(see above), Adam Bosworth, and Larry Tesler, respectively, only begin
a very long list. But the outside-in dynamic of user-generated
content also allows such sites as or Grouper (now Crackle)
to thrive. In these kinds of businesses it's certainly imperative to
get top-flight operations and data-center professionals, no question,
but these folks are of a different breed compared to the breakthrough
innovators of the caliber mentioned above.

Quality is Built from the Inside Out
This area is tricky. Certainly the core application functionality and
engineering need to be built into the base architecture, as eBay
discovered a few years back. But no longer is the internal team the
only resource: many of the best businesses balance internal and
external talent, Amazon being exhibit A. In contrast, efforts built
on pure volunteer collaboration, such as the Chandler PIM and Mozilla
browser, have been outpaced by commercial ventures. It's also worth
reiterating that Apple runs a very closed shop very successfully: the
iPod and iPhone feel antithetical to the Web 2.0 mantra. It would
appear that in this regard, as in many others, several successful paths remain available.

Rows and Columns
While I don't want to oversimplify and assert that value has migrated
from nodes to links, the fact remains that the structure of business,
personal connections, and information is looking much more like a
spider web than a library card catalog. As scholarship from Rob Cross
at Virginia and others has illustrated, informal networks of personal
contacts, once exposed, often explain a corporation better than the
explicit titles and responsibilities. More recently, Mark Anderson at
Strategic News Service has connected some of the dots around Google
and Apple, at both the board level and elsewhere, contending that an
ecosystem is taking shape to challenge Microsoft. At the engineering
level, the very concept of social networking behind Twitter, flickr,
and the Dodgeball startup scooped up by Google represents a departure
from a conventional relational database mentality.

Calling this a trend would be premature, but the corporate
architectures at Microsoft, Google, and Apple mirror their varying
approaches to the market. Apple's share price includes a healthy
dose of respect for the management ability of Steve Jobs, in that
particular context, to both envision and execute. Conversely, the
achievement of Google, with the jury out on the model's staying power,
may lie in leadership's balancing of individual brilliance at
different layers of the hierarchy with financially realistic corporate
objectives. Finally, Microsoft appears to be working hard to define
an emerging management model as the founding generation hands off to
new COO Kevin Turner (from Wal-Mart) and CTO Ray Ozzie, long ago at

While it certainly includes a substantial element of buzzword-mania,
the shift from rows and columns to graphs -- whether in software
architecture (cf. Metaweb), business model (Facebook), or management
structure (Linux still matters here) -- merits watching for several
reasons. First, the combination of cheap and (remotely available)
processing, effectively infinite online storage, and functionality
tuned to these realities means that graphs are required to handle the
sheer scale of available data. Second, the ability to map and model
networks allows their structures to be better understood and utilized.

Finally, social groups get larger than could be managed in an
unconnected world -- according to a recent survey of 18,000 people
conducted by Nickelodeon, MTV, and Microsoft, "Globally, the average
young person connected to digital technology has 94 phone numbers in
his or her mobile phone, 78 people on a messenger buddy list and 86
people in his or her social networking community." This requires both
new ways to understand social connections and tools with which to
manage them. To underscore this shift, the North Carolina Attorney
General announced earlier this week that MySpace just ceased hosting
pages for 29,000 known sex offenders.

Taken together, these tendencies are reshaping the software business:
programming (as in putting content together) has joined programming
(as in coding) as a core competency for many kinds of businesses that
fall in the gaps between computing and media. The fusion also shakes
up conventional media, as we have noted earlier. The purely
push-based media model, used to advertise things primarily for largely
unmeasurable brand impact (unmeasurable at the level of the ad,
particularly), is being challenged by viewers and readers who want
more participation in both the experience (what used to be called
consumption) and the process (formerly known as publishing or content
creation). The YouTube-CNN debates feel to some extent like a
gimmick, but they appear to be a harbinger. As blogs, social
networks, and professional content get further jumbled, as Rupert
Murdoch seems to be intent on doing, the business models of media,
software, gaming, and transport will continue to feel the effects.

Wednesday, June 27, 2007

Early Indications June 2007: Miles Davis, CEO?

As technologies, cultural attitudes, demographics, and economics change, people have both the opportunity and the need to reinvent organizational models. When industrialization drew farmers into cities and factories, the military provided a convenient reference: the army of labor was directed by captains of industry. Symphony orchestras provided another authoritarian model. As the corporation matured, it invented its own characteristics. Henry Ford fathered process-centric division of labor with his refinements to the assembly line, while Alfred Sloan pioneered many organizational and financial practices, such as divisions (another military offshoot?) and ROI, that made the corporation the model for other entities, such as as schools, foundations, and some sports teams.

Today's business environment presents new challenges to old models. A long list of factors combine to reshape work and organization:

-prosperity (Maslow's hierarchy of needs)
-the shift from manufacturing to services
-the rise of intangible forms of value such as brand and intellectual property
-global markets for risk
-urban congestion and telecommuting
-safety and security considerations
-China's resource hunger
-the unique nature of software as an invisible asset
-increased monetization of data
-the Internet and its associated technologies such as e-mail
-mobility, particularly the impact of cellular and other wireless data networks
-global enterprise software packages
-work-family issues that followed mass entry of women into universities and the workforce
-problem-solving vs. assembly-line routinization
-shorter product life- and use-cycles
-offshoring and outsourcing
-widespread cultural resistance to positional authority
-intensity of task and knowledge specialization
-mass air travel

and many more. As Erik Brynolfsson recently noted in Sloan Management Review (spring 2007 p. 55), we need to rethink the very nature of firms, beginning with Ronald Coase's famous theory: "The traditionally sharp distinction between markets and firms is giving way to a multiplicity of different kinds of organizational forms that don't necessarily have those sharp boundaries."

Given the uncertainty and rapid change implied by this list, it's no surprise that academics and other management thinkers have focused on improvisation. Rather than looking at the everyday sense of the word having to do with makeshift or ad hoc solutions, however, these theorists see considerable structure in musical and dramatic improvisation. One researcher went so far as to live with Chicago's Second City comedy troupe to investigate these structures, but our focus here will be on jazz. (A particularly valuable resource can be found in the September-October 1998 issue of Organization Science devoted to jazz and many of its organizational implications and parallels.)

According to Kathleen Eisenhardt of Stanford (in "Strategic Decisions and All That Jazz," Business Strategy Review 8 (3), 1–3), improvisation both involves intense communication between players in real time and a small number of well understood rules in which improvising is performed. The practice is not a matter of the soloist "making it up as he goes along," but something much richer and more collectively created. Paul Berliner, whose 1994 book Thinking in Jazz is a milestone, goes even farther:

[T]he popular definitions of improvisation that emphasize only its spontaneous, intuitive nature -- characterizing it as the 'making of something out of nothing' -- are astonishingly incomplete. This simplistic understanding of improvisation belies the discipline and experience on which improvisers depend, and it obscures the actual practices and processes that engage them. (p. 492, quoted in Weick, "Improvisation as a Mindset," in the Organization Science volume noted above, p. 544)

To give some indication of just how complex the practice of improvisation can be, the Canadian organizational scholar Karl Weick explains that it in fact exists on a continuum, with the progression of different techniques implying "increased demands on imagination and concentration." To summarize, the simplest form of improvisation is interpretation, moving through embellishment then variation, all the way to improvisation, which implies a time pressure and a lack of precomposition. Thinking about the organizational equivalents of these techniques is a compelling but highly imprecise exercise. (Weick pp. 544-545)

Perhaps because it evolved in parallel with the information age, jazz appears to be well suited to collaborative work by impermanent teams of skilled workers. It is also more applicable to performance than to decision-making: few great quartets or quintets have been democratic, and many leaders of bands large and small have been solitary, poor, nasty, brutish, or short, to borrow from Thomas Hobbes. Improvisation found little place in the classic big bands of Goodman or Ellington. More recently, until his death James Brown fined band members, many of whom were truly A-list musicians, in mid-performance for breaking his rules.

So improvisation in and of itself does not solve the organizational dilemma of managing real-time knowledge work. Michael Gold, who lectures on the intersection of jazz and business after having been both a bassist and a banker, posits an acronym - APRIL - to denote the five traits that carry over:

The members of a jazz ensemble possess and practice a set of shared behaviors that we call the Five Dynamics of Jazz.

* Autonomy -- self-governing, self-regulating, adaptable and independent - yet in support of (and interdependent with) the larger organism.

* Passion -- the quality of emotional vibrancy, zest, commitment, and energy to pursue excellence and the course one believes to be true.

* Risk -- the ability to take chances and explore new territory and methods in pursuit of shared goals, and the ability to support others in their explorations.

* Innovation -- the skill to invent, recombine, and create new solutions to problems using either old or new forms, methods, and/or resources.

* Listening -- the ability to truly hear and feel the the communication of passion, meaning and rhythms of others. (

Gold's Five Dynamics are useful but not sufficient, and raise operational questions presumably addressed in his lectures: how do good managers channel both passion and the need to show up on time? Innovation is of course vital, but how do the other members of his quartet know what to do when the improvised bass solo is over?

Another jazz player/business speaker (and a classmate of Gold's) has combined his education and work as a drummer with lessons from jobs in consulting and startups to present a potentially more rigorous view. Operating from his home bases in Norway and Boston, Carl Stormer has been addressing banks, consulting firms, telecom companies, and CPG firms on the topic of "Cracking the Jazzcode." The presentation itself, which I have not yet seen, is innovative in both structure and message.

Stormer begins with a brief welcome, then proceeds to play drums in a band of three or four players who have never before performed as an ensemble (every performance is different). These are high-grade professionals: Cameron Brown has played bass for Archie Shepp, Art Blakey, Joe Lovano, and Dewey Redman. Saxophonist Rob Scheps has recorded with John Scofield, Carla Bley, and Steve Swallow. Guitarists Jon Herington and Georg Wadenius have both toured with Steely Dan.

So the musicianship is top-shelf. What can managers learn? Stormer has developed a rich set of insights. First among these is the notion of instruments: improvisation is key to jazz, but does not in and of itself define the genre. What functions do each instrument perform at what time? In other words, why don't we hear trios of drummers or quartets of saxophones? What are the rules for passing a solo? What are the responsibilities of the horn player during the guitar solo? Instruments have different roles in an ensemble, roles that ensure that players don’t have to fight for the same functions. (Conversely, when functions overlap, as with a guitar and piano, players must work out who leaves room for whom.) In addition, the ownership of instruments ensures that players match their skills with their task.

While improvisation may look individual, jazz is inherently made by groups. What are the elements that define an ensemble? Why are sextets more than twice as difficult to manage and play in than trios? How do groups communicate? Why don't quartets have teambuilding exercises? Why can the Jazzcode band of the moment work effectively without rehearsal?

The Jazzcode lecture also includes important ideas about shared cultural references: if my tenor solo quotes from "Round Midnight," the drummer will do a better job faster if he can pick up on the source of the riff. If the band gets a request not everyone knows, what happens? What is the score from which a group plays? What are the differences between notes on paper and music in performance, and what do they tell us about business processes?

Many other thought-starters emerge in Stormer's conversation. What are the benefits of increasing your competence on an instrument vs. cross-training on other instruments, most notably piano? For all the emphasis on improvisation and traded soloing, why is it that arrangers play such an important role in certain ensembles? What are the payoffs of increased competence on my instrument? Do I get more solos, will better musicians want to play with me, will I make more money? To that end, how do I practice: improving on my weak points or developing deeper insights into my favorite techniques and songs?

I don't want to give away Stormer's trade secrets, but jazz -- as a music and not just as a vague concept thought to involve chaos and unscripted soloing -- is rich with business implications. In short, I believe there may well be a Jazzcode for business and that if there is, Carl Stormer is uniquely positioned to discern and explain it. Furthermore, the emerging business and technology climate will only amplify the wisdom of his approach.