Sunday, September 30, 2007

Early Indications September 2007 - Web 2.0 and the Enterprise: Beneath the Surface

As managers of enterprise computing environments confront both perennial and emerging challenges, a new set of technologies is complicating the situation. While so-called web 2.0 was born of such consumer-driven sites as Wikipedia, del.icio.us, YouTube, and various blogs and blog-related efforts, a growing number of observers and participants is arguing for the utility of Web 2.0 principles and tools in workplace computing. At the end of the day, the question is more subtle than it may appear at first glance.

Rather than hedge with the standard "it depends" conclusion, I believe that the various tools will prove to reinforce existing competitive advantages rather than confer new ones. That is, the cultural attributes necessary for successful Web 2.0 behavior are in and of themselves powerful differentiators, and the tools will amplify either the presence or absence of such traits as accountability, openness, receptiveness to change, sensitivity to customer needs and preferences, and the like.

The term and concept of "enterprise 2.0" appear to have originated with Harvard Business School professor Andrew McAfee, most explicitly in a Sloan Management Review article from this past spring. He argues that "the new technologies are significant because they can potentially knit together an enterprise and facilitate knowledge work in ways that were simply not possible previously." (p. 22; citation below) Specifically, McAfee points to search, links, "authoring" (blogs and wikis), tags, "extensions" (algorithmic extrapolation), and "signals" (mostly RSS) as the primary enabling technologies.

On its face, much of the argument seems straightforward and even exciting: having the ability to develop nuggets of business functionality quickly, from the edge of the organization inward, presents a stark contrast to many software development efforts. Being able to identify the right people with relevant skills and knowledge in minutes makes many document-centric "knowledge repositories" feel frustratingly ill-conceived. Assuming that experts on a subject would voluntarily articulate their expertise and create metadata would have been naive only a few years ago.

In the right situation, any of the above behaviors may, in McAfee's word, "emerge" as the result of bottom-up self-organization and effort rather than the mandated top-down kind. But emergence is a very tricky business -- the sciences of understanding its sources, implications, and results are still immature. Let's look at a few complicating factors that could stand between certain flavors of corporate reality and the ideal of enterprise 2.0.

-Of Computation and Communications

Corporate IS organizations have traditionally been responsible for the electronic automation of business tasks and processes: order entry, accounts receivable, warranty service, and more recently customer contact management and new product development. In contrast, web 2.0 technologies don't automate much; they facilitate richer, sometimes better organized and more widely distributed, communications. The first complication comes as IS organizations look at conventional questions that have surrounded application development: what is the ROI, what are the payoff metrics, where is the audit trail, who will manage access and permissions. More simply, issues of control show up almost immediately, as the need to specify goals, metrics, and chains of responsibility encounter notions of wide participation, of distributed authority, and of "shoot first, aim later (if at all)."

-Of Signals and Noise

The core assumptions of web 2.0 -- that users own the content they create, and that said content is of interest to someone else in a long tail of taste and proclivities -- have led to a veritable explosion of original and republished (in a variety of forms) content: whether as a Myspace profile, a YouTube video, a self-published movie review or political rant, or a wiki entry, content is everywhere. The larger problem of editing remains an issue even at "formal" publications, but it's intensified in a workplace where people may not have the same ability to opt out, and who, at 5:00 pm or whenever, really want to go home with more rather than fewer tasks completed. The incessant blurring of personal and work time, and personal and businesses modes of behavior, is playing out vividly in the Web 2.0/Enterprise 2.0 debate. As long as the tools for publishing and distribution develop faster than the tools for managing and filtering, web 2.0 has the potential for unpalatable signal-to-noise ratios, particularly with captive or semi-captive audiences.

-Generationality

This emphasis on communication is already having dramatic effects, according to 40- and 50-something peers of mine, particularly in knowledge-driven industries such as advertising, accounting, and consulting. I frequently see generational differences working with university students, but from the reports of many colleagues, the sharp differences in communications platforms across generations are radically complicating the task of management. It's not unheard-of for senior executives to have admins print off their e-mails, and voicemail remains the medium of choice in some firms. At the other demographic extreme, e-mail is often disregarded in favor of some combination of twitter, text messaging, PC-based instant messaging, and social-network message tools.

People who grew up with a web-centric social sensibility often communicate rather more freely than their elders (or regulators, in some cases) would prefer. Enterprise IS has the unenviable task of logging all material communications, and sometimes of turning off some of the most powerful web 2.0 exemplars. The aforementioned middle-aged managers, meanwhile, must communicate across an increasingly wide variety of technologies, each with particularities of convenience, cultural norms, interoperability, and security and privacy. Add to this cultural dynamic the technical incompatibilities among communications tools. It feels a bit like the days of Compuserve vs. Prodigy: my Facebook message won't cross over to your Myspace page. Being a contact on LinkedIn doesn't mean I can see you on Spoke.

-What's the platform Kenneth?

Once upon a time, a phone was a phone and a computer was a computer -- even when it connected to phone lines. Then phones went mobile but it was still easy to tell a Star-Tac from a Thinkpad. These days, however, gaming devices, smart phones, ultra-mobile PCs, and other hybrid devices have blurred the old easy distinctions. The iPhone is a computer, no question, but is neither marketed nor used like a PC. 200 million Skype users have proven powerfully that voice is just another data type over the network. More in Asia than in North America, the mobile phone is a television "set" -- even the old words are antiquated. In the enterprise setting, this proliferation and polymorphism of devices combines with the content explosion and communication imperative to create unprecedented complexity: complexity for users of various tools and platforms, complexity for application specification, complexity for network design and security officers.


***
The many costs of these multiple layers of complexity begin to illustrate how web 2.0 tools can, in the wrong setting, extract far more than they contribute. Flame wars provide an accessible case in point: even though there may be wisdom in crowds (whether through various forms of voting, prediction markets - which McAfee doesn't mention, or simply an unexpected discovery of domain expertise), there will be more far instances of threadjacking, name-calling, bad information, and other forms of noise.

At the same time, in the right organization, web 2.0 tools can enhance existing forms of positive dialogue. Given the technologies' emphasis on communication, for example, the contradiction between operations and marketing might be creatively discussed and addressed. Why does marketing so highly value (and expensively pursue) depth and duration of customer interaction while call centers are designed and run to minimize the company's contact with precisely the people marketing is struggling to reach? In such fluid, indeterminate situations, McAfee's characterization of "emergent collaboration" may indeed be realized.

So the question comes down not to "are web 2.0 technologies applicable to enterprise IT?" but rather "in what kinds of cultures and in the context of what kinds of business processes can wikis, tags, blogs, and their associated tools make a difference?" That is, once we shift the focus of inquiry from the technologies to the locus of their deployment, the believers and doubters can both begin assembling the relevant evidence for what promises to be a long, strange experiment and discussion.

Andrew P. McAfee, "Enterprise 2.0: The Dawn of Emergent Collaboration," Sloan Management Review 47:3, 21-28.
http://sloanreview.mit.edu/smr/issue/2006/spring/06/

Friday, August 24, 2007

August 2007 Early Indications: China's Changing Role in the Tech Sector

At base, technological change and globalization cannot be cleanly
distinguished, and thus will be interlinked for the foreseeable
future. The shipping container is arguably one of the five great
breakthroughs of the twentieth century. Cellular telephony's
revolution of participation, the impact of voice over IP on
international calling, offshore call centers and code factories, price
transparency, and many more facts of global life originated in a lab
or startup.

Given that China's rapid growth and wide impact have become
essentially synonymous with globalization, it makes sense to examine
the current state of the tech industry relative to this awakening
giant. Worldwide interest in the question has been on the upswing,
prompted by two developments: the acquisition by Lenovo of IBM's PC
operation, and Apple's reasonably prominent branding of the iPod's
Chinese manufacturing. More recently, the UK's Mail on Sunday
newspaper ran a critical story in June 2006 on the Chinese factories
from whence the devices originate. Since then, attention has been
focused on wages, working conditions, and the business models behind
the influx of Chinese-made devices and components. (A bibliography
appears at the end.)

James Fallows, who writes for The Atlantic Monthly, recently reported
from Shenzhen, the port city home to the contract manufacturing
factory linked to Apple. One theme that reappears throughout the
article is Fallows' amazement at the scale of Chinese activity:

-The port of Shenzhen and Hong Kong (only about 30 miles away)
dispatched 40 million cargo containers, or the equivalent of one per
second, in a calendar year. (The U.S. exports that return to China in
those containers consist primarily of scrap paper and scrap metal,
along with empty containers.)

-Shenzhen is a planned city that 25 years ago was a fishing town of
maybe 75,000 citizens. It is now bigger than New York, having grown
100-fold, or more, in 25 years.

-At the Foxconn manufacturing plant, a vast number of employees work
12-hour shifts turning out all manner of electronic goods: the precise
number is not made public or perhaps known, but estimates range
between 200,000 and 300,000 people, many of them young women from the
countryside who have migrated to the factory. The facility serves
150,000 lunches per day.

At the macro level, the impact of Chinese exports on the global
economy appear to be mostly anecdotal and probably overstated. In
selected markets, however, China's combination of low wages and
manufacturing scale has driven prices lower in much of the rest of the
world. A famous example is bicycles, but for our purpose, the low
prices of many advanced items -- including cell phones, laptop
computers, cameras, some medical devices, and electronics equipment in
general -- derive in part from China's impact on the industry. That
is, the availability of such items as Motorola Razrs for (apparently)
free and laptop computers for $500 and potentially $100 owes as much
to China's economics as it does to Dell's direct business model or
Moore's law.

The companies driving this transition are, for the most part, not
household names. The electronics manufacturing services (EMS)
industry, formerly known as contract manufacturing, is itself only
about ten to fifteen years old, but growing about 20% per year. In
the mid-1990s, Nokia, Cisco, Sony, and other major brands began
exiting the manufacturing business, leaving it in the hands of such
companies as Solectron, Flextronics, and Jabil.

The largest current EMS, Hon Hai Precision, is the parent of Foxconn.
It is expected to grow from $40 billion to $54 billion in revenues
this year after having grown 44% in 2006. The founder, Terry Gou, is
a native of Taiwan worth $10 billion, according to the Wall Street
Journal; he does not appear on Forbes Magazine's list of the world's
richest people, where he would rank in the top 65. Hon Hai, a
publicly traded company, is China's largest exporter.

As EMS companies seek to increase margins and avoid commoditization,
they take on more upfront work, moving toward becoming so-called
Original Design Manufacturers (ODM). A quick quiz: what do Quanta,
Compal, Inventec, Wistron, and Asustek do? According to Fallows, they
collectively account for 90% of global production of laptop computers;
at one factory, he saw machines from three different major brands
coming off the same assembly line. As a quick check of these
companies' websites illustrates, many laptops we might associate with
HP, Dell, or other major brands began life in one of these Asian
firms, which broadly speaking are higher in the food chain than EMS
companies.

The final step up the margin ladder is for a manufacturer to design,
make, and label its own offerings for market, as an Original Brand
Manufacturer (OBM). Brand is in fact a major story at Lenovo,
formerly the Chinese Legend PC firm, which bought the IBM business in
2005. The company's marketing is focusing heavily on sporting events,
with Olympic sponsorship at both the Turin and Beijing games.
Lenovo's story is fascinating: the CEO, Bill Amelio, is an American
with a Karate black belt hired away from Dell, while the chairman,
Yang Yuanqing, is Chinese. The company's ownership is split among
public shareholders (35%), the state-run Chinese Academy of Sciences
(the original investor in Legend at 27%), employees, IBM, and private
equity firms. Lenovo sells in 66 countries and recently announced
plans to open factories in India and Mexico, the better to shorten
supply chains and thus accelerate inventory flow.

Lenovo's headquarters moved from Beijing to Raleigh, NC shortly after
the IBM transaction, but Amelio lives in Singapore. The culture of
the company is in flux as Chinese managers take courses in directness
and accountability and IBM, Legend, and Dell habits are sorted out.
The legacy IBM business, meanwhile, is being upgraded with investments
in IT, R&D (moved increasingly to China from the U.S.), and supply
chain. With 8.3% global market share, the company ranks #3 worldwide
in PC shipments, barely ahead of Acer and lagging HP (19.3) and Dell
(16.1). Competition is intense: Dell recently invested $16 billion in
one year in Chinese capacity, more than Lenovo's entire revenues.
Lenovo has responded by cutting costs, including laying off 1400
employees announced earlier this year, and by reinventing its channel
model outside China.

While the whole world is watching to see how Lenovo fares as China's
first global brand, another company from the other side of the ocean
is trying to create a hybrid Chinese-American firm. 3Com has had a
wild ride in its nearly 30 years of existence. After being co-founded
by Ethernet inventor Bob Metcalfe in 1979, the company sold a variety
of networking equipment including interface cards, and attempted
several consumer plays including USRobotics (modems) and its
subsidiary Palm Computing that were later spun out, as well as the
Kerbango Internet radio that never came to market and the Audrey
Internet appliance, which lasted less than a year.

In 2003 3Com formed a joint venture with Huawei, now an $8 billion
company of 62,000 employees that sells networking gear primarily to
telecom operators. Earlier this year, 3Com bought back Huawei's stake
for $882 million in the JV, now known as H3C. Total headcount in the
company is now heavily weighted toward Asia (5,000, mostly in China)
with about 1,200 employees still in the U.S. The company now enjoys a
similar R&D situation to Lenovo, in that engineers are about 1/5 as
expensive in China as in the U.S., so investment can go a lot farther.
3Com also will encounter some of the cultural issues that slowed
Lenovo after the IBM acquisition, but like Lenovo gained global scale
via a trans-Pacific deal.

So what's the overall picture? Software creation is generally a
non-issue, except domestically, where the Baidu search engine has had
some success and Lenovo has introduced some functionality specific to
the home market. Chinese firms have proven they can build electronics
to order, and build from original designs in certain segments.
Quality control and material provenance remain problematic. Lenovo
has proven it can sell lots of PCs in its home market and that
spending lots of money can build a brand. Unlike India, China has not
produced a generation of globally prominent managers and executives,
with the exception of Lenovo's Yanqing.

The dominant business model of China's role in the global technology
industry, however, is probably still represented by a man James
Fallows calls "Mr. China," an Irishman named Liam Casey. Casey runs
PCH China Solutions, a firm built up from Casey's personally-acquired
Rolodex of factory locations, contract outcomes, manufacturing
capabilities, roads, and many other factors. If someone needs a
widget built, Casey is likely to know who can build it, who has
capacity, who can supply appropriate materials, and how much it should
cost. For outsiders entering the country, as they are in droves, such
knowledge can be found only with informed intermediaries like Casey;
as Fallows notes, "foreigners don't know where to start or whom to
deal with in the chaos of small, indistinguishable firms."

The rapid growth, corruption, and lack of supply chain transparency
have led to predictable consequences, as when Mattel could not name
its suppliers of tainted toys until long after lead was discovered.
Pollution, the classic externality, is fast becoming a front-burner
issue, and could play a dramatic role in the Beijing Olympics.
Working conditions don't measure up to western standards, but at the
same time, China's industrialization has alleviated severe issues of
rural poverty. Furthermore, the process is probably safer and more
humane than what weavers experienced in Manchester, spinners
encountered in the Carolinas, or early auto workers persevered through
in Flint. To some extent, comparing historical examples of industrial
misery is an apples-and-oranges exercise, but it serves to remind us
that any judgment of these conditions is relative, and for better and
worse, Chinese factory workers are generally better off than they were
on a farm. The various winners and losers remain to be fully sorted
out, but China's emergence will continue to reshape many aspects of
the global order.

"Bold fusion; Face value," The Economist, Feb. 17, 2007, p. 74.

Steve Hamm and Dexter Roberts, "China's First Global Capitalist,"
Business Week, Dec. 11, 2006.

James Fallows, "China makes, the world takes," Atlantic Monthly,
July-August 2007, p. 48.

Jane Spencer, "Lenovo Looks to Expand Global Reach," Wall Street
Journal, July 27, 2007, p. B4.

Jason Dean, "The Forbidden City of Terry Gou," Wall Street Journal,
August 11, 2007, p. A1.

"The stark reality of iPod's Chinese factories," The Mail on Sunday,
August 18, 2006.

Bruce Einhorn, "The Tech dragon Stumbles," Business Week, May 17, 2007, p. 44.

Friday, July 27, 2007

July 2007 Early Indications: From Programming to Programming

In the past ten to fifteen years, many barriers between traditional
industries have broken down. We're in the early stages of another
big, blurry brawl, but to set some context, here are a few examples
and data points:

-Entertainment and computing now overlap in many significant ways.
According to Neilsen, Americans between 8 and 34 spend more time
gaming than watching television. Globally, computer gaming has become
about a $30 billion industry, compared to worldwide box office
receipts of about $26 million in 2006, which was an all-time record.

-Telecommunications and media are battling into each other's
territory. Cable television and voice providers, as of 1990, were
separate and distinct. By 2000, cable providers led telecoms, in
North America anyway, in Internet access. Starting early in the
decade, the cable providers have gathered significant share
(approaching 30% in some markets) of the wireline voice market, but
DSL has gained back some share in Internet access. In the next five
years, look for telecoms to provide television content; Verizon has
announced Fios1, a "hyper local" channel in the Washington, D.C. area.
A recent study by Motorola found that 45% of Europeans, led by 59% of
the French, watch some TV over the Internet. Cellular telephony is
shaping up as the next media platform: Japanese phones routinely
include television tuners already, and growth is expected to be rapid
in many areas of Asia.

-Retail has been redefined along several dimensions. The U.K. grocery
chain Tesco punches far above its weight in petrol sales: with only
4.3% of the retail locations, it has captured over 12% of the market,
lagging only BP, which controls 16.5% of the market but has three
times as many locations. Here in the U.S., it's hard to believe that
Wal-Mart expanded from general merchandise into the grocery business
less than 10 years ago, but it controls at least 20% of a highly
fragmented market. eBay, which began as a secondary market, now also
includes many new, branded goods from established sellers.

Our focus today, however, is on a different "invasion" of an adjoining
market. Ten years ago, when investors were looking for "the next
Microsoft," they held certain assumptions about what a highly
successful software company looked like:

-Winners choose the right platform, picking well according to the
market size and share of the hardware on which the software, of
whatever sort, runs. IBM's OS2 operating system in the early 1990s
had some technical advantages over Windows, but IBM never established
the application layer which would make its OS competitive.

-Winners develop mechanisms for user lock-in and network effects.
Word processing programs stand as an obvious example, where switching
is hard and expensive, and it makes sense to be on the same product as
all of your co-workers.

-Winners manage upgrade cycles efficiently: that locked-in user base
will eventually have to buy the new, improved version, delivering a
major revenue infusion to the software seller and perhaps the wider
ecosystem.

-Winners sell software to large customer bases one consumer or one
business at a time. This reality of the market implies effective
management of brand, retail channels, and enterprise sales forces.

-Winners care about software functionality and performance; data as it
is generated or managed by the application falls out of scope.

-Winners hire strong technical teams because functionality is
specified early in the new product development cycle and must be
hard-coded into the package.

-Winners think of the world in rows and columns. Whether in
calendaring, spreadsheets, databases, project management (swim lanes),
presentation graphics, or customer contact management, most programs
of the 1985-2000 period deeply embedded a grid metaphor and/or
architecture.

Times have changed. Whether one looks at Google, a clear challenger
to Microsoft's dominance, or at the new crop of companies all seeking
to ride the "web 2.0" bandwagon, many of these assumptions about
software no longer hold true. For example, in a survey of 25 startups
to watch compiled by Business 2.0, fully 20 had revenue models at
least partly based on advertising. Greatness in software now requires a lot of the old world skills and positioning, plus a healthy dose of some new elements as well.

To set some context, look at some familiar companies listed by market
capitalization and price/earnings ratio as of 19 July:

Company Cap P/E

Microsoft 301B 23
Oracle 105B 25
SAP 67B 26
Disney 68B 16
Time Warner 78B 13

Google 171B 48
Yahoo 35B 51

Ten years ago, anyone looking for "the next Microsoft" probably would
not have looked to Viacom or Disney as models. And for good reason:
the role of "pushed" content is itself in transition. Yet the core of
the media model -- the packaging of audiences for sale to advertisers
-- is fueling growth at Google, presenting both technical and cultural
challenges at Yahoo, and the source of deep concern among Microsoft's
top leadership. The changing of the guard is further emphasized by
Microsoft's experience with the most recent exemplar of old-school
software, its Vista operating system. The product shipped three years
late, with a stripped-down feature set, and effectively cost several
senior executives their jobs: Brian Valentine, Jim Allchin, and, to a
degree, Bill Gates. It also has yet to sell in large numbers, in part
because enterprise buyers are waiting for the first updated release,
when many of the first-run glitches will be addressed.

What are the emerging dynamics for software dominance? Compared to
the standards for success circa 1997, a few factors have been inverted
while most still hold true, with a twist.

Platforms
Rather than developing for Unix, Windows, Mac OS, Symbian, set-top
boxes, and a variety of other operating systems, Google and Amazon
have led the way toward development of services for the Internet as a
platform. Among other things, this stance greatly simplifies product
distribution: the differences between Google Maps and my 1998 version
of Rand McNally's Windows package are striking. Every time a new road
is built, or interstate exits renamed, or a pedestrian mall built,
millions of CDs become obsolete. Google (or NavTeq or whoever) makes
one change to the base map and every subsequent query will be addressed with accurate information. Getting the platform right still matters, but the definition of the term is changing from local to virtual, solitary to distributed, and product to environment.

Lock-in
This aspect still concerns financial analysts, particularly because
switching costs can be so low. If I change from Yahoo Finance to,
say, Fidelity's investor workbench, apart from my investment in the
old interface, there's very little to restrain me from leaving. Tim
O'Reilly, who helped formulate the very notion of Web 2.0, asserts
that users own their data in these sorts of scenarios, but the
exceptions to his assertion prove that Web 2.0 is hardly the last
word. My eBay reputational currency, iTunes preferences, and Hotmail
account are neither open nor portable -- by design.

Network Effects
There's no question that successful software still exploits network
effects. The more developers who code to a given platform --
Facebook, Salesforce, or Google maps -- the more that standard gains
authority: note that none of those aforementioned businesses counts
as a website. One of the platform pioneers powerfully illustrates the
point perfectly: Amazon just noted in its earnings conference call
that it has 265,000 developers signed up to use its web services.
There are also powerful network effects among users, whether at eBay,
MySpace, or BitTorrent: the more people who use the service, the more
valuable it becomes.
Compare that one fact to consumer products,
banking, automobiles, or pharmaceuticals, and we are reminded how
significantly online dynamics depart from those of widget business or
even most of the service sector.

Upgrade (and therefore revenue) Cycles
No longer is the objective to leverage a large installed base onto a
new version of the product. Google makes money every hour of every
day, and apart from acquisitions, we don't expect spikes in its
revenues. Indeed, the escape from the cyclicality of product upgrade
cycles may not yet be fully appreciated as analysts assess the new
breed of software companies. The dependence of shrinkwrap software
companies on secondary revenue streams may become problematic: Larry
Ellison noted in an interview with FT last year that Oracle was
getting 90% margins on maintenance. Customers can't be, and aren't,
happy with those economics, so it is likely only a matter of time
until competition and/or customer resistance change the model. To what, nobody can say.

Selling Software as a Product, One at a Time
On July 19, Google reported quarterly revenues of $3.87 billion, a
year-over-year improvement of 58%. Did its sales force grown by 60%
in a year? I highly doubt it. Although the company offers a few
software products a customer can purchase, they amount to mere drops
in that $15 billion annual bucket: enterprise search hardware and
software, hosted applications, GIS tools. An important facet of the
(lowercase) software as a service trend is that in an increasing
number of cases, users don't have the software on their own devices,
but access a server, its location irrelevant, to get something done.
As a result, the customer base (of advertisers) is dramatically
smaller than the user base, delivering favorable sales force
performance metrics.

Accordingly, software distribution channels are being completely
reinvented: the old goal used to be to get your product onto a shelf
and/or catalog page at Computer City, Egghead, or Micro Warehouse.
Note that all of those businesses are defunct, another indication of
deeper change in the industry. In a related development that sheds
further light on a complicated situation, PC Magazine subscriptions
have dropped from 6.1 million in 2003 to 4.8 million.

People Buy Features and Performance
There's a wonderful video that embodies this thinking perfectly: enter
"microsoft ipod" into the YouTube search bar. Microsoft apparently
produced this spoof internally, illustrating the trend toward "speeds
and feeds" in stark contrast to Apple's aura and powerful design
sense. Just run down the standard old-school software questions in
regard to Hotmail or Mapquest:

-What is the recommended processor?
-How much free disk space is required?
-What is the minimum memory required?
-How many transactions per second can the application handle?
-How fast can the application render/calculate/save/etc.?

The very mention of these former performance criteria in regard to the
most successful "applications" of our time highlights the
discontinuity between where we are and where we were. It's critically
important to note that the path from Lotus Organizer or the original
Encarta to Basecamp or Wikipedia involved a step-function change
rather than evolutionary progression.

Hire the Best Technical Team
There's no question that high-caliber architects and developers
matter. Look at the arms race among Microsoft, Amazon, Google, and
Yahoo to hire the giants of the industry: Gordon Bell, Brian Valentine
(see above), Adam Bosworth, and Larry Tesler, respectively, only begin
a very long list. But the outside-in dynamic of user-generated
content also allows such sites as del.icio.us or Grouper (now Crackle)
to thrive. In these kinds of businesses it's certainly imperative to
get top-flight operations and data-center professionals, no question,
but these folks are of a different breed compared to the breakthrough
innovators of the caliber mentioned above.

Quality is Built from the Inside Out
This area is tricky. Certainly the core application functionality and
engineering need to be built into the base architecture, as eBay
discovered a few years back. But no longer is the internal team the
only resource: many of the best businesses balance internal and
external talent, Amazon being exhibit A. In contrast, efforts built
on pure volunteer collaboration, such as the Chandler PIM and Mozilla
browser, have been outpaced by commercial ventures. It's also worth
reiterating that Apple runs a very closed shop very successfully: the
iPod and iPhone feel antithetical to the Web 2.0 mantra. It would
appear that in this regard, as in many others, several successful paths remain available.

Rows and Columns
While I don't want to oversimplify and assert that value has migrated
from nodes to links, the fact remains that the structure of business,
personal connections, and information is looking much more like a
spider web than a library card catalog. As scholarship from Rob Cross
at Virginia and others has illustrated, informal networks of personal
contacts, once exposed, often explain a corporation better than the
explicit titles and responsibilities. More recently, Mark Anderson at
Strategic News Service has connected some of the dots around Google
and Apple, at both the board level and elsewhere, contending that an
ecosystem is taking shape to challenge Microsoft. At the engineering
level, the very concept of social networking behind Twitter, flickr,
and the Dodgeball startup scooped up by Google represents a departure
from a conventional relational database mentality.

Calling this a trend would be premature, but the corporate
architectures at Microsoft, Google, and Apple mirror their varying
approaches to the market. Apple's share price includes a healthy
dose of respect for the management ability of Steve Jobs, in that
particular context, to both envision and execute. Conversely, the
achievement of Google, with the jury out on the model's staying power,
may lie in leadership's balancing of individual brilliance at
different layers of the hierarchy with financially realistic corporate
objectives. Finally, Microsoft appears to be working hard to define
an emerging management model as the founding generation hands off to
new COO Kevin Turner (from Wal-Mart) and CTO Ray Ozzie, long ago at
Lotus.

While it certainly includes a substantial element of buzzword-mania,
the shift from rows and columns to graphs -- whether in software
architecture (cf. Metaweb), business model (Facebook), or management
structure (Linux still matters here) -- merits watching for several
reasons. First, the combination of cheap and (remotely available)
processing, effectively infinite online storage, and functionality
tuned to these realities means that graphs are required to handle the
sheer scale of available data. Second, the ability to map and model
networks allows their structures to be better understood and utilized.

Finally, social groups get larger than could be managed in an
unconnected world -- according to a recent survey of 18,000 people
conducted by Nickelodeon, MTV, and Microsoft, "Globally, the average
young person connected to digital technology has 94 phone numbers in
his or her mobile phone, 78 people on a messenger buddy list and 86
people in his or her social networking community." This requires both
new ways to understand social connections and tools with which to
manage them. To underscore this shift, the North Carolina Attorney
General announced earlier this week that MySpace just ceased hosting
pages for 29,000 known sex offenders.

********
Taken together, these tendencies are reshaping the software business:
programming (as in putting content together) has joined programming
(as in coding) as a core competency for many kinds of businesses that
fall in the gaps between computing and media. The fusion also shakes
up conventional media, as we have noted earlier. The purely
push-based media model, used to advertise things primarily for largely
unmeasurable brand impact (unmeasurable at the level of the ad,
particularly), is being challenged by viewers and readers who want
more participation in both the experience (what used to be called
consumption) and the process (formerly known as publishing or content
creation). The YouTube-CNN debates feel to some extent like a
gimmick, but they appear to be a harbinger. As blogs, social
networks, and professional content get further jumbled, as Rupert
Murdoch seems to be intent on doing, the business models of media,
software, gaming, and transport will continue to feel the effects.

Wednesday, June 27, 2007

Early Indications June 2007: Miles Davis, CEO?

As technologies, cultural attitudes, demographics, and economics change, people have both the opportunity and the need to reinvent organizational models. When industrialization drew farmers into cities and factories, the military provided a convenient reference: the army of labor was directed by captains of industry. Symphony orchestras provided another authoritarian model. As the corporation matured, it invented its own characteristics. Henry Ford fathered process-centric division of labor with his refinements to the assembly line, while Alfred Sloan pioneered many organizational and financial practices, such as divisions (another military offshoot?) and ROI, that made the corporation the model for other entities, such as as schools, foundations, and some sports teams.

Today's business environment presents new challenges to old models. A long list of factors combine to reshape work and organization:

-prosperity (Maslow's hierarchy of needs)
-the shift from manufacturing to services
-the rise of intangible forms of value such as brand and intellectual property
-global markets for risk
-urban congestion and telecommuting
-safety and security considerations
-China's resource hunger
-the unique nature of software as an invisible asset
-increased monetization of data
-the Internet and its associated technologies such as e-mail
-mobility, particularly the impact of cellular and other wireless data networks
-global enterprise software packages
-work-family issues that followed mass entry of women into universities and the workforce
-problem-solving vs. assembly-line routinization
-shorter product life- and use-cycles
-offshoring and outsourcing
-widespread cultural resistance to positional authority
-intensity of task and knowledge specialization
-mass air travel

and many more. As Erik Brynolfsson recently noted in Sloan Management Review (spring 2007 p. 55), we need to rethink the very nature of firms, beginning with Ronald Coase's famous theory: "The traditionally sharp distinction between markets and firms is giving way to a multiplicity of different kinds of organizational forms that don't necessarily have those sharp boundaries."

Given the uncertainty and rapid change implied by this list, it's no surprise that academics and other management thinkers have focused on improvisation. Rather than looking at the everyday sense of the word having to do with makeshift or ad hoc solutions, however, these theorists see considerable structure in musical and dramatic improvisation. One researcher went so far as to live with Chicago's Second City comedy troupe to investigate these structures, but our focus here will be on jazz. (A particularly valuable resource can be found in the September-October 1998 issue of Organization Science devoted to jazz and many of its organizational implications and parallels.)

According to Kathleen Eisenhardt of Stanford (in "Strategic Decisions and All That Jazz," Business Strategy Review 8 (3), 1–3), improvisation both involves intense communication between players in real time and a small number of well understood rules in which improvising is performed. The practice is not a matter of the soloist "making it up as he goes along," but something much richer and more collectively created. Paul Berliner, whose 1994 book Thinking in Jazz is a milestone, goes even farther:

[T]he popular definitions of improvisation that emphasize only its spontaneous, intuitive nature -- characterizing it as the 'making of something out of nothing' -- are astonishingly incomplete. This simplistic understanding of improvisation belies the discipline and experience on which improvisers depend, and it obscures the actual practices and processes that engage them. (p. 492, quoted in Weick, "Improvisation as a Mindset," in the Organization Science volume noted above, p. 544)

To give some indication of just how complex the practice of improvisation can be, the Canadian organizational scholar Karl Weick explains that it in fact exists on a continuum, with the progression of different techniques implying "increased demands on imagination and concentration." To summarize, the simplest form of improvisation is interpretation, moving through embellishment then variation, all the way to improvisation, which implies a time pressure and a lack of precomposition. Thinking about the organizational equivalents of these techniques is a compelling but highly imprecise exercise. (Weick pp. 544-545)

Perhaps because it evolved in parallel with the information age, jazz appears to be well suited to collaborative work by impermanent teams of skilled workers. It is also more applicable to performance than to decision-making: few great quartets or quintets have been democratic, and many leaders of bands large and small have been solitary, poor, nasty, brutish, or short, to borrow from Thomas Hobbes. Improvisation found little place in the classic big bands of Goodman or Ellington. More recently, until his death James Brown fined band members, many of whom were truly A-list musicians, in mid-performance for breaking his rules.

So improvisation in and of itself does not solve the organizational dilemma of managing real-time knowledge work. Michael Gold, who lectures on the intersection of jazz and business after having been both a bassist and a banker, posits an acronym - APRIL - to denote the five traits that carry over:

The members of a jazz ensemble possess and practice a set of shared behaviors that we call the Five Dynamics of Jazz.

* Autonomy -- self-governing, self-regulating, adaptable and independent - yet in support of (and interdependent with) the larger organism.

* Passion -- the quality of emotional vibrancy, zest, commitment, and energy to pursue excellence and the course one believes to be true.

* Risk -- the ability to take chances and explore new territory and methods in pursuit of shared goals, and the ability to support others in their explorations.

* Innovation -- the skill to invent, recombine, and create new solutions to problems using either old or new forms, methods, and/or resources.

* Listening -- the ability to truly hear and feel the the communication of passion, meaning and rhythms of others. (http://www.jazz-impact.com/about.shtml)

Gold's Five Dynamics are useful but not sufficient, and raise operational questions presumably addressed in his lectures: how do good managers channel both passion and the need to show up on time? Innovation is of course vital, but how do the other members of his quartet know what to do when the improvised bass solo is over?

Another jazz player/business speaker (and a classmate of Gold's) has combined his education and work as a drummer with lessons from jobs in consulting and startups to present a potentially more rigorous view. Operating from his home bases in Norway and Boston, Carl Stormer has been addressing banks, consulting firms, telecom companies, and CPG firms on the topic of "Cracking the Jazzcode." The presentation itself, which I have not yet seen, is innovative in both structure and message.

Stormer begins with a brief welcome, then proceeds to play drums in a band of three or four players who have never before performed as an ensemble (every performance is different). These are high-grade professionals: Cameron Brown has played bass for Archie Shepp, Art Blakey, Joe Lovano, and Dewey Redman. Saxophonist Rob Scheps has recorded with John Scofield, Carla Bley, and Steve Swallow. Guitarists Jon Herington and Georg Wadenius have both toured with Steely Dan.

So the musicianship is top-shelf. What can managers learn? Stormer has developed a rich set of insights. First among these is the notion of instruments: improvisation is key to jazz, but does not in and of itself define the genre. What functions do each instrument perform at what time? In other words, why don't we hear trios of drummers or quartets of saxophones? What are the rules for passing a solo? What are the responsibilities of the horn player during the guitar solo? Instruments have different roles in an ensemble, roles that ensure that players don’t have to fight for the same functions. (Conversely, when functions overlap, as with a guitar and piano, players must work out who leaves room for whom.) In addition, the ownership of instruments ensures that players match their skills with their task.

While improvisation may look individual, jazz is inherently made by groups. What are the elements that define an ensemble? Why are sextets more than twice as difficult to manage and play in than trios? How do groups communicate? Why don't quartets have teambuilding exercises? Why can the Jazzcode band of the moment work effectively without rehearsal?

The Jazzcode lecture also includes important ideas about shared cultural references: if my tenor solo quotes from "Round Midnight," the drummer will do a better job faster if he can pick up on the source of the riff. If the band gets a request not everyone knows, what happens? What is the score from which a group plays? What are the differences between notes on paper and music in performance, and what do they tell us about business processes?

Many other thought-starters emerge in Stormer's conversation. What are the benefits of increasing your competence on an instrument vs. cross-training on other instruments, most notably piano? For all the emphasis on improvisation and traded soloing, why is it that arrangers play such an important role in certain ensembles? What are the payoffs of increased competence on my instrument? Do I get more solos, will better musicians want to play with me, will I make more money? To that end, how do I practice: improving on my weak points or developing deeper insights into my favorite techniques and songs?

I don't want to give away Stormer's trade secrets, but jazz -- as a music and not just as a vague concept thought to involve chaos and unscripted soloing -- is rich with business implications. In short, I believe there may well be a Jazzcode for business and that if there is, Carl Stormer is uniquely positioned to discern and explain it. Furthermore, the emerging business and technology climate will only amplify the wisdom of his approach.

http://www.carlstormer.com/jazz/

Thursday, June 21, 2007

May 2007 Early Indications

The following is based on the opening talk presented at the Center for
Digital Transformation's spring 2007 research forum.

I.
Roughly 20 years ago, Citibank CEO Walter Wriston said that
"information about money has become almost as important as money
itself." Since that time, complex secondary and tertiary risk markets
have grown into a massive global financial information-processing
mechanism. Stocks and bonds, traded on primary markets, are hedged by
futures, options, and derivatives, as well as a variety of arcane (to
the public) devices such as Enron's famous special purpose entities.
These instruments are nothing more than information about money, and
their growth helps prove the truth and wisdom of Wriston's comment.

Data, what Stan Davis once called "information exhaust" or the
byproduct of traditional business transactions, has become a means of
exchange and a store of value in its own right. Hundreds or even
thousands of business plans are circulating, each promising to
"monetize data." While Google is an obvious poster child for this
trend, there are many other, often less obvious, business models
premised on Wriston's core insight, that information about stuff is
often more valuable and/or profitable than the stuff.

Internet businesses are the first that come to mind. Both Linux and
eBay have captured reputational currency and developed communities
premised on members' skills, trustworthiness, and other attributes.
These attributes are, in the case of eBay, highly codified and make
the business much more than a glorified classified ad section.
Information about retail goods is used by 7-Eleven Japan to drive
new-product hypotheses in much the same way than analytical credit
card operations such as Capital One develop offers in silico. An
astounding 70% of SKUs in a 7-Eleven are new in a given year, and such
innovation in a seemingly constrained market is only possible because
of effective use of data.

Amazon's use of purchase and browsing data remains unsurpassed. I
recently compared a generic public page -- "welcome guest!" -- to my
home page, and at least eighteen different elements were customized
for me. These were both "more of the same," continuing a trend begun
with a previous author or recording artist purchase, and "we thought
you might like," recommendations based on customer behavior of others
deemed similar to me. Of the eighteen elements of that home page,
each had a valid reason for inclusion and was a plausible purchase.

Another less visible example of this trend is the Pantone system.
Information about color is almost certainly more profitable than paint
or ink. Pantone has a monopoly on the precise defitions for colors
used in commerce, whether in advertising or branding - Barbi pink and
Gap blue are omnipresent - or in production processes: every brownie
baked for use in Ben & Jerry's ice cream is compared to two Pantone
browns to ensure consistency. Pantone is also global: Gap blue is the
same in Japan as in New Jersey, and on shopping bags, neon signs, and
printed materials. The private company does not disclose revenues,
but it is now branching out into prediction businesses, selling
briefings telling fashion, furniture, and other
companies whether olive green will be popular or not next year.

II.
A second trend crossing business, science, and other fields can
colloquially be called "big data." We are seeing the growth of truly
enormous data stores, which can facilitate both business decisions and
analytic insights for other purposes.

Some examples:

-The Netflix prize invites members of the machine learning community
to improve the
prediction algorithms used to recommend "if you liked X you might like
Y" recommendations. While it is not clear that the performance
benchmark needed to win the $1 million top prize can be reached
incrementally, one major attractor for computer scientists is the size
and richness of Netflix's test data set, the likes of which are scarce
in the public domain: it consists of more than 100 million ratings
from over 480 thousand randomly-chosen, anonymous customers on nearly
18 thousand movie titles.

-Earlier this month a new effort, the Encyclopedia of Life, was
launched to provide an online catalog of every species on earth. In
the past several years, however, geneticist Craig Venter sailed around
the world on a boat equipped with gene sequencing gear. The wealth of
the results is staggering: at least six million new genes were
discovered.

-The data available on a Bloomberg terminal allows complex inquiries
across asset classes, financial markets, and time to be completed
instantaneously. Before this tool, imagine answering a simple
question using spreadsheets, paper records, multiple currencies, and
optimization: "What basket of six currencies - three short and three
long - delivered the best performance over the past two years?"

-The Church of Latter-day Saints has gathered genealogical records
into an online repository. The International Genealogical Index
database contains approximately 600 million names of deceased
individuals, while the addendum to the International Genealogical
Index contains an additional 125 million names. Access is free to the
public.

In the presence of such significant data sets, various academic
disciplines are debating how the fields progress. Quantitative vs.
qualitative methods continue to stir spirited discussion in fields
ranging from sociology to computer science. The continuing relevance
of such essays as C.P. Snow's The Two Cultures and David Hollinger's
"The Knower and the Artificer" testify to the divide between competing
visions of inquiry and indeed truth.

A fascinating question, courtesy of my colleague Steve Sawyer,
concerns the nature of errors in data-rich versus data-poor
disciplines. Some contend that data-rich disciplines tend to be wary
of type I errors (false positives) and thus miss many opportunities by
committing false negatives (type II) that are less visible. Data-poor
communities, meanwhile, may be unduly wedded to theories given that
evidence is sparse and relatively static: in contrast to Venter's
marine discoveries, historians are unlikely to get much new evidence
of either Roman or Thomas Jefferson's politics.

III.
Given that data is clearly valuable, bad guys are finding ways to get
and use it. Privacy is becoming a concern that is both widely shared
and variously defined. Indeed, our commentator Lawrence Baxter, who
used to be a law professor at Duke, noted that defining what privacy
is has proven to be effectively impossible. What can be defined are
the violations, which leads to a problematic state of affairs for both
law and policy.

Data breaches are growing both in number and in size: in the past year
and a half, there have been roughly 50 episodes that involved loss of
more than 100,000 records. The mechanisms for loss range from lost
backup tapes (that were not encrypted) to human error (government
officials opening or publishing databases containing personally
identifiable information) to unauthorized network access. In the
latter category, retailer TJX lost over 45 million credit- and
debit-card numbers, with the thieves, thought to be connected to
Russian organized crime, gaining access through an improperly
configured wireless network at a Marshall's store in Minnesota. Bad
policies, architecture, and procedures compounded the network problem,
to the point where TJX cannot decrypt the files created by the hackers
inside the TJX headquarters transactional system.

Part of data's attractiveness is its scale. If an intruder wanted to
steal paper records of 26 million names, as were lost by the Veterans
Administration last year after a single laptop was stolen, he or she
would need time, energy, and a big truck: counting filing cabinets,
the records would weigh an estimated 11,000 pounds. A USB drive
holding 120 gigabytes of data, meanwhile, can be as small as a 3" x 5"
card and a half-inch thick.

Redefining risk management in a data economy is proving to be
difficult, in part because IT workers have been slow to lead the way
in both seeing the value of data and treating it accordingly. To take
one notable example, the Boston Globe printed green-bar records
containing personal data relating to 240,000 subscribers, then
recycled the office paper by using it to wrap Sunday Globes for
distribution. Not surprisingly, an arms race is emerging between bad
guys, with tools such as phishing generators and network sniffers, and
the good guys, who all too often secure the barn after the horse has
run away.

IV.
Who will be the winners? That is, what companies, agencies, or
associations will use data most effectively? Axciom, Amazon, American
Express, and your college alumni office might come to mind, but it is
so early in the game that a lot can happen. Some criteria for a
potential winner, and there will of course be many, might include the
following:

-Who is trusted?
-Who has the best algorithms?
-Who has, or can create, the cleanest data?
-Who stands closest to real transactions?
-Who controls the chain of custody?
-Who can scale?
-Who has the clearest value proposition?
-Who understands the noise in a given system?
-Who can exploit network externalities?

Whoever emerges at the front of the pack, the next few years are sure
to be a wild ride.

Monday, April 09, 2007

Early Indications March-April 2007: Can Lightning Strike Twice?

The Apple iPhone announcement has created an extraordinary degree of
market speculation and interest. Can Steve Jobs, who in many respects IS
Apple, duplicate the success of the iPod in a new but adjacent market?
In the end, no matter how many commentators expound at whatever length,
nobody really knows what will happen. Taking two extreme scenarios as
endpoints on a continuum of potential outcomes, here are some arguments
why things might unfold as they will.

If the iPhone fails to duplicate the success of the iPod, it will likely
be because:

-The price point is too high: $499 (which AT&T/Cingular is not allowed
to discount) yields an estimated 50% margin, according to iSupply, which
does market research. That profitability, however, could allow Apple to
cut the price after the early adopters pay a premium, and economies of
scale drop the cost of the inputs. Even so, knock-offs will enter the
market faster than they did after the iPod launch.

-The form factor proves cumbersome. Other tablet-like devices with rich
visual interfaces have failed to translate well as handsets put up to
the ear: the Blackberry is great from the thumbs' point of view but less
attractive to mouth and ears. Looking at global markets, many young SMS
users can text blindfolded but the iPhone's smooth screen doesn't allow
this kind of typing.

-The Swiss Army knife factor: all-in-one devices reduce footprint, but
few chefs rely on a red pocket knife to slice cheese, bone meat, or dice
carrots. And who's ever used the saw for anything? The point here is
that the iPhone's range of capabilities might make it marginally
acceptable for anything but less than appealing for specialized tasks
that a Blackberry, conventional handset, or iPod will perform better.

-Functionality is poor. If battery life, or overall durability, or voice
quality, or data security fall short, word of mouth can turn negative in
a hurry. Recall that some iPods had issues with screens that scratched
easily, Sony has had significant battery problems, and smartphones such
as the Sidekick fared poorly at voice transmission. Given the device's
complexity (involving accelerometers and proximity sensors) and
expansive screen real estate in a demanding context (purses and
pockets), keeping large numbers of iPhones in real-world service could
be challenging.

-Apple has to rely on partners. The retail channels for the iPod are
countless. iTunes runs on both Mac and PC operating systems. Most
significantly, the major music labels signed over access to most of
their catalog. This time around, Apple will rise or fall with AT&T
Wireless, which will have far more to do with the experience of iPhone
ownership than any iPod partner did. AT&T, Cingular, and BellSouth, at
the time of their merger earlier this year, employed over 300,000 people
among them. If the wireless business is only a fifth of that headcount,
that's still a 50,000+ person business with which Apple needs to
coordinate technology, customer service, marketing message, and
performance incentives.

-Apple is launching into a mature market with powerful incumbents, high
capital intensity, and well-defined roles. In contrast to the music
market in 2002, which was characterized by tumbling share prices at the
labels, distribution of the capital base (recording studios and pressing
plants) to millions of PC owners, and few mega-selling titles, the
wireless industry has consolidated to a small number of global network
operators, equipment manufacturers, and handset firms. The barrier to
entry remains high, and although Wimax could alter the landscape
eventually, the sort of bottom-up revolution represented by the original
Napster, or even Skype on the wireline side of telecom, is unlikely to
affect mobile providers. The scale of wireless is almost unfathomable:
Apple has sold about 100 million iPods in five years, which is a huge
number in the PC industry. In Q4 of 2006, Nokia all by itself sold 102
million handsets.

-The demographics fail to align: the people most comfortable with
constant connectivity and multifunction devices --15-to-30-year-olds --
may be the least able to afford the devices.

-Apple moves outside its comfort zone. Audio and video integration has
been a hallmark of the Mac environment since day zero, and music has
been part of that integration. The Newton failed for many reasons that
are still likely painful for Apple executives to recall, but the iPhone
runs similar risks as the company enters markets only marginally
connected from its original business. Consumer electronics in the early
part of this decade was ripe for innovation – it still boggles the
imagination that Sony missed the music-player market – but the complex
world of telecom and handset manufacturers is yet another significant
step away from PC pricing, ecosystems, customer expectations, product
lifecycles, externalities, supply chains, etc.

If the iPhone follows the iPod as a success, it will be because:

-Apple invents a new category of device, learning from previous
failures. In the music player case, Apple integrated a far superior
music management software application with the MP3 player, and
implemented copy protection to satisfy the labels that their 99-cent
songs would not be copied indefinitely. In the iPhone case, Apple took a
variety of lessons from Motorola's ROKR music player+phone, which has
had its iTunes license pulled. Rather than being a followup to the
groundbreaking RAZR, the ROKR is essentially the answer to a trivia
question. The iPhone is a bet of a different order entirely.

-The user interface transcends anything else in the category not with
bells-and-whistles complexity but intuitive simplicity. Compare the
owner's manual of even a mid-price mobile handset to the iPod to get a
feel for what the iPhone user experience should be. The human finger
replaces the stylus that doomed everything from Apple's Newton to Palm's
Pilot, the device is aware of itself in space (much like the
surprisingly successful Wii), and the display's colors and lighting are
rich and vivid.

-Apple has once again used superior industrial design, elevated to the
level of art, to create unsurpassed "cool" factor in a category. The
microscopic attention to coherence and detail in the iPod, from
marketing to packaging to peripherals to product endorsers, creates an
emotional appeal found in few electronic devices.

-As venture capitalist John Doerr recently noted, Apple has a vast army
of users trained to synch their device with a computer. It's an
installed base of user behavior that could give the iPhone a jump-start
in adoption.

-The iPhone captures momentum amidst industry disruption. As mobile
broadband emerges from competing standards and platforms, the iPhone
could dominate a multi-radio niche just at the moment that heterogeneous
coverage becomes a reality. Going abroad? Working in a Starbucks
hotspot? Surfing on a train in the northeast corridor? Experimenting
with Clearwire or Sprint Wimax? Having a unified device to maintain
connectivity across access technologies could become extremely valuable.

-The demographics align: the price points, through whatever mechanism,
drive adoption in both the niche knowledge-worker and
technology-as-jewelry segments along with 20-somethings who replaced
landline phones with mobiles and may augment PCs with a
tablet-phone-music player.

-The beautiful device is powered by a "killer application." In the case
of the iPod it was clearly iTunes (not necessarily the music store), and
with the iPhone, it could be visual voice mail, being able to browse and
manage voice messages from the screen rather than having to listen to
them serially.

-The iPhone, rather than being a phone, is treated by enterprise IT
shops as a Unix terminal, with the caliber of security that implies. For
remote or salesfloor (think of boutique menswear or automobile
dealerships) sales forces needing both a catalog and a conversation
piece, or executives carrying valuable information on hard drives, or
mobile professionals needing secure communications and storage in a
variety of contexts, the iPhone could turn out to be highly relevant.
Furthermore, given how many people are likely to want (crave) the
device, there will be at least a few IT organizations that delight their
business clients with these secure, robust, engaging remote connection
devices.

However the iPhone plays out, I can't remember a product launch that
generated so much attention. (Microsoft's Vista launch, by contrast, is
a distant cousin of the fever generated by Windows 95's entry into the
market.) Industry analysts, Mac fans, gadget fiends, and future-scanners
are all watching closely to see if we're present at the creation of a
new industry with new possibilities for communications and lifestyle, or
if the audacity of the claims can't match the complexity and rigors of
real-world supply chains, sales channels, and use scenarios. Apple is
sticking with the June launch date, so we won't have to wait long to
find out.

Thursday, March 01, 2007

Early Indications February 2007: The Evolving Enterprise Application

Until recently, enterprise software came in one of two basic shapes. If the firm built an application from scratch, the process was frequently long and expensive. Considerable effort was required for technologists to understand and translate business requirements into code. Once that code was tested and debugged, a particular application reflected the time and place of its creation. Accordingly, each stratum of the heterogeneous software environment often required maintenance from the very people who initially built it because it was often difficult to translate their knowledge of the application into documentation. Maintenance processes thus varied widely across the portfolio, contributing to complexity in the environment that exacted tolls in in time, money, and performance.

Buying an application from an outside vendor solved many of these problems. Microsoft, Oracle, and their kin had thousands of installed programs from which to learn, and used wide experience to build fixes and upgrades for the installed base. Even so, installing and maintaining enterprise packages such as SAP, Manugistics, or even Access could be expensive and difficult: different companies, functions, and, often, individual users require customization. Training remains a challenge even after web-based interfaces replaced many packages' idiosyncratic, proprietary screens and commands. Cost was also a concern as software license expenses are compounded by hardware upgrade requirements, consulting fees, and upgrades over the life of the application.

In both instances, scaling up could be difficult but scaling down -- after a change in corporate direction, spinoff, or shift in the company's market -- was often impossible: 500 software seats, with their attendant human and hardware infrastructure, didn't translate into 250 seats at anything like half the cost.

Six recent developments may portend some major shifts in the enterprise application market. All of them introduce the deep changes to prevailing funding, development, and support models that will be worth tracking.

*Software as a service, vertical flavor

On February 27, Salesforce.com announced a 25,000-seat deal with Merrill Lynch, whose financial advisors will use the tool to help manage clients' portfolios. Salesforce had already established a foothold at such sector accounts as Suntrust and Aon, but the Merrill Lynch deal is a clear breakthrough into the top tier of the industry, at scale.

*Software as a service, horizontal flavor

Google Apps Premier Edition launched February 22. Like Salesforce, Google rolled out blue-chip adopters: GE and Procter & Gamble, in addition to many universities and smaller businesses already enrolled. For $50 per user per year, enterprises get

-10GB of storage, about 100 times the typical corporate e-mail box
-99.9% uptime guarantee (which still translates to over 8 hours a year downtime)
-24x7 help desk support
-APIs for single sign-on, data migration, and other integration tasks
-Gmail for Blackberry
-Google docs and speadsheets, which allow for multiple employees working on the same document at the same time but which are not yet fully interoperable with Microsoft Office files
-Administrative access to ensure compliance with corporate policies (who sees which calendars, for example, or what attachments are and are not permitted)

*Corporate mashups

FedEx CIO Robert Carter recently told an audience at Wharton a little about a highly secretive pilot project the company is running. High-value packages, particularly biotech-related shipments such as bone marrow, are tagged with active radio sensors that transmit the parcel's location to the company over public wi-fi networks. FedEx also used the readily available Google Earth APIs rather than building or licensing GIS software, particularly for a pilot project. The resulting application aims to blend mapping data, video from the trucks, and package status (presumably including temperature, shock and motion records, and tampering indicators) to create a new generation of service to shippers with particularly sensitive items in transit.

-RSS inside the firewall

Serendipity Technologies, an Israeli startup I saw at the DEMO conference, encapsulates enterprise application data into RSS feeds. It's not hard to envision an end user with the WorkLight product assembling her own desktop: in-house data such as available-to-promise inventory, customer order history, and current pricing could join Mapquest, Weather Channel, Yahoo Finance, and newswire feeds to get a sales rep ready to do a day's calls. I've asked around informally, and something like 40% of many enterprise's application portfolios consists of reporting tools. Making the relevant feeds available as services for business users to assemble as needed would seem to be a way around the long-cycle, expensive, rigid development process for non-transactional applications.

-Appliances

Building on its integration of analytic functionality on a customized Intel-based device to create the "business intelligence accelerator," SAP will be launching an enterprise search appliance sometime this year. Other vendors have had similar success in bundling software onto special-purpose hardware to speed deployment, lower cost of ownership, and improve performance relative to a general-purpose platform. Familiar examples include antispam (Ironport), data warehousing (Teradata), and network security (Symantec).

-Show, don't tell

The requirements definition phase for an enterprise software project can be a long, frustrating exercise as businesspeople and technologists struggle to define what's possible, what's cost-effective, and what's necessary. There's been a long-running debate as to how well people in IT shops can learn the ins and outs of business process compared to how well users can learn to use yet-to-be-invented lightweight "development" tools (Excel macros are the standard existence case) to build the applications they need. Neither argument has prevailed.

I recently encountered a company that's straddling the line with a intuitively appealing tool that presents an alternative to translating interviews, focus groups, and other forms of task analysis into a text document often hundreds of pages long. By their very size and complexity, these documents themselves inject delays and ambiguity into the processes of both package deployment and custom development. Instead, teams at such companies as Wachovia, Agilent, and Dow Jones are using tools from a company called iRise to see realistic simulations (not prototypes) of the desired functionality, relationships, and usability. Both requirement definition and testing phases are said to be accelerated, developers can cut rework, and time to deployment has decreased in real-world projects.


What traits do these six software forms share? To some degree or another, all six items include an aggressive integration story. This posture is a refreshing change from previous generations of enterprise apps that were built for greenfield deployment yet purchased by established companies with already-complex environments. Whether it's Google's industry-standard APIs, SAP's Netweaver investment, Salesforce's Apex platform, or Serendipity's creative use of AJAX and RSS, there appears to be a general intent to plug things together rather than build siloed functionality (data warehousing appliances may be an exception, however).

In contrast to many traditional applications, software as a service, enterprise RSS feeds, and mashups should, in concept, scale both up and down. If a mashup takes a week or two to build, and then requirements change, is there even a need to call its retirement "sunsetting"? The notion of on-demand functionality has held appeal for several years for precisely this reason, and the combination of a "faucet" on usage, along with throw-away integration, should reduce impediments to change. To this end, one prominent CIO recently told me that services architectures won't save him any money, but not having them would definitely get expensive: in his view, paying the price, in the form of data discipline, to obtain flexibility was well worthwhile.

Finally, payback on initial investment should come quickly for these kinds of software, designed as they are for quick deployment and reasonably simple integration. Google Apps changes less than half of a typical e-mail host for far more functionality, for example. FedEx's Carter said that his mapping functionality would have been potentially too expensive to license, particularly for a speculative effort. This tightening of the connection between cost and value in turn should make the CIO's relations with business unit customers easier to justify. Being able to dial service levels, volumes, and functionality up or down has the potential to improve the chargeback process: instead of presenting business units with lump sums of fixed costs proportionately divided, the CIO can move in the direction of menu pricing for some portion of his or her cost base.

Changing the current model will not happen fast. The life span of application portfolios is frequently measured in decades. Existing cost structures cannot be exited overnight. Appliances introduce their own particular type of lock-in. Many CIOs and other administrators hesitate to move enterprise data off site to a Google or Salesforce. Customer service and support remain question marks, particularly for mashups, which may present users and help desks with unpredictable performance characteristics: is the problem on the desktop, with Comcast's network or the Starbucks wi-fi hotspot, or in some faraway server?

But for all of these impediments, the general direction of the industry is headed toward a blend of customization and standardization, on-site and off-site, lightweight and industrial-strength. The proof is easy to find: the same industry leaders who thrived under the old model are moving to embrace various forms of the new one, whether it's Intel teaming with SAP to build appliances, Dell supplying special-function servers to Ironport, Microsoft's Live suite of offerings, or IBM's support for software as a service. As work, organization, and markets change dramatically, it's fitting that the business toolkit do so as well.

Thursday, January 18, 2007

Early Indications January 2007: A Different Kind of Prediction Letter

"No man is an island, entire of itself; every man is a piece of the continent, a part of the main."

Before we can talk about systems, we need to start smaller. A standalone technological artifact can be called a device, which is technically "a machine used to perform one or more relatively simple tasks." Electronic calculators and electric drills are common examples, illustrating that devices are generally more complex than simple tools, such as a hammer or pencil.

A set of devices sharing communications rules and facilities can combine into a platform, in which the devices gain in utility by being able to work together. A hard drive, display, processor, and peripherals like cameras and printers constitute the personal computer platform. Like a device, some platforms are used by individuals, but others (such as computer-controlled machine tools) constitute commercial infrastructure.

When platforms are connected by communications standards and technologies, they become systems. An airline network is a system in that runways, pilot training, technically precise language (as used in the cockpit and in communications with control towers), refuelling inventories, jetbridge heights, catering cart dimensions, and countless other aspects are coordinated in order that passengers can get where they want to go at minimal cost and a high degree of safety. Perhaps the most salient fact of systems is their complexity, which often scales nonlinearly with attributes such as size, population, and so forth.

Because systems are big, expensive, and complex, they must be governed in special ways. Private ownership of the railroads created a backlash that still curtails the power of airlines. The power that AT&T derived from owning the U.S. telephone system, literally down to the kitchen handset, eventually led to its being broken apart by regulators. Because systems often appear to be natural monopolies, they tend to be either government-owned or highly regulated. Even so, systems often provide economic opportunity on a very large scale: Motorola could not sell over 50 million Razr wireless handsets at a high premium without existing networks for the devices to connect to.

Systems typically support and depend on other systems. An airline relies on banks for working capital, on hotel chains for traveler (and crew) accommodations, and on government for certifications of various sorts. If the government doesn't provide a certificate of airworthiness for a new plane, no bank will lend the money and an outside insurer won't underwrite the liability. A personal computer without a telecommunications system and an electric power grid is essentially useless. At a deeper level, a keyboard without a language and syntax of expressions, both grammatical and mechanical, is quickly reduced in value. These dependencies among systems, and the resulting complexities, drive our 2007 predictions.

Recall that the invention of a gasoline-powered automobile took over a half-century to generate related systems of gas stations, factories and factory jobs, financing companies, limited-access highways, suburbs, enthusiast and consumer rating magazines, race tracks, drive-in restaurants and bank windows, safety regulators and regulations, environmental impacts, insurance companies, vacation destinations, and dozens of other artifacts that marked the United States of the 1950s and afterward. Not long ago one out of every seven U.S. jobs was connected to the automobile. Systems of systems - in this case, mechanical, financial, governmental, legal, and economic - take a long time to develop but, once running, exhibit "flywheel effects" in that they possess considerable momentum and continue to spin for a long time even after the source of driving energy is removed. Systems also resist change.

The so-called ICT (information and communications technologies) sector has long been powerful, but in the recent past it has begun exhibiting the characteriststics of a system of systems. A key development in this transition is the development of third-party funding, often through advertising. Just as television broadcasts are "free," so too are more and more technology benefits funded through online advertising revenue. The Internet of 1997 was a system, no doubt, a "network of networks," but we now have human, economic, technical, and political systems all riding on the same technology roadbed. Brides and grooms, job-hunters, election-seekers, terror networks, stock-market investors and manipulators, stamp collectors, wildlife biologists, role-players, and any other identifiable population now has a significant digitally connected component. In addition, so-called Web 2.0 business methods and models are using those human networks as economic engines.

This preamble brings us to the 2007 predictions. It's getting difficult to find economic actors that don't need to be actively involved in technologically-constituted systems. The Internet flourished without substantial government regulation until a large number of constituencies sought redress and/or protection, most in the past several years. Online gambling, voice over IP services, "net neutrality," alternative broadband providers, targeted advertising, sexual material, maliciously edited public profiles on Wikipedia and elsewhere, unauthorized use of copyrighted digital content - the list goes on and on, but in each instance we see a collision between systems based on old and new models of regulation, remuneration, protection, privacy, and so forth. At base, we are having to redefine some of the core systems that make the world work: money, contracts, civil rights and civic responsibilities, identity, possession, and others.

This year, I believe that several of these collisions will reach new heights of unexpectedness, expense, and impact. Some candidates follow:

-Last year an agency of the U.S. government lost 26 million names with personally identifiable information on a single laptop. Monetary losses from online fraud, extortion, and related exercises were not reported, but are estimated to range in the hundreds of millions of dollars worldwide. This year, look for a still grander failure of data protection, either in one highly visible episode or a cumulative increase.

-YouTube and related content distribution mechanisms will push the envelope too hard, with a high-profile episode of unauthorized copy distribution prompting legislation, litigation, and potentially business failure.

-Some new activity - whether job referrals, recipe swapping, rotisserie baseball, geneology, Christian evangelism, or something similarly below radar - will break through using a Google-like monetization model and approach the growth rate we saw for video in 2006.

-Around the time of Gerald Ford's administration, a bumper sticker was distributed as speed limits on federally funded highways were reduced to save on petroleum consumption: "55: It's not just a good idea, it's the law." Science nerds quickly came out with their version: "186,000 miles per second: It's not just a good idea, it's the law." For all their amazing capabilities, communications and computing systems still can't cheat physics. 2007 will see the so-called virtual world continue to encounter the physical environment in important ways. A few examples suggest the breadth of the issue:

*Data centers are beginning to scale up to the size of factories and even foundries in their energy consumption.
*Intel continues to struggle with the heat output of microprocessors, apparently more than AMD does.
*The long-term effects of electromagnetic radiation (especially introduced to the brain through a hole in the protective skull) remain unknown.
*The digital bread crumbs left behind by automobiles, cell phones, PCs, credit cards, and identification technologies (toll passes, fingerprint readers, browser cookies) continue to accumulate on a massive scale. The potential for abuse grows apace.

-A different facet of the energy and transportations systems relates to automobiles. While Chevrolet just introduced a good-looking electric car, the Volt, that anticipates developments in battery technology, Tesla Motors will ship over 200 Roadsters at $100,000 apiece that out-accelerate a Porsche 911 and achieve the equivalent of 135 miles per gallon fuel efficiency. Many thousands of similar electric cars could be deployed on the current power grid once Tesla and other firms approach mass-market pricing driven by economies of scale. In contrast, ethanol technology is energy-inefficient (but getting better), and there is no infrastructure in place to store or move ethanol at the required scale: because alcohol conducts electricity while gasoline is a dielectric, for example, fuel pumps need to be redesigned. More critically, alcohol at 85% concentration attacks the metals and gaskets used in current fuel pipelines and tankers. E85 also delivers inferior gas mileage compared to gasoline. In short, innovation -- at the device level -- to improve electric cars produces a much lighter load on the current supporting systems than would a mass adoption of ethanol. How politics and markets react to rising oil prices, from a systems-of-systems perspective, will determine quite a bit about the shape of the next 10-20 years.

-After 9/11 and 7/7, Katrina and the other hurricanes of 2005, the Pacific tsunami, SARS, the 2003 power grid failure, and the Long Beach port strike, 2006 did not see a major disruption to the world's transportation and communication systems. Such good fortune cannot last indefinitely, yet readiness for the unexpected remains lower than it could be, especially after the many upheavals that followed the Y2K non-event. Some observers see video peer-to-peer networks, such as Joost from the founders of Skype, as potentially crippling the current Internet, for example. In the past month, try telling someone who had Norovirus that it's less severe than avian flu.

-Paradoxically, even as people and devices grow more connected, with access to more information, the need for intermediaries evolves rather than disappear. In the 1960s, most Americans could watch one of three network news shows and subscribe to one of two or possibly three daily newspapers. Retail selection was vast, compared to the 1930s or 40s, but still constrained to what merchandisers chose to present. Now, consumers can shop across vast inventories of food, fashion, entertainment, and other products (the so-called "long tail"), theoretically making each person his or her own merchandiser, video producer, or editor. Just as the late 1990s was the heyday of disintermediation predictions before brokerage firms, real estate agents, car dealers, and other middlemen responded, so now we will see how newspapers, magazines, book publishers, movie studios, universities, and other content businesses respond to the increase in direct access to many categories of content. The fact that the American Film Institute noted in its "2004 Moments of Significance" that "one of the best sources of news today is [Jon Stewart's] faux news show" would be inconceivable in the 1970s: Chevy Chase more authoritative than John Chancellor? Expect more, not fewer, of such puzzling inversions going forward.

In short, 2007 will continue the trend toward more complex types of change as information and communications technologies intertwine the world in more ways. When John Donne wrote the famous words in the headnote nearly 400 years ago, he anticipated a system of connection that remains incomprehensible in its scale, its impact, and its possibilities.

Early Indications December 2006: How'd We Do?

[distributed 12/13/06]

In January of this year, we published eight predictions. At this
point, the score is six hits, an incomplete, and a slight miss.
Overall, it was a pretty good year for technology and innovation, with
a variety of new wireless technologies getting closer to market, a
breakout year for web video, and an ambitious set of web services from
Amazon (including everything
from artificial intelligence to warehouse shelf space).

On the mis-step front, the litany of privacy breaches became longer
and louder, with a stolen laptop costing the U.S. Veteran's
Administration a projected $160 million dollars had it not been found.
In the midst of absorbing losses in the hundreds of millions of
dollars because of defective batteries, Sony launched an expensive,
powerful gaming platform that it couldn't supply in holiday
quantities. The "netroot" bloggers made a lot of noise but in the end
did not get their U.S. Senate candidate elected.

1) The second half of the year will be stronger than the first half in
the PC sector

Score: Hit
Even though Microsoft's Vista operating system has yet to generate any
PC sales, because consumers can't yet buy it and enterprises will have
to certify their existing applications on the new platform, this
prediction did in fact come to pass: at HP, the Personal Systems
group saw revenues for the three months ending October 31 rise 13%
over the quarter ending July 31. At Dell, only preliminary numbers
for the quarter ending November 3 have been reported, so it's
difficult to say with certainty what's going on.

2) "Services" will become the corporate IT buzzword outside IT

Score: Hit
SAP recently announced that its core positioning will focus on what it
calls Enterprise SOA or ESA, with half of R&D spending committed to
creating up to 30,000 SOA-driven business processes. Accenture is
spending an announced$450 million on SOA, HP $500 million, and IBM a
total of $1 billion on SOA over the coming years. Oracle, BEA, Sun,
and most software vendors apart from Microsoft, which is not branding
services so aggressively, are joining the gold rush.

3)Google will launch a breakthrough business outside web advertising

Score: Hit
Counting only their distribution deal with BSkyB, the YouTube
acquisition, the launch of the core of an online office suite (Docs
and Spreadsheets), and the challenge to PayPal implied by Google
Checkout, it was a big year for Google. Going forward, the company is
joining Amazon, Yahoo, and other search firms in building enormous
data centers to support further expansion of so-called "cloud
computing." George Gilder's recent article on these data centers in
Wired, apart from its purple prose, is required reading, particularly
for its take on the electricity consumption issues:
http://www.wired.com/wired/archive/14.10/cloudware_pr.html

4) HDTV will have collateral effects

Score: Hit
According to market researchers DisplaySearch, HDTV displays broke
through and accounted for over half of the North America TV market in
2006. Price drops continue, even in the face of such strong demand,
in part because the big producers see additional production capacity
coming on line in the near future. Verizon's bet on fiber to the
home, meanwhile, may be concerning the cable TV operators, whose
industry lab was reported this summer to have questioned the wisdom of
long-term investment in a coaxial infrastructure with strict, and low,
limits on HD traffic.

5) The relentless reinvention of business markets by the Internet and
digitization will continue

Score: Hit
"Who might be next? Television is my best guess." With YouTube and
BitTorrent both getting content distribution deals from major players,
and with cellular continuing its push toward video, the motion picture
and other video incumbents are confronting a dramatically different
landscape. A big story here was ESPN pulling its cellular phone
service after only about six months.

6) The quiet march of robot progress will continue

Score: Incomplete
The march was so quiet I couldn't hear it. There was no high-profile
story on par with 2005's DARPA challenge, which will be re-run in 2007
in simulated urban traffic, rather than last year's open albeit
obstacle-strewn desert environment.

7) Sensors and other location-awareness technologies will make the
news for an unexpected consequence

Score: Miss
RFID in the supply chain is finally providing suppliers (as opposed to
retailers like
WalMart) with a compelling cost justification: promotion
effectiveness. Think about a consumer products manufacturer (a
fictional example would be a battery company before Halloween): if I
deploy an expensive, time-sensitive end cap or other display, I want
to know if a chain's thousands of stores are in fact displaying my
promotional material and inventory. If they aren't on the sales
floor, customers could be confronted with stockouts and/or I must take
returns of seasonally-specific merchandise, such as Halloween
packaging in November. Apart from that realization, the toll bridges,
automated thermostats, automobile black boxes, and their kin
apparently worked well enough not to draw notice. RFID-equipped
passports are concerning industry observers who see how easy it is to
read them from a distance, and a company that sells tags for human
implantation (with readers being given away to emergency departments)
is raising fears, but neither was a major story.

8) The developing world will once again make headlines for innovation
and not just cheaper
production costs

Score: Hit
A Nobel Prize for microfinancing in Bangladesh, Brazil's leadership in
ethanol, and Korea's
launching a mobile WiMax service years ahead of the U.S. or Europe all
seem to count here. It's noteworthy that two of these three examples
involve organizational rather than primarily industrial innovation.

Also in January, we discussed six macro trends, and several of them
certainly made an impact:

Climate change: Just this week, a preprint of an article in
Geophysical Research Letters suggested that the Arctic could be open
water by 2040, maybe sooner.

Avian flu: Nothing on the epidemic front so far

Unstable energy prices: Big news here, especially mid-summer

The end of the bi-polar world: Russia, Venezuela, the Middle East,
North Korea and Darfur certainly proved that regional instability can
reach far and wide.

Decreased faith in government and authority: The U.S. midterm
elections, a wide lack of confidence in the United Nations, and the
privatization of large-scale humanitarian efforts by everyone from the
Gates Foundation to Rick Warren suggest this trend is continuing.

Increased evidence of class conflict: "Conflict" may be the wrong
word, because there was less overt class-related rhetoric this year
than in Katrina or the Paris riots. But the word "separation" still
applies, with significant implications for the middle class in
industrialized nations: in 2005, the average CEO of a U.S. company
with more than $1 billion in revenues made 262 times what the average
worker was paid, the second-highest multiple on record. The average
income of the top 126 hedge fund managers this year, according to
Barrons, was $363 million, up 45% over the previous year. At Goldman
Sachs, meanwhile, the mean average compensation for all 26,000
employees, from administrative assistants to top management, is
$622,000 in 2006.

On that note, happy holidays. Watch for the 2007 predictions in January.