Friday, September 30, 2011

September 2011 Early Indications: The Innovation Moment?

What follows is a selection from the opening to a book I'm completing. Recent tech-related news -- Apple's dominance, Amazon's alternative axis of competition, tablet and other woes at HP and Research in Motion, social media and social change all over the world, Anonymous, and sustained global un- and underemployment -- seems to reinforce the hypothesis that the rules of the game are in transition.

Thoughts and reactions are welcome, though they might not make it into the final product.
**********

Given the changes of the past 40 years—the personal computer, the Internet, GPS, cell phones, and smartphones—it’s not hyperbole to refer to a technological revolution. This book explores the consequences of this revolution, particularly but not exclusively for business. The overriding argument is straightforward:

1) Computing and communications technologies change how people view and understand the world, and how they relate to each other.

2) Not only the Internet but also such technologies as search, GPS, MP3 file compression, and general-purpose computing create substantial value for their users, often at low or zero cost. Online price comparison engines are an obvious example.

3) Even though they create enormous value for their users, however, those technologies do not create large numbers of jobs in western economies. At a time when manufacturing is receding in importance, information industries are not yet filling the gap in employment as economic theory would predict.

4) Reconciling these three traits will require major innovations going forward. New kinds of warfare and crime will require changes to law and behavior, the entire notion of privacy is in need of reinvention, and getting computers to generate millions of jobs may be the most pressing task of all. The tool kit of current technologies is an extremely rich resource.

Cognition

Let’s take a step back. Every past technological innovation over the past 300-plus years has augmented humanity’s domination over the physical world. Steam, electricity, internal combustion engines, and jet propulsion provided power. Industrial chemistry provided new fertilizers, dyes, and medicines. Steel, plastics, and other materials could be formed into skyscrapers, household and industrial items, and clothing. Mass production, line and staff organization, the limited liability corporation, and self service were among many managerial innovations that enhanced companies’ ability to organize resources and bring offerings to market.

The current revolution is different. Computing and communications augment not muscles but our brain and our sociability: rather than expanding control over the physical world, the Internet and the smartphone can combine to make people more informed and cognitively enhanced, if not wiser. Text messaging, Twitter, LinkedIn, and Facebook allow us to maintain both "strong" and "weak" social ties—each of which matters, albeit in different ways—in new ways and at new scales. Like every technology, the tools are value-neutral and also have a dark side. They can be used to exercise forms of control such as bullying, stalking, surveillance, and behavioral tracking. After about 30 years—the IBM PC launched in 1981—this revolution is still too new to reflect on very well, and of a different sort from its predecessors, making comparisons only minimally useful.

For a brief moment let us consider the "information" piece of "information technology," the trigger to that cognitive enhancement. Claude Shannon, the little-known patron saint of the information age, conceived of information mathematically; his fundamental insights gave rise to developments ranging from digital circuit design to the blackjack method popularized in the movie 21. Shannon made key discoveries, of obvious importance to cryptography but also to telephone engineering, concerning the mathematical relationships between signals and noise. He also disconnected information as it would be understood in the computer age from human uses of it: meaning was "irrelevant to the engineering problem." This tension between information as engineers see it and information that people generate and absorb is one of the defining dynamics of the era. It is expressed in the Facebook privacy debate, Google’s treatment of copyrighted texts, and even hedge funds that mine Twitter data and invest accordingly. Equally important, however, these technologies allow groups to form that can collectively create meaning; the editorial backstory behind every Wikipedia entry, collected with as much rigor as the entry itself, stands as an unprecedented history of meaning-making.

The information revolution has several important side effects. First, it stresses a nation’s education system: unlike 20th-century factories, many information-driven jobs require higher skills than many members of the work force can demonstrate. Finland’s leadership positions in education and high technology are related. Second, the benefits of information flow disproportionately to people who are in a position to understand information. As the economist Tyler Cowen points out, "a lot of the internet’s biggest benefits are distributed in proportion to our cognitive abilities to exploit them." This observation is true at the individual and collective level. Hence India, with a strong technical university system, has been able to capitalize on the past 20 years in ways that its neighbor Pakistan has not.

Innovation

Much more tangibly, this revolution is different in another regard: it has yet to generate very many jobs, particularly in first-world markets. In a way, it may be becoming clear that there is no free lunch. The Internet has created substantial value for consumers: free music, both illegal and now legal. Free news and other information such as weather. Free search engines. Price transparency. Self-service travel reservations and check-in, stock trades, and driver’s license renewals. But the massive consumer surplus created by the Internet comes at some cost: of jobs, shareholder dividends, and tax revenues formerly paid by winners in less efficient markets.

In contrast to a broad economic ecosystem created by the automobile—repair shops, drive-in and drive-through restaurants, road-builders, parking lots, dealerships, parts suppliers, and final assembly plants—the headcount at the core of the information industry is strikingly small and doesn’t extend out very far. Apple, the most valuable company by market capitalization in the world in 2011, employs roughly 50,000 people, more than half of whom work in the retail operation. Compare Apple’s 25,000 non-retail workers to the industrial era, when headcounts at IBM, General Motors, and General Electric all topped 400,000 at one time or another. In addition, the jobs that are created are in a very narrow window of technical and managerial skill. Contrast the hiring at Microsoft or Facebook to the automobile, which in addition to the best and the brightest could also give jobs to semi-skilled laborers, tollbooth collectors, used-car salesmen, and low-level managers. That reality of small workforces (along with outsourcing deals and offshore contract manufacturing), high skill requirements, and the frequent need for extensive education may become another legacy of the information age.

In the past 50 years, computers have become ubiquitous in American business, and in many global ones. They have contributed to increases in efficiency and productivity through a wide variety of mechanisms, whether self service websites, ATMs, or gas pumps; improved decision-making supported by data analysis and planning software; or robotics on assembly lines. The challenge now is to move beyond optimization of known processes. In order to generate new jobs—most of the old ones aren’t coming back—the economy needs to utilize the computing and communications resources to do new things: cure suffering and disease with new approaches, teach with new pedagogy, and create new forms of value. Rather than optimization, in short, the technology revolution demands breakthroughs in innovation, which as we will see is concerned with more than just patents.

There are of course winners in the business arena. But in the long run, the companies that can operate at a sufficiently high level of innovation and efficiency to win in brutally transparent and/or low-margin markets are a minority: Amazon, Apple, Caterpillar, eBay, Facebook, and Google are familiar names on a reasonably short list. Even Dell, HP, Microsoft, and Yahoo, leaders just a few years ago, are struggling to regain competitive swagger. Others of yesterday’s leaders have tumbled from the top rank: Merrill Lynch was bought; GM and Chrysler each declared bankruptcy. Arthur Andersen, Lehman Brothers, and Nortel are gone completely. How could decline happen so quickly?

Consider Dell, which achieved industry leadership in the 1990s through optimization of inventory control, demand creation, and the matching of the two. The 2000s have treated the company less well. Apple, which like Dell boasts extremely high levels of supply chain performance, has separated itself from the PC industry through its relentless innovation. Seeing Apple pull away with the stunning success of the iPhone, Google in turn mobilized the Android smartphone platform through a different, but similarly effective, series of technical and organizational innovations. In contrast to Apple and Google, optimizers like Dell are suffering, and unsuccessful innovators including Nokia are making desperate attempts to compete. Successful innovation is no longer a better mousetrap, however: the biggest winners are the companies that can innovate at the level of systems, or platforms. Amazon's repeated innovations, many of which came as stunning surprises, illustrate a profound understanding of this truth. At the same time, Microsoft's efforts to shift from the PC platform onto something new have met with mixed success: the Xbox has done well in a small market, while the results of the Nokia mobile bet will obviously be a top story for the coming year.

Given our place in the history of technology, it appears that structural changes to work and economics are occurring. To set some context, consider how mechanization changed American agriculture after 1900. Fewer people were needed to till the land, leading to increased farm size and migration of spare laborers to cities. Manufacturing replaced agriculture at the core of the economy. Beginning in 1960, computers helped optimize manufacturing. Coincident with the rise of enterprise and then personal computing, services replaced manufacturing as the main employer and value generator in the U.S. economy. In short, innovation could be to information what mechanization was to agriculture, the agent of its marginalization and the gateway to a new economic era.

How information technology relates to this shift from manufacturing to services and, potentially, a new wave of innovation is still not well understood; to take one example, as Michael Mandel argued in Bloomberg Businessweek, a shortfall of innovation helps explain the misplaced optimism that contributed to the financial crises of the past years. But rather than merely incant that "innovation is good," I believe that the structure of economic history has certain limits, and computers’ propensity for optimization may be encountering one such limit. It takes people to innovate, however, and identifying both the need as well as the capabilities and resources necessary for them to do so may be a partial path out of the structural economic stagnation in which we find ourselves.