Sunday, November 14, 2010

Review Essay: Kevin Kelly, What Technology Wants

In 35 years of reading seriously and often professionally, I have never a read a book like What Technology Wants. I dog-eared at least 30 pages and filled several margins with reactions. Over two long plane rides, I was by turns absorbed, consternated, and counter-punching. I think What Technology Wants gets the story wrong, but it lays out a bold, original, and challenging position with a complex array of evidence, analysis, and conviction. The core hypothesis is untestable, however, and enough counterexamples can be summoned that substantial uncertainty undermines Kelly's deterministic argument.

Make no mistake, optimism is the operative motif. As Kelly notes, when sages or prophets foretold the future in ages past, the outlook was usually bad. The very notion of progress, by contrast, is itself a relatively modern invention. As we will see, Kelly's book is best understood as part of a larger conversation, one that has found particularly fertile ground in America.

What exactly is the technology that "wants" things? From the outset, Kelly finesses a sweepingly broad definition:

"I've somewhat reluctantly coined a word to designate the greater, global, massively interconnected system of technology vibrating around us. I call it the _technium_. The technium extends beyond shiny hardware to include culture, art, social institutions, and intellectual creations of all types. . . . And most important, it includes the generative impulses of our inventions to encourage more tool making, more technology invention, and more self-enhancing connections." (11-12)

Several of the book's key themes become apparent early. Most centrally, technology is read as, if not alive ("vibrating" with "impulses"), then something very close to alive: connections between technology and biology, moving in both directions, are drawn throughout the book. For example, "if I can demonstrate that there is an internally generated direction within natural evolution, then my argument that the technium extends this direction is easier to see." (119)

The second, and more regrettable, tendency of the book is to argue along multiple slippery slopes. In the initial definition, for example, the technium includes everything from churches (both buildings and people) to cloned sheep to George Foreman grills to the Internet. If it includes so much, what is the technium _not_? I believe that understanding "social institutions and intellectual creations of all types" and their role in the technology artifacts that more commonly concern us -- things like end-of-life treatment protocols, ever-nastier methods of warfare, or high levels of carbon dioxide output -- requires a sharper knife.

The aforementioned slippery slope argumentative technique may have been most brilliantly parodied in the student court trial scene in Animal House:
***
But you can't hold a whole fraternity responsible for the behavior of a few sick, perverted individuals. If you do, shouldn't we blame the whole fraternity system?

And if the whole fraternity system is guilty, then isn't this an indictment of our educational institutions in general?

I put it to you, Greg. Isn't this an indictment of our entire American society?

Well, you can do what you want to us, but we won't sit here, and listen to you badmouth the United States of America!
***
Several sections of What Technology Wants raised red flags that suggest similarly deft rhetoric may be in play elsewhere in the book. In an argument structurally very similar to the Animal House logic, for example, the technium is given almost literally biological properties: "Because the technium is an outgrowth of the human mind, it is also an outgrowth of life, and by extension it is also an outgrowth of the physical and chemical self-organization that first led to life." (15) If, like me, one does not grant him this chain of logic linking single-celled life forms to Ferraris or credit default swaps, Kelly's argument loses some of its momentum: for him, the quasi-sentient life force that is the sum of humanity's efforts to create is ultimately life-enhancing rather than destructive or even indifferent.

Nowhere is this faith more clearly stated than in the book's conclusion. "[The technium] contains more goodness than anything else we know," Kelly asserts. Given that the technium is everything that people have ever made or written down, what is the alternative that could be "more good"? Pure nature? But the technium is awfully close to nature too: "the technium's wants are those of life." In fact, like Soylent Green, the technium is (at least partially) people: "It will take the whole technium, and that includes us, to discover the tools that are needed to surprise the world." (359)

But the fact of the matter is that much of the technium is built to kill, not to want life: the role of warfare in the advancement of technology dates back millennia. From swords and plowshares, to Eli Whitney's concept of interchangeable parts in musket-making, to nuclear weapons, people and governments have long used technical innovation to subdue each other. Even Kelly's (and my) beloved Internet can trace its origins directly to the game theoretics of John von Neumann and mutual assured destruction. Statecraft shapes technology, sometimes decisively, yet this influence is buried in Kelly's avalanche of technological determinism.

To be sure, some of Kelly's optimism has convincing grounding; it's his teleology I question. In What Technology Wants, the strongest sections combined clever data-gathering and analysis to express the power of compounding innovation: particularly where they can get smaller, things rapidly become cheaper and more powerful at a rate never before witnessed. Microprocessors and DNA tools (both sequencing and synthesis) are essential technologies for the 21st century, with Moore's law-like trajectories of cost and performance. In addition, because software allows human creativity to express and replicate itself, the computer age can advance very rapidly indeed. The key question, however, relates less to technological progress than to our relation to that progress.

In my discussions with Kelly back when we were affiliated with the same think tank in the 1990s, he had already identified the Amish as a powerful resource for thinking about the adoption of technology. Chapter 11, on Amish hackers, raises the issues of selective rejection to a level of depth and nuance that I have seen nowhere else. Four principles govern the Amish, who are often surprising in their technology choices, as anyone who has seen their skilled and productive carpenters (with their pneumatic nail guns carried in the back of pickup trucks) can attest.

1) They are selective, ignoring far more than they adopt.

2) They evaluate new things by experience, in controlled trial scenarios.

3) Their criteria for evaluation are clear: enhance family and community while maintaining distance from the non-Amish world.

4) The choices are not individual but communal. (225-6)

Remarkably, Amish populations are growing (fast), unlike the Shakers of New England who attempted similar removal from the world but could not sustain their existence either individually or collectively. Instead, the Amish often become expert in the use of a technology while eschewing its ownership. They are clever hackers, admirable for their ability to fix things that many non-Amish would simply throw away. At the same time, there are no Amish doctors, and girls have precisely one career trajectory: motherhood or a close equivalent thereof. As Kelly notes, the people who staff and supply grocery stores or doctor's offices, participate in a cash economy, and pay taxes for roads and other infrastructure enable their retreat. In the end, the Amish stance cannot scale to the rest of us, in part because of their radical withdrawal from the world of television, cell phones, and automobiles, and because of the sect's cohesive religious ethos.

Speaking of governments and economies, the role of money and markets is also remarkably limited for Kelly. Technologies evolve through invention and innovation. Those processes occur within a lattice of investors, marketers, sales reps, and other businesspeople who have different motivations for getting technologies into people's hands or lives. Not all of these motives support the wants of life, as Bhopal, cigarette marketing, and Love Canal would attest.

The capitalist underpinnings beneath so much western technology are ignored, as in this summary passage: "Like personality, technology is shaped by a triad of forces. The primary driver is preordained development -- what technology wants. The second driver is the influence of technology history, the gravity of the past . . . . The third force is society's collective free will in shaping the technium, or our choices." (181)

Profit motives, lock-in/lock-out, and the psychology of wants and needs (along with business's attempts to engage it) are all on the sideline. Furthermore, a "collective free will" feels problematic: what exactly does that mean? Market forces? I don't think that reading is in play here. Rather than economics, Kelly seems most closely aligned with biology, to an extreme degree at some points: "The most helpful metaphor for understanding technology may be to consider humans as the parents of our technological children." (257)

But understanding ourselves as "parents" doesn't help solve real technological problems: how do we address billions of discarded plastic beverage bottles (many fouling the oceans), or the real costs of long-term adoption of the internal combustion engine, or the systems of food and crop subsidies and regulations that shape diet in a age of simultaneous starvation and obesity? How does the technium want goodness in any of those scenarios? Maybe the polity and the increasingly vibrant non-profit sector are part of the technology superstructure, seeing as they are human inventions, but if that's the case, Kelly's definition is so broad as to lose usefulness: the book gives little idea of what lies outside the technium. If money and markets (and kings and congresses, as well as missiles and machine guns) are coequal with cathedrals and computers, getting leverage on questions of how humans use, and are used by, our technologies becomes more difficult.

With all of its strengths and shortcomings, Kelly has written a book at once unique and rooted in a deep tradition: for well over a century Americans in particular have simultaneously worried and effused over their machines. The distinguished historian of technology Thomas P. Hughes noted in 1989 that the 1960s had given many technologies a bad name, so that cheerleaders had become scarce even as technology was infusing itself into the conceptual and indeed existential ground water: "Today technological enthusiasm, although much muted as compared with the 1920s, survives among engineers, managers, system builders, and others with vested interests in technological systems. The systems spawned by that enthusiasm, however, have acquired a momentum -- almost a life -- of their own." (American Genesis, 12) The technology-is-alive meme is a familiar one, and a whole other study could position Kelly in that tradition as well.

For our purposes, it is sufficient to note that Kelly stands as a descendant of such enthusiasts as Edison, Ford, Frederick W. Taylor, Vannevar Bush, and, perhaps most directly, Lewis Mumford, now most famous as an urban theorist. Like Kelly, Mumford simultaneously delighted in the wonders of his age while also seeing causes for concern. Note how closely his 1934 book Technics and Civilization anticipates Kelly, excepting the fact that Mumford predated the computer:

"When I use the word machines I shall refer to specific objects like the printing press or the power loom. When I use the term 'the machine' I shall employ it as a shorthand reference to the entire technological complex. This will embrace the knowledge and skills and arts derived from industry or implicated in the new technics, and will include various forms of tool, instrument, apparatus and utility as well as machines proper." (12)

One man's technium is another man's machine. For all their similarity of definition, however, Mumford kept human agency at the center of his ethos, compared to Kelly's talk of inevitability and other semi-biological tendencies of the technium super-system: "No matter how completely technics relies upon the objective procedures of the sciences, it does not form an independent system, like the universe: it exists as an element in human culture and it promises well or ill as the social groups that exploit it promise well or ill." (6) Mumford focuses on the tool-builder; Kelly gives primacy to the cumulative (and, he asserts, mostly beneficent) sum of their tool-building. In the end, however, that technium is a mass of human devices, institutions, and creations so sprawling that it loses conceptual usefulness since no human artifacts are excluded.

The critical difference between the two perspectives becomes clear as Mumford resists the same determinism in which Kelly revels: "In order to reconquer the machine and subdue it to human purposes, one must first understand it and assimilate it. So far, we have embraced the machine without fully understanding it, or, like the weaker romantics, we have rejected the machine without first seeing how much of it we could intelligently assimilate." (6) Mumford's goal -- consciously understanding and assimilating technologies within a cultivated human culture -- sounds remarkably like the Amish notion of selective rejection that Kelly admires yet ultimately rejects as impractical at scale.

It is a tribute to Kevin Kelly that he forced me to think so hard about these issues. What Technology wants deserves to be widely read and discussed, albeit with red pencils close at hand; it is a book to savor, to consider, to challenge, and to debate. The book is not linear by any stretch of the imagination, and strong chapters (such as on deep progress and on the Amish) sit alongside weaker discussions of technology-as-biology and an arbitrary grocery list of the technium's attributes that feels like it could have been handled less randomly.

Those shortcomings help define the book: by tackling a hard, messy topic, Kelly was bound to have tough patches of tentative prose, partially unsatisfying logic, and conclusions that will not be universally accepted. For having the intellectual courage to do so, I tip my hat. Meanwhile I look for a latter-day Lewis Mumford to restore human agency to the center of the argument while at the same time recognizing that governments, markets, and above all people interact with our technologies in a contingent, dynamic interplay that is anything but deterministic.

Tuesday, November 02, 2010

Early Indications October 2010: The Analytics Moment: Getting numbers to tell stories

Thanks in part to vigorous efforts by vendors (led by IBM) to bring
the idea to a wider public, analytics is coming closer to the
mainstream. Whether in ESPN ads for fantasy football, or
election-night slicing and dicing of vote and poll data, or the
ever-broadening influence of quantitative models for stock trading and
portfolio development, numbers-driven decisions are no longer the
exclusive province of people with hard-core quantitative skills.

Not surprisingly, the definition is completely problematic. At the
simple end of the spectrum, one Australian firm asserts that
"Analytics is basically using existing business data or statistics to
make informed decisions." At the other end of a broad continuum,
TechTarget distinguishes, not completely convincingly, between data
mining and data analytics:

"Data analytics (DA) is the science of examining raw data with the
purpose of drawing conclusions about that information. Data analytics
is used in many industries to allow companies and organization to make
better business decisions and in the sciences to verify or disprove
existing models or theories. Data analytics is distinguished from data
mining by the scope, purpose and focus of the analysis. Data miners
sort through huge data sets using sophisticated software to identify
undiscovered patterns and establish hidden relationships."

To avoid a terminological quagmire, let us merely assert that
analytics uses statistical and other methods of processing to tease
out business insights and decision cues from masses of data.
In order to see the reach of these concepts and methods, consider a
few examples drawn at random:

-The "flash crash" of May 2010 focused attention on the many forms and
roles of algorithmic trading of equities. While firm numbers on the
practice are difficult to find, it is telling that the regulated New
York Stock Exchange has fallen from executing 80% of trades in its
listed stocks to only 26% in 2010, according to Bloomberg. The
majority occur in other trading venues, many of them essentially
"lights-out" data centers; high-frequency trading firms, employing a
tiny percentage of the people associated with the stock markets,
generate 60% of daily U.S. trading volume of roughly 10 billion
shares.

-In part because of the broad influence of Michael Lewis's bestselling
book Moneyball, quantitative analysis has moved from its formerly
geeky niche at the periphery to become a central facet of many sports.
MIT holds an annual conference on sports analytics that draws both
sell-out crowds and A-list speakers. Statistics-driven fantasy sports
continue to rise in popularity all over the world as soccer, cricket,
and rugby join the more familiar U.S. staples of football and
baseball.

-Social network analysis, a lightly practiced subspecialty of
sociology only two decades ago, has surged in popularity within the
intelligence, marketing, and technology industries. Physics, biology,
economics, and other disciplines all are contributing to the rapid
growth of knowledge in this domain. Facebook, Al Qaeda, and countless
startups all require new ways of understanding cell phone, GPS, and
friend/kin-related traffic.

Why now?

Perhaps as interesting as the range of its application are the many
converging reasons for the rise of interest in analytics. Here are
ten, from perhaps a multitude of others.

1) Total quality management and six-sigma programs trained a
generation of production managers to value rigorous application of
data. That six-sigma has been misapplied and misinterpreted there can
be little doubt, but the successes derived from a data-driven approach
to decisions are, I believe, informing today's wider interest in
statistically sophisticated forms of analysis within the enterprise.

2) Quantitative finance applied ideas from operations research,
physics, biology, supply chain management, and elsewhere to problems
of money and markets. In a bit of turnabout, many data-intensive
techniques, such as portfolio theory, are now migrating out of formal
finance into day-to-day management.

3) As Eric Schmidt said in August, we now create in two days as much
information as humanity did from the beginning of recorded history
until 2003. That's measuring in bits, obviously, and as such Google's
estimate is skewed by the rise of high-resolution video, but the
overall point is valid: people and organizations can create data far
faster than any human being or process can assemble, digest, or act on
it. Cell phones, seen as both sensor and communications platforms,
are a major contributor, as are enterprise systems and image
generation. More of the world is instrumented, in increasingly
standardized ways, than ever before: Facebook status updates, GPS,
ZigBee and other "Internet of things" efforts, and barcodes and RFID
on more and more items merely begin a list.

4) Even as we as a species generate more data points than ever before,
Moore's law and its corollaries (such as Kryder's law of hard disks)
are creating a computational fabric which enables that data to be
processed more cost-effectively than ever before. That processing, of
course, creates still more data, compounding the glut.

5) After the reengineering/ERP push, the Internet boom, and the
largely failed effort to make services-oriented architectures a
business development theme, vendors are putting major weight behind
analytics. It sells services, hardware, and software; it can be used
in every vertical segment; it applies to every size of business; and
it connects to other macro-level phenomena: smart grids, carbon
footprints, healthcare cost containment, e-government, marketing
efficiency, lean manufacturing, and so on. In short, many vendors
have good reasons to emphasize analytics in their go-to-market
efforts. Investments reinforce the commitment: SAP's purchase of
Business Objects was its biggest acquisition ever, while IBM, Oracle,
Microsoft, and Google have also spent billions buying capability in
this area.

6) Despite all the money spent on ERP, on data warehousing, and on
"real-time" systems, most managers still can not fully trust their
data. Multiple spreadsheets document the same phenomena through
different organizational lenses, data quality in enterprise systems
rarely inspires confidence, and timeliness of results can vary widely,
particularly in multinationals. I speak to executives across
industries who have the same lament: for all of our systems and
numbers, we often don't have a firm sense of what's going on in our
company and our markets.

7) Related to this lack of confidence in enterprise data, risk
awareness is on the rise in many sectors. Whether in product
provenance (Mattel), recall management (Toyota, Safeway, or CVS),
exposure to natural disasters (Allstate, Chubb), credit and default
risk (anyone), malpractice (any hospital), counterparty risk (Goldman
Sachs), disaster management, or fraud (Enron, Satyam, Societe
General), events of the past decade have sensitized executives and
managers to the need for rigorous, data-driven monitoring of complex
situations.

8) Data from across domains can be correlated through such ready
identifiers as GPS location, credit reporting, cell phone number, or
even Facebook identity. The "like" button, by itself, serves as a
massive spur to inter-organizational data analysis of consumer
behavior at a scale never before available to sampling-driven
marketing analytics. What happens when a "sample" population includes
100 million individuals?

9) Visualization is improving. While the spreadsheet is ubiquitous in
every organization and will remain so, the quality of information
visualization has improved over the past decade. This may result
primarily from the law of large numbers (1% of a boatload is bigger
than 1% of a handful), or it may reflect the growing influence of a
generation of skilled information designers, or it may be that such
tools as Mathematica and Adobe Flex are empowering better number
pictures, but in any event, the increasing quality of both the tools
and the outputs of information visualization reinforce the larger
trend toward sophisticated quantitative analysis.

10) Software as a service puts analytics into the hands of people who
lack the data sets, the computational processing power, and the rich
technical training formerly required for hard-core number-crunching.
Some examples follow.

Successes, many available as SaaS

-Financial charting and modeling continue to migrate down-market:
retail investors can now use Monte Carlo simulations and other tools
well beyond the reach of individuals at the dawn of online investing
in 1995 or thereabouts.

-Airline ticket prices at Microsoft's Bing search engine are rated
against a historical database, so purchasers of a particular route and
date are told whether to buy now or wait.

-Wolfram Alpha is taking a search-engine approach to calculated
results: a stock's price/earnings ratio is readily presented on a
historical chart, for example. Scientific calculations are currently
handled more readily than natural-language queries, but the tool's
potential is unbelievable.

-Google Analytics brings marketing tools formerly unavailable anywhere
to the owner of the smallest business: anyone can slice and dice ad-
and revenue-related data from dozens of angles, as long as it relates
to the search engine in some way.

-Fraud detection through automated, quantitative tools holds great
appeal because of both labor savings and rapid payback. Health and
auto insurers, telecom carriers, and financial institutions are
investing heavily in these technologies.

Practical considerations: Why analytics is still hard

For all the tools, all the data, and all the computing power, getting
numbers to tell stories is still difficult. There are a variety of
reasons for the current state of affairs.

First, organizational realities mean that different entities collect
the data for their own purposes, label and format it in often
non-standard ways, and hold it locally, usually in Excel but also in
e-mails, or pdfs, or production systems. Data synchronization efforts
can be among the most difficult of a CIO's tasks, with uncertain
payback. Managers in separate but related silos may ask the same
question using different terminology, or see a cross-functional issue
through only one lens.

Secondly, skills are not yet adequately distributed. Database
analysts can type SQL queries but usually don't have the managerial
instincts or experience to probe the root cause of a business
phenomenon. Statistical numeracy, often at a high level, remains a
requirement for many analytics efforts; knowing the right tool for a
given data type, or business event, or time scale, takes experience,
even assuming a clean data set. For example, correlation does not
imply causation, as every first-year statistics student knows, yet
temptations to let it do so abound, especially as scenarios outrun
human understanding of ground truths.

Third, odd as it sounds in an age of assumed infoglut, getting the
right data can be a challenge. Especially in extended enterprises but
also in extra-functional processes, measures are rarely sufficiently
consistent, sufficiently rich, or sufficiently current to support
robust analytics. Importing data to explain outside factors adds
layers of cost, complexity, and uncertainty: weather, credit, customer
behavior, and other exogenous factors can be critically important to
either long-term success or day-to-day operations, yet representing
these phenomena in a data-driven model can pose substantial
challenges. Finally, many forms of data do not readily plug into the
available processing tools: unstructured data is growing at a rapid
rate, adding to the complexity of analysis.

In short, getting numbers to tell stories requires the ability to ask
the right question of the data, assuming the data is clean and
trustworthy in the first place. This unique skill requires a blend of
process knowledge, statistical numeracy, time, narrative facility, and
both rigor and creativity in proper proportion. Not surprisingly,
such managers are not technicians, and are difficult to find in many
workplaces. For the promise of analytics to match what it actually
delivers, the biggest breakthroughs will likely come in education and
training rather than algorithms or database technology.