If you’re an author with a knack for conjuring up nightmare scenarios,
these are in fact the best of times: George R. R. Martin (Game of
Thrones) and Suzanne Collins (The Hunger Games) are absolutely rolling
in money and fame. There’s something going on when
bleak futures capture national mindshare in books, TV shows, and movies.
Look at 1963: with far fewer options, mass audiences converged on
distraction. Shows such as Petticoat Junction, Candid Camera, and My
Favorite Martian made the top ten, all lagging The
Beverly Hillbillies and Bonanza. In 1968, with riots in the streets and
the assassinations of Robert Kennedy and Martin Luther King calling the
American ideal into question, Bonanza hung in at #3, behind Gomer Pyle
U.S.M.C. and Rowan and Martin’s Laugh-in.
That Stanley Kubrick was able to confront the madness of nuclear war
with the brilliant black comedy of Dr. Strangelove (1964) helps prove
the point: dystopias have historically been uncommon cultural
touchstones; now they’re everywhere.
Could it be that these cultural artifacts capture our zeitgeist? Whether
in the Dark Knight Batman films, The Wire, Breaking Bad, or The
Sopranos, our most popular entertainments of the past 15 years present a
pretty bleak vision, diametrically opposed to new
deals, new frontiers, or “Don’t Stop Thinking About Tomorrow” campaign
songs. Along the dystopian line of thinking, it’s easy to find evidence
that the world is heading in a very bad direction. A gloomy sample:
*Ocean levels are rising faster than predicted, but the local effects in
New York, Miami, the Netherlands, and Bangladesh will all vary
considerably. Millions of people will be displaced; where will they go?
Norfolk Naval Base will lose acres of land, how fast
nobody knows.
*What appears to be the single largest distributed denial of service
(DDoS) cyber-attack was mounted earlier this month using at least
150,000 compromised cameras and other poorly secured Internet of Things
(IoT) devices. It’s quite possible our cars, garage-door
openers, thermostats, and personal devices can be turned against us.
*Guns kill a lot of people in the U.S. Exactly how many is hard to
determine, in part because the gun lobby discourages public health
officials from calculating statistics. But whether it is suicides
(20,000 a year, or 2/3 of all gun deaths), mass shootings,
police violence against citizens, or the average of 82 shootings per
week in Chicago alone, the numbers are depressing but apparently
acceptable, given the lack of action. One statistic provides food for
thought. Despite its wide error range, a Harvard study
released earlier this month estimated (a key word) that 7.7 million
people (3% of U.S. adults) own half the country’s guns. These
“super-owners” collect between 8 and 140 firearms apiece.
*Globally, millions of people are being lifted out of poverty, but in
the U.S., tens of millions of middle-class people find their fates stuck
or, increasingly, declining. Whether from plant closures, downsizing,
inadequate skills, offshoring, or automation’s
various effects, people can’t get ahead the way previous generations
did. For many, complex reasons, class conflict is showing itself in
various ways: racial tensions, protests in places like San Francisco
where homelessness and extreme wealth collide, and
anti-trade sentiment. Immigration and refugees are super-sensitive
issues from Turkey to British Columbia.
*At the same time that Colorado and Washington state are finding
benefits of legal marijuana, recreational drugs are killing people. In
addition to the violence in Chicago noted above, some of which is
drug-related, the toll of opioid drugs is shocking. Especially
when heroin is cut with fentanyl, overdoses are swamping local EMS and
other responders. Columbus saw 27 in 24 hours, while Cincinnati had to
cope with 174 heroin overdoses in 6 days. Huntington, WV had calls for
27 overdoses in under four hours last month.
Prescription oxycontin was likely a tragic gateway drug in many of these
cases. “Just say no” and a “war” on drugs clearly didn’t work; what’s
next?
*On the ethical drug front, meanwhile, we live in scary times.
Antibiotic-resistant “superbugs” are making hospitalization in any
country a frightening proposition. As of 2013, 58,000 babies died of
antibiotic-resistant infections in India alone, and in a global
age of travel, those bacterial strains are moving elsewhere. An
estimated 23,000 people died in the US last year from
antibiotic-resistant infections, and just this past May, the CDC
reported that a Pennsylvania woman who had not recently traveled out of
the
country tested positive for the mcr-1 strain of E. coli. This variant
resists colistin, widely regarded as the “last resort” antibiotic,
though the woman in question _did_ respond to other treatments. Still,
the CDC’s language is sobering: “The CDC and federal
partners have been hunting for this gene in the U.S. ever since its
emergence in China in 2015. . . . The presence of the mcr-1 gene,
however, and its ability to share its colistin resistance with other
bacteria such as CRE raise the risk that pan-resistant
bacteria could develop.”
None of these problems have easy answers; some don’t even have hugely
difficult answers. Zeroing in on the technology-related world (thus
leaving aside climate change, gun violence, and drug issues for the
moment), I see four nasty paradoxes that, taken together,
might explain some of how we arrived at a juncture where dystopian
fantasies might resonate.
1) Automation brings leisure and productivity; robotics threatens to
make many job categories obsolete. From radiologists to truck drivers to
equity analysts, jobs in every sector are threatened with
extinction.The task of making sure technologies of artificial
muscle and cognition have widely rather than narrowly shared benefits
runs counter to many management truisms regarding shareholder value,
return on investment, and optimization.
2) We live in a time of unprecedented connection as most adults on the
planet have access to a phone and will soon get smartphones;
interpersonal dynamics are often characterized by savagery (at worst) or
distractedness. (Google “Palmer Luckey” for a case in
point.) Inside families and similar relationships, meanwhile, the
psychologist Sherry Turtle argues persuasively that we are failing each
other, and especially our kids, when we interact too much with screens
and too little with flesh-and-blood humans.
3) The World Wide Web brought vast stores of the world’s cultural and
scientific knowledge to masses of people; a frightening amount of public
debate is now “post-factual,” with conspiracy theories and plain old
ignorance gaining large audiences. Climate science,
GMO crops, and vaccinations are but three examples. The assumptions
behind the Web have too often failed: access to knowledge by itself
cannot counter fads (hello Justin Bieber), longstanding ignorance, or
intolerance. Compare the traffic to YouTube or Facebook
with that to the Library of Congress, Internet Archive, or even
Wikipedia. At some level, maybe people don’t like eating their
intellectual vegetables; junk food is too hard to resist.
4) Billions of sensors, smartphones, and cloud computing virtual
machines enable an increasingly real-time world, where information flows
faster and wider every year; historical context is lacking for many
public assertions and private opinions. In September,
a Republican party official claimed there was no racism before 2008. For
years, only a minority of people have been able to identify in which
century the American revolution or Civil War occurred. Nuanced views of
Reconstruction or the Gilded Age, hugely formative
of and relevant to today, are difficult to find.
Together, these paradoxes add up to a truly dystopian vision at odds
with what seemed inevitable just a few years ago. It’s difficult to be
optimistic, but to close I’ll suggest some reasons why solutions are so
difficult.
*The digital world doesn’t respect traditional organizational
boundaries. Examples abound: Russia is said to be meddling in the US
election cycle. Certainly the superpowers have influenced local
elections in the past, but the thought of major media outlets
and voting machines being compromised by a global adversary calls the
whole notion of sovereignty into question. Whether it’s in regard to
spam, child porn, copyright, compromised hardware at the chipset level,
digital privacy, or the handling of video and
music streaming, the global, borderless nature of the mobile/digital
platforms calls basic facts of jurisdiction, evidence, and recourse into
question.
*At the same time that “where” needs to be redefined, so too we must
confront what work is. Who does what, how much they are paid or
otherwise valued, how they learn the job, what happens when jobs or
entire labor markets disappear — none of our current answers
can be assumed to hold stable 10 years from now. Education, unemployment
and disability benefits, collective bargaining, workplace health and
safety (does sitting really “kill” you?), pensions, internships,
retirement, job-hunting, and corporate education and
training will all assume new shapes. Some of this will be messy; I can’t
see anyone getting it all right the first time.
*Technologies of communication and transportation have usually been a
double-edged sword. Trade brings benefits to many parties, but smallpox,
influenza, and the AIDS virus all crossed oceans on new modes of
transport. Given the essentially free, multimedia,
borderless nature of digital communications, what equivalent maladies
will be given broad distribution, and what will be their consequences?
*In a pluralistic world, what can serve as a moral compass for an
individual, a group, a nation, a continent? The teachings of Muhammad,
Jesus, Yahweh, Confucius, and the Buddha all have served to guide people
over the centuries, but so too have they justified
crimes against humanity. We live in a connected world where religious
conflict becomes more likely than in eras with less physical mobility.
Given global communications and mobility, how is coexistence possible,
given increases in both fundamentalism and secularism
in many places, and the ongoing tendency for the major religions to
splinter internally, often violently? In a post-factual world, people
try to claim their own beliefs, but without sufficiently binding notions
of a common identity, purpose, or ideology, we
are left less with states of free-thinkers than with new sources of
conflict — and fewer resources for building group identity.
To be sure, there are many hopeful signals, and plenty of today’s
entertainment is mindless diversion not unlike the television hits of
the 1960s. That dystopias can find audiences may be more a function of
the multitude of distribution options than of national
mood. In any event, I do believe the challenges we confront will test
moral resolve, institutional flexibility, and intellectual creativity
unlike ever before. It may be that meta-questions are in order: rather
than asking how we solve internet security or
rising ocean levels, we (a tricky word all on its own) need to ask, what
are the political forms, grounds of legitimacy, and resources of the
institutions we will design to address these new challenges.