In the many years that I’ve been tracking technological, cultural, and economic change, it’s become quite clear that 2022 is radically different from 1997. It’s not that there was one epochal moment — the dropping of atomic bombs, the US withdrawn from Vietnam, the assassination of a King or a Kennedy — but something happened. This month I want to tease out how the future we were promised in 1997 isn’t the future we got, and why that might be so.
Some salient facts about 1997:
-Cell phones were phones and texting was not at all common in the US
-Software was sold on physical media either at consumer “computer stores” or via system integrators large (Andersen Consulting) and small (Microsoft’s certified developer network)
-Computers lived in special rooms on corporate premises and/or on desktops (possibly after having been carried in a briefcase or backpack)
-Computing was big but not huge business: Microsoft’s revenues of $11 billion were on par with those of Weyerhaeuser, outside the Fortune 100
-Human attention was not yet a dominant business: the top-ranked media/entertainment company on the Fortune 100, Disney, ranked 55th
-Information intermediaries including political parties, news-gatherers, and scientific organizations were still regarded as credible.
Compare that state of affairs to our time:
-Mobile phones are becoming ubiquitous in more and more countries, even Afghanistan (with 65 phones per 100 people five years ago), with smartphones being the dominant product category (77% market share)
-Software is harder to buy as physical media, even when you really want to do so (as with Adobe Creative Suite). Most software updates occur without the knowledge of the user.
-Computers live on our persons and are no longer called computers. US residents are estimated to check them 250 times a day — an average of every 5 waking minutes.
-Companies that rely on machine learning and other advanced forms of computing trade in human attention for at least part of their revenue. Amazon (#2 on the Fortune 500), Apple (#4), Google/Alphabet (11th), Microsoft (21st), and Facebook/Meta (46th) are the prime examples.
Rewind
At the dawn of the World Wide Web, access to knowledge was presumed to lead to better decisions. In 1992 Tim Berners-Lee explicitly referred back to Vannevar Bush’s “As We May Think” essay from 1945 that envisioned what was essentially an infinite microfilm reader built into a desk. As early as 1998, however, a Rutgers professor (James Katz) pointed out the two-edged sword of wide access to media: “unlike newspapers and other media outlets, there are often no quality control mechanisms on Web sites that would permit users to know what information is generally recognized fact and what is spurious.“ The death of gatekeepers meant the demise of fact-checking, but in the age of Yahoo, AOL, and Geocities, the problem was relatively contained.
The next big wave revolved around Google, in particular its business model. As the technical goal shifted in light of the 2001 tech recession and profitability became imperative, providing the most accurate search results did not pay the bills: getting people to click on ads did. The slight change in emphasis from technical accuracy and performance to managing human behavior portended the rise of the planet-scale platforms in the 2005-15 decade: Facebook, YouTube, Reddit, Instagram. Note the change in both language and conceptual underpinnings: Tripod was a collection of personal pages on an umbrella website; Facebook is a platform. Note that Geocities-style personalization was unavailable. Rather, the value was in the links, including to those in your reference group who were living more interesting lives than you were. Thus the big emphasis on behavioral aspects of computing: software companies need to get people to follow, like, subscribe, click, watch, refer, recommend.
It’s worth pointing to the role of the smartphone in this context. As computing goes from something that sits on a desk and maybe connects via a modem to something that lives in your pocket, our interactions with computing evolve. I cannot imagine Facebook getting the degree of user engagement it did had it only been available as a desktop website, nor could YouTube have grown to its current scale had video cameras not become ubiquitous. Doing searches — for restaurants, for directions, for recommendations — in the moment as opposed to at the desktop prior to venturing forth is a completely different process.
At the same time, Amazon’s re-invention of time-share computing under the cloud rubric, along with Salesforce and the other software-as-a-service plays, further de-emphasized traditional software practices. This was observed first in B2B markets, then in prosumer (Adobe Creative Suite), then in consumer software: nobody downloads Google, and many aspects of software functionality now are paid for with ads. It may be almost a full decade since I last installed software off a physical disk. Note that apart from cryptocurrency, which I will address in another newsletter, there hasn’t been a US-based breakout software play since 2009, when Uber and Airbnb came to market. Similarly, between Apple/Samsung’s conquest of the smartphone market and Amazon et al’s conversion of many enterprise systems to off-premise computing, when people say they work in computing circa 2022, they probably don’t do things or buy things the way their progenitors of 15 years ago did.
2015-pandemic
Because this chapter hasn’t really played out yet, I’ll suggest two themes and trust that readers will suggest more that I’m missing. First, machine learning hit a period of rapid improvement — in performance, in range of application, in scale — since about 2012 as a result of the aforementioned cloud-scale compute farms, of bigger and better training data, and some decisive advances in algorithms. If you haven’t played with the DALL*E 2 AI image generator (https://openai.com/dall-e-2/)
The signal “contribution” of machine learning to date, however, is not in images but in hacking human behavior. Whether it’s the McDonalds video menu board, ads on social media, recommendations for YouTube, or something as simple yet frequently disturbing as Google type-ahead, people are being manipulated throughout the day by companies operating under little oversight and under regulations ill-fitted to the task of preserving human autonomy, privacy, and independence from lock-in. (Note that autonomous passenger vehicles, Elon Musk’s promises and branding to the contrary, are not yet ready for prime time.) How tech gets regulated in the coming years will be key to both human and economic futures.
It may be that I’m operating with the blinders of someone completing a book on online video, but the second big story since 2015 for me is the rise of TikTok. Consider a few facts that mark it as a departure:
-Parent company ByteDance’s core competency is algorithmic control of human attention. TikTok is but one app that operationalizes this capability; it’s fascinating to consider where the parent company might pivot next: mobile telephony, enterprise software (think Slack for countries that might not want to buy from US firms), even banking.
-TikTok was built on a vertical visual model. There is no scroll. As a result, the app can map visual cause to physical effect with extreme accuracy. Scrolling apps, meanwhile, have very little idea of what might be on the screen when I click “like” or click away.
- TikTok takes action — starts a video — rather than presenting a user with alternatives.
-The short-video format means that many more behavioral data points are collected per hour of interaction with the app. In any machine-learning scenario, the better the training data, the better the performance.
-TikTok was built as an app then ported to a web presence as a secondary consideration. Like Uber, it was built mobile-first, so old assumptions applied to “Internet companies” need to be tested.
-TikTok has stakeholders ranging from VCs (Softbank , Sequoia) to private equity (General Atlantic, KKR), to the Chinese communist party. Its governance, exit strategies, and accountability are unlike any other tech firm, and are still very much fluid. To date, no other Chinese firm has established anywhere near this big a consumer presence outside China, so how it moves forward as a company will be important to watch.
What next?
History has a way of, if not repeating, then at least rhyming. The LP record is a more viable format than the compact disc, 40 years after the latter promised “perfect sound — forever.” Moleskine paper planners and diaries are big sellers. Old-school sports and games from bowling to axe-throwing attract hipster audiences. Will we see more analog record-keeping in the days ahead? Even people who burned photo libraries to DVD can have a hard time playing physical media these days.
Similarly, will some people rediscover the desktop, reserving the phone for on-the-go mapping, weather, and other emotionally lightweight uses? Anecdotally, the number of people I’ve heard trying to limit phone presence seems to be growing, and the surge in usage of U.S. wilderness suggests that getting unplugged might be part of the appeal. (Note that the most recent Apple iOS has multiple controls for shutting functionality off.) There’s also the matter of an aging user base: no matter the screen resolution, older eyes need bigger print.
Finally, how long and how deep will public dissatisfaction with the platforms run? Facebook/Meta’s stock price drop has many causes, but falling engagement metrics are part of the landscape. Market exit along with regulatory limits will both be factors, but the countervailing force of algorithmic “addictiveness” (the term often used in association with TikTok) also bears watching. Part of ByteDance’s success is in not overdriving its headlights: when Facebook got adopted by grandmothers, its days among teens were numbered, and TikTok so far hasn’t repeated the same pattern. Rather than building a mass-audience single platform, look for ByteDance to maintain distinct audience segments with unique products — or that’s my guess anyway. In an issue coming soon, you'll see just how many times my projections have been wrong . . . .
To those of you stateside, best wishes for a safe and enjoyable Independence Day holiday.