Sunday, July 31, 2022

June 2022: Locating the demarcations

In the many years that I’ve been tracking technological, cultural, and economic change, it’s become quite clear that 2022 is radically different from 1997. It’s not that there was one epochal moment — the dropping of atomic bombs, the US withdrawn from Vietnam, the assassination of a King or a Kennedy — but something happened. This month I want to tease out how the future we were promised in 1997 isn’t the future we got, and why that might be so.

Some salient facts about 1997:


-Cell phones were phones and texting was not at all common in the US

-Software was sold on physical media either at consumer “computer stores” or via system integrators large (Andersen Consulting) and small (Microsoft’s certified developer network)

-Computers lived in special rooms on corporate premises and/or on desktops (possibly after having been carried in a briefcase or backpack)

-Computing was big but not huge business: Microsoft’s revenues of $11 billion were on par with those of Weyerhaeuser, outside the Fortune 100

-Human attention was not yet a dominant business: the top-ranked media/entertainment company on the Fortune 100, Disney, ranked 55th

-Information intermediaries including political parties, news-gatherers, and scientific organizations were still regarded as credible.


Compare that state of affairs to our time:

-Mobile phones are becoming ubiquitous in more and more countries, even Afghanistan (with 65 phones per 100 people five years ago), with smartphones being the dominant product category (77% market share)

-Software is harder to buy as physical media, even when you really want to do so (as with Adobe Creative Suite). Most software updates occur without the knowledge of the user.

-Computers live on our persons and are no longer called computers. US residents are estimated to check them 250 times a day — an average of every 5 waking minutes.

-Companies that rely on machine learning and other advanced forms of computing trade in human attention for at least part of their revenue. Amazon (#2 on the Fortune 500), Apple (#4), Google/Alphabet (11th), Microsoft (21st), and Facebook/Meta (46th) are the prime examples.


Rewind

At the dawn of the World Wide Web, access to knowledge was presumed to lead to better decisions. In 1992 Tim Berners-Lee explicitly referred back to Vannevar Bush’s “As We May Think” essay from 1945 that envisioned what was essentially an infinite microfilm reader built into a desk. As early as 1998, however, a Rutgers professor (James Katz) pointed out the two-edged sword of wide access to media: “unlike newspapers and other media outlets, there are often no quality control mechanisms on Web sites that would permit users to know what information is generally recognized fact and what is spurious.“ The death of gatekeepers meant the demise of fact-checking, but in the age of Yahoo, AOL, and Geocities, the problem was relatively contained.


The next big wave revolved around Google, in particular its business model. As the technical goal shifted in light of the 2001 tech recession and profitability became imperative, providing the most accurate search results did not pay the bills: getting people to click on ads did. The slight change in emphasis from technical accuracy and performance to managing human behavior portended the rise of the planet-scale platforms in the 2005-15 decade: Facebook, YouTube, Reddit, Instagram. Note the change in both language and conceptual underpinnings: Tripod was a collection of personal pages on an umbrella website; Facebook is a platform. Note that Geocities-style personalization was unavailable. Rather, the value was in the links, including to those in your reference group who were living more interesting lives than you were. Thus the big emphasis on behavioral aspects of computing: software companies need to get people to follow, like, subscribe, click, watch, refer, recommend. 


It’s worth pointing to the role of the smartphone in this context. As computing goes from something that sits on a desk and maybe connects via a modem to something that lives in your pocket, our interactions with computing evolve. I cannot imagine Facebook getting the degree of user engagement it did had it only been available as a desktop website, nor could YouTube have grown to its current scale had video cameras not become ubiquitous. Doing searches — for restaurants, for directions, for recommendations — in the moment as opposed to at the desktop prior to venturing forth is a completely different process.


At the same time, Amazon’s re-invention of time-share computing under the cloud rubric, along with Salesforce and the other software-as-a-service plays, further de-emphasized traditional software practices. This was observed first in B2B markets, then in prosumer (Adobe Creative Suite), then in consumer software: nobody downloads Google, and many aspects of software functionality now are paid for with ads. It may be almost a full decade since I last installed software off a physical disk. Note that apart from cryptocurrency, which I will address in another newsletter, there hasn’t been a US-based breakout software play since 2009, when Uber and Airbnb came to market. Similarly, between Apple/Samsung’s conquest of the smartphone market and Amazon et al’s conversion of many enterprise systems to off-premise computing, when people say they work in computing circa 2022, they probably don’t do things or buy things the way their progenitors of 15 years ago did.


2015-pandemic

Because this chapter hasn’t really played out yet, I’ll suggest two themes and trust that readers will suggest more that I’m missing. First, machine learning hit a period of rapid improvement — in performance, in range of application, in scale — since about 2012 as a result of the aforementioned cloud-scale compute farms, of bigger and better training data, and some decisive advances in algorithms. If you haven’t played with the DALL*E 2 AI image generator (https://openai.com/dall-e-2/), I suggest you do so, just to have something to wrap your mind around. Combined with Amazon’s announcement this week that Alexa can speak in the voices of people no longer living (https://www.npr.org/2022/06/23/1107079194/amazon-alexa-dead-relatives-voice), we need to get very serious very soon about the issue of deep fakes in multiple audio and video formats.


The signal “contribution” of machine learning to date, however, is not in images but in hacking human behavior. Whether it’s the McDonalds video menu board, ads on social media, recommendations for YouTube, or something as simple yet frequently disturbing as Google type-ahead, people are being manipulated throughout the day by companies operating under little oversight and under regulations ill-fitted to the task of preserving human autonomy, privacy, and independence from lock-in. (Note that autonomous passenger vehicles, Elon Musk’s promises and branding to the contrary, are not yet ready for prime time.) How tech gets regulated in the coming years will be key to both human and economic futures.


It may be that I’m operating with the blinders of someone completing a book on online video, but the second big story since 2015 for me is the rise of TikTok. Consider a few facts that mark it as a departure:


-Parent company ByteDance’s core competency is algorithmic control of human attention. TikTok is but one app that operationalizes this capability; it’s fascinating to consider where the parent company might pivot next: mobile telephony, enterprise software (think Slack for countries that might not want to buy from US firms), even banking.


-TikTok was built on a vertical visual model. There is no scroll. As a result, the app can map visual cause to physical effect with extreme accuracy. Scrolling apps, meanwhile, have very little idea of what might be on the screen when I click “like” or click away.


- TikTok takes action — starts a video — rather than presenting a user with alternatives.


-The short-video format means that many more behavioral data points are collected per hour of interaction with the app. In any machine-learning scenario, the better the training data, the better the performance.


-TikTok was built as an app then ported to a web presence as a secondary consideration. Like Uber, it was built mobile-first, so old assumptions applied to “Internet companies” need to be tested.


-TikTok has stakeholders ranging from VCs (Softbank , Sequoia) to private equity (General Atlantic, KKR), to the Chinese communist party. Its governance, exit strategies, and accountability are unlike any other tech firm, and are still very much fluid. To date, no other Chinese firm has established anywhere near this big a consumer presence outside China, so how it moves forward as a company will be important to watch.


What next?

History has a way of, if not repeating, then at least rhyming. The LP record is a more viable format than the compact disc, 40 years after the latter promised “perfect sound — forever.” Moleskine paper planners and diaries are big sellers. Old-school sports and games from bowling to axe-throwing attract hipster audiences. Will we see more analog record-keeping in the days ahead? Even people who burned photo libraries to DVD can have a hard time playing physical media these days. 


Similarly, will some people rediscover the desktop, reserving the phone for on-the-go mapping, weather, and other emotionally lightweight uses? Anecdotally, the number of people I’ve heard trying to limit phone presence seems to be growing, and the surge in usage of U.S. wilderness suggests that getting unplugged might be part of the appeal. (Note that the most recent Apple iOS has multiple controls for shutting functionality off.) There’s also the matter of an aging user base: no matter the screen resolution, older eyes need bigger print.


Finally, how long and how deep will public dissatisfaction with the platforms run? Facebook/Meta’s stock price drop has many causes, but falling engagement metrics are part of the landscape. Market exit along with regulatory limits will both be factors, but the countervailing force of algorithmic “addictiveness” (the term often used in association with TikTok) also bears watching. Part of ByteDance’s success is in not overdriving its headlights: when Facebook got adopted by grandmothers, its days among teens were numbered, and TikTok so far hasn’t repeated the same pattern. Rather than building a mass-audience single platform, look for ByteDance to maintain distinct audience segments with unique products — or that’s my guess anyway. In an issue coming soon, you'll see just how many times my projections have been wrong . . . .


To those of you stateside, best wishes for a safe and enjoyable Independence Day holiday.

July 2022 Early Indications: Actual Reality

As much as it’s tempting to address a wave of news in the virtual reality market from Meta, Apple, and elsewhere, I’m more concerned about the fate of ground truth — actual reality — in the current technology landscape. Whether it involves physical goods, news, or personal identities, the possibilities for fakery have never been easier to exploit or more numerous to try to regulate. At some level, this is also a question of trust, specifically, of the eclipse of human-scale mechanisms as we build planet-scale platforms that move too fast to keep up with. Let’s start with examples then move to the concerns. 

The aptly-named The RealReal is a consignment seller of luxury goods. The business is crucially dependent on human authenticators who can vouch for a six-digit Gucci or Louis Vuitton. The site has never had trouble attracting traffic, but neither has it achieved profitability. As is happening in many companies these days, the founder was replaced in part because of investors’ concerns over cost controls. Said founder’s previous startup — pets.com — should have provided some clues in this department, one would think, but in any event, authentication is both a systems administrator and real-world challenge. Ebay provides authentication for select wristwatches (a Patek Philppe from a Boca Raton jeweler is currently listed for $1.25 million) as well as sneakers. Third parties are introduced into the workflow, adding complexity, time, and shipping costs to the transaction, but the all-important trust relationship is maintained. How long can either authentication model — RealReal in-house, eBay’s external network — persist in the face of better-financed fakers, higher transaction volume, or recessionary cost pressure?


A widely circulated video appeared to portray House Speaker Nancy Pelosi as drunk. With no gatekeepers at CBS News, the Washington Post, or similar organizations to stop the flow, millions of people were fooled by the simple act of slowing down the frame rate on the video. As far as I can determine, such manipulation is not prohibited at Facebook; at YouTube a clause in the terms of service about political figures might disqualify the video. Coming very soon, deep fake videos will make such crude manipulation seem quaint. Hollywood-grade effects tools are becoming widely available and there are no standards in place to watermark or otherwise authenticate first-party video originality and fidelity. Given the tenor of US and other political debate since 2016 or so — in which blatant lies are routinely repeated, transmitted, and amplified — we should probably expect to see deep fakes play a role in an upcoming election, whether in the UK, EU, or US. Teaching digital literacy is already an uphill battle: people still click on dangerous email links every day. Now imagine trying to teach Internet users of any age, in any nation, that what they are seeing and hearing might be fake while there are limited tools for testing any given artifact.


Long ago an investigative reporter was told to “follow the money” on the Watergate coverup. It’s still good advice. There’s a reason Meta is almost a trillion-dollar company while any news-focused organization you can name is worth a fraction of that. Meta is concerned only with “engagement,” a metric that conveniently increases not with the propagation of truth but with falsehood: whatever gets people riled up does the job nicely. The company is seeing its usage and reputation decline as Apple’s forced opt-in policy took millions of users off Facebook’s cross-site behavioral tracking. 


Meta is now undertaking steps to loosen controls on false or deceiving anti-vaccine and related content because they apparently cannot afford to keep falsehoods out of users’ feeds. A 2020 article in Nature showed that 


“anti-vaccination clusters offer a wide range of potentially attractive narratives that blend topics such as safety concerns, conspiracy theories and alternative health and medicine, and also now the cause and cure of the COVID-19 virus. . . . By contrast, pro-vaccination views are far more monothematic. Using aggregation mathematics and a multi-agent model, we have reproduced the ability of anti-vaccination support to form into an array of many smaller-sized clusters, each with its own nuanced opinion, from a population of individuals with diverse characteristics . . . .”

In other words, the person-to-person “feel” of Facebook plays well with deliberate or incidental micro targeting: anti-vaccine adherents can find whatever flavor of argument they find most congruent with their preexisting beliefs. Very few public-health workers are expert in such behavioral marketing nuance, so the “monothematic” messaging continues, accompanied by an ample dose of inconsistency: what is the virus called this week? (Both nation of first observation and Greek letters seem to have left the building.) Should young children get vaccinated? When? How often? Does masking help? When and where is it required? Is there a BA4/5-specific vaccine yet?

The social critic (he co-founded Spy magazine, for those of a certain age) Kurt Andersen plausibly connects the rise of the Internet — massive communications power with few gatekeepers — to the anti-authoritarian impulse of the 1960s. Now, nobody’s word can be said to be definitive and every voice gets its own free channel: “Before the Internet, crackpots were mostly isolated and surely had a harder time remaining convinced of their alternate realities. Now their devoutly believed opinions are all over the airwaves and the Web, just like actual news. Now all the fantasies look real.” 

How much deeper will much of the world venture into fantasyland? How far will our various publics veer from reality, whether wittingly or via deception? How will science education, instructors in civic virtue, and news media react when the lies and falsehoods get bigger and more consequential as the tools for their dissemination grow stronger? How much more damage will civic institutions — schools, government agencies, public safety — sustain? 

What might a positive path forward look like? As Andersen notes, people often prefer falsehood to fact, so any effort that involves behavioral engineering needs to be carefully considered, monitored, and adjusted. Old-style public-service messaging like “Only you can prevent forest fires” or “Just say no” turns out to be a) less true than the taglines once were and b) disconnected from the current zeitgeist. Thus, addressing the root cause of rampant fakery — insatiable demand for it — relies on both messages and institutions that no longer command the respect they once did. 

All of this adds up to seriously confusing times. At the same time that the Webb telescope opens entirely new chapters of deep-space astronomy to millions, cultists believe the most outlandish quackery or worse. In fact, one observer (on Twitter, so I can’t find the post any more) posited that movie scripts are now the measure of plausibility: forgot ground truth, if Nicholas Cage or some Marvel character can be imagined doing it, I’ll choose that over the mundanity of record-keeping, basic math, and legal reasoning. There’s something to this: what else would explain the seemingly infinite desire for dystopian plot lines in the face of overall good news: an almost instant vaccine for a global pandemic, fewer people (by percentage) in poverty than ever, improved legal rights for many minority populations? (I first noted this trend in the September 2016 newsletter, but we don’t seem to have turned any corners.) In the midst of such confusion, “keeping it real” is getting more difficult every year.