Sunday, July 31, 2022

July 2022 Early Indications: Actual Reality

As much as it’s tempting to address a wave of news in the virtual reality market from Meta, Apple, and elsewhere, I’m more concerned about the fate of ground truth — actual reality — in the current technology landscape. Whether it involves physical goods, news, or personal identities, the possibilities for fakery have never been easier to exploit or more numerous to try to regulate. At some level, this is also a question of trust, specifically, of the eclipse of human-scale mechanisms as we build planet-scale platforms that move too fast to keep up with. Let’s start with examples then move to the concerns. 

The aptly-named The RealReal is a consignment seller of luxury goods. The business is crucially dependent on human authenticators who can vouch for a six-digit Gucci or Louis Vuitton. The site has never had trouble attracting traffic, but neither has it achieved profitability. As is happening in many companies these days, the founder was replaced in part because of investors’ concerns over cost controls. Said founder’s previous startup — pets.com — should have provided some clues in this department, one would think, but in any event, authentication is both a systems administrator and real-world challenge. Ebay provides authentication for select wristwatches (a Patek Philppe from a Boca Raton jeweler is currently listed for $1.25 million) as well as sneakers. Third parties are introduced into the workflow, adding complexity, time, and shipping costs to the transaction, but the all-important trust relationship is maintained. How long can either authentication model — RealReal in-house, eBay’s external network — persist in the face of better-financed fakers, higher transaction volume, or recessionary cost pressure?


A widely circulated video appeared to portray House Speaker Nancy Pelosi as drunk. With no gatekeepers at CBS News, the Washington Post, or similar organizations to stop the flow, millions of people were fooled by the simple act of slowing down the frame rate on the video. As far as I can determine, such manipulation is not prohibited at Facebook; at YouTube a clause in the terms of service about political figures might disqualify the video. Coming very soon, deep fake videos will make such crude manipulation seem quaint. Hollywood-grade effects tools are becoming widely available and there are no standards in place to watermark or otherwise authenticate first-party video originality and fidelity. Given the tenor of US and other political debate since 2016 or so — in which blatant lies are routinely repeated, transmitted, and amplified — we should probably expect to see deep fakes play a role in an upcoming election, whether in the UK, EU, or US. Teaching digital literacy is already an uphill battle: people still click on dangerous email links every day. Now imagine trying to teach Internet users of any age, in any nation, that what they are seeing and hearing might be fake while there are limited tools for testing any given artifact.


Long ago an investigative reporter was told to “follow the money” on the Watergate coverup. It’s still good advice. There’s a reason Meta is almost a trillion-dollar company while any news-focused organization you can name is worth a fraction of that. Meta is concerned only with “engagement,” a metric that conveniently increases not with the propagation of truth but with falsehood: whatever gets people riled up does the job nicely. The company is seeing its usage and reputation decline as Apple’s forced opt-in policy took millions of users off Facebook’s cross-site behavioral tracking. 


Meta is now undertaking steps to loosen controls on false or deceiving anti-vaccine and related content because they apparently cannot afford to keep falsehoods out of users’ feeds. A 2020 article in Nature showed that 


“anti-vaccination clusters offer a wide range of potentially attractive narratives that blend topics such as safety concerns, conspiracy theories and alternative health and medicine, and also now the cause and cure of the COVID-19 virus. . . . By contrast, pro-vaccination views are far more monothematic. Using aggregation mathematics and a multi-agent model, we have reproduced the ability of anti-vaccination support to form into an array of many smaller-sized clusters, each with its own nuanced opinion, from a population of individuals with diverse characteristics . . . .”

In other words, the person-to-person “feel” of Facebook plays well with deliberate or incidental micro targeting: anti-vaccine adherents can find whatever flavor of argument they find most congruent with their preexisting beliefs. Very few public-health workers are expert in such behavioral marketing nuance, so the “monothematic” messaging continues, accompanied by an ample dose of inconsistency: what is the virus called this week? (Both nation of first observation and Greek letters seem to have left the building.) Should young children get vaccinated? When? How often? Does masking help? When and where is it required? Is there a BA4/5-specific vaccine yet?

The social critic (he co-founded Spy magazine, for those of a certain age) Kurt Andersen plausibly connects the rise of the Internet — massive communications power with few gatekeepers — to the anti-authoritarian impulse of the 1960s. Now, nobody’s word can be said to be definitive and every voice gets its own free channel: “Before the Internet, crackpots were mostly isolated and surely had a harder time remaining convinced of their alternate realities. Now their devoutly believed opinions are all over the airwaves and the Web, just like actual news. Now all the fantasies look real.” 

How much deeper will much of the world venture into fantasyland? How far will our various publics veer from reality, whether wittingly or via deception? How will science education, instructors in civic virtue, and news media react when the lies and falsehoods get bigger and more consequential as the tools for their dissemination grow stronger? How much more damage will civic institutions — schools, government agencies, public safety — sustain? 

What might a positive path forward look like? As Andersen notes, people often prefer falsehood to fact, so any effort that involves behavioral engineering needs to be carefully considered, monitored, and adjusted. Old-style public-service messaging like “Only you can prevent forest fires” or “Just say no” turns out to be a) less true than the taglines once were and b) disconnected from the current zeitgeist. Thus, addressing the root cause of rampant fakery — insatiable demand for it — relies on both messages and institutions that no longer command the respect they once did. 

All of this adds up to seriously confusing times. At the same time that the Webb telescope opens entirely new chapters of deep-space astronomy to millions, cultists believe the most outlandish quackery or worse. In fact, one observer (on Twitter, so I can’t find the post any more) posited that movie scripts are now the measure of plausibility: forgot ground truth, if Nicholas Cage or some Marvel character can be imagined doing it, I’ll choose that over the mundanity of record-keeping, basic math, and legal reasoning. There’s something to this: what else would explain the seemingly infinite desire for dystopian plot lines in the face of overall good news: an almost instant vaccine for a global pandemic, fewer people (by percentage) in poverty than ever, improved legal rights for many minority populations? (I first noted this trend in the September 2016 newsletter, but we don’t seem to have turned any corners.) In the midst of such confusion, “keeping it real” is getting more difficult every year.