Monday, November 30, 2020

Early Indications October 2020: Why this time will be different


Nearly 30 years ago, businesses across the world began a surge of investment in information technologies. Spurred by desktop computers that were fast enough to run graphical user interfaces, by the adoption of the Internet starting with email, and by the combination of management consultants and enterprise software vendors, companies began “reengineering the corporation,” as one best-selling book termed it. Chief Information Officers were named and, with great rapidity, often replaced. SAP, Siebel, Oracle, Sun Microsystems, and Microsoft all minted millionaires out of programmers and architects, while the big systems integrators — IBM, Accenture, and many others — hired, trained, and churned through thousands of young hires to staff the many and massive projects. As for results, opinions varied in both academic and managerial circles, but every CEO, no matter what industry, had to confront the decision as to how (not whether) to invest in IT.


There are reasons to wonder whether the time has come for a similar wave. Machine learning has dropped in price and proven its capabilities to help address new kinds of problems. Cloud services have changed the capital spending landscape sufficiently that IBM is spinning out everything else and focusing 100% on virtualized computing. Internets of things, industrial and consumer alike, create stunning possibilities for improvements in efficiency, safety, and innovation. New languages and architectures including Tensorflow, containers, adaptive processing platforms like FPGAs, and no-code development tools extend both the making and using of enterprise computing to new scenarios. And no longer are programmers limited to Microsoft, Oracle, and the usual developer toolsets: Zoom just announced it wants to be a platform, joining the likes of Slack, Dropbox, and of course Salesforce.


There’s a growing list, most recently driven by Covid-19, of consequential business and social problems that require attention. The good news is that the aforementioned technical developments make new solutions (and types of solutions) possible. Finally, businesses need the kinds of transformations that IT makes possible. Amazon’s scale, Airbnb’s flexibility, Google’s reinvention of the advertising industry: none of these competitive forces could run on 1990s computing. Whether it was the lengthening of global supply chains, the offshoring of millions of western jobs, or the complex financial instruments that made modern banking possible, modern business relies on more and more complex software applications, ideally accessible anywhere any time on any device.


Why will this time be different? I’ve spotted many reasons, and readers will no doubt have even more. Let’s look at a few.


1) The global pandemic will make capital spending extremely challenging

Yes interest rates look like they will remain low, facilitating borrowing. But the uncertainty of the pandemic’s initial duration along with the virus’s long-term effects, susceptibility to vaccines and/or therapies, and potential mutation will not clarify for years. Hard-hit industries continue to multiply: travel and tourism, hospitality, oil, retail, and the public sector only begin a list. Corporate spending on anything investigative is likely to remain constrained for 3 years, probably more.


2) The IT vendor landscape is changing

In the old days, “nobody ever got fired for buying IBM.” Later, duopolies or oligopolies emerged in sector after sector: databases (IBM, Oracle, and Sybase or another flavor of the month), enterprise operating systems (Unix variants, Windows NT), networking (Cisco and Juniper), and so on. Now, several factors represent a changed landscape from 15 years ago.

A) The installed base is a complex mess of new and old; on-premise and cloud; startups, warhorses, and unsupported failed companies; mobile and desktop; top-down and bottom-up; free and licensed; and backbones from accumulated merged entities. “Greenfield” is a ridiculous notion in this day and age.

B) Virtualization has destroyed the old ISO “layer cake” model of who did what. Is a storage area network storage or networking? Given that a social graph is a database, can our DBAs run one? Is Google Maps data or application? VMWare, a software company 80% owned by Dell, a hardware company, sells virtualization that runs on Amazon's AWS. Who's the lead dog on that sled? Figuring out which vendor can, should, and will do what, especially in the aforementioned complex legacy environments, is nontrivial.

C) The IS shop can no longer play gatekeeper. Beginning with Salesforce, business units and even departments began bypassing central IT and expensing SAAS seats that solved real problems, ramped up in days not months, and avoided the types of IT staffers who were either too intrusive (“you need to do it this way”) or unresponsive (“fill out a trouble ticket and we’ll sort them by priority”). In some companies, central IT tried to block Facebook at the firewall, smartphones notwithstanding.

D) While there are still big vendors selling IT, the buying environment is more complex. “One-stop shops” are harder to find. SAP stock fell 20% in one day a week ago. IBM is reorganizing, again. Amazon does AWS extremely well, but it’s not a holistic solution, especially absent pretty capable developers to deploy it on the client side of the relationship. The trend toward process outsourcing means that Accenture might run a big chunk of a company’s supply chain, but the outsourcer’s handoff points to both IS and internal SCM are far from standard or fixed. And what the contract says might not be how things actually work on the ground. Thus the days of Oracle, Deloitte, and Compaq teaming up on a big bid look very different: recall that a recent government cloud contract came down to AWS and Microsoft after Oracle and IBM were eliminated.


3) Today’s business problems are different than 1995’s

Whether it’s computational biology at a pharma company, chatbots in customer service, detecting synthetic media (deepfakes) in video on social media, or administering fair tests via online instruction, the enterprise software of the past decades won’t have, or be able to accommodate, the machine-learning and other computational resources to address many of these evolving business needs. The jokes about ERP and poured concrete resonated for a reason, and in a world where international trade agreements can change weekly, terrorism and sabotage take new forms, and markets can evaporate overnight (hello air travel), agility has become a watchword that big software packages typically don’t speak. 


4) Today’s organization is different from 1995’s

Remote workers and the new childcare realities that they bring, contractors with evolving legal status, offshore factories migrating closer to (maybe not “back”) home, pressure from multiple sides to address racial and gender equity issues, people at retirement age staying on given insufficient 401(k) savings, and the need to recruit millennials with different skills, norms, and values from the core staff — it’s hard to see much business as usual, and the upheaval will increase, not decrease, in the foreseeable future. The call to be “data-driven” echoes from the C-suite outward, but people with the skills to do and to manage such processes are not yet available in sufficient numbers. (If anything, scientific and quantitative literacy in the U.S. are declining.) Deciding to do the right things is a key part of management, but doing those things right is tough if the requisite skills are just not available. 


Finally, this time will be different in part because some things stay the same. People still resist change, personally and especially collectively. Most organizations, ERP investments notwithstanding, still run on Excel + email. Work/life don’t balance. Risk-taking has been suppressed by “risk management” departments to the point where the status quo becomes organizational law. 


If it’s going to be different, what should we look for this time around?

Privacy, algorithmic transparency, and naïveté all need to be addressed by substantive debate and laws with teeth: people have proven incredibly easy to game and the massive behavioral experiments being run by Google, Facebook, and Amazon have concrete consequences. In enterprise IT, meanwhile, when cash is king, customers can extract real change from vendors. Finally, maybe some organizations won’t waste a good crisis and instead begin the hard work of reinventing the cost structure rather than nibbling at it, the value proposition by truly engaging with customers, and the organization by taking a hard, fresh look at what’s possible and sustainable. (The Nordic countries are one place to start.)


Jeanne Ross is a longtime fixture at MIT’s Center for Information Systems research, having been director and now Principal Research Scientist in nearly 30 years there. In her new book Designed for Digital, she and two co-authors look at the IS organization that is emerging in the post-ERP/CRM era: these backbones are necessary but not sufficient, and IoT, machine learning, analytics, blockchain and many other emerging technologies need to be assessed and where applicable utilized. Her exemplars of “big companies that get it,” drawn from the CISR’s global roster, might be surprising but I can attest, after seeing hundreds of mid-career masters students, that Schneider Electric and Philips are in fact making positive moves. 


There is a lot to learn from in the book, but three core lessons from the leading companies apply in the context of this newsletter. 


1.  They experiment repeatedly.

2.  They co-create with customers.

3.  They assemble cross-functional development teams.


In Ross’s words, “These three challenges attack your habits because they tend to be different from what you have done.” In other words, it’s the behaviors that matter, not the change management consultants, or the CIO who has “a seat at the table” (most still don’t), or the project’s projected ROI.  

In learning from startups, Ross notes that big companies don’t really get the notion of the pivot, of starting down a road then changing course after new information becomes available. Yes the established company has channels to market, capital, and brand equity the startup lacks, but, in the end, the startup poses a threat to the incumbent for this single reason: small, open-minded organizations are better at changing their mind and shifting direction in light of experimental evidence.

Thus the short answer to the opening question — why will things be different this time? — is that size matters, and small is beautiful because agility often matters more than mass. For many reasons, look not for 5-year $50 million transformations, but rather quick-hit, small-scale pilots with the freedom to evolve and ideally pivot.

Early Indications November 2020: Intel Outside

It’s been a little over 9 years since the Silicon Valley venture capitalist Marc Andreesen proclaimed that “software is eating the world.” At the time, his credentials included playing a key role in the invention and commercialization of the web browser before becoming an investor whose portfolio companies included Facebook, Zynga, LinkedIn, Foursquare, Skype, Groupon, and Twitter. In the almost-decade since he staked out his intellectual position, software has created billions of dollars of wealth at the aforementioned Facebook, at Google, and Netflix. As Andreesen’s own list of ostensible world-eaters illustrates, however, it’s unclear (judging from Microsoft acquisitions Skype and LinkedIn, Groupon, or Foursquare) how much his thesis came true. Think back to 1994-2000, and recall the software companies that indeed turned the world upside-down: Akamai, Netscape, Google, PayPal, MySQL, VMWare, and Salesforce.


As those last two companies foretold (as did Andreesen’s second startup, Loudcloud — later called Opsware), the “where” of software was changing from running on a device or customer premise to connecting via the Internet to some vast data center at an undisclosed location. As we look at 2000 to 2020, many of the companies “eating the world” may in fact build software, but their competitive differentiation includes hefty portions of hardware and infrastructure. It’s easy to see that Amazon writes clever software at huge scale, but with dozens of data centers and more than a million employees, many of whom have nothing to do with code, it’s not meaningfully called a software company. Similarly, ByteDance writes clever algorithms to power its TikTok and other similarly addicting services around the world, but absent server farms, there’s nothing happening.


When scanning US-based innovation in the past 9+ years since Andreesen wrote, there’s really not much to see apart from Airbnb and Uber/Lyft: important companies, for sure. Using software to arbitrage everyday people’s capital (at Airbnb, this was true at the outset but less so now) is a game changer, but it remains to be seen if the three companies can be consistently profitable: right now the ride-sharing companies keep losing prodigious sums of money while compressing drivers’ wages and flexibility further each year. As I argued in a journal article a couple years ago, it remains unclear how much Uber’s notoriously bad behavior derived from a crappy culture fostered by the co-founder and how much it resulted from the realization that the entire model could never work. The whole point of taxi medallions was to limit supply to maintain profitable pricing; absent limits on supply, price will race to the bottom of a market.


All of this is a somewhat circuitous way of suggesting that while software is incredibly important, hardware not only matters, but is actually where more interesting things are currently happening. Intel is (to use this year’s most overused word, albeit correctly) an iconic company, on par with AT&T, GM, and Wal-Mart in that it helped define an entire epoch in business history. Former CEO Andy Grove’s paranoia about being disrupted in Christensonian fashion eventually came true: Arm chips, dismissed so easily at their launch given low benchmark performance alongside their power efficiency, were the final blow that broke the boulder of Intel’s industry dominance.


2020 has been a fascinating year on the microprocessor front. Softbank sold Arm to Nvidia to raise cash to help right the sinking Vision Fund ship. In October, meanwhile, AMD bought Xylinx, the dominant player in the broadly-defined system-on-a-chip market. Taiwan Semiconductor Manufacturing Company, meanwhile, is fab-to-the-world, most visibly to Apple, and its stock has more than doubled off its Covid-low in March. Like Apple, Google designs its own silicon, and outsources manufacturing to either TSMC or GlobalFoundaries, AMD’s spun-out semiconductor operation now owned by the Abu Dhabi sovereign wealth fund. Amazon, meanwhile, bought an Israeli chip-design firm in 2015 that now designs the Arm-based custom silicon (reportedly manufactured by TSMC) that powers big chunks of AWS. Facebook appears to rely heavily on Intel, with AI chips rumored to be in development.


Given how few of us work in close proximity to Google TPUs or Amazon Gravitons, Apple’s recent launch of its M1 chip is the first experience everyday people can have of this chip revolution. The M1 is in broad outlines a close relative of the A14 Bionic chip that powers the most recent iPhones. It is available in the Mac Mini and some laptop computers (that, remarkably, are on sale for Cyber Monday). Given that both chipsets are Arm-based, the recent announcement (in July) and release (in November) of Apple products that migrate off Intel processors is extremely significant. The performance gains are simply staggering: my favorite is that the M1 can emulate an Intel chip — relatively efficiently, but not as fast as Apple’s Arm-native code — faster than an Intel can run at full throttle. This performance comes with battery life measured in days rather than hours, without cooling fans, and at low price points. The fact that TSMC can produce 5-nanometer traces while Intel is announcing delays in its 7-nm products, out into 2022, helps explain the disparity. 


It’s unclear what markets Intel will have left: Apple is phasing out Intel chips, desktop computing is in the midst of a steep decline, and Intel captures little of the tablet market. The Wintel duopoly, meanwhile, is no longer an oligopoly, and Microsoft is focusing its energies on the cloud rather than the desktop. Lots of laptops were sold to support Covid-driven work-from-home initiatives, but that blip in sales will most likely not last given general economic softness and the size of this sudden (and one-time) refresh cycle.


I’ll spare my readers the fanboy-like praises of the many reviewers (here’s one summary) but it did warm my heart to remember seeing technology breakthroughs on a regular basis: the Mosaic browser, Altavista then Google search, Keyhole EarthViewer, Gapminder Trendalyzer, YouTube, the iPhone, then . . . what? The iPad, EarPods, Apple Watch, endless meal-delivery apps that will bankrupt already-drowning restaurants — none of these really count as anything that engages me the way the M1 reviewers report feeling. Just to take one example, opening the lid of an M1 Mac laptop has become a kind of sport, given the instant screen readiness. Even browser windows and simple operations are said to snap in a way that redefines the computing experience, reminding a certain kind of user why we went into this business in the first place.