Monday, January 25, 2016

Early Indications October 2015: Of colleges, jobs, and analytics

It's funny how careers unfold. As a result of being in a particular place in a particular time, I find myself teaching analytics, supply-chain management, and digital strategy, mostly at the masters level. Not only did I not study any of these subjects in graduate school, none of these disciplines existed under their current name as recently as 20 years ago or so. What follows are some reflections on careers, skills, and patterns in education prompted by my latest adventures as well as some earlier ones.

1) What should I major in?
Across the globe, parents and students look at the cost of college, salary trends, layoffs, predilections, and aspirations, then take a deep breath and sign up for a major. I have seen this process unfold multiple times, and people sometimes miss some less obvious questions that are tough to address, but even tougher to underestimate.

The seemingly relevant question, "what am I good at," is tough to answer with much certainty: we require students to declare a major before they've taken many (sometimes any) courses in it, and coursework and salary work are of course two different things as well. While it's tempting to ask, "who's hiring," it's much harder to ask "where will there be good jobs in 20 years?" Very few Chief Information officers in senior positions aspired to that title in college, mostly because it didn't exist. Now that CIOs are more common, it's unclear whether the title and skills will be as widely required once sensors, clouds, and algorithms improve over the next decade or two.

It's even more difficult to extrapolate what the new "hot" jobs will be. In the late 1990's, the U.S. Bureau of Labor statistics encouraged students to go into desk top publishing, based on projected demand. In light of smartphones, social networks and "green" thinking, the demand for paper media never materialized, then tablets, e- readers, and wearables cut into demand still further. It's easy to say the Internet of Things or robotics will be more important in 20 years than they are today, but a) will near-term jobs have materialized when the student loan payments come due right after graduation, or b) are there enough relevant courses at a given institution? One cause of a nursing shortage that emerged about 15 years ago was a shortfall in the number of nursing professors: there were unfilled jobs, and eager students, but not enough capacity to train sufficient numbers of people to ease the hiring crunch.

2). English (or psychology, or fill in the blank) majors are toast

Many politicians are trying to encourage STEM career development in state universities and cite low earning potential for humanities graduates as a reason to cut funding to these fields. As Richard Thaler would say, it matters whether you make French majors pay a premium, or give chemical engineers a discount: the behavioral economics of these things are fascinating. The University of Florida led the way here about three years ago, but it's hard to tell how the experiment panned out.

At the same time, the respected venture investor Bill Janeway wrote a pointed piece in Forbes this summer, arguing that overcoming the friction in the atoms-to-bits-to atoms business model (Uber being a prime example) demands not just coding or financial modeling, but something else:


"Unfortunately for those who believe we have entered a libertarian golden age, freed by digital technology from traditional constraints on market behavior, firms successful in disrupting the old physical economy will need to have as a core competency the ability to manage the political and cultural elements of the eco-systems in which they operate, as well as the purely economic ones. . . .

In short, the longer term, sustainable value of those disrupters that succeed in closing the loop from atoms to bits and back to atoms will depend as much on successful application of lessons from the humanities (history, moral philosophy) and the social sciences (the political economy and sociology of markets) as to mastery of the STEM disciplines."

http://www.forbes.com/sites/valleyvoices/2015/07/30/from-atoms-to-bits-to-atoms-friction-on-the-path-to-the-digital-future/

On the whole, as the need for such contrarian advice illustrates, we know little beyond the stereotypes of college majors. The half-life of technical skills is shrinking, so learning how to learn becomes important in building a career rather than merely landing an entry-level position. Evidence for the growing ability of computers and robots to replace humans is abundant: IBM bought the Weather Channel in part to feed the Watson AI engine, Uber wants robotic cars to replace its human drivers, and even skilled radiologists can be outperformed by algorithms. A paper by Carl Frey and Michael Osborne at Oxford convincingly rates most career fields by their propensity to be automated. It's a very illuminating, scary list (skip to the very end):

http://www.oxfordmartin.ox.ac.uk/downloads/academic/The_Future_of_Employment.pdf

To bet against one's own career, in effect short-selling an occupational field requires insight, toughness, and luck. At the same time, the jobs that require human interaction, memory of historical precedent, and tactile skills will take longer to automate. Thus the liberal arts orientation toward teaching people how to think rather than how to be a teacher, accountant, or health-club trainer will win out, I believe.  This is a long term bet, to be sure, and in the interim, there will be unemployed Ivy Leaguers looking with some envy at their more vocationally focused state-school kin. Getting the timing right will be more luck than foresight.

3). What is analytics anyway?
As I've developed both a grad course and a workshop for a client in industry, I'm coming to understand this question differently. A long time ago, when I taught freshman composition, it took a few semesters to understand that while effective writing uses punctuation correctly, an expository writing (as it was called) course was an attempt to teach students how to think: to assess sources, to take a position, and to buttress an argument with evidence. All too frequently, however, colleges see the labor-intensive nature of freshman writing seminars as a cost to be cut, whether through using grad students, adjuncts, automation, or bigger section sizes. Each of these detracts from the close reading, personal attention, and rigorous exercises that neither scale well nor are done capably by many grad students or overworked adjuncts.

I'm seeing similar patterns in analytics. Once you get past the initial nomenclature, the two disciplines look remarkably similar: while courses are nominally about different things (words and numbers), each seeks to teach the skills related to assessing evidence, sustaining a point of view, and convincing a fair-minded audience with analysis and sourcing. To overstate, perhaps, analytics is no more a matter of statistics than writing is about grammar: each is a necessary but far from sufficient element of the larger enterprise. Numbers can be made to say pretty much whatever the author wants them to say, just as words can. In this context, the recent finding that very few (39%) published research findings in psychology could be replicated stands as a cautionary tale. (https://www.washingtonpost.com/news/speaking-of-science/wp/2015/08/27/trouble-in-science-massive-effort-to-reproduce-100-experimental-results-succeeds-only-36-times/) Unfortunately, American numeracy -- quantitative literacy -- is extremely low, rendering millions of people incapable of managing business, households, and retirement portfolios. Being able to write sound academic research, meanwhile, looks to be even more rare than we've thought.

A paradox emerges; at the moment when computational capability is effectively free and infinite relative to an individual's needs, the skills required to deploy that power are highly unequally distributed, with little sign of improvement any time soon. How colleges teach, who we teach, what we teach, and how it gets applied are all in tremendous upheaval: it's exciting, for sure, but the collateral damage is mounting (in the form of student loan defaults and low completion rates at for-profit colleges, to take just one example). Are we nearing a perfect storm of online learning, rapidly escalating demand for new skills, sticker shock or even buyer refusal to pay college tuition bills, abuses of student loan funding models, expensive and decaying physical infrastructure (much of it built in the higher-education boom of the 1960s), and demographics? Speaking of paradoxes, how soon will the insights of analytics -- discovering, interpreting, and manipulating data to inform better decisions -- take hold in this domain?