In the last years of the nineteen-eighties, I worked not at startups but at what might be called finish-downs. Tech companies that were dying would hire temps—college students and new graduates—to do what little was left of the work of the employees they’d laid off. This was in Cambridge, near M.I.T. I’d type users’ manuals, save them onto 5.25-inch floppy disks, and send them to a line printer that yammered like a set of prank-shop chatter teeth, but, by the time the last perforated page coiled out of it, the equipment whose functions those manuals explained had been discontinued. We’d work a month here, a week there. There wasn’t much to do. Mainly, we sat at our desks and wrote wishy-washy poems on keyboards manufactured by Digital Equipment Corporation, left one another sly messages on pink While You Were Out sticky notes, swapped paperback novels—Kurt Vonnegut, Margaret Atwood, Gabriel García Márquez, that kind of thing—and, during lunch hour, had assignations in empty, unlocked offices. At Polaroid, I once found a Bantam Books edition of “Steppenwolf” in a clogged sink in an employees’ bathroom, floating like a raft. “In his heart he was not a man, but a wolf of the steppes,” it said on the bloated cover. The rest was unreadable.
Not long after that, I got a better assignment: answering the phone for Michael Porter, a professor at the Harvard Business School. I was an assistant to his assistant. In 1985, Porter had published a book called “Competitive Advantage,” in which he elaborated on the three strategies—cost leadership, differentiation, and focus—that he’d described in his 1980 book, “Competitive Strategy.” I almost never saw Porter, and, when I did, he was dashing, affably, out the door, suitcase in hand. My job was to field inquiries from companies that wanted to book him for speaking engagements. “The Competitive Advantage of Nations” appeared in 1990. Porter’s ideas about business strategy reached executives all over the world.
Porter was interested in how companies succeed. The scholar who in some respects became his successor, Clayton M. Christensen, entered a doctoral program at the Harvard Business School in 1989 and joined the faculty in 1992. Christensen was interested in why companies fail. In his 1997 book, “The Innovator’s Dilemma,” he argued that, very often, it isn’t because their executives made bad decisions but because they made good decisions, the same kind of good decisions that had made those companies successful for decades. (The “innovator’s dilemma” is that “doing the right thing is the wrong thing.”) As Christensen saw it, the problem was the velocity of history, and it wasn’t so much a problem as a missed opportunity, like a plane that takes off without you, except that you didn’t even know there was a plane, and had wandered onto the airfield, which you thought was a meadow, and the plane ran you over during takeoff. Manufacturers of mainframe computers made good decisions about making and selling mainframe computers and devising important refinements to them in their R. & D. departments—“sustaining innovations,” Christensen called them—but, busy pleasing their mainframe customers, one tinker at a time, they missed what an entirely untapped customer wanted, personal computers, the market for which was created by what Christensen called “disruptive innovation”: the selling of a cheaper, poorer-quality product that initially reaches less profitable customers but eventually takes over and devours an entire industry.
Ever since “The Innovator’s Dilemma,” everyone is either disrupting or being disrupted. There are disruption consultants, disruption conferences, and disruption seminars. This fall, the University of Southern California is opening a new program: “The degree is in disruption,” the university announced. “Disrupt or be disrupted,” the venture capitalist Josh Linkner warns in a new book, “The Road to Reinvention,” in which he argues that “fickle consumer trends, friction-free markets, and political unrest,” along with “dizzying speed, exponential complexity, and mind-numbing technology advances,” mean that the time has come to panic as you’ve never panicked before. Larry Downes and Paul Nunes, who blog forForbes, insist that we have entered a new and even scarier stage: “big bang disruption.” “This isn’t disruptive innovation,” they warn. “It’s devastating innovation.”
Things you own or use that are now considered to be the product of disruptive …