The Limits of Strategy: Lessons in Leadership from the Computer Industry - Softcover

Von Simson, Ernest

 
9781440192609: The Limits of Strategy: Lessons in Leadership from the Computer Industry

Inhaltsangabe

1992 was a killing year for the four computer companies most important to business buyers over the decade. All four had been dominant suppliers of minicomputers for the past fifteen or twenty years. But on July 16, the CEOs of both Digital Equipment and Hewlett Packard were pushed into retirement. On August 8, Wang Laboratories declared bankruptcy. In December, IBM halved its dividend for the first time ever, forcing the resignation of its CEO a month later.  How did this happen? All four CEOs were clever and experienced. Two were founders of their companies; the other two highly successful career executives in their respective companies. All four were simply overwhelmed.
 
And while there was no single explanation for what happened, there were definite common themes.  They recur again and again in the many stories of this book.  Are the deadliest changes unavoidable because strategy is too easily thwarted by cluster bombs like technological velocity, cultural inertia, obsolete business models, executive conflict, and investor expectations?

The year 1992 is the fulcrum of this book, but the underlying theme is company transitions in the face of massive changes in markets, technologies, or business models – or, in other words, the limits of strategy.

Die Inhaltsangabe kann sich auf eine andere Ausgabe dieses Titels beziehen.

Über die Autorin bzw. den Autor

With his wife, Naomi Seligman, Ernest von Simson ran the Research Bureau, the quietly powerful think tank that observed, shaped, and guided the development of the computer industry. They got to know and admire the giants of those years -- including Michael Dell, Bill Gates, Andy Grove, and many more.

Auszug. © Genehmigter Nachdruck. Alle Rechte vorbehalten.

The Limits Of Strategy

Lessons in Leadership from the Computer IndustryBy Ernest von Simson

iUniverse, Inc.

Copyright © 2009 Ernest von Simson
All right reserved.

ISBN: 978-1-4401-9260-9

Contents

Preface...............................................................................................viiIntroduction..........................................................................................xi1: A Mad Dash Through History.........................................................................12: The Strategic Gold Standard: The Watsons..........................................................123: Reorganizing To Rearm: Frank Cary at IBM..........................................................354: The Competitive Limits Of Technology: Amdahl versus IBM...........................................525: Transient Technology: Travails of the Mini Makers.................................................666: First Movers: The Dawning of the Personal Computer................................................877: Defeated In Succession: An Wang at Wang Labs......................................................1098: Retrospective Strategy: John DeButts at AT&T......................................................1339: Foreign Cultures: AT&T's Recruit from IBM.........................................................15110: The Perils Of Incumbency: Sun and Oracle Take Over the Neighborhood..............................16711: Self-Accelerating Economies Of Scale: Apple, Microsoft, and Dell.................................19412: Choosing The Wrong War: IBM Takes On Microsoft...................................................21413: Powering To The Apogee: Ken Olsen at DEC.........................................................23114: Tumbling To Collapse: The Palace Guard Ousts Olsen...............................................25015: Field Force And Counterforce: DEC, HP, and IBM in Battle Mode....................................27416: Distracted By Competition: IBM Battles Fujitsu and Hitachi.......................................29017: Navigating The Waves At IBM: Akers Runs Aground, and Gerstner Takes the Helm.....................30518: Squandered Competitive Advantage: IBM Mainframes and Minicomputers...............................32419: Building A Great Business: Paul Ely at Hewlett-Packard...........................................34120: CEO Tumbles: Hewlett-Packard's Horizontal Phase..................................................35521: Limits Of Strategy?...............................................................................374Index.................................................................................................387

Chapter One

A MAD DASH THROUGH HISTORY

Before we start, let's consider a highly compressed synopsis of the computer industry's self-immolating and resurrecting history to set the book's timeline and a few overarching trends. Information Technology began modestly enough in 1822 when Charles Babbage introduced a forerunner to the computer with his beautifully handcrafted electromechanical calculator. Herman Hollerith pushed the still-fuzzy concept a key step closer to what we now know as the computer with his punch-card tabulating equipment. First used in the 1890 census, punch cards were gradually adopted for business use. Two decades later, Hollerith was able to sell his tabulating business for the then princely sum of $1 million, assuring his comfortable retirement.

Heading up the group of entrepreneurs that made Hollerith a wealthy man in 1911 was the pioneering Charles Flint, who merged a time-clock company and a scales company with the tabulating business to form the Computer-Tabulating-Recording Company, or C-T-R. It was this entity that CEO Thomas J. Watson Sr. would rechristen as International Business Machines in 1924. And when James Rand Jr. bought Porter Punch, a small tabulating company, a year later, he initiated a nose-to-nose sparring match between his Remington Rand and Watson's IBM that would survive for sixty years.

Though Hollerith punch cards became indispensable to various business operations, the decks were prone to flightiness as cards were lost, missorted, and otherwise abused. One well-traveled tale concerned cards soaked in a water-pipe break and then dried in the oven of a friendly pizza joint.

The first actual computers were built from vacuum tubes during World War II; the Brits built the Colossus, and two fellows from the University of Pennsylvania, J. Presper Eckert and John Mauchly, came up with the ENIAC (Electronic Numerical Integrator and Computer). Meanwhile, IBM was sponsoring Howard Aiken's construction of the Mark I at Harvard. Essentially a giant electromechanical tabulating device, the Mark I's first programmer was Grace Murray Hopper, a phenomenon in her own right.

Hopper was a mathematician, physicist, serial innovator, and U.S. Navy Captain, a rank attained after she joined the Naval Reserve to support her country in wartime. During these early days, when even one of her multiple accomplishments was considered unusual for a woman, Hopper recalled a summer evening in Cambridge when the lab doors had been left open to dissipate the day's heat. When the computer choked the next morning, a moth was found caught in one of its electromechanical switches-"the first bug," she later quipped, and, indeed, she is widely credited with discovering exactly that.

The Magnetic Fifties

Commercial computing began with Eckert and Mauchly's Universal Automatic Computer (Univac), and, perhaps more important, with their substitution of magnetic tape for those pesky and problematic punch cards. The two inventors had left the University of Pennsylvania on March 31, 1946, to form a company called first the Electronic Controls Corporation and soon the Eckert-Mauchly Computer Corporation. That company was sold in 1950 to IBM's longtime rival, Remington Rand. At first, Tom Watson Sr. resisted the move to electronics, largely out of fear that magnetic tape would kill IBM's immensely lucrative business in punch cards. Tom Jr.'s longer vision persevered.

Before the decade ended, the computer was in its second "generation," with transistor technology supplanting the vacuum tube. Simultaneously, computers made their first real penetration into the business office, as punch-card records were slowly transferred to magnetic tape. Soon, mainframes were pervasive, often visible in "glass houses" located near the headquarters lobby so that visitors could marvel at a company's modernity as captured in the herky-jerky movement of the tape drives.

The Do-It-Yourself Sixties

The 1960s marked my entry into the industry, eventually affording me a front-row seat from which to view the computer revolution. Naomi entered the industry in 1965 as a freelance market researcher working mostly for IBM. Around 1963, I designed and programmed a business application on a pair of transistor-based IBM computers that supported an entire insurance company with less memory and fewer cycles than today's wristwatch.

By mid-decade, the industry consisted of IBM and the so-called seven dwarfs: the Burroughs Corporation, the Control Data Corporation (CDC), the General Electric Company (GE), Honeywell, NCR (officially the National Cash Register Company until 1974), the Radio Corporation of America (RCA), and the Univac division of Remington Rand-by then part of the Sperry-Rand Corporation. Every dwarf took shelter under IBM's pricing umbrella to mark up the cost of its hardware fivefold for 80 percent gross margins. Big Blue could hold to its 15 percent annual profit growth and surround its major customers with armies of "free" sales representatives and systems engineers, who invaded executive offices with one idea after another, many half-baked.

Efficiency was no better among the seven dwarfs. All were shielded from competition by the handcrafting of software; a customer couldn't switch to a different computer without laboriously rewriting and then retesting every applications program. "Switching cost" was the iron advantage undergirding the entire computer industry's flabby business model.

Given that restrictive oligopoly, computer vendors could benignly double price/performance ratio every five years more or less. And computer power presumably increased exponentially with cost, as stipulated by Grosch's "law" (named for Herb Grosch, the gifted computer scientist and grumpy industry gadfly who was serially hired and fired from IBM by both Watsons). Though this "big is beautiful" price/performance relationship was widely accepted, its validity was questionable. Most computing-power metrics are horribly unreliable and too easily manipulated by computer marketers. Besides, the pricing wizards at IBM and elsewhere set prices with an eye toward encouraging customers to buy bigger computers than they really needed. Grosch's law owed less to electronics than to complacent business models and oligopolistic pricing.

In the late 1960s, IBM won what was arguably the largest bet ever made in the computer industry. Tom Watson Jr. had invested heavily in the development of System/360, a line of small to large computers that were software-compatible and that used the same peripherals-that is, tapes, disks, printers, and so on. Previously, customers couldn't switch to a larger or newer computer without reprogramming all of their applications-a deal breaker if there ever was one. Watson's gamble changed all that and gave IBM products an edge its competitors lacked.

The appeal of IBM compatibility was enormous, and System/360 completely upended the existing computer industry. RCA and GE quickly exited the field, with Honeywell eventually following, and CDC became a computer-services company. Against IBM, the only real survivors from the "mainframe" era were, ironically, Tom Sr.'s two fiercest opponents: Unisys, the stepchild of Jim Rand after Univac and Burroughs merged in 1986; and NCR, the brainchild of John H. Patterson, the man who had brought the elder Watson into the office-equipment business and then fired him.

The Chips Fall in the Seventies

In 1973, Naomi and I formed the Research Board and began almost three decades studying the computer industry during its most innovative and formative period. From our vantage point, we saw that success brings its own challenges, which for IBM meant both an antitrust suit and, more important, scores of new market entrants.

First came the leasing companies, clippers in hand, to undercut IBM's prices with discounts on secondhand gear. Plug-compatible peripherals and mainframes followed, and they used IBM's own 360 operating system to cut the equipment newcomers' research-and-development and field sales expenses. Worse yet, compatibility wore down a customer's apprehension about linking its own applications to a vendor of uncertain business viability. Should the fledging die, the customer could quickly and painlessly go running back to Big Blue.

At the same moment, the minicomputer industry was birthed. Starting around 1968, dozens of small companies formed in response to early-mover DEC's successful introduction of the Programmed Data Processor (PDP) line. Most of these start-ups built business models with lower product costs and gross margins than those burdening mainframers. For one thing, the minis used high-volume circuit technology, which was both cheaper to buy and simpler to deploy than the exotic ware the mainframes demanded. The minis were also cheaper to operate, since they didn't require a special priesthood or glass houses; regular office workers could fire up the machines without much training.

Grosch's law was quickly repealed. Now small was better, in a sense. The computing power provided by minicomputers, and then microprocessor-based servers, was far less expensive than what came from mainframes, a result of the minis' lower-cost technology and leaner gross margins. Most of the new wave was still burdened with the disadvantage of "proprietary operating systems," however, meaning that every manufacturer's software was incompatible with its peers.

But lurking just over the near horizon was the microprocessor, which carried the essentials of a computer processor on a single silicon chip. Developed first by Intel in 1971, and very shortly thereafter by Texas Instruments, the chips revolutionized computer development and radicalized the entire industry in the process. Many chief executives failed to appreciate the threat in time to save their companies. But so did the heads of Intel, the National Semiconductor Corporation, Motorola, and AT&T's legendary Bell Labs. First movers into PCs like Commodore, Radio Shack, and a kite string of lesser pennants fared no better.

Meanwhile, Grace Hopper had left Univac to lend her talents to the U.S. Navy, becoming the computing world's transcendent figure and bridging the gap between Howard Aiken's mechanical marvel and the microprocessor. Captain Hopper had begun mentoring Naomi whom she sponsored in 1968 for the American Management Association's Leadership Council. With their matching Vassar pageboy haircuts, one white and the other chestnut, they noodled, with Grace providing two pieces of stellar advice: "learn knitting" to avoid talking too often, and "leave your prestigious Diebold Group vice presidency to start the new firm with Ernie."

Captain Hopper was wonderful with young people and new ideas. Our interviews with her at the Pentagon were always attended by the twenty-something Navy ensigns and electrician mates whom she had somehow identified as computing wizards. She was godmother to the newest forms of computing that are only today becoming fully realized. She was certainly among the earliest proponents of replacing the exotically powered and priced mainframes with cheaper, more approachable minicomputers. She also imagined that hundreds, even thousands, of microprocessors might one day perform computationally intensive tasks that would overwhelm even the largest supercomputer. "When our pioneer forebears were trekking westward and their wagons were too heavy for the oxen, they didn't grow larger oxen, they harnessed more of them," she liked to say. "They didn't harness a herd of rabbits, either," we'd mutter under our breaths.

But Captain Hopper was much closer to the truth than most of us. To illustrate her argument when speaking at Research Board meetings and other venues, she would hand out roughly keyboard-long pieces of wire: "That's a nanosecond," she'd tell her admirers, who numbered in the thousands. "It's the maximum distance that light-or an electrical signal-can travel in a billionth of a second." And, by implication, that was the maximum dimension of a computer targeted at optimal throughputs. Today, microprocessors operating together are a given. Google alone harnesses hundreds of thousands of these rabbits.

A clear counterpoint to Hopper's concept came from the legendary supercomputer builder Seymour Cray. Dr. Cray was reputed to have begun designing each new model by building a box sized to provide the proximity required for his ultimate computing targets, if all the components could be crammed inside. But the required amount of ultra-high-performance circuitry creates enormous heat, comparable to the surface of an electric iron. So Cray mined his considerable genius to develop the "packaging" (e.g., the circuit boards) and especially the cooling mechanism. One of his most famous deca-million-dollar masterpieces was shaped like a banquette (complete with seat cushions) with liquid Freon running through the "backrest" to draw off the heat.

Dr. Cray had a curious personal ritual that could characterize the computer industry as a whole. Every spring he'd begin building a sailboat on the cliffs overlooking his Wisconsin lake that he'd finish in time for summer sailing. Then in the autumn, he'd burn the boat to ashes to clear his mental pathways for starting again the next year.

"Burn the place down," replied Steve Jobs to my question on how Apple could have escaped the Mac's success (after Steve had founded NeXT). The remark was simultaneously typical Steve and a terse, if inadvertent, reflection of the heavy baggage inherent in outdated business models. The only way to escape prior success is to burn it down?

Minis Fade in the Eighties

The beginning of the end for the minicomputer companies was preordained by three separate events. First, microprocessors replaced minis embedded in the machinery produced by assorted companies. Second, IBM finally entered the market with a half-dozen of its own minicomputer models. And finally, software compatibility eroded the customer's cost of switching to another vendor. After that, the old proprietary model was dead.

Minicomputer companies, led by Digital Equipment, followed the IBM System/360 approach and created hardware lines with a single operating system. Then the circle of compatibility widened beyond the product line of a single supplier when Larry Ellison began writing his Oracle database in a "higher-level" language that could be readily "ported" to different operating systems. It was a three-bagger for the industry's most envied iconoclast. Larry drew customers by lowering their switching costs across computer suppliers. As his customer count grew, so did Oracle's appeal to the developers of applications packages-first in accounting and payroll, later in supply-chain management and other fancy stuff. And more third-party software lured even more customers to Oracle.

Switching costs were hammered down again by the spread of UNIX (popularly Unix), an operating system first written around 1969 by Bell Labs' scientists. The initial version of Unix attracted scientists and hobbyists but was ill-suited for business use, lacking reliability and productivity tools for average programmers. By the early 1980s, though, Unix was being commercialized by Sun Microsystems and NCR. Some old-line hardware vendors tried to stem the assault by creating their own Unix flavors, such as IBM's AIX and Hewlett-Packard's HPUX. But the different Unix brands were still enough alike to draw the independent developers of applications software-initially, scientific and engineering tools and, eventually, business applications.

The Disappearing Act of the Nineties

The draw of large numbers was flattening industry profit margins. Larger volumes permitted sharply lower prices; success bred more success. Bill Gates was separating Microsoft and its Windows operating system from IBM's over-engineered, underperforming OS/2. He began to appear at industry meetings witha chart likeTable 1.1 that illustrated economies of scale on operating systems costing roughly $500 million each to develop.

Having a single Microsoft operating system would assure compatibility all around, both for PC makers like Compaq and Dell and for the all-important independent software vendors. Of the fifty midrange computing players active in the 1970s, only IBM and Hewlett-Packard survived, joined by latecomer Sun. Digital Equipment (acquired by Compaq), Wang, and all the rest had either disappeared completely or were severely reconstituted.

(Continues...)


Excerpted from The Limits Of Strategyby Ernest von Simson Copyright © 2009 by Ernest von Simson. Excerpted by permission.
All rights reserved. No part of this excerpt may be reproduced or reprinted without permission in writing from the publisher.
Excerpts are provided by Dial-A-Book Inc. solely for the personal use of visitors to this web site.

„Über diesen Titel“ kann sich auf eine andere Ausgabe dieses Titels beziehen.

Weitere beliebte Ausgaben desselben Titels

9781440192586: The Limits of Strategy: Lessons in Leadership from the Computer Industry

Vorgestellte Ausgabe

ISBN 10:  1440192588 ISBN 13:  9781440192586
Verlag: iUniverse, 2010
Hardcover