A review of George Szpiro’s 2011 book on the history of the Black-Scholes option-pricing formula uses Southwest Airlines’famous fuel-price-hedging strategy as a key piece of its explanation for why firms might want to use options. Southwest’s hedging has received a lot of attention; the gains and losses on these financial trades have rivaled operating profits and losses on its income statement. Most commentators have applauded this aggressive trading activity, merely cautioning that sometimes Southwest guesses wrong about future oil prices and loses a lot of money.
What no one seems to ask is why Southwest shareholders would want the firm to be speculating in the fuel market in the first place. Unless these hedges materially reduced the risk of bankruptcy–and Southwest’s balance sheet is typically stronger than its rivals’–the classic argument applies: Shareholders should not want corporate managers to hedge industry-specific risks, such as swings in fuel prices, because they can very easily deal with these risks themselves by holding a diversified portfolio of stocks (including oil firms) or even by buying their own options on oil prices. Southwest’s financial risk reduction via hedging conveys little or no benefit to the owners of the firm.
But wait, many will object–doesn’t hedging give Southwest a cost advantage over its rivals when oil prices go up? And since these hedges are often accomplished by options, isn’t there an asymmetry, since when Southwest guesses wrong, it only loses the price it paid for the option? Doesn’t the airline therefore lower its costs by these trades, gaining a leg up on its rivals?
The answer is No. These hedges have no impact whatsoever on Southwest’s cost of being an airline operator. They constitute an independent, speculative financial side business, a business that is exactly as good for Southwest shareholders as the CFO’s team is at outguessing the fuel market. Even when Southwest guesses right, it is not improving the airline business’s competitiveness.
To see why this is true, think about the incremental fuel cost to Southwest of running a flight with or without the hedge. If the spot price of fuel is $x/gallon at the time of the flight and it consumes y gallons, then the fuel cost is xy. If Southwest has successfully hedged the oil price, then it will make a bunch of money after closing out its position, but it would still independently save $xy by not running the flight. If Southwest has guessed wrong and lost money on the hedge, it would also save $xy by not running the flight. So the cost of operation–the increment in expenditure caused by producing another unit–is unaltered by the hedging strategy.
This situation should be easy to visualize because the hedges are on oil rather than jet fuel and because they are settled for cash rather than physical delivery. But even if the hedges were denominated in physically delivered jet fuel, successful or unsuccessful hedging would have no impact on airline operating costs. If Southwest just bought fuel early for $(x-a)/gallon and stored it until the spot price was $x/gallon, the opportunity cost of the flight would still be $xy, since the airline could cancel the flight and sell y gallons for that amount. The incremental expenditure difference between flying and not flying is exactly the same. (If opportunity cost confuses you, visualize that Southwest has some fuel on hand purchased at the lower hedged price and some at the spot price, and note that it doesn’t matter which barrel of gas goes into which plane–all the fuel is fungible, and it is all worth $x/gallon if that’s what it could be sold for.)
Now, risk-averse behavior by managers may be in their own interest, depending on the form of their compensation, the structure of the labor market, and their perceived ability differential over their peers. But it is of little help to the owners of public firms that are far from bankruptcy. That’s a point that should not be hedged.
A long time ago, in a blog far, far away, I outlined the idea of a “new-wave utility.” The idea was that some innovative high-growth service businesses were transitioning into utility-like systems whose large and diverse customer bases implicitly depended on them for ubiquity, reliability, and stability of offering. One example I mentioned in passing was Starbucks. Apparently, in Manhattan, Hurricane Sandy has revealed the truth of this classification. From the story in the link, access to bathrooms has been a key issue in the Big Apple. That’s less of a factor in L.A., but power outlets, WiFi, and table space in a congenial environment have certainly put Starbucks (and its smaller rivals such as the Coffee Bean and Tea Leaf Co.) in the category of utilities for the city’s horde of writers, students, and deal-makers.
Alex Tabarrok’s pictorial commentary on patent policy, drawn on a napkin, posits that the current patent system is somewhat too strong and thereby decreases innovation (the link to his original post is below). I have to say, however, that I don’t think patent strength is the problem. The problem is that the growth in patent applications over the last two decades has vastly exceeded the growth in resources available to the patent office, resulting in 1) long delays between patent application and granting (which can render patents completely pointless in fast moving industries), and 2) inadequate ability to examine the patent applications for novelty, usefulness and non-obviousness. This lowers the value of good patents (because they aren’t granted quick enough or may be fallaciously challenged) and increases the likelihood of bad patents being granted. As a result, for many individuals and firms, the expected net gains from manipulating the patent system for the purposes of extortion (hostage taking, patent trolling) now exceeds the expected net gains from using the patent system to actually innovate.
It’s difficult to assess how patent strength affects innovation without first making sure that patents are being granted and used the way the system had originally intended.
Alex Tabarrok’s original post can be found here: http://marginalrevolution.com/marginalrevolution/2012/09/patent-theory-on-the-back-of-a-napkin.html?utm_source=feedburner&utm_medium=email&utm_campaign=Feed%3A+marginalrevolution%2Ffeed+%28Marginal+Revolution%29
Over at the Atlantic, Jordan Weissmann argues that the Obama administration’s claim to be pursuing an “all of the above” energy strategy is unrealistic because the EPA’s new CO2 emission rules will make traditional coal plants untenable while “clean coal” technology is uneconomical relative to natural gas. Fine.
But Weissmann goes on to argue that the reason why clean coal R&D is a big waste of money is because of the lack of a cap-and-trade policy that would put a price on CO2 emissions. That’s dead wrong. With a price for carbon dioxide, just as with the EPA’s technology or emissions standards, power producers would look for the cheapest alternative to coal. That would be natural gas (given the same forecasts Weissmann relies on). So the clean-coal subsidies are unlikely to pay off regardless of what kind of policy we pursue, be it a CO2 tax, cap and trade, or emissions or technology standards for power plants. Cheaper is cheaper. It’s amazing how often people fail to grasp principles of competitive advantage.
(We could, of course, come up with a convoluted policy to keep coal miners employed, similar to how the 1977 amendments to the Clean Air Act were set up to penalize low-sulphur Western coal so as to keep Eastern mines open, but I’m hopeful that we can avoid that kind of perversity this time. If you see laws or regulations that punish natural gas use in electric power, though, you’ll know my hopes have gone unfulfilled.)
An article recently posted in Slate reviews research showing that a significant portion of the variation in IQ tests is attributable to motivation rather than ability. In one striking study researchers measured the children’s IQ and split them into High, Average, and Low groups. They reran the test offering the low group an M&M for every correct answer. As a result of this simple incentive, the low group’s score went from 79 to 97 – on par with the average group.
Ok, so incentives work. Perhaps not a big surprise on many levels.
On the other hand, there is a large OB/HRM literature invested in the conclusion that performance increases are associated with hiring employees with a higher IQ. The assumption there is that IQ measures ability as opposed to motivation.
This raises a critical question for strategy scholars. Is motivation an immutable attribute of human capital Read the rest of this entry »
For a PDW, I was asked to develop a short list of paradoxes linked to the strategic human capital (spoiler alert for those of you planning to be at the session at 8am on Friday). I’m sure some of them would not surprise you in the least. Others might spur some discussion though. Here is the short list:
- Rent from human capital may not show up in profitability
- “Who” is a firm?
- Firm-specificity isn’t as important as we might think
- HR Departments may not matter much
- High performance work systems don’t tell us much about such advantages
Rent. The first point is what you would expect from me so let me dismiss it quickly. Obviously, if rent is linked to human capital, some portion of it is likely to be captured by people. Nuff said.
Who is a firm? A sharp distinction is made between hiring on the spot market and an internal labor market. Rightly so. However, one might think that once labor is “internal” such people are part of the firm. Read the rest of this entry »
Some of you may remember Mason Carpenter’s old teaching web page with experiential exercises, videos, and other tips for teaching strategy. I’ve repackaged his content, added some of my own materials, and it can now be found at:
A quick tip is that you can now sort the resources by topic (click the category list on the right). I included the most common broad topics in a core strategy course so this should get you to something useful quickly. Probably most importantly, there is a mechanism so people can submit new tools and comment on exiting tools to keep the site fresh.
To give you a feel for it, here are links to a few exercises and resources that you might find particularly useful:
- Global Alliance Game (focus is on the search for complementarities and hazards in negotiating to take advantage of them).
- Read the rest of this entry »
The current issue of McKinsey Quarterly features an interesting article on firms crowd-sourcing strategy formulation. This is another way that technology may shake up the strategy field (See also Mike’s discussion of the MBA bubble). The article describes examples in a variety of companies. Some, like Wikimedia and Redhat aren’t much of a surprise given their open innovation focus. However, we should probably take notice when more traditional companies (like 3M, HCL Technologies, and Rite-Solutions) use social media in this way. For example, Rite-Solutions, a software provider for the US Navy, defense contractors and fire departments, created an internal market for strategic initiatives:
Would-be entrepreneurs at Rite-Solutions can launch “IPOs” by preparing an Expect-Us (rather than a prospectus)—a document that outlines the value creation potential of the new idea … Each new stock debuts at $10, and every employee gets $10,000 in play money to invest in the virtual idea market and thereby establish a personal intellectual portfolio Read the rest of this entry »
I read about Microsoft’s acquisition of patents from AOL with some interest. They note that this reflects a price of $1.3M/patent and compare it to other recent escalations in the IP arms race. Analysts estimate that Google only paid $400k/patent in the $12B acquisition of Motorola Mobility. Nortel patents recently went for about $750k each. Of course, given the wide variance in the value of a patent, clearly the average is not particularly informative — it treats all of these patents as homogeneous which is certainly not the case. Nevertheless, the escalating prices do suggest that the arms race is unlikely to create much value for the firms (and certainly not for consumers).
However, buried in the stories is another rather interesting observation – some of the key players earn more from selling rivals’ handsets than their own. Read the rest of this entry »
After watching Jeremy Lin (Knicks) score 38 points against the Lakers tonight, I’m now on the Lin bandwagon. I don’t really even follow basketball that closely, but this seems like an intriguing story.
How on earth did someone like this go unnoticed? Seriously. He happened to get an opportunity to show his stuff as Carmelo Anthony and Amare Stoudemire are injured – and boy has he delivered.
Here’s a kid who didn’t get recruited for college ball, despite a tremendous record in high school. He was a superstar at Harvard but went undrafted by the NBA after graduating from Harvard (in economics) in 2010. He played a few games for Golden State and Houston, but was cut by both. He has played D-league basketball this year, until a few weeks ago. As of last week, he did not have a contract.
But come on: is basketball truly this inefficient at identifying and sorting talent? The comparisons and transfer of ability across “levels” (high school-college-professional) of course is tricky, though you would think that with time there would be increased sophistication.
Now, four games of course doesn’t make anyone a star. But even if Lin proves to “just” be a solid bencher, it seems that talent scouts clearly undervalued Lin (who lived in his brother’s apartment until recently). How much latent talent is out there? (I think that at the quarterback position in professional football – there are significant problems in identifying talent, but that’s another story.)
There are of course also some very interesting player-context/team-fit, interaction-type issues here, and I’m not sure that this really gets carefully factored beyond just individual contribution (thus not recognizing emergent positive, or negative, player*player effects). It’ll be interesting to see what happens, for example, when Carmelo Anthony is added back into the mix.
Well, it’ll be interesting to see how all this plays out. There is in fact a sabermetrics-type, stats-heavy, Moneyball-like thing in basketball as well – called ABPRmetrics. I would be curious to know whether there are ways to statistically identify Lin-type undervaluation and potential, and whether phenoms like this lead to better metrics for identifying talent.
UPDATE: Here’s ONE analyst/statistician who saw Lin’s potential in 2010.
Many of you will have heard by now that Kodak is likely to file for Chapter 11 bankruptcy sometime soon. Their present strategy appears to be to wind down the business by selling off many of their patents. I guess my main surprise upon seeing them back in the news was that they were still in business. Apparently, it takes an extended period for these behemoths to fold for good.
The source of my surprise was the fact that I used to teach Kodak managers in both the executive and part-time MBA programs at the Simon School in Rochester, the home of Kodak’s headquarters. Just to be clear, these men and women were great students … bright, curious, open-minded, and typically well-trained. My purpose here is not to jump on the current bandwagon and blame Kodak’s present troubles on the stupid, selfish, rapacious tendencies of its 1%-er senior managers. Quite the opposite. Rather, I’d like to question what role, if any, the things they learned in b-school strategy classes played in the formation of, ultimately, misguided business plans.
Harking back to the late 90s, when I was teaching EMBA classes that were populated with about 1/3 senior managers from Kodak, I vividly remember initiating class discussions about the disruption digital technology was going to have on Kodak’s legacy business. Unlike the automobile manufacturers of the 70s, who really missed the significance of Japanese competition, Kodak managers fully understood that the new digital technologies were going to change their industry forever. Sure, there was a lot of uncertainty about the speed and path by which transformation would occur. But, it wasn’t the case that these smart people didn’t see it coming. They got it. And they were optimistic and dedicated, in my experience to a person, to implementing strategies that would permit Kodak to successfully ride the new technological wave.
Why were they so optimistic? When challenged to discuss it in class, they proudly explained that Kodak’s “core competency” was “color”. The reasoning went something like, “We understand color and its application to photography better than any other firm. This knowledge will be as important for success in digital applications as it was in analog film. Therefore, we are wonderfully positioned for whatever challenges the market presents.” The problem, from my perspective was two-fold: a) the thinking did not seem to go much deeper than this; and, b) the strategy literature did not have much to offer to help them think deeper than this.
Many have complained that the RBV, which is the source of this core-competency thinking, is a tautology: core competencies are unique resources that cause a firm to persistently outperform its peers; all firms that persistently outperform their peers have core competencies. I don’t agree with this complaint. Indeed, my sense that the pioneers of the RBV were on to something substantially influenced my desire to study strategy. That said, the “theory” underlying the RBV doesn’t go much more than one step beyond the tautology. And, much of where it goes is wrong (e.g., having resources that are inimitable is neither necessary nor sufficient for persistent performance advantage).
So, my energetic, smart, dedicated EMBA students, when presented with a strategy theory that was frustratingly close to a tautology, developed a strategic conceptualization of their firm that was – not surprisingly – frustratingly close to one as well. At the end of the day, it seemed to be an article of faith among my students that “knowledge resources about color” were going to save the day. (As we are all only too aware, smart people are masters at locking onto a favored idea and finding all kinds of arguments to support it.) As a teacher, it was incredibly difficult to push them deeper into a critical analysis of how, specifically, this “color know-how” was going to be their lifeline. New competitors, new product distribution channels, radical changes to how photographs are shared and consumed? No problem — we know color!
Part of my teaching frustration, which became part of my research motivation, was that the extant literature did not offer much in the way of tools to help these folks think about such issues in a complete, consistent, and efficacious way. Worse, in my judgement, teaching those folks a shallow set of ideas actually facilitated their transition into a dangerous state of groupthink. Holding up a piece of tautological thinking as the pinnacle of scholarly theory doesn’t exactly encourage students to think beyond tautology.
Our field has more than its share of interesting conjectures (i.e., informally generated speculations). What we need now are more scholars who are willing to roll up their sleeves and dig into the details. And patience. Lots of patience.
This article in Forbes argues that a new book by the Dean of Rotman School provides an antidote to the rampant excesses of modern day capitalism. The principle swipe is against the landmark paper (over 29000 Google Scholar citations) by Jensen and Meckling on both the prevalence of the principal agent problem in the governance of firms and the various solutions to overcome it – including creating incentives that maximize shareholder value. Quoting Jack Welch, former CEO of GE, the article says that maximizing shareholder value is the dumbest idea in the world. I my self am not sure if this is THE dumbest idea in the world – in fact there are many more that would easily surpass P-A problem resolution – but I am sure this will ignite a debate about why firm’s exist – what is the best governance mechanism for them and the role of economic theory and action in our lives. I for one need to go back and read the article and then read the book.
Since Jeff is my colleague – I better plug his recent (2011) book The Innovator’s DNA: Mastering the Five Skills of Disruptive Innovators, Harvard University Press. The book is co-authored with Hal Gregersen (Insead) and Clayton Christensen (Harvard).
I just talked to Jeff about the book – interesting to learn about the intricacies of publishing a practitioner book. The book is doing very well. (I was at O’Hare airport a few weeks ago and it’s hard to miss the separate showcase that the book has at various bookstores.)
Jeff, Hal and Clay have also published a few articles related to the book in 2009:
Entrepreneur behaviors, opportunity recognition, and the origins of innovative ventures. Strategic Entrepreneurship Journal.
The Innovator’s DNA. Harvard Business Review.
Be sure to check the book and articles out!
Last week there was a very useful WSJ article reporting on an analysis of the supplier relationships at the core of the new iPhone 4S (here … while it lasts). This seems like a nice mini-case analysis to see how our theories seem to explain actual outcomes.
They note that Qualcomm “is the big winner” because it is supplying a suite of chips that adds up to $15 per phone. Intel is a loser because it acquired Infineon and then those chips were dropped from the product.
Samsung lost out on the memory chips to its Korean rival Hynix — a surprise since Samsung is known to have a more reliable product. However, interestingly, Samsung did retain its role as the manufacturer of Apple’s proprietary A5 processor which provides the iPhone 4S and the iPad 2 with the bulk of its computing power.
Here is an article listing the names of 20 retail firms that have been more profitable than their competitors over the last 5 years (plus, there is an interactive chart showing average analyst recommendations for the stocks of the top six through Oct., 2011). When I tell you that, across a broad spectrum of measures, Target persistently outperforms its direct competitors, what explanations leap to mind?
Whether you are an active strategy scholar or someone whose only exposure to strategy was a first-year course in an MBA program somewhere, my guess is your train-of-thought goes something like: the fact that Target’s performance superiority is repeated year after year makes “sheer luck” an unlikely explanation; therefore, Target must have superior capabilities; moreover, these capabilities must exhibit certain features (e.g., inimitability), otherwise competition would quickly erase its relative advantage. Then, you would be off and running, digging deeper into such possibilities as: superior information technology, better understanding of customers, economies of scale, excellent brand equity, more flexible something-or-other for “competing on the edge of chaos,” etc.
Sadly, it turns out, you may have jumped the gun when you ruled out the “sheer luck” hypothesis. Denrell, Fang and Zhao have a new paper forthcoming in SMJ entitled, “Inferring Superior Capabilities from Sustained Superior Performance.” In it, they explore the problem of inferring superior skills from data on relative firm performance. The paper is an instance of what I have come to consider classic Denrell: take a central central piece of conventional wisdom in management scholarship (the roots of which are probably the informal, intuitive conjectures in some famous paper from the 1980s), analyze the issue with a stochastic model that is at once simple and general, and demonstrate — yet again — that human intuition with respect to statistical inference is really, really bad. This is a salient trait in even the smartest among us. In this paper, the analysis shows why persistent superior performance may not imply superior capabilities. As a bonus, they go on to apply Bayesian methods to a large data set to infer the luck versus ability components of observed sustained advantages.
I always think of papers like this when I hear someone jumping on the “dude, your ivory tower social science is totally irrelevant to real-world business practitioners like me” bandwagon. (In fairness, this bandwagon is presently a fashionable place for fellow academics to be as well.) The moment we accept that humans are inherently bad at logical and probabilistic processing is the moment we flag the importance of learning rigorously derived general principles. Knowing how to decide whether the sustained ass-kicking your are receiving from your competitor is due to luck or ability is, actually, an important skill in real-world business management.
Teppo recently asked us whether the fundamental questions of strategy have changed since Rumelt, Schendel & Teece’s classic work. Relatedly, Mike wondered if strategy has lost sight of foundational questions and is now ceding territory to Economists.
One critical shift has been away from corporate strategy (multi-business firms and M&A). To an extent, this was fueled by debates over whether industry matters (see classic articles by Rumelt and McGahan and Porter) as well as the rise of the resource-based view. Lost in the shuffle were the prospects for corporate strategy research…
This has practical implications. I’m about to begin a module on corporate strategy (can you tell it’s my teaching semester) and this question looms large. I need to justify to my students why, given miniscule corporate effects, I am spending so much time on this topic. I think it points to a fundamental flaw in the way some of this research has been interpreted. Read the rest of this entry »
I’ve been reading Charles Hill’s (Yale) book Grand Strategies: Literature, Statecraft and World Order. The book is rather aggressive. It is essentially an effort to provide a tutorial for how big-picture thinking and decision-making at the global level might be informed by literature, specifically the classics.
The premise of the field of “grand strategy” (essentially an off-shoot of political science and history) is that this type of practical big-picture thinking–strategic decisions under uncertainty–can’t be taught and understood by focusing on the minutiae that extant social science studies. Well, that’s the story anyways. So, in this book Hill turns to the classics.
Throughout the book Hill discusses the influence that specific classics have had on various leaders (Leaves of Grass and Lincoln, The Dream of the Red Chamber and Mao, Don Quixote and Mann, etc) and also vets the analogies between fictional events and narratives in the classics and real-world correlates. You get sort of a dazzling history of modernity (post Thirty Years’ War), where various classics essentially provide possibilities, foreshadow or mirror emergent realities about wars, nation states, forms of governance, etc. The result is a pretty breath-taking tour de force where fiction and reality blend all too readily. Hill’s book seems to imply both a great man theory as well as a (Campbellian, of sorts) great narrative(s) theory of, well, everything. I should note – Hill’s political stripes are pretty evident from the book, so that might bother some readers.
But I am really enjoying the book. I’m of the mind that the classics indeed have much to teach us, strategy included. I would recommend this book over most of the practitioner strategy books that you might pick up at your local bookstore.
I’ll put up another post about the book once I finish it.
Below the fold you’ll find a video of Charles Hill speaking about the above book. Read the rest of this entry »
A recent paper by Chad Syverson (2011, JEL) has created a bit of a buzz in strategy research circles. Entitled, “What determines productivity?” this paper surveys work in economics over the past decade or so on persistent differences in levels of firm productivity. As I have pointed out elsewhere, one of the traditional differences between economics and strategy is the focus on efficiency versus distributional issues. Economists tend to be motivated by the former – loosely, how much aggregate value is created in an industry or economy? – and strategy scholars by the latter – loosely, how and why is that value divvied up? Going by the title of this paper, my strategy colleagues might easily pass it up as yet another example of work in economics that doesn’t quite match up with our specific interests.
That would be a mistake. Indeed, I urge all strategy scholars (and those working to become strategy scholars) to take a very careful read of this paper and the papers it cites. Firm “productivity” is, of course, theoretically defined as output per unit of input. In practice, however, empirical measures are often constructed using revenues in the numerator and costs in the denominator. Simply subtract the denominator from the numerator and you have … profit . Thus, many of the citations here are about persistent differences in profit performance among firms … which is exactly the object of interest in strategy.
What surprised me about this paper is just how much work has been done on these issues in the last ten years. Among the sources of systematic and persistent performance heterogeneity examined in recent work are: competition, sunk costs, technology spillovers, organizational structure, human capital, incentive systems, information, learning-by-doing, and managerial talent & practices. I believe that list includes pretty much all the sources of competitive advantage contemplated in the strategy literature. By the way, the paper focuses only on empirical contributions, no theory. One of the striking things about this is the sophistication of the methods used to identify causal effects. Among a raft of others, work by Bloom and Van Reenen (2010) and Forbes and Lederman (2011) stand out as exemplary. Over 100 papers published in the past decade are cited and, again, that’s not counting any theoretical contributions.
On the one hand, it is great to see so much interesting research activity on a topic near and dear to strategy. On the other hand, it seems that strategy has lost interest in the foundational questions that animated it early-on. In the 1990s our field felt alive with big-think empirical work by people like McGahan, Porter, Rumelt, and Henderson. Are we ceding our core issues to economics? Or, does the increasing popularity of these issues in economics just mean more outlets for our work? Either way, my guess is that the indirect competition is good and, to the extent journal editors are paying attention, will tend to raise the research quality bar.
The management field has long been obsessed with the oddly loose idea of fit, whether it be between strategy and the environment, strategy and structure, or the functional-area components of an overall strategy. Pieces have to fit together into some sort of design in order for the whole thing to work; firms that get all the pieces to fit better (whether by skill or luck) win the competitive struggle. This orientation generates a kind of rhetorical, emotional glow around the idea of complementarity–who wouldn’t want to be coherent and consistent and have all everything work together?
My concern with this attitude is that it overlooks the importance of substitution. Opportunities to outcompete rivals often involve being the first or the only player to recognize that one action, factor, resource, or level of coordination can profitably replace another. I’ll probably return to this theme in future posts; here I want to discuss one particular area of substitution–between strategic acumen and skill at execution.
As a student, at Reed College, Steve Jobs came to believe that if he ate only fruits he would eliminate all mucus and not need to shower anymore. It didn’t work. He didn’t smell good. When he got a job at Atari, given his odor, he was swiftly moved into the night shift, where he would be less disruptive to the nostrils of his fellow colleagues.
The job at Atari exposed him to the earliest generation of video games. It also exposed him to the world business and what it meant build up and run a company. Some years later, with Steve Wozniak, he founded Apple in Silicon Valley (of course in a garage) and quite quickly, although just in his late twenties, grew to be a management phenomenon, featuring in the legendary business book by Tom Peters and Bob Waterman “In Search of Excellence”.
But, in fact, shortly after the book became a bestseller, by the mid 1980s, Apple was in trouble. Although their computers were far ahead of their time in terms of usability – mostly thanks to the Graphical User Interface (based on an idea he had cunningly copied from Xerox) – they were just bloody expensive. Too expensive for most people. For example, the so-called Lisa retailed for no less than $10,000 (and that is 1982 dollars!). John Sculley – CEO – recalled “We were so insular, that we could not manufacture a product to sell for under $3,000.” Steve Jobs was fantastically able to assemble and motivate a team op people that managed to invent a truly revolutionary product, but he also was unable to turn it into profit. Read the rest of this entry »