16 December 2025

The Ten Trillion Dollar Gamble

Recommendation

A timely book on personal finance with actionable advice is rare, but investment expert Russ Koesterich delivers specific guidance for investors. He jumps into the current debate on the US budget deficit and suggests ways to prepare your portfolio for an uncertain, “deficit-driven” future. Koesterich details which strategies, asset classes and instruments offer potential safe harbors and good returns amid the grim reality of structural deficit economics. BooksInShort finds his discussion of the deficit informative and useful, though at times repetitive, and recommends his book to those who are weighing their options for wise actions in financially challenging times.

Take-Aways

  • Growing US deficits will trigger higher taxes, greater budget cuts, falling entitlement benefits and increased interest rates.
  • In this “deficit-driven” environment, investors will encounter greater stock and bond market volatility, incipient pressure on the US dollar and rising inflation.
  • Profitable investing opportunities will still exist, but not necessarily in the same areas nor with the same techniques people used in the past.
  • Investment timing is critical: Closely monitor economic indicators for rising interest rates and inflation.
  • Lock in your long-term borrowings at current low rates, and keep your cash short term.
  • Rising interest rates will lower bond prices, so “ladder” your bond maturities.
  • Stocks are in a “secular bear market,” which means prices won’t appreciate for years.
  • Look for equity opportunities overseas, in foreign companies and US exporters.
  • Add commodities to your portfolio; they perform well in rising interest rate environments.
  • Real estate is a good investment in inflationary times.

Summary

Facing Reality

The US budget deficit in fiscal year 2010 was $1.3 trillion. That amount represents approximately all the debt America amassed from its founding in 1776 until 1984. So, in one year, the country overspent by the equivalent of roughly 200 years of boom, bust and war. Experts predict that between 2010 and 2019, the US will add an annual average of $900 billion to its obligations.

“The fact that we’ve gotten away with our financial profligacy in the past does not mean that we will get away with it in the future.”

Despite former vice president Richard Cheney’s 2002 comment, “Reagan proved that deficits don’t matter,” the truth is they do matter. While deficits are nothing new in the American economy, these staggering numbers will change the global economic climate and trigger more taxes, greater budget cuts, falling entitlement benefits and increased interest rates. The 2009 and 2010 deficits account for 10% of US GDP, the largest proportion since World War II. Granted, the outsized spending went toward stabilizing a sluggish economy, but the more debt a nation incurs, the higher the chances that it will have destabilizing effects.

“Investors need to adjust their portfolios to reflect the new economic realities that are coming.”

The US’s total debt now stands at $14 trillion, an astonishing 90% of GDP. When indebtedness reaches such a high percentage of a nation’s output, further government borrowing invariably leads to higher interest rates. As rates rise, inflation threatens, and investors hesitate to buy bonds, because they don’t want to lock up their money at low interest rates. So bond prices drop, pushing interest rates up even more. As the deficit grows, the US has to sell more Treasury bonds to finance it, and this new supply of bonds further drives down bond prices.

Investor Beware

In this “deficit-driven” environment, investors will encounter greater stock and bond market volatility, incipient pressure on the dollar, and rising inflation. A stagnant economy will impair people’s ability to save and invest. This comes at a precipitous time for many Americans, especially homeowners. Between 2001 and 2008, US mortgage debt doubled, increasing consumer debt to 120% of net income. Personal debt-to-income ratios are at a historic high. An International Monetary Fund study found that interest rates rose more in countries with large national deficits and high personal debt levels than in countries with high deficits but manageable individual debt. Since the US has huge deficits and overextended citizens, higher interest rates will significantly affect the housing market, keeping home prices down and slowing appreciation.

“The fiscal situation has now reached a point where even if politicians behave responsibly, there are no easy choices.”

Higher interest rates make mortgages, car loans, student loans and credit cards more expensive. Inflation also makes the US dollar less attractive to foreign investors. The US economy will likely continue to grow slowly relative to the economies of developing markets. These scenarios hold dangers for private investors, who have little control over such macro issues but need to watch their personal portfolios carefully. Profitable investing opportunities will still exist, but not necessarily in the same areas as in the last few decades, nor will they be accessible with the same techniques used in the past.

It’s All About Timing

Timing determines investment success: Moving too quickly into and out of investments can be as costly as reacting too slowly. While the US economy has meandered along for some time, interest rates are not likely to rise until the end of 2011, with inflation following about a year later. In this setting, you should hold fewer US bonds and equities, and more cash and commodities.

“A long-term rise in inflation is one of the risks that investors need to account for in constructing their portfolios and managing their finances.”

To manage your investment timing, watch economic indicators that can foretell the movement of interest rates. For instance, while the government is likely to be a heavy borrower for some time, look for increasing corporate demand for loans. As economic conditions improve, firms will seek credit, and banks’ commercial and industrial (C&I) loans will edge upward. Check the Federal Reserve’s website every Friday for that week’s C&I data. Consumer debt is also critical, because it reflects public sentiment: The better the job market, the more readily people will spend. However, they “are unlikely to start to borrow aggressively again until the debt-to-income ratio is back to around 90%.”

“No matter what, the era of cheap money is over.”

Similarly, imminent inflation sends out its own warning signs: Be on the lookout for increased job growth (check monthly Labor Department statistics), rising manufacturing ability (via the “capacity utilization” rate the Federal Reserve reports monthly) and swelling money supply (look for ongoing growth in excess of 6% per year in the Fed’s M2 money calculation).

Managing Debt, Cash and Bonds

After a nearly uninterrupted 30-year decline in interest rates, a rising interest rate environment calls for handling your debt and liquidity in a different manner: Lock in your long-term borrowing (such as mortgages) now, while rates are still low, and put your cash only in short-term instruments, so you can reinvest it as rates move upward. Avoid adjustable rate mortgages (ARMs) – your costs go up as interest rates rise – unless you plan to repay your mortgage quickly, before the rate resets. If you’re buying a new home or refinancing, consider making less of a down payment or borrowing more. Inflation eats away at what you owe, so you’re better off getting a bigger mortgage; you’ll owe less in the long run. Keep your cash available in savings accounts, money market funds and one- to two-year certificates of deposit.

“Munis have one significant advantage over Treasuries or corporate bonds: Unlike the federal government, most states are legally prohibited from running a budget deficit.”

Bonds provide income to investors and are usually a safe, secure addition to most portfolios, especially in low-growth economies. Traditional rules of thumb advise most investors to keep about one-third of their holdings in bonds, and suggest more for those nearing retirement. But as interest rates rise, all investors should trim their exposure to bonds, especially US Treasuries.

“At its root, a structural deficit is as much a political problem as an economic one.”

Given a deficit-driven scenario, investors younger than 50 years of age should hold no more than 20% of their portfolios in bonds, while pre-retirees should look to buy dividend-paying and preferred stocks. For the bonds you do hold, “ladder” their maturities: Buy bonds of different durations so when shorter-term bonds mature, you can reinvest in higher-yielding securities. Consider individual municipal (muni) bonds or muni bond funds; keep them geographically dispersed and with durations of less than five years. Prioritize top-rated corporate and international bonds over US Treasuries for your taxable bond portfolio.

The Deficit’s Impact on Stocks

Stock investors saw 20% annual returns from 1995 to 2000, but shares in the first decade of the 21st century had their poorest showing since the Great Depression. The forthcoming scenario of sluggish growth, higher interest rates and creeping inflation all bode badly for future stock performance. Corporate profits will suffer, firms’ borrowing costs will rise, and inflation will depress earnings and share prices. A “secular bear market” – a lackluster period, sometimes lasting 10 or more years, with mostly sideways share-price movements – will likely define the deficit-driven future.

“Smaller deficits, more modest debt, a bond market priced for an inflationary train wreck, and a convenient place for large, developing nations to park their money – this is why the United States has managed to overspend for 40 years with so little damage.”

Investors should reduce their exposures to equities, particularly US companies’ shares, but they shouldn’t eliminate stocks entirely from their portfolios. Be alert to buying opportunities in cheaply valued shares, international companies, inflation-favored industries and uncorrelated investment strategies.

“Whether you are running a huge hedge fund or managing a small personal portfolio, timing is critical to your success.”

Stock markets can post short-term rallies in a secular bear market, but the overall trend is always lower. Determine your risk appetite and your required return on investment. Buy shares of companies in industrialized nations when their price-to-earnings (P/E) ratios drop well below 15 times their annual earnings. These cheaper valuations can turn into good bets, but you should probably not have more than 55% of your overall portfolio in equities. When P/E ratios rise above 20, reduce your equity allocation to about 30%.

“Just as secular bear markets begin when investors are too optimistic, they end when they’re too pessimistic.”

Most investors act based upon “home country bias.” They feel more comfortable in, and therefore buy too much of, the securities issued by their nation’s companies. Because the US represents only one-quarter to one-third of global stock activity, your equities portfolio should be mostly in foreign shares. In the future, certain international and emerging markets will become more attractive for diversification and profit potential, both because they weathered the 2008 financial crisis better than the US, and because they have more promising growth prospects and demographics. For example, Australia and Canada have solid banking systems, less debt than the US relative to their GDPs and abundant natural resources. In inflationary times, the prices of natural resources surge. Emerging markets such as South Korea, South Africa, Taiwan and Chile promise good growth uninhibited by high national debt levels.

“We should also try to forget the buy-and-hold mantra that was endlessly repeated at the peak of the bull market...Stocks can go down.”

Another way to capitalize from the benefits of foreign diversification is to invest in US companies that derive most of their revenues from international sales. American industrial and technology firms tend to have highly profitable overseas operations. For instance, Intel conducted 80% of its sales in foreign markets in 2009. If the dollar weakens, US exporting companies will benefit because their products will then become cheaper in international markets.

“Even good companies can make for bad stocks if you overpay.”

In addition, look for companies in industries that perform well in the face of rising interest rates. The earnings of energy, health care and technology firms tend to withstand inflation better than those of banking, utility and “consumer discretionary” enterprises, like restaurants and retail businesses. Experienced investors should also consider alternative investments such as hedge funds – actively managed and adaptable investment pools.

“What may eventually save real estate is a bit of inflation. For while higher rates and slower growth hurt real estate, inflation does not.”

And when you can’t find stocks good enough to invest in, you can always trade market volatility: Price spikes and plunges occur even during sideways markets, so the VIX Index, also known as the “fear index,” allows you to speculate on how much the US stock market will swing over a set time period. These positions should only amount to a small portion of your portfolio, but they can provide an interesting – and lucrative – alternative if you do your homework.

“Buy Stuff”

The prices of physical assets, such as commodities and real estate, benefit from rising inflation. While near-term inflation may not yet be in the cards for the US economy, prepare your portfolio for its eventual return. Adding real assets makes a lot of sense, but it also calls for close study and planning.

When inflation first emerges, investors suffer “inflation shock,” and their initial impulse is to dump stocks and bonds, and to acquire commodities. Since inflation erodes the buying power of paper assets, commodities become more attractive. In the past, commodity prices have jumped anywhere from 3.8% to almost 10% for every 1% rise in the yearly US inflation rate.

But inflation isn’t the only reason to consider commodities for, at a minimum, 10% of your portfolio. Globalization and the growth of newly industrializing countries mean that demand for items like crude oil and metals will surge; concurrently, supplies of these commodities will be under pressure, as nations increasingly deplete their energy supplies and natural resources. Research prospective supply-and-demand data for commodities from emerging users like China and India. Include choices from the whole range of commodity classes: precious metals, energy, agricultural and industrial. Investors seeking portfolio diversification can buy shares of funds based on baskets of commodities or even speculate on prices in the commodities futures markets.

Real Estate

For many people, their only real estate exposure – their homes – may constitute the largest portion of their net worth. In a slow-growth, noninflationary economy, that home is all the real estate you’ll want to own. But if inflation roars back, consider investing in a second home, undeveloped land or commercial property, if you hold them for at least a decade and can finance the purchase with borrowed money. Real estate, unlike other physical assets, can potentially provide you with rental income, and it holds its value relatively well over the long term, as long as there is some inflation in the system. Land and housing prices – and their prospects for appreciation – vary greatly by individual markets and type, so consider carefully whether adding real estate to your portfolio makes sense for you.

About the Author

Russ Koesterich is chief investment strategist for iShares and global head of investment strategy for BlackRock Scientific Active Equities.


Read summary...
The Ten Trillion Dollar Gamble

Book The Ten Trillion Dollar Gamble

The Coming Deficit Debacle and How to Invest Now

McGraw-Hill,


 



16 December 2025

The Optimization Edge

Recommendation

Information technology and management consultant Steve Sashihara offers a complete guide to optimization, starting with explaining that it is the process of using the speed and power of computers to analyze information and recommend decisions. He goes beyond yes-and-no decisions, and delves into such issues as determining optimum package delivery routes (for UPS) and finding the best way to sell hotel rooms (for Marriott). Sashihara provides a history of operations research and how it works, and he elucidates what computers have added to the process. He shows managers how to present an optimization program to all levels of their companies, starting with decision-making managers. Sashihara also explains the problems that can arise with optimization programs. BooksInShort recommends his book to those who want to understand the decision-making process and wish to know why to optimize and how.

Take-Aways

  • Information is the primary tool of business. “Optimizing” will help you make the most of your data.
  • Decision makers have too much information to use it effectively.
  • Managers can use computers to sort and store information, and to make decisions based on it. That is optimization.
  • Companies that optimize usually outperform companies that do not.
  • Cutting staff is not the best way to save money; make better use of your resources.
  • To optimize, you must first know fully what information you currently have. Then identify your issues and build a team.
  • The use of technology often has unintended consequences, so optimize carefully and monitor your progress.
  • Optimization works best when executives use it to solve a specific business issue.
  • Optimization will be far more common in the future, so welcome it now.
  • Computer-based optimization is not a threat to human decisions makers, but a way to enable them to do their jobs better.

Summary

Asset Management: The Ultimate Goal

While people must rethink many of the old ways of doing business, some goals and basic realities remain the same, such as the need to make effective decisions about using your organization’s assets. Today, companies – particularly start-ups – have fewer fixed assets. Staffs are smaller. Competition is fiercer, more constant and more global. Executives instantly turn to downsizing as a way to manage assets and save money, but this is not how to grow. For that, firms must find better, more effective ways of using their resources.

What Is Optimization?

Optimization is the use of computer programs to analyze all the available information that relates to a decision. It suggests the best decision based on this data and explains the reasons for that recommendation. Optimization is the most outstanding way to determine how to deploy your company’s assets effectively. It can offer more than just yes-or-no choices; in fact, firms apply it to very complicated decisions. UPS, the package-moving company, used optimization to find more efficient routes for shipping packages – a complex task. Optimization in UPS’s case included planning driving routes with minimal left turns. To turn left, a truck driver might waste time and fuel waiting for an opening in traffic, but a driver usually can make a right turn immediately.

Who Uses Optimization?

All managers make decisions every day. A firm’s economic survival can depend on its leaders’ choices and their ability to monitor which decisions worked and which went wrong. Correcting mistakes also involves decisions. Managers of successful companies “possess an uncanny ability to make complex decisions faster, more accurately and more consistently than their competition.” However, optimization also supplies a competitive edge. UPS, which uses the system for ground packages, is highly successful; FedEx is its only major competition. When Airborne Express tried to compete with UPS by using old-style “seat of the pants” decision making, it fell behind and no longer exists. UPS used optimization to design smarter routes, increase efficiency, and save fuel and money.

“There is no substitute for human brainpower, experience and judgment. But decision making is becoming more complex, the need for speed is ever greater and the value placed on the ‘right decision’ is increasing.”

Walmart also uses optimization for decision making and, in particular, for working out the logistics of sending goods where they need to go. Buying in bulk helps Walmart keep its prices reasonable, but the ability to deliver products to appropriate stores, at appropriate times, is even more important. Walmart uses optimization to examine all stages of its supply chain continually, from supplier to customer. Walmart tracks changes at each point. Consider air conditioners. Walmart’s former top competition, Kmart, tended to use simple assumptions, like stocking up on air conditioners in the summer. Walmart looks at buying patterns for air conditioners. Its logisticians examine weather reports, note when areas expect heat waves and prepare to divert air conditioners from stores in cooler places.

“Optimization may begin as a single project, but once it is successful, changing the way decisions are made can permeate people’s thinking and change the culture of an organization.”

Two US hotel and restaurant chains founded in the 1920s, Marriott and Howard Johnson, spent 50 years as prime competitors. For decades, they followed similar development paths. Both opened their first travel lodges in the mid-1950s and did well. By the mid-1970s, facing increased oil prices and economic stagnation, Americans cut back on road trips. Howard Johnson’s profits declined. The firm cut costs, downsized staff and served cheaper food. The loss of customers continued. In 1979, management accepted a buyout offer from a British company. Ironically, Marriott bought Howard Johnson’s assets from that firm in 1985 and resold them.

“Optimization is a woefully underutilized capability in many industries.”

Marriott thrived by taking a different course. As early as 1938, Marriott’s restaurants – the Hot Shoppes – supplied box lunches for 22 daily flights from Washington, DC, to New York. When airlines turned to optimization to improve their scheduling, Marriott began to use similar systems for its hotel bookings, particularly for groups, which tend to reserve rooms much further in advance than individuals. Hotel rates vary greatly, so Marriott turned to optimization to find a better way to set prices. The company’s effort to promote internal acceptance of its optimization system began with gaining upper management support and convincing staff that optimization did not threaten their jobs. Marriott also uses optimization to build its image. If a Marriott hotel’s lowest rate is more than a group can pay, or if the hotel lacks rooms, its staff uses the company’s information base to recommend a nearby Marriott or even a competitor. The hotel might not get the group’s business, but Marriott earns goodwill.

Foundations of Optimization

Managers who want to adopt optimization should ask how their company is making repetitive decisions about major assets, and whether it has underutilized assets and is using the right factors to justify its decisions. Notice if certain strategic decisions or operational issues repeatedly generate debate, and determine if your current forecasts are accurate. Optimization is both a new and a very, very old concept. People have always wanted to get the most from their assets, including their teams. “The cave people who had to decide who should go on the hunt and who should stay behind to guard the camp faced an optimization problem.” Even if they thought out mental models of how to form a hunting party, cave dwellers lacked the math concepts to frame optimal questions. Basic modeling emerged as early civilizations worked out calendars not only to measure the passage of time but also to predict events, such as full moons and seasons. Mathematics existed in some form among the ancient Greeks and Romans. The Arabs made major contributions by developing algebra and introducing a better numbering system. The more direct ancestor of the math behind modern optimization arose around 1600 from efforts to predict the future – namely, attempts to find more reliable methods than trying to divine the will of the gods who controlled fate.

“Optimization is about taking an upside-down look at things and moving into unexplored areas. Rather than starting in the boardroom...optimizers are creative thinkers continuously on the prowl for optimal solutions – starting from the bottom up.”

French 17th-century mathematician Blaise Pascal contributed to “prediction science” when he tried to make gambling less risky by finding a way to predict the outcome of games. These early betting odds became the basis of modern gambling – both illegal and legal. One variation was the idea that the gambler’s odds should account for both shifts in certain outcomes and their desirability. In investing, this becomes the “expected return.” Science, particularly theoretical science , often runs in advance of the technology it needs to work. In the 1880s, the lag in technology kept mathematician Charles Babbage from being able to build two impressive calculating machines he had designed, including one he could program with punch cards. The first version of his machine – built 150 years after he designed it – weighed tons.

Optimization in the 20th Century

The invention of the transistor and the microchip exemplified technology’s decreasing size and increasing calculating speed. This set the stage for modern optimization, even as more direct advances unfolded, such as the development of more precise methods for calculating odds of occurrence and degrees of desirability, and of actuarial methods for life expectancy, the basis of life insurance and pension calculations. The first person to work significantly in this area was Edmond Halley, discoverer of Halley’s Comet.

“Game theory involves using mathematics to model behavior in strategic and tactical situations, where an individual’s success depends on the individual’s wins at the expense of another.”

The advent of technology and operations research – the direct ancestors of optimization – accelerated in the first half of the 20th century. The British first used radar in 1940 to detect approaching German bombers during the Battle of Britain. The British analyzed potential enemy attack patterns to plan future raids. As the war went on, “operations analysis” greatly improved the effectiveness of Allied bombings, antiaircraft fire and artillery targeting. The Allies built faster computers to use in artillery targeting, technological research and decoding. Engineers wrote linear programming to analyze problems, resources, and constraints. The process – later simplified by the debut of the “simplex algorithm” – solved problems by following a series of steps. Mathematicians’ next pivotal theoretical development was “game theory,” which helps predict an opponent’s behavior.

“Theory never rests.”

When World War II ended, computers still filled entire rooms, and only governments could afford them. By 1951, Remington Rand had built the Univac 1 for the US Census Bureau for $1 million (equal to $8 million today). IBM demonstrated just how fast technology advances, even in peacetime, with its 608-transistor computer for $83,000 (which now would buy some 500,000 vastly more powerful laptops). The 608 was smaller, faster and cheaper than the Univac I, and consumed less electricity. The technology needed for optimization had arrived and would continue to improve. The first computer programs specialized for optimization debuted in 1955, followed by the introduction of user-friendly computer language and the microchip, which further shrank the size and cost of computers.

Optimization Today: The Need

Optimization is not the first step in computers’ takeover of society, despite any number of science fiction villains, such as HAL in 2001: A Space Odyssey. Humans are not likely to become cyborgs, despite remarkable advances in medicine and prosthetics, and they likely will always have the final “sign off” on decisions, no matter how computers develop. However, optimization reaches good decisions, often faster than a person might. Decision making must include human values, but are those values necessary in “operational decisions” – that is, ways of doing things? People cling to the familiar because they know it, not because it fills some deep moral purpose or even because it is better. Optimization can show what works best and when it can work better. People will correct failures, but they are less psychologically able to correct what seems like success.

Where Does Optimization Work Best?

Optimization works well where similar decisions must be made again and again, like when figuring out how to route packages. Optimization also functions best:

  • When a large number of variables or permutations enter into the relevant calculations.
  • When the decision making involves quickly sorting an enormous number of facts.
  • When inaccessible information is necessary.
  • When a decision calls for combining probabilities (if A, then what happens? If A and B, then what?).
  • When emotions are not part of the decision.
  • When the situation does not call for protracted discussion and negotiation – that is, when compromise cannot be optimized.
  • When the decision-making process requires continuous processing or monitoring.
  • When mathematics can model the decisions.

How to Design, Plan and Examine an Optimization Program

When establishing an optimization system, pay attention to organizational culture. Optimization has to deal with the existing culture, though its success can change that culture. People who are content with current methods need good reasons to switch to optimization, particularly decision makers who may consider themselves good at making decisions and not wish for computerized support. Those who fear that optimization will cost them their jobs will oppose it. To set up an optimization program:

  • First, assess your existing information so you know exactly what you already have.
  • Identify a compelling business issue that is solvable with math-based decision making.
  • Assemble a team with the right attitude, energy, and a variety of complementary skills.
  • Present the reasons for an optimization system. Introduce it gradually.
  • Anticipate, and move to ameliorate, any unintended consequences of optimization. For example, how will success in departments that have adopted optimization affect departments that have not?
  • Staffers need preparation, skills to use the program and clarity about its goals.
  • Be sure employees can operate the software correctly and can provide feedback.
  • Develop methods to monitor results, fix the negatives and build on the positives.
  • Insist on feedback and provide it to upper management.
  • Deliver value quickly. Establish momentum when introducing the program. Expand it to other areas of the company as soon as feasible. Be open and maintain transparency.

The Future

Some corporations use optimization for only some operations. UPS, for example, has not yet introduced optimization for its air service, though it is under consideration. This is in keeping with the best practice of establishing such programs piece by piece. Optimization’s use expands when companies see how well it works and when people understand that computer-based optimization is a tool, not a threat to them or their jobs. It will be far more prevalent in the future, so welcome it. People will, for the foreseeable future, have to program assumptions into computers and run them, though these computers will continue to do more tasks.

About the Author

Steve Sashihara is president and CEO of Princeton Consultants Incorporated, an information technology and management consulting firm.


Read summary...
The Optimization Edge

Book The Optimization Edge

Reinventing Decision Making to Maximize All Your Company's Assets

McGraw-Hill,


 



16 December 2025

Fire in the Valley

Recommendation

Authors Paul Freiberger and Michael Swaine offer the second edition of their extremely popular 1984 chronicle of the birth of the personal computer. They recount how the PC industry began, who fueled its growth and why things happened as they did. The central stories cover the emergence of MITS, IMSAI, Apple, Tandy and Microsoft. This second edition adds the development and maturation of the hardware and software industries. Apple and Microsoft’s sagas still dominate, but new stories emerge, including tales of Dell, Oracle, Netscape and the Internet. The second edition shows how the PC child has grown up. You’ll see how the nerds took a hobby and reformed the world using Boolean logic, integrated circuits, motherboards and chips. BooksInShort recommends this book to everyone with an interest in the computer industry and particularly to those who are hungry for the real stories behind the growth of the 20th century’s most pivotal industry.

Take-Aways

  • The first "thinking" machines arose from the marriage of electricity and Boolean algebra.
  • Charles Babbage’s analytical engine ran on electricity, though he first considered steam power.
  • The Mark I computer was the first computer introduced and the first rendered obsolete by advancing technology.
  • The vacuum tube ENIAC computer could process commands 1,000 times faster than computers using electrical switching relays.
  • Transistors changed the computer industry, and silicon - on which transistors were made - created Silicon Valley.
  • The first personal computer, the Altair 8800, was sold via mail order and delivered unassembled and without programs.
  • Hobbyists developed many early computer games.
  • Oracle was a CIA code name for a software project that company founder Larry Ellison helped develop.
  • Hewlett-Packard senior engineers rejected the original Apple I design.
  • IBM signed Microsoft to develop DOS for its new PCs after a software competitor missed a meeting.

Summary

Boolean Algebra, Vacuum Tubes and Steam

The question that launched the personal computer industry was, "Can a machine be programmed to think?" In 1833, British inventor Charles Babbage claimed that steam could be harnessed to run an analytical engine that could solve mathematical problems. Although Babbage never developed his concept, American logician Charles Sanders determined in 1888 that Boolean algebra could be used as a model for electrical switching circuits. Logic therefore, could be represented by electrical circuitry.

“The success of the first product in a market wasn’t always just a matter of somebody filling an existing niche first. It was sometimes a matter of creating the niche, of inventing a new kind of thing in the world.”

If logic could be applied to switching circuits, then electronic machines could be built to solve logical problems. In 1936, Benjamin Burack did just that; he built a logic machine. His machine could process statements made in the form of logical syllogisms. International Business Machines (IBM), which built non-thinking calculating machines, entered the fray at that time by giving $500,000 to Harvard professor Howard Aiken to develop a calculating device inspired by Babbage’s analytical engine. The Mark I was introduced in 1944 to high praise, though without acknowledgement of IBM’s support. But Aiken’s device suffered from a larger problem, which would haunt the new industry: Advancing technology already was making the machine obsolete.

“Those early computer enthusiasts had no choice but to write their own software. No one imagined that anyone would actually buy software from someone else.”

By the time the Mark I was announced, electronic vacuum tubes were replacing electric switching relays in computer designs. ENIAC, introduced in 1946, was the first computer to use vacuum tube technology. The government used ENIAC, which ran 1,000 times faster than the Mark I, to perform atomic bomb testing calculations at Las Alamos. ENIAC’s builder, the Remington Typewriter Company changed its name to Sperry Univac, giving birth to the mainframe computer industry. Within 10 years, ENIAC’s Sperry Univac had competition from IBM, Control Data Corporation, Honeywell, Burroughs, General Electric, RCA and NCR. IBM would come to dominate the mainframe computer market, but computer technology was never static. By the 1960s, technological advancements created the new minicomputer market. Even as Digital Equipment Company (DEC) and Hewlett-Packard (HP) made cheaper, smaller minicomputers, newer technology always barked at their heels.

The Transistor

Creation of the personal computer required the invention of the transistor. Vacuum tubes were large, hot and subject to burn out. Transistors were small, used less heat and, most importantly, could be integrated into a single semiconductor. These integrated circuits, or chips, were made from silicon and the area of California where they were developed and manufactured became known as Silicon Valley.

“The Apple name was actually Job’s idea. He later insisted that he picked the name at random, but it may have been inspired by either the Beatles’ record label or by Job’s experience working in apple orchards in an Oregon commune.”

Intel Development Corporation, a new Silicon Valley company, received a commission from a Japanese calculator company for a line of chips to run their calculators. Martin "Ted" Huff, Intel employee number 12, was assigned to design the new calculating chips. In 1969, Huff proposed a set of chips that included a chip that could run programs. In actuality, Huff had created the microprocessor, a computer without memory or peripherals. The chip was called the 4004 - for the number of transistors upon it. In 1971, when Intel launched the 4004 chip, it found that it needed to provide customer support. Intel assigned Adam Osborne, future developer of an eponymous portable computer, to write a documentation manual. Thus a new profession was born. At the same time, Intel hired professor Gary Kildall to write a high level implementation or programming language. The language he wrote was called PL/M (Programming Language for Microcomputers). Again, a new profession was born.

MITS, the Altair and BASIC

In 1975, Albuquerque, New Mexico’s Micro Instrumentation Telemetry Systems (MITS) became the first firm to sell a no-frills microcomputer, the Altair 8800. MITS delivered the bare bones computer with a CPU with 256 bytes of memory, but no terminal or keypad. Buyers had to assemble their own units and - to make the $397 machines do anything - write their own programs. MITS could not produce kits fast enough to satisfy demand. When MITS began selling the Altair 8800, Bill Gates was a Harvard freshman and Paul Allen was working for Honeywell in Boston. After seeing an article on the computer in Popular Mechanics, Gates and Allen called MITS founder Ed Roberts and offered to sell him a version of BASIC customized for the Altair 8800. Roberts told the young entrepreneurs that he would buy the first version of BASIC that he saw running on an Altair. They told him they had the software. Six weeks later, after actually writing the software, they flew to Albuquerque. Before leaving Boston, they changed the name of their company from Traf-O-Data to Micro-Soft.

Homebrew Computer Club

The Homebrew Computer Club began in San Francisco in 1975 as a hobbyists’ exchange community. Sometimes, representatives from Intel showed up at meetings and distributed newly designed chips to club members in exchange for feedback on the chips’ performance. The intellectual ferment at these meetings spawned several companies, including Processor Technology, Cromemco, North Star, Vector Graphics and Godbout. However, conversation at the meetings always returned to a discussion of the "big boys." When was IBM or some other big company going to produce a personal computer? Meanwhile, dozens of new computer-related companies emerged, including Apple Computer, Commodore, IMSAI, Digital Microsystems, Alpha Micro Systems, Heathkit and Ohio Scientific.

“Wozniak was unarguably an outstanding engineer, but he could only work on projects that interested him, and then only for as long as they interested him.”

Software also interested the Homebrewers. By necessity, the Altair 8800 had forced hobbyists to write their own programs. When a club member figured out how to play music on an Altair 8800, club members became tremendously interested in making these personal computers do something. Many turned to creating games, because making games was a good way to learn to program these machines. They developed such games as Star Trek, Breakout, Target, Adventure Land, Pirate Adventure and MicroChess, which was particularly significant because it was one of the first personal computer games sold to the general public. Peter Jennings, the owner of MicroChess, reinvested his earnings into marketing a business program developed by a new company called Personal Software. The product was VisiCalc.

The Rise of Software Empires

VisiCalc made Personal Software part of the new wave of software companies that sold both to hobbyists and business owners. Two Atlanta high school friends launched Structured Systems Group to market business software for microcomputers and sold a general ledger program by mail for $995. Eventually, the company changed its name to Peachtree Software. Another company, MicroPro, began selling SuperSort, a data sorting program, and Word-Master, a text editing program, through retailers. Following customers’ requests, MicroPro created WordStar, a new word processing program that fixed the word wrapping problems in the more established word processing program, Electric Pencil. Together, Personal Software, Peachtree Software and MicroPro International created the industry standards for consumer sales of software products.

“There’s money to be made in this business.”

Convinced that money could be made in software, Phillipe Kahn started Borland International. Larry Ellison started SDL, which became Oracle (the CIA code name for a software project that Ellison helped develop). George Tate and Hal Lashlee founded Ashton-Tate. Gordon Eubanks founded C&E Software, which acquired Symantec and took its name. Most of these companies developed microcomputer products, although Oracle focused on the minicomputer market. Ashton-Tate’s dBase was the largest cash cow with millions of users and Borland’s Turbo Pascal was the fastest running program. Microsoft competed against it by launching QuickBasic, which also supported the firm’s reputation in the area of programming language. The software wars were hot and heavy, and any issues that weren’t decided in the marketplace were decided in the courts.

“The transistor was the technological breakthrough that made both the minicomputers of the 1960s and the personal computer revolution of the 1970s possible.”

Magazines, clubs, shows and stores emerged to distribute personal computer products and information. The earliest magazine to capture the rapture was Byte. Other successful magazines were Dr. Dobb’s Journal, Recreational Computing, Personal Computing, PC Magazine, PC World and MacWorld. The Whole Earth Software Catalog, one of the first books in the new field, received a $1.1 million publisher’s advance. On the retail side, computer stores opened, offering advice and gear. The Computer Store, first of its breed, opened in Los Angeles in 1975. Soon the Byte Shop opened and eventually national operators, such as ComputerLand and Radio Shack, entered the market with in-store computer sales people who handled branded and private label products.

“Logic, in other words, could be replaced by electrical circuitry.”

Apple Hewlett-Packard was a typical Silicon Valley company. It made everything from mainframe computers to pocket calculators. But when an engineering employee presented his design for a personal computer to senior engineers to evaluate, they rejected his idea as not being an appropriate HP product. The engineer, who lacked a college degree, went back to his parent’s garage to build the rejected computer - he was Steve Wozniak and it was the Apple I.

“The early purchasers of the Altair had no choice but to write their own programs.”

Wozniak didn’t build Apple I alone. His friend Steve Jobs, took Woz’s engineering skill and pushed the business concept, selling 50 Apple I machines to the Byte Shop. Wozniak and Jobs incorporated their new company, Apple, on April 1, 1976. Their first task was to find financing to fulfill the Byte Shop’s order, but they were already looking ahead to creating the Apple II. To get financing for that, Jobs sought the advice of Atari’s founder Nolan Bushnell. He introduced Jobs and Wozniak to Silicon Valley venture capitalist Don Valentine, who, in turn, introduced them to retired Intel engineer Mike Markkula, who invested in the company. Apple’s success opened the door to the personal computer market for software companies.

Microsoft

A few years before the two "Steves" founded Apple Computer, Paul Allen and several of his friends were working for Computer Center Corporation. Their job was to debug DEC programs. One of Allen’s friends, Bill Gates was known as the local expert at subverting computing system security. Gates took great pleasure in invading the DEC systems he and his friends were debugging. He was, in industry terms, a hacker - at 13 years old.

“The hobbyists of the day looked at their new machines and asked themselves what they could do with them. Play games, they answered.”

By 1980, Allen and Gates were running Microsoft, which sold more than $8 million of software annually and employed 32 people. Although they sold products, mostly programming languages, for all types of computers, the BASIC product written for the Altair dominated their sales. Microsoft had a problem with the Apple 6502 processor. Its software had to be translated before it could run on Apple machines; this was costly and time consuming. As Microsoft was trying to figure out what to do, it received a call from IBM.

“Instead of researching the market, the programmers simply decided at the outset, what features to put in their software.”

IBM sought an operating system and other software for its new personal computers. IBM had selected two companies as potential software suppliers, Microsoft and Digital Research. When IBM representatives tried to purchase the CP/M operating system from Microsoft, they were told that the Digital Research owned the product. When they arrived for a meeting with Digital’s owner, he was out flying his plane. Eventually, they met with him, but they did not agree to terms for Digital’s CP/M. IBM went back to Microsoft, which negotiated to convert a local Seattle company’s operating system into DOS, the operating system for IBM. Thus, Microsoft started its fast climb to the top of the software universe.

About the Authors

Paul Frieberger  is the co-author of Fuzzy Logic , winner of the 1993 Los Angeles Times Book Prize. He has written for the San Jose Mercury News , the San Francisco Examiner and National Public Radio and now works at the Interval Research Corporation in Palo Alto, California. Michael Swaine  is editor at large for Dr. Dobb’s Journal. He is also a popular columnist for print and electronic magazines in the United States, Italy and Germany and maintains Swaine’s World, a Web site that tracks computer industry news, at www.swaine.com.


Read summary...
Fire in the Valley

Book Fire in the Valley

The Making of the Personal Computer

McGraw-Hill,
First Edition:1984


 




All Articles
Load More