Wednesday, December 29, 2010

A Brief History - 7 - A New Hope

PART 7: Silicon Wars: A New Hope

"A Brief History of Code", by Rudedog Hawkins

PART 7: Silicon Wars: A New Hope

A new hope for realizing AI came down the pipe.  Introduce the microprocessor and microcontroller, circa 1971, by Intel and Texas Instruments respectively.  The two companies were based out of Texas.  The established industry leaders of the time were in the New England.  Digital Equipment Corporation, DEC, was based out of Lexington, Massachusetts.   International Business Machines, IBM, was based out of Armonk, New York.  Other soon to be major players in the coming Silicon War were MOS Technology in Pennsylvania, Motorola in Illinois, and Zilog in California.  One each was located in the East, West, and mid-West.

In retrospect, company location may have proven to be critical.  Innovation without obvious usefulness was in the West.  Stagnation and intractability was rampant in the East.  At the time, California was the cultural hot spot in the U.S. with the rest of the country following the lead.  Mass-market consumer oriented companies producing products such as movies, albums, and fashions were moving to California in droves.  New England held itself up as the standard bearer of American society.  You had Ivy League schools, The Statue of Liberty, and apple pie.  The Mid-West found itself feeding off the best of both worlds.

I should note that Motorola’s 6800 CPU, designed by an ex-MOS employee, used the same core instruction set as the MOS 6500.  Curiously, Zilog’s Z80 series was designed by an Intel ex-employee, and used the same core instruction set as the Intel 8080.  The political stage for the coming Silicon War was being set.  It would be a battle that would dramatically transform all of the companies involved.  Out of the survivors that remain today save one, Intel, none rely on making their major profits from manufacturing microprocessors, and/or personal computers.

The 6xxx and 8xxx CPU “families” were most notable for their fundamental design differences in interrupt design and the size of their instruction sets.  The 6xxx family used a smaller set with hardware interrupts, while the 8xxx family used a much larger set with software interrupts.  These same fundamental differences also existed between industry leading DEC and IBM mainframes and mini-computers of the time period.

That first Intel microprocessor, 4004, was one chip in a multi-chip set that performed the function of a general-purpose computer.  The TI microcontroller, TMS1000, was a scaled down version of most of the general-purpose functions found in a multi-chip set into a single device that was designed for dedicated applications.  These devices incorporated what is known LSI, Large Scale Integrated circuitry.

Integrated circuitry had been around for several years.  Ever since the invention of the solid-state transistor in the 1950s, engineers had been building complex analog and digital circuitry on a single silicon slab.  But, never had any miniaturization and integration been done on the scale demonstrated by these new products.  The first computers were fabricated using vacuum tubes.  The 6502 CPU was no larger than a postage stamp, but its’ equivalent manufactured from vacuum tubes would have been the size of a commercial cruise ship.  Today’s chips would have equivalents the size of the island of Manhattan, and in some cases orders of magnitude larger than that.  The digital revolution had just landed its’ first man on the moon.

This miniaturization carried the side benefit of higher speed many reasons.  One reason, the signals had less distance to travel, which meant that Transmission Line effects from circuit board tracings were greatly reduced.  Since all of the circuitry was cast from the same silicon slab then all of the circuitry could be perfectly matched, which meant that manufacturing variations in silicon purity were negated. 

Previously, a CPU was comprised of dozens of discrete digital chips that filled at least one entire circuit board.  Most CPUs had separate boards for the separate functional areas found in CPU designs.  Now, the equivalent of a few circuit boards were now encoded onto a single chip!  The microprocessor represented brute force miniaturization, LSI, on a massive scale. 

The microcontroller took a slightly different direction by going for all out integration.  An entire advanced multi-chip set was on a single chip.  Putting it all on to one chip allowed for better performance at the tradeoff of versatility.  While this may have locked you into a specific chip set, the intended use of the product was that of dedicated CPU running only one program, all of the time.  They made for some great alarm clocks, wristwatches, and calculators.  Every consumer had to have at least one digital gadget.  A decade later, there were microcontrollers being created that had onboard interpreters for high-level computer languages like BASIC and FORTH.

This naturally led to a cycle of entrepreneurs who started up companies in the hopes of getting rich quick.  It didn’t work for almost all of them.  The startup costs and proprietary technology required for manufacturing chips proved to be too prohibitive for the small guy.  Instead, established manufacturers in the industry funded divisions or entirely separate companies to produce integrated circuit chips.  Everyone realized the double bonus profit to be had from the big-ticket item, building the microprocessor-based CPU and microcontrollers.  And large profits were there just for the taking. 

Even the stalwart DEC, whose mini-computers had helped put the first man on the moon, eventually jumped into the fray.  DEC was one of the last major companies holding back from entering a market perceived to be too volatile, and soon to be short-lived for many of the players and wannabes.  To a large degree, this assessment was quite accurate.  Too many companies were developing products that would too frequently be obsolete before they could bring them to market. 

DEC had preferred to stick to their mainstay cash cow, producing expensive mini-computers for business use.  Prior to microprocessors, most commercially sold computers came in two sizes.  A mainframe was a high-performance computer in a rack as large as a home refrigerator. A mini-computer was a much smaller, less powerful system, which could vary in size from a modern microwave to dishwasher.

Low cost microprocessors led directly to the commercial availability of low-cost computers for consumers.  At least low cost compared to the tens and hundreds of thousands that most mini-computers of the day cost.  Hobbyists could now purchase complete kits to fabricate their own computers for the cost of a television set.  But, you had to be a real enthusiast or an engineer to understand how to construct and use the things.  You had to be a hacker who didn’t mind losing some sleep, lots of sleep.

Low cost computers were an existential threat to both IBM and DEC, but they were blinded by their own size and momentum to see just how deadly the new threat could be. 

Rudy  =8^D

No comments:

Post a Comment