-
A history of the Amiga, part 3: The first prototype
From
Stephen Walsh@39:901/280 to
All on Wed Oct 24 22:52:46 2007
Prototyping the hardware
Modern chips are designed using high-powered workstations that run very expensive chip simulation software. However, the fledgling Amiga company could not afford such luxuries. It would instead build, by hand, giant replicas of the silicon circuitry on honeycomb-like plastic sheets known as breadboards.
Breadboards are still used by hobbyists today to rapidly build and test simple circuits. The way they work is fairly simple. The breadboard consists of a grid of tiny metal sockets arranged in a large plastic mesh. Short vertical strips of these sockets are connected together on the underside of the board so that they can serve as junctions for multiple connectors. Small lengths of wire are cut precisely to length and bent into a staple-like shape, with the exposed wire ends just long enough to drop neatly into the socket. Small chips that perform simple logic functions (such as adding or comparing two small numbers in binary code) straddle the junctions, their centipede-like rows of metal pins precisely matching the spacing of the grid.
At the time, nobody had ever designed a personal computer this way. Most personal computers, such as the IBM PC and the Apple ][, had no custom chips inside them. All they consisted of was a simple motherboard that defined the connections between the CPU, the memory chips, the input/output bus, and the display. Such motherboards could be designed on paper and printed directly to a circuit board, ready to be filled with off-the-shelf chips. Some, like the prototypes for the Apple ][, were designed by a single person (in this case, Steve Wozniak) and manufactured by hand.
The Amiga was nothing like this. Its closest comparison would be to the minicomputers of the day—giant, refrigerator-sized machines like the DEC PDP-11 and VAX or the Data General Eagle. These machines were designed and prototyped on giant breadboards by a team of skilled engineers. Each one was different and had to be designed from scratch—although to be fair, the minicomputer engineers had to design the CPU as well, a considerable effort all by itself! These minicomputers sold for hundreds of thousands of dollars each, which paid for the salaries of all the engineers required to construct them. The Amiga team had to do the same thing, but for a computer that would ultimately be sold for under $2,000.
So there were three chips, and each chip took eight breadboards to simulate, about three feet by one and a half feet in size, arranged in a circular, spindle-like fashion so that all the ground wires could run down the center. Each board was populated with about 300 MSI logic chips, giving the entire unit about 7200 chips and an ungodly number of wires connecting them all. Constructing and debugging this maze of wires and chips was a painstaking and often stressful task. Wires could wiggle and lose their connections. A slip of a screwdriver could pull out dozens of wires, losing days of work. Or worse, a snippet of cut wire could fall inside the maze, causing random and inexplicable errors.
However, Jay never let the mounting stress get to him or to his coworkers. The Amiga offices were a relaxed and casual place to work. As long as the work got done, Jay and Dave Morse didn't care how people dressed or how they behaved on the job. Jay was allowed to bring his beloved dog, Mitchy, into work. He let him sit by his desk and had a separate nameplate manufactured for him.
Jay even let Mitchy help in the design process. Sometimes, when designing a complex logic circuit, one comes to a choice of layout that could go either way. The choice may be an aesthetic one, or merely an intuitive guess, but one can't help but feel that it should not be left merely to random chance. On these occasions Jay would look at Mitchy, and his reaction would determine the choice Jay would make.
Slowly, the Amiga's custom chips began to take shape. Connected to a Motorola 68000 CPU, they could accurately simulate the workings of the final Amiga, albeit more slowly than the final product would run. But a computer, no matter how advanced, is nothing more than a big, dumb pile of chips without software to run on it.
Stephen
--- GoldED+/BSD 1.1.5
* Origin: -:- Dragon's Lair BBS -:- telnet: bbs.vk3heg.net (39:901/280)
-
From
Stephen Walsh@39:901/280 to
All on Wed Oct 24 22:54:22 2007
By Jeremy Reimer
Raising the bar for operating systems
All computers since the very first electronic calculators required some kind of "master control program" to handle basic housekeeping tasks such as running application programs, managing the user environment, talking to peripherals such as floppy and hard disks, and controlling the display. This master program is called the operating system, and for most personal computers of the day, it was a very simple program that was only capable of doing one thing at a time.
Jay's specialty was designing hardware, not software, so he had little input on the design of the Amiga's operating system. But he did know that he wanted his computer to be more advanced than the typical personal computers of the time running such primitive operating systems as AppleDOS and MS-DOS. His hire for chief of software engineering, Bob Pariseau, did not come from a background in microcomputers. He worked for the mainframe computer company Tandem, which made massive computers that were (and are still today) used by the banking industry.
Bob was used to his powerful computers that could handle many tasks and transactions at one time. He saw no reason why microcomputers should not be capable of the same thing. At the time, there were no personal computers that could multitask, and it was generally felt that the small memory capacities and slow CPU speeds of these machines made multitasking impossible. But Bob went ahead and hired people who shared his vision.
The four people he hired initially would later become legends of software development in their own right. They were RJ Mical, Carl Sassenrath, Dale Luck, and Dave Needle. Carl's interview was the simplest of all: Bob asked him what his ultimate dream job would be, and he replied, "To design a multitasking operating system." Bob hired him on the spot.
Carl Sassenrath had been hired from Hewlett-Packard where he had been working on the next big release of a multitasking operating system for HP's high-end server division. According to Carl:
"What I liked about HP was that they really believed in innovation. They would let me buy any books or publications I wanted... so I basically studied everything ever published about operating systems. I also communicated with folks at Xerox PARC, UC Berkeley, MIT, and Stanford to find out what they were doing.
In 1981-82 I got to know CPM and MSDOS, and I concluded that they were poor designs. So, I started creating my own OS design, even before the Amiga came along."
So the Amiga operating system would be a multitasking design, based on some of Carl's ideas that would later be called a "microkernel" by OS researchers in academia. Carl had invented the idea before it even had a name; the kernel, or core of the operating system, would be small, fast, and capable of doing many things at once, attributes that would then pervade the rest of the operating system.
The decision to make a multitasking kernel would have a huge impact on the way the Amiga computer would perform, and even today the effects can still be felt. Because the mainstream PC market did not gain true multitasking until 1995 (with Windows 95) and the Macintosh until 2001 (with OSX), an entire generation of software developers grew up on these platforms without knowing or understanding its effects, whereas the Amiga, which had this feature since its inception, immediately gave its developers and users a different mindset: the user should never have to wait for the computer. As a result, programs developed for the Amiga tend to have a different, more responsive feel than those developed for other platforms.
Adding a GUI
There was one more significant design decision that was made about the Amiga at this time: to design it with a graphical user interface. Most personal computers at the time were controlled by a command line interface; the user had to type in the name of a program to run it and enter a long series of commands to move files or perform maintenance tasks on the computer.
The idea of a graphical user interface was not new. Douglas Engelbart had demonstrated most of its features along with the world's first computer mouse in 1968, and researchers at Xerox PARC had created working models in the mid-70's. At the beginning of the 1980's, it seemed everyone was trying to cash in on the graphical interface idea, although developing it on the primitive computers of the day was problematic. Xerox itself released the Star computer in 1981, but it cost $17,000 and sold poorly, serving mostly as an inspiration for other companies. Apple's version, the Lisa, came out in 1983. It cost $10,000 and also sold poorly. Clearly, personal computers were price-sensitive, even if they had advanced new features.
Apple solved the price issue by creating a stripped-down version of the Lisa. It took away the large screen, replacing it with a tiny 9 inch monochrome monitor. Instead of two floppy drives, the new machine would come with only one. There were no custom chips to accelerate sound or graphics. And as much hardware as possible was removed from the base model, including the memory—the operating system was completely rewritten to squeeze into 128 kilobytes of RAM. The stripped-down operating system was only capable of running one application at a time—it couldn't even switch between paused tasks.
This was the Macintosh, which was introduced to the world in dramatic fashion by Steve Jobs in January of 1984. What most people don't remember about the Macintosh was that initially it was not a success—it sold reasonably well in 1984, but the following year sales actually went down. The Mac in its original incarnation was actually not very useful. The built-in word processor that came with the machine was limited to only eight pages, and because of the low memory and single floppy drive, making a backup copy of a disk took dozens of painful, manual swaps.
The Amiga operating system team wasn't thinking like this. The hardware design group wasn't compromising and stripping things down to the bare minimum to save money, so why should they?
The Amiga user interface WorkBench
The Amiga user interface, Workbench.
One of the more difficult parts of writing a graphical user interface is doing the low-level plumbing, called an API, or Application Programming Interface, that other programmers will use to create new windows, menus, and other objects on the system. An API needs to be done right the first time, because once it is released to the world and becomes popular, it can't easily be changed without breaking everyone's programs. Mistakes and bad design choices in the original API will haunt programmers for years to come.
RJ Mical, the programmer who had come up with the "Zen Meditation" game, took this task upon himself. According to Jay Miner, he sequestered himself in his office for three weeks, only coming out once to ask Carl Sassenrath a question about message ports. The resulting API was called Intuition, an appropriate name given its development. It wound up being a very clean, easily-understandable API that programmers loved. In contrast, the API for Windows, called Win16 (later updated to Win32) was constructed by a whole team of people and ended up as a mishmash that programmers hated.
Stephen
--- GoldED+/BSD 1.1.5
* Origin: -:- Dragon's Lair BBS -:- telnet: bbs.vk3heg.net (39:901/280)
-
From
Stephen Walsh@39:901/280 to
All on Wed Oct 24 22:55:37 2007
Working 90-hour weeks
RJ Mical recalled what life was like back in those busy early days:
"We worked with a great passion... my most cherished memory is how much we cared about what we were doing. We had something to prove... a real love for it. We created our own sense of family out there."
Like the early days at Atari, people were judged not on their appearance or their unusual behavior but merely on how well they did their jobs. Dale Luck, one of the core OS engineers, looked a bit like a stereotypical hippie, and there were even male employees who would come to work in purple tights and pink fuzzy slippers. "As long as the work got done, I didn't mind what people looked like," was Jay Miner's philosophy. Not only was it a family, but it was a happy one: everyone was united by their desire to build the best machine possible.
Why was everybody willing to work so hard, to put in tons of late (and sometimes sleepless) nights just to build a new computer? The above and beyond dedication of high-tech workers has been a constant ever since Silicon Valley became Silicon Valley. Companies have often reaped the rewards from workers who were willing to put in hundreds of hours of unpaid overtime each month. Managers in other industries must look at these computer companies and wonder why they can't get their workers to put in that kind of effort.
Part of the answer lies with the extreme, nearly autistic levels of concentration that are achieved by hardware and software engineers when they are working at peak efficiency. Everyday concerns like eating, sleeping, and personal hygiene often fade into the background when an engineer is in "the zone." However, I think it goes beyond that simple explanation. Employees at small computer companies have a special position that even other engineers can't hope to achieve. They get to make important technical decisions that have far-reaching effects on the entire industry. Often, they invent new techniques or ideas that significantly change the way people interact with their computers. Giving this kind of power and authority to ordinary employees is intoxicating; it makes people excited about the work that they do, and this excitement then propels them to achieve more and work faster than they ever thought they could. RJ Mical's three-week marathon to invent Intuition was one such example, but in the story of the Amiga there were many others.
The employees of Amiga, Inc. needed this energy and passion, because there was a hard deadline coming up fast. The Consumer Electronics Show, or CES, was scheduled for January 1984.
The January CES and the buyout of Amiga
CES had expanded significantly since its inception in 1967. The first CES was held in New York City, drawing 200 exhibitors and 17,500 attendees. Among the products that had already debuted at CES were the VCR (1970), the camcorder (1981), and the compact disc player (also 1981). CES was also home to the entire nascent video game industry, which would not get its own expo (E3) until 1995.
Amiga, Inc. didn't have a lot of money left over for shipping its prototype to the show, and the engineers were understandably nervous about putting such a delicate device through the rigors of commercial package transport. Instead, RJ Mical and Dale Luck purchased an extra airline seat between the two of them and wrapped the fledgling Amiga in pillows for extra security. According to airline regulations, the extra "passenger" required a name on the ticket, so the Lorraine became "Joe Pillow," and the engineers drew a happy face on the front pillowcase and added a tie! They even tried to get an extra meal for Joe, but the flight attendants refused to feed the already-stuffed passenger.
The January 1984 CES show was an exciting and exhausting time for the Amiga engineers. Amiga rented a small booth in the West Hall at CES, with an enclosed space behind the public display to showcase their "secret weapon," the Lorraine computer. A guarded door led into the inner sanctum, and once inside people could finally see the massive breadboarded chips, sitting on a small table with a skirt around the edges. Skeptical customers would often lift the skirt after seeing a demonstration, looking for the "real" computer underneath.
The operating system and other software were nowhere near ready, so RJ Mical and Dale Luck worked all night to create software that would demonstrate the incredible power of the chips. The first demo they created was called Boing and featured a large, rotating checkered ball bouncing up and down, casting a shadow on a grid in the background, and creating a booming noise in stereo every time it hit the edge of the screen. The noise was sampled from Bob Pariseau hitting the garage door with one of the team's celebrated foam baseball bats. The Boing Ball would wind up becoming an iconic image and became a symbol for the Amiga itself.
The January CES was a big success for the Amiga team, and the company followed it up by demonstrating actual prototype silicon chips at the June CES in Chicago, but the fledgling company was rapidly running out of money. CEO Dave Morse gave presentations to a number of companies, including Sony, Hewlett-Packard, Philips, Apple, and Silicon Graphics, but the only interested suitor was Atari, who lent the struggling company $500,000 as part of a set of painful buyout negotiations. According to the contract, Amiga had to pay back the $500,000 by the end of June or Atari would own all of their technology. "This was a dumb thing to agree to but there was no choice," said Jay Miner, who had already taken a second mortgage out on his house to keep the company going.
Fortunately for Amiga (or unfortunately, depending on how you imagine your alternate histories) Commodore came calling at the last minute with a buyout plan of its own. It gave Amiga the $500,000 to pay back Atari, briefly thought about paying $4 million for the rights to use the custom chips, and then finally went all in and paid $24 million to purchase the entire company. The Amiga had been saved, but it now belonged to Commodore.
This concludes the first three-part installment of our history of the Amiga platform. Watch this space for the next installment, which will cover the Amiga's official launch and its early years.
Stephen
--- GoldED+/BSD 1.1.5
* Origin: -:- Dragon's Lair BBS -:- telnet: bbs.vk3heg.net (39:901/280)