Dave Patterson

by Dave Patterson
patterson@cs.berkeley.edu
Professor in Computer Science at UC Berkeley

September 25, 1998

[Italic comments are from Ray Holt]

While I was a grad student I worked at Hughes Aircraft on the computer for the radar system, starting in 1971 or 1972, so it brought back old memories.

Notice that he has lots of references to microprogramming in the document, which is typical of that era. (The aerospace computers I did were microprogrammed too.) The word “microprocessor” was used before the 4004 to mean the processor in a microprogrammed computer by some people of the time, so its not shocking to see the word microprocessor in a document of that era. In kind of used the term ambiguously, but it was a single chip CPU [Ray – Why does a microprocessor have to be single chip? It appears that is revisionist thinking.]

What this document describes is a microprogrammed set of custom LSI chips that can configured into multiple configurations and then microprogrammed, apparently in binary, to perform some application. [Ray – All computers are binary programmed until software tools are developed.]

I think the minimum system is has 3-4 chips for the CPU, but I’m not positive from the stuff I read, but you could use many more to get more performance. It depends whether you include the ROM (microcode memory) and the RAM (“RAS”) as part of the CPU. Then it would be at least 2 more chips (5-6 total) in the CPU. Its plausible to include them since the RAM only had 16 words, so its like a register file, and the microcoded CPUs of the time would include the ROM in the CPU. His designed scaled, so there could also be lots more chips in his computer, without it being a multiprocessor. The designer just had to write the microcode that made it all work. [Ray – The multiple chips allow this chip set to operate with parallel execution which might be called co-processing but whatever the terms used it was decades ahead of other applications. Intel apparently does not include ROM and RAM in its CPU definition.]

You should classify it as a microprogrammed special purpose computer, using a variable number of custom chips packaged as DIPs that could provide good performance in a small footprint. A classic aerospace thing to do, although most aerospace engineers of the time would design computers using standard TTL chips to reduce development costs. [Ray – “The thing to do” is just the point.  Any engineer would recognize the final design fits the specification which was not a desktop calculator. Apparently a working microcomputer system on 14 sq inches working in a military environment is the same as standard TTL.]

I agree that its likely that the 4004 wasn’t fast enough [Ray – nor did it exist but two years later]; it had to fit into a single chip, and that meant sacrificing performance to make it fit [Ray – The 4004 chip set was NOT single chip nor was it intended to be]. Holt’s used the technology to solve a fixed problem, and that problem wasn’t a desktop calculator (which led to the 4004) but signal processing for the F14A. [Ray – The 4004 was TWO years later so what is the point?  This microprocessor used the EXACT same technology as the later 4004 and operated at mil-spec and had denser chips and worked. If the 4004 was an accomplishment then this was a great accomplishment.]

No way Holt’s computer is a microprocessor, using the word as we mean it today. [Ray – Apparently, Intel disagreed, and “today” is 30 years later.]  Arguing that it is simply revisionist history, trying to claim a glory that isn’t deserved. Unless there is some patent deal going on, I don’t know why people would do this. Holt must understand the real issues; he looks to be a good computer engineer. [Ray – Mr Patterson, what is your point? What glory? Defending the United States with a high tech microprocessor?  Revisionist? How? Two years before the 4004 is being a revisionist? Why not look at this design for what it is instead of defending some design that was not even on the drawing board at this time? This design is probably worth mentioning in a textbook on architecture. ]

Hughes actually did a wafer-scale integrated CPU a couple of years later, which I worked on, and its was probably one of the first real ones, but who cares? There was no path from it to anything that had commercial impact, and its not like engineers at commercial companies were studying aerospace computers to get ideas to steal and put them in their commercial computers. [Ray – Did the Hughes “wafer” work? was it published? or is it just talk and paper design?][Ray- See my Legacy link to see if this design had any impact on the microprocessor world. There might be some big surprises.]

Dave

[Ray – It appears that Mr Patterson published a book in 1997, “Computer Organization and Design: The Hardware / Software Interface”, and the CADC announcement made the book obsolete.  It is also interesting that none of his later books mentioned this major accomplishment.  Why would Mr Patterson ignore one of the most major accomplishment in the computer history?  I am sure he understands the real issues; he seems to be a real professor. I don’t know why professors do this?]

[Ray – I also find it interesting that Mr Patterson has never replied to my comments.  Maybe his knee-jerk reaction to this major accomplishment was too sensitive and embarrassing to his Emeritus career. Comments are still welcome.] August 6, 2016

Copy the code below to your web site.
x