Aucbvax.2693 fa.works utzoo!decvax!ucbvax!works Fri Aug 14 15:30:45 1981 WorkS Digest V1 #6 >From DUFFEY@MIT-AI Fri Aug 14 15:25:28 1981 WorkS Digest Fri, 14 Aug 1981 Volume 4 : Issue 6 Today's Topics: Workstations - IBM's Personal Computer, Micro Benchmarks ---------------------------------------------------------------------- Date: 13 Aug 1981 1159-EDT From: Willie Lim Subject: IBM Personal Computer IBM announced its new 8080 based personal computer yesterday. The prices range from $1500 to $6000 (approx.). It seems that the system has color graphics. Does anyone has any more details on the system? What is the OS on the system and what are options? Willie ------------------------------ Date: 13 Aug 1981 0813-PDT From: Stevan Milunovic Subject: Micro Benchmark Units The units in the benchmark chart sent Aug 12 were bytes for code, and microseconds for execution time. Sorry for the omission. -Steve ------------------------------ Date: 14 Aug 1981 0448-PDT From: SCHIFFMAN at SRI-KL Subject: Benchmarking new Micros I forget where I first heard it said that benchmarking was Advanced Lying With Statistics.... I looked rather carefully at the EDN benchmarks when they first came out. I took these more seriously than usual because: They were in Assembly language; therefore they measure programmer skill plus machine performance, as opposed to higher-level-language benchmarks which measure programmer skill plus machine performance plus compiler quality. (Worse yet are benchmarks written in different languages for different machines which throw "language- quality" into the mix... or benchmarks running on time-sharing systems that end up measuring scheduler fairness and disk performance.) They were written by employees of the respective manu- facturers who were likely to be skilled with the given architecture. This also removes the possibility of unfair advantage given due to hidden prejudices. A reasonable coverage of routine types were made that collectively might represent "general performance". (Including interrupt service routines was a good move, for example.) Nevertheless, the benchmarks were about as useless as benchmarks usually are! To pick some specific nits-- Although there was ONE environmental parameter supplied (the clock speed for the given processor) there was no mention of what memory performance is required to run at that speed without wait states. I believe it is the case that a 10Mhz 8086 can run with memory much slower (and therefore cheaper) than a 10MHz 68000. Nor is it mentioned what the availability of the part is at that clock rate. Did you know that a 10MHz 8086 is $200 cheaper than a 10Mhz 68000? Maybe if you paid that kind of money to Intel they would sell you a 16MHz part! The benchmark specifications had loopholes in them which were taken (quite understandably) to differing advantage. For example (as best as I can remember) the interrupt service benchmark did not specify that context had to be completely saved. The Intel programmers went ahead and saved all registers anyway (a reasonable thing for a service routine to do). The programmers for the other machines only saved the minimal context necessary to meet the specification. So much for GOOD benchmarks! Any yet one often hears that "CPU X is 20% faster than CPU Y" based on even less careful comparisons. {BTW, I'm planning to use the 68000 in my next system. And I do think that is much faster than the 8086 for the things I want to do. But it's very likely a bit slower (for what I want) than the Z8000. Don't forget that there are other reasons for choosing a CPU than how fast it goes.} Most CPU designers, when pressed, will agree that there is no reasonable way to collect a small set of general metrics that will characterize machine performance. To find the fastest among several computers "in the same performance class" can only be done by carefully attempting to model the application for which the machine is to be used. If you are lucky, this can be as simple as designing your program and coding the inner loops for each machine to be considered. If you're not so lucky, you might spend a year building a workload simulator and still not know how different things will be if you get a different disk controller. Doing it right, of course, can be very hard. It's much easier to refer to a list of how long a quicksort of 100 items takes for every machine ever invented. Joseph Weisenbaum (in "Computer Power and Human Reason") tells of the story about the drunk repeatedly walking around a lamp post at night. A passing policeman asks him for an explanation. The drunk replies that he lost his keys-- Cop: "Oh, so you lost them under the lamp post?" Drunk: "Naw, lost 'em over there." (Waving at the distant darkness). Cop: "So why look for them under the lamp?" Drunk: "Silly! 'Cause the light's so much better here!" -Allan ------------------------------ End of WorkS Digest ******************* ----------------------------------------------------------------- gopher://quux.org/ conversion by John Goerzen of http://communication.ucsd.edu/A-News/ This Usenet Oldnews Archive article may be copied and distributed freely, provided: 1. There is no money collected for the text(s) of the articles. 2. The following notice remains appended to each copy: The Usenet Oldnews Archive: Compilation Copyright (C) 1981, 1996 Bruce Jones, Henry Spencer, David Wiseman.