Blog: "Early research on economies of scale for computer systems"

I found this rather interesting and with good links. It’s all about whether it’s most efficient always to buy the biggest computer you can - as opposed to buying several smaller ones. And it’s a historical take.

… the observed fact that the cost of compute was decreasing over time is unrelated to the claim that the cost of compute decreases as the size of the computer increases.

The plot below uses Knight’s 1953-1961 data, and shows operations per second against seconds per dollar (a confusing combination, but what Knight used), with fitted regression lines for three years using Knight’s model

One of the links is to a previous post in the blog

where we get links to two papers by Knight

Changes In Computer Performance and Evolving Computer Performance 1963-1967, by Kenneth Knight, are the references to cite when discussing the performance of early computers… Both papers were published in Datamation, a computer magazine whose technical contents could rival that of the ACM journals in the 1960s, but later becoming more of a trade magazine.

Both papers contain lots of interesting performance and cost data on computers going back to the 1940s. … This week I found high quality OCRed copies of the papers on the Internet Archive; my effort was reduced to fixing typos, which felt like less work.

See DOI:10.1364/josa.43.000306 for Grosh’s 1953 High Speed Arithmetic: The Digital Computer as a Research Tool mentioned below:

It so happens that the value of the Knight’s fitted exponent is close to that proposed in a 1953 paper (“High speed arithmetic: The digital computer as a research tool”, no online copy):

  It used to cost one cent to do a multiplication on a
  desk calculator; now it is more like four cents; but
  with these big machines we can do a million in an hour
  for $400, and that means twenty-five multiplications
  for a cent! I believe that there is a fundamental rule, 
  which I modestly call Grosch's law, giving added
  economy only as the square root of the increase in
  speed-that is, to do a calculation ten times as cheaply
  you must do it one hundred times as fast.

which did indeed become widely known as Grosch’s law.

4 Likes

Might as well drop a link to Gordon Bell’s discussion of Grosch’s Law which I happened on in the video of the lecture he gave at MIT in 1972 on “The Future of Computers”. I’m fairly sure that the ‘empirical studies’ he names are the Knight papers, because they’re both in the Computer Structures bibliography and “Changes in Computer Performance” is cited on pp. 50-51 (and AFAICT there doesn’t seem to be much else he could be referring to).

Summary

There’s another law, in comp … that’s never been proven, with respect to computers, and that is that you move in … as you move, as you compare two machines of a given time, then this goes up, this has to be at … let’s see … I guess, a slope like … like this, namely that performance is equal to the cost of the machine squared, so at a given point in time, you … as you double the price of the machine you must go up by a factor of four in performance. That’s a phenomenon known as Grosch’s Law, in which he formulated it to say that the more … the more money you put into a computer, you should get four times, or you double the price, you should quadruple the performance. And there’s been lots of empirical work, there’s been some empirical work to decide what that is, and … To a certain extent that’s a pricing problem too, because people may price according to that law, too. But in general … yeah, there’s a point that we can put more and more in a machine and get more out until it reaches this point of diminishing returns. And, or as you hit a limit in the technology, you just can’t build the machine. And that’s what happening with some of these very large machines. That … they—everything you can put in because of the generality pays, but the trouble is, something hits them before they can get all of that junk in, that is, light, speed of light, you know, the power supply, all these dumb things [laughs] that have no business of going after them, but … but, with a large machines, it’s usually those … mechanical considerations, physical considerations that limit long before you get all of the stuff in that you really would like to put in.

1 Like

Logic is only half the cost. You need ample memory to run programs.
The other thing is you need to normalize the cost to be of same word length.