|Date:||Thu, 15 Jul 2021 02:31:11 -0700 (PDT)|
|Injection-Info:||gal.iecc.com; posting-host="news.iecc.com:2001:470:1f07:1126:0:676f:7373:6970"; logging-data="97708"; mail-complaints-to="email@example.com"|
|Keywords:||architecture, history, comment|
|Posted-Date:||15 Jul 2021 13:22:55 EDT|
On Wednesday, July 14, 2021 at 12:42:37 PM UTC-7, Roger L Costello wrote:
> Hello Compiler Experts!
> As I understand it, computers were originally designed to do arithmetic
> computations and in the old days nearly 100% of a CPU's work involved
> arithmetic computations.
It seems that people might have believed that, even for a long time, but
I suspect rarely true. There are stories about the IBM 704 Fortran compiler,
and the authors believed that they had to make optimal use of the hardware,
or no-one would use their compiler. At the time, that would have been
assembly programmers, in some for or other. Then when they were testing
the compiler, they were surprised at the code generated doing things
better than they thought of doing.
Early computers were sold with minimal, if any, software.
Then IBM designed System/360 and OS/360 to go along with it.
About that time (I am sure some will disagree when) the costs of writing
software surpassed the costs of hardware. So, anything that can reduce
the cost of hardware is worth considering. So, more and more use of
high-level langauges, even at the cost of wasted CPU time.
I remember wondering in the Cray-1 days, with the Cray-1 designed to be
very fast at floating point, if it was a waste to run a compiler on it.
It seemed to me that it would have been better to use a cross compiler,
so the Cray floating point processing would be best used. As well as I
know, that mostly was not done.
> I look at what I now do on a daily basis with computers and it is primarily
> text processing. My guess is that "text processing" at the machine level
> mostly means doing comparisons and moving things into and out of
> memory/registers; that is, not much in the way of arithmetic computations. Is
> that correct?
Good text processing is reasonably numeric intensive. TeX uses dynamic
programming to find the optimal line breaking points on a page. It is less
optimal in computing page breaks, as computers weren't so fast at the time.
But computers have gotten faster, so the amount of time used decreased.
> These days what percentage of a CPU's work involves doing arithmetic
> computations versus other, non-arithmetic computations?
Close to zero. Remember, the CPU is most of the time sitting there waiting
for you to do something. Some systems have an actual "null job",
accumulating the CPU time not used for anything else. Others don't tell you
about it, but might keep track of how much is used. IBM S/360 processors
have a "wait state" to stop the CPU when there isn't anything to do. Rental
charges depended on how much of the time it was actually computing.
But note also that the power used by CMOS logic (most CPUs today)
depends almost linearly on how much is being done. The CPU gets
much hotter when it is actually working. This wasn't always true.
ECL power use is almost independent of how much it is doing.
[I generally agree except to note that modern PCs and particularly phones
display a lot of high quality images and video, both of which require
extensive arithmetic to get from the internal representation to the bitmap on
the screen. General purpose CPUs have extended instruction sets like
Intel's SSE and AVX, and often there are GPUs on the same chip as the
CPU, as in the Apple M1. I get the impression that compilers don't
deal very well with these things, so vendors provide large libraries
of assembler code to use them. -John]
Return to the
Search the comp.compilers archives again.