Sujet : Re: evolution of bytes, The joy of FORTRAN
De : antispam (at) *nospam* fricas.org (Waldek Hebisch)
Groupes : alt.folklore.computers comp.os.linux.miscDate : 03. Mar 2025, 02:50:33
Autres entêtes
Organisation : To protect and to server
Message-ID : <vq31t7$14njc$1@paganini.bofh.team>
References : 1 2 3 4 5 6 7
User-Agent : tin/2.6.2-20221225 ("Pittyvaich") (Linux/6.1.0-9-amd64 (x86_64))
In alt.folklore.computers Lawrence D'Oliveiro <
ldo@nz.invalid> wrote:
On Sun, 2 Mar 2025 14:58:22 -0000 (UTC), Waldek Hebisch wrote:
IBM deemed decimal arithmetic to be necessary for comercial market.
Interesting to see a recognition nowadays that even scientific users might
be interested in decimal arithmetic, too. Look at how the latest version
of the IEEE 754 spec includes decimal data types in addition to the
traditional binary ones.
Pushing decimal numbers into modern hardware is practical idiocracy.
Basically, IBM wants to have a selling point so they pushed inclusion
in standards. And apparently Intel decided to hedge its bets,
so instead of opposing IBM standard they expanded it using their
own variation. Actually, thanks to Intel standard is clearly
unworkable, so if Intel wanted to kill it it would be hand to
do better.
To put it differently, modern process allows putting quite a lot
of transistors on a single chip. Manufactures are not sure how
to use them efficiently and due to heat can not simultaneously
use all transitors. So there is temptetion to include features
givning marginal benefits, like decimal hardware.
Concerning commercial use, 1401 was intended to replace card
equipement which strongly influenced its architecure. To
have reasonable migration path from 1401 the 360 series needed
decimal hardware. But even in 1965 there were good reasons to
use binary for commercial data processing. Namely, in
typical cases most data was in "master files", stored either
on tape or discs. Keeping master files in binary would
give better density and consequently faster I/O. Computation
in binary was faster. Other input and output to printers
had smaller volume, so could use convertion. Of course,
actual usage varied between sites and on low end machines
relative cost of binary/decimal convertion was higher than
on high end machines, but tendency was to have most compute
power in high end machines. In modern times cases when
decimal hardware makes a difference are rare (if any).
Or more precisly, users want to run at high volume unmodified
old programs which use decimal instructions. But
recompilation could replace decimal instructions with
binary ones at modest preformance loss (and most of this
lost would be recoverd due to ability to run on faster
hardware). And in modern times such codes tends to be
tiny part of actual computational load in commercial
date processing.
-- Waldek Hebisch