In alt.folklore.computers John Levine <
johnl@taugh.com> wrote:
According to Peter Flass <peter_flass@yahoo.com>:
A lot of older machines were character-addressable. The term “byte” hadn’t
been invented yet. The 1401 (etc), 1620, and many 70xx machines.
Oh sure but I am fairly sure that the 360 was the first machine that
was both character and word addressable with the words at power-of-two
addresses, and a design that allowed word operationw to work as a unit
rather than serially by character.
As far as I know it was also the first character addressable binary
machine. The earlier ones were decimal.
Honeywell 200 has 8-bit memory units and is binary addressable.
While "intended" computations are decimal, IIRC it has binary
address arithmetic. There were several machines where 4-bit
BCD digits were addressable and which used binary addresses.
Those machines operated serially, but this was mostly at level
of implementation. Machine instructions were defined as
operationg on sequences.
360 was build on philosophy that putting more circuits into
machine can be done at reasonable cost and that extra circuits
are worth it. So, the initial idea for 8-bit characters and
24-bit words lost and 32-bit words won.
360 had goal of offering both low end and high end machines.
Good preformance at high end required fixed length words
and at least 18 address bits, so word lenth bigger or equal
than 18. IBM deemed decimal arithmetic to be necessary for
comercial market. Theoreticaly they could use character
based artihtmetic like in 1401. But that is wasteful.
Also, 1401 had 6-bit characters which were limiting for
some applications, so bigger characters were needed at least
as on option. So more natural choice was BCD and limited
1401 compatibility. With BCD it is natural to have word
length that is divisible by 4. Low end models were very
important to profitability of 360, and at hardware level
thay should operate with fraction of word length. That
leaves relatively small number of reasonable combinations:
24 bit architectural words with 8 or 12 bit hardware, 32
bit architectural words with 8 or 16 bit hardware, 36 bit
architectural words with 12 bit hardware. 36 bits could
give floating pint compatibility with earlier "scientific"
models, but othewise I do not see advantages. 24 bit
may be attractive from purely hardware point of view,
but clearly complicate things for characters different
(bigger) than 6 bits. That leaves 32-bits. At least
character processing instructions need character
addressability and since there are enough address bits
it is natural to use character addresses for all
instructions.
In seventies demand for at least 7 bit characters was strong,
BCD remained relevant, so even if mainframes would use
different word length microprocessors naturally would
tend to use 8 bit bytes for the same reasons as present
in IBM 360 case.
So, IBM may be first for specific combination of features,
but there were stong reasons for the choice and since IBM
was one of leading companies those reasons acted on them
with more force.
-- Waldek Hebisch