Liste des Groupes | Revenir à col misc |
On 2/28/25 9:19 AM, Scott Lurndal wrote:c186282 <c186282@nnada.net> writes:On 2/26/25 7:22 PM, Rich Alderson wrote:Lawrence D'Oliveiro <ldo@nz.invalid> writes:
On Tue, 25 Feb 2025 15:47:15 -0700, Peter Flass wrote:The idea of defining different-sized bytes is a real plus.What they meant by "bytes" was really just "bitfields".
Look up the original definition of "byte" in the signal processing literature,
and you'll find that "arbitrary bitfield" is the original meaning, dude.
The restriction of "byte" to "bitfield of a particular size for the manufacturer's
architecture, especially 8 bits" is the odd choice.
8 bits kinda emerged with microprocessors.
Surely the IBM 360 was responsible for the popularity
of 8-bit bytes - and that drove the adoption by other
computer manufacturers if only to support common I/O
peripherals.
IBM was, to a degree still is, a big and influential
player. Lots of 360/370 systems were sold - and you
saw 'em at big govt / NASA installations, which
amounted to 'prestige' and a sort of a 'standard'.
They were good boxes.
Came across a 360 (model 20 ? the little one) in
use by a parts distributor about 20 years ago.
Nowadays they're always emulated, like with the
Hercules system or better since I doubt you'll
get parts/service for yer ancient 360/370 now.
Bytes were bound to migrate towards some even power
of two. Six or seven bits ... nah. 8 was kind of
the useful minimum. 4 bits was for microcontrollers
and some BCD calx. 12 and 18 bits were seen in
various minis/mainframes for quite awhile.
Hexadecimal is handy and fairly comprehensible.
Better than octal IMHO. 8/16/32/64 works out very
well with hex.
Les messages affichés proviennent d'usenet.