Sujet : Re: The integral type 'byte' (was Re: Suggested method for returning a string from a C program?)
De : janis_papanagnou+ng (at) *nospam* hotmail.com (Janis Papanagnou)
Groupes : comp.lang.cDate : 25. Mar 2025, 19:18:14
Autres entêtes
Organisation : A noiseless patient Spider
Message-ID : <vrus18$3srn9$1@dont-email.me>
References : 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19
User-Agent : Mozilla/5.0 (X11; Linux x86_64; rv:45.0) Gecko/20100101 Thunderbird/45.8.0
On 25.03.2025 10:38, David Brown wrote:
Personally, I think [...]
(I'll skip most of that in your post.)
Thus pretty much any programmer in the last 50 years sees "byte" as
synonymous with 8-bit octet, including C programmers,
Be careful if you are not speaking for yourself, and especially if
you extrapolate to such a lengthy period of time.
50 years ago was 1975 (and about the time I wrote my first programs).
And it was even some years later that I programmed on CDC 175 or 176,
a machine with a word length of 60 bit, 6 bit characters and Pascal's
'text' data type was a 'packed array [1..10] of character'. (Just to
give an example.) Computer scientists generally had a much broader
view back these days.
If you'd have said 40 years ago, about the time when MS DOS systems
got popular, I would have agreed about the prevalent opinion. OTOH,
with all this populism a lot of quality degradation entered the IT
scenery (at least, as far as my observation goes); things were not
taken as accurately as would have been appropriate.
and for the last
30 years or so it has been the ISO standard definition of the term.
I suppose you meant the "ISO _C_ standard definition"?
I'm asking because I was in my post already referring to international
standards (ISO, CCITT/ITU-T, etc.) that have defined 'octet' for the
purpose of unambiguously identifying an 8 bit entity. The 'octet' went
into the ASN.1 protocol standard notation (that you will now also find
in IETF's RFC standards).
Janis
[...]