Sujet : Integral types and own type definitions (was Re: Suggested method for returning a string from a C program?)
De : janis_papanagnou+ng (at) *nospam* hotmail.com (Janis Papanagnou)
Groupes : comp.lang.cDate : 24. Mar 2025, 16:37:49
Autres entêtes
Organisation : A noiseless patient Spider
Message-ID : <vrru8f$174q6$1@dont-email.me>
References : 1 2 3 4 5 6 7 8 9 10 11 12 13
User-Agent : Mozilla/5.0 (X11; Linux x86_64; rv:45.0) Gecko/20100101 Thunderbird/45.8.0
On 20.03.2025 16:11, bart wrote:
On 20/03/2025 14:32, Scott Lurndal wrote:
bart <bc@freeuk.com> writes:
On 20/03/2025 13:36, Scott Lurndal wrote:
then it's surprisingly rare in source code.
>
Long is useless, because Microsoft made the mistake of defining
'long' as 32-bits on 64-bit architectures, while unix and linux
define it as 64-bits.
>
Unix and Linux define it as 32 bits on 32-bit architectures and 64 bits
on 64-bit ones.
>
That's what I said. Thanks for the confirmation. It doesn't change
the fact that Microsoft didn't define long as 64-bit on 64-bit
architectures,
creating incompatibilities that didn't exist in the 32-bit world
between the two dominant operating systems.
>
Remainder of bart's typical windows-centric complaints elided.
>
But your typical anti-Microsoft remarks are fine? Since you called it a
'mistake' to keep 'long' the same between 32/64-bit machines, even
though both OSes kept 'int' the same.
Many things (more or less related) come to my mind when reading that.
Of primary interest here is certainly what the "C" standard defines.
It's not that enlightening (IMO) what Microsoft did/does (or Linux);
these are just two [common contemporary] examples.
When I started with "C" or C++ there were not only 8-bit multiples
defined for the integral types; there were 9 bit or 36 bit entities
on some machines. And a 'int' type could be 16 or 32 bit (or 36 bit);
'int' reflected (sort of) the "machine register size". And the other
types were woven around; 'short' not larger than 'int', 'long' not
smaller than 'int'. Optimization considerations made it possible to
have just a single actual size for all numeric integral types' sizes.
Unless you are focused with your development on just a single machine
architecture you may choose the appropriate types with their specific
"C" language names.
In our application cases we needed certainty about actual sizes, so
(as many others did) we introduced our own types; like the entities
that you find now defined in "types.h". (Back these days there was
no "types.h" available.)
Janis
[...]
PS: I'm a bit late here, and a week absence made this thread already
become a tapeworm (as so often). So later responses on that in this
thread may be yet unnoticed by me and also got answered already or
commented on.