Sujet : Re: Loops (was Re: do { quit; } else { })
De : bc (at) *nospam* freeuk.com (bart)
Groupes : comp.lang.cDate : 17. Apr 2025, 20:18:59
Autres entêtes
Organisation : A noiseless patient Spider
Message-ID : <vtrk73$19iev$1@dont-email.me>
References : 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22
User-Agent : Mozilla Thunderbird
On 17/04/2025 19:47, Keith Thompson wrote:
bart <bc@freeuk.com> writes:
Or maybe you can tell me where I can buy a computer (not some board
with an embedded microcontroller) where a C compiler for it has an
'int' type other than 32 bits (with either fewer bits, or more!).
Not the point. It's about what "int" *means*. It doesn't mean "a
32-bit integer".
At present it only seems to bother you. Since if it REALLY bothered
anyone here, nobody would be using 'int'; they'd be using int32_t or
long, unless they're happy that their int values and intermediate
calculations would be using only 16 bits.
POSIX and Windows both guarantee that int is at least 32 bits wide, and
it's perfectly reasonable to rely on those guarantees on those platforms.
It would be so easy for you to get this right, but you don't even
acknowledge the possibility that I might have a valid point.
It's an obscure, pedantic point.
A better one is that if considering only Windows and POSIX (what about MacOS or Android?), then you admit int will be 32 bits (at least, but I'm not holding my breath for 64).
That means that if either of those OSes are implied, then we can assume that 'int' means a 32-byte to near 100% likelihood.
That saves thousands of posters having to qualify every reference to 'int' with '[on POSIX or Windows]' or 'when it is exactly 32 bits'.
That would be ironic if they have been making that assumption in their source code for years and decades, where it would be critical, but are not allowed to do so on long-running Usenet discusssions where it doesn't really matter.
So, like you get tired of my keeping on defending my stances from attacks on multiple fronts, I get tired of having remember that some specialised hardware that runs C may have a 16-bit int or something equally unusual.
In that case, why don't I save time by stating:
* Whenever I mention 'int' type in the context of C, it means the
32-bit type
* If I intend it to mean anything other than that, I will say so.