Liste des Groupes |
On 17/04/2025 03:18, Keith Thompson wrote:bart <bc@freeuk.com> writes:>
[...]The increment is part of its behaviour. At the language level, I couldWhereas for a language like C which is defined by a written
explain how FOR is implemented, but I'd rather not do that.
standard,
the standard *must* specify the behavior at the language level
(not in terms of generated intermediate or machine code).
At the language level, the behaviour of FOR loop variable iterating
over A to B inclusive, simply has the variable assuming each of those
values in turn. It doesn't say how it does that.
However, it becomes more important if the language has anything to say
about the value of the loop variable after the loop terminates (when
its scope is not limited to the loop body as is the case in my
language).
>
Then the value might be B + step or B - step, depending on which way
it iterates, using the current implementation.
The fact that>
for (int i = INT_MAX-1; i <= INT_MAX; i++);
has undefined behavior can be rigorously inferred from the language
definition.
A note that upper and lower limits must be in the range i64.min + sYou know, of course, that C's int type is not necessarily 32 bits.
.. i64.max - s (where s is the step size) can suffice.
>
What's far more useful is that it naturally works with i64 so has
limits 4 billion times bigger than you get with C's default 'int'
type.
There is no "default" for the width of int. If you had written "with
32-bit int", that would have been clear and correct.
I think it's perfectly reasonable for such discussions to assume 'int'
is 32 bits. And to only qualify it only when it can matter.
For computers in general, that assumption is likely to have held for
at least 20 years, possible even 30.
>
Or maybe you can tell me where I can buy a computer (not some board
with an embedded microcontroller) where a C compiler for it has an
'int' type other than 32 bits (with either fewer bits, or more!).
At present it only seems to bother you. Since if it REALLY bothered
anyone here, nobody would be using 'int'; they'd be using int32_t or
long, unless they're happy that their int values and intermediate
calculations would be using only 16 bits.
Les messages affichés proviennent d'usenet.