Sujet : Re: VMS
De : c186282 (at) *nospam* nnada.net (c186282)
Groupes : comp.os.linux.miscDate : 01. Jul 2025, 04:12:21
Autres entêtes
Message-ID : <m9idnVZaSs-Kz_71nZ2dnZfqn_SdnZ2d@giganews.com>
References : 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23
User-Agent : Mozilla/5.0 (X11; Linux x86_64; rv:78.0) Gecko/20100101 Thunderbird/78.13.0
On 6/30/25 3:36 AM, The Natural Philosopher wrote:
On 30/06/2025 00:09, c186282 wrote:
These days you CAN 'usually' get away with assuming an
int is 16 bits - but that won't always turn out well.
I thought the default int was 32 bits or 64 bits these days.
ISTR there is a definition of uint16_t somewhere if that is what you want
Well, you can TEST that easily enough with your
favorite compiler. Declare an unsigned int, init
to zero, then count up until it wraps back to zero.
MOST compilers, I think ints are still almost always
16-bits as a holdover from the good old days. You
can declare long and long long ints of course, but
int alone, expect it to be 16-bit.
Actually it's become rather annoying ... seems
like there are way TOO many 'types' these days.
Everybody invents new ones, and then there's the
ones M$ invents. Often same thing by many names.
Anybody for int8, int16, int32, int64, int128
and that's that ??? It'd make things LOTS easier,
a lot less conversion/casting involved.
Guess we'll have to add int256 ... but keep the
naming simple and no BS.
A rapid google shows no one talking about a 16 bit int. Today its reckoned to be 32 bit
But if it matters, use int16_t or uint16_t
I can find no agreement was to what counts as a short, long, int, at all.
If it matters, use the length specific variable names.
My gripe exactly ... the plethora of 'types', the
evolution of chips, it's just TOO these days. More
chances to screw up for no good reason.
The 'programming community' needs to fix this, no
external force can. AGREE on plain clean obvious
type defs and USE them everywhere.