Sujet : Re: VMS
De : c186282 (at) *nospam* nnada.net (c186282)
Groupes : comp.os.linux.miscDate : 27. Jun 2025, 18:24:06
Autres entêtes
Message-ID : <fPmcnaKaU81_TsP1nZ2dnZfqn_adnZ2d@giganews.com>
References : 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19
User-Agent : Mozilla/5.0 (X11; Linux x86_64; rv:78.0) Gecko/20100101 Thunderbird/78.13.0
On 6/27/25 3:37 AM, Richard Kettlewell wrote:
candycanearter07 <candycanearter07@candycanearter07.nomail.afraid>
writes:
Robert Riches <spamtrap42@jacob21819.net> wrote at 03:34 this Tuesday (GMT):
<candycanearter07@candycanearter07.nomail.afraid> wrote:
Aren't you supposed to multiply by sizeof as well?
>
Multiply by sizeof what? sizeof(char)? This was in the
pre-Unicode days. Even now with Unicode, IIUC sizeof(char) is
still always 1.
>
I still multiply by sizeof(char), half because of habit and half to
make it clear to myself I'm making a char array, even if its
"redundant". I kinda thought that was the "cannonical" way to do that,
since you could have a weird edge case with a system defining char as
something else?
Whatever the representation of char, sizeof(char)=1. That’s what the
definition of sizeof is - char is the unit it counts in.
From the language specification:
When sizeof is applied to an operand that has type char, unsigned
char, or signed char, (or a qualified version thereof) the result is
1. When applied to an operand that has array type, the result is the
total number of bytes in the array.) When applied to an operand that
has structure or union type, the result is the total number of bytes
in such an object, including internal and trailing padding.
A programmer can adopt a personal style of redundantly multiplying by 1
if they like, it’ll be a useful hint to anyone else reading the code
that the author didn’t know the language very well. But in no way is
anyone ‘supposed’ to do it.
"Best practice" sometimes means a little bit of
redundant/clarifying code.
Some of us are old enough to remember when CPUs were
not always 4/8/16/32/64 ... plus even now they've
added a lot of new types like 128-bit ints. Simply
ASSUMING an int is 16 bits is 'usually safe' but
not necessarily 'best practice' and limits future
(or past) compatibility. 'C' lets you fly free ...
but that CAN be straight into a window pane :-)