Sujet : Re: Hex string literals (was Re: C23 thoughts and opinions)
De : david.brown (at) *nospam* hesbynett.no (David Brown)
Groupes : comp.lang.cDate : 21. Jun 2024, 12:06:14
Autres entêtes
Organisation : A noiseless patient Spider
Message-ID : <v53mr6$34g49$3@dont-email.me>
References : 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26
User-Agent : Mozilla/5.0 (X11; Linux x86_64; rv:102.0) Gecko/20100101 Thunderbird/102.11.0
On 21/06/2024 09:13, Lawrence D'Oliveiro wrote:
On Wed, 19 Jun 2024 10:49:24 +0200, David Brown wrote:
On 19/06/2024 09:25, Lawrence D'Oliveiro wrote:
>
On Tue, 18 Jun 2024 15:54:15 +0200, David Brown wrote:
>
... C++ could not use underscores due to their use in user-defined
literals, and C followed C++.
>
C can still offer the option for them, though.
>
Sometimes it makes sense for C to do the same thing in a different way
from C++ - but it is rare, and needs very strong justification.
The fact that it is something of a de-facto standard among other popular
languages would count.
The apostrophe was already the standard - not just a "de-facto standard" - in the language that is most relevant for cooperation with C.
Is C doomed to remain forever a strict subset of C++?
C is not a subset of C++. Their intersection covers most of C, but not all of it.
But C and C++ are often used together and compiled together in the same binaries. A large proportion of C and C++ programmers work with both languages, while almost none of them have any use for, say, Ada with its underscore digit separator.
It makes sense when introducing new features to either language to be compatible with the other (if the feature is relevant to both languages). C thus copies from C++, and C++ copies from C. Sometimes there must be differences, but gratuitous differences are bad for everyone, even if they might seem a little nicer in one language in isolation.