Sujet : Re: Hex string literals (was Re: C23 thoughts and opinions)
De : ldo (at) *nospam* nz.invalid (Lawrence D'Oliveiro)
Groupes : comp.lang.cDate : 21. Jun 2024, 08:13:00
Autres entêtes
Organisation : A noiseless patient Spider
Message-ID : <v5395s$3280j$1@dont-email.me>
References : 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25
User-Agent : Pan/0.158 (Avdiivka; )
On Wed, 19 Jun 2024 10:49:24 +0200, David Brown wrote:
On 19/06/2024 09:25, Lawrence D'Oliveiro wrote:
>
On Tue, 18 Jun 2024 15:54:15 +0200, David Brown wrote:
... C++ could not use underscores due to their use in user-defined
literals, and C followed C++.
C can still offer the option for them, though.
Sometimes it makes sense for C to do the same thing in a different way
from C++ - but it is rare, and needs very strong justification.
The fact that it is something of a de-facto standard among other popular
languages would count.
Is C doomed to remain forever a strict subset of C++?