Sujet : Re: Downwardly Scalable Systems
De : nospam (at) *nospam* example.net (D)
Groupes : comp.miscDate : 15. Apr 2024, 11:17:04
Autres entêtes
Organisation : i2pn2 (i2pn.org)
Message-ID : <1f47f9e0-ad94-817e-3b0f-9ea84f24195d@example.net>
References : 1 2 3 4 5
On Mon, 15 Apr 2024, Computer Nerd Kev wrote:
D <nospam@example.net> wrote:
On Sun, 14 Apr 2024, Computer Nerd Kev wrote:
>
In fact in terms of memory and disk storage GCC keeps going
backwards that even for C/C++. Compiling large C/C++ programs with
-Os in ever newer GCC versions keeps producing ever bigger binaries
for unchanged code. Of course other compilers are available and I'm
not sure how other popular ones compare.
>
Why do they go backwards?
>
I'd be quite interested to find out as well. When it comes to the
more fine-tuned optimisation options (a set of which -Os enables),
the GCC documentation is often very lacking in detail, especially
when it comes to changes between versions.
>
I mean larger binaries must come with some benefit right?
>
The benchmarks that they're chasing are for speed rather than
binary size. -Os turns on some optimisations which may make a
program run a little slower in return for a smaller binary. My
guess is that the GCC developers aren't very interested in -Os
anymore, but I haven't seen an easy path to understanding why
exactly it keeps getting less effective than in earlier GCC
versions.
Got it! Thank you for the information. I guess perhaps it's similar to the old argument that emacs is "too big". With todays disks/ssds it matters less and less.