Sujet : Re: Downwardly Scalable Systems
De : not (at) *nospam* telling.you.invalid (Computer Nerd Kev)
Groupes : comp.miscDate : 14. Apr 2024, 23:12:36
Autres entêtes
Organisation : Ausics - https://newsgroups.ausics.net
Message-ID : <661c54d3@news.ausics.net>
References : 1 2 3 4
User-Agent : tin/2.0.1-20111224 ("Achenvoir") (UNIX) (Linux/2.4.31 (i586))
D <
nospam@example.net> wrote:
On Sun, 14 Apr 2024, Computer Nerd Kev wrote:
>
In fact in terms of memory and disk storage GCC keeps going
backwards that even for C/C++. Compiling large C/C++ programs with
-Os in ever newer GCC versions keeps producing ever bigger binaries
for unchanged code. Of course other compilers are available and I'm
not sure how other popular ones compare.
Why do they go backwards?
I'd be quite interested to find out as well. When it comes to the
more fine-tuned optimisation options (a set of which -Os enables),
the GCC documentation is often very lacking in detail, especially
when it comes to changes between versions.
I mean larger binaries must come with some benefit right?
The benchmarks that they're chasing are for speed rather than
binary size. -Os turns on some optimisations which may make a
program run a little slower in return for a smaller binary. My
guess is that the GCC developers aren't very interested in -Os
anymore, but I haven't seen an easy path to understanding why
exactly it keeps getting less effective than in earlier GCC
versions.
-- __ __#_ < |\| |< _#