Sujet : Re: Python recompile
De : bc (at) *nospam* freeuk.com (bart)
Groupes : comp.lang.cDate : 18. Mar 2025, 11:59:27
Autres entêtes
Organisation : A noiseless patient Spider
Message-ID : <vrbjme$2bne2$1@dont-email.me>
References : 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30
User-Agent : Mozilla Thunderbird
On 18/03/2025 09:53,
Muttley@DastardlyHQ.org wrote:
On Mon, 17 Mar 2025 17:10:59 +0000
bart <bc@freeuk.com> wibbled:
On 17/03/2025 16:32, Muttley@DastardlyHQ.org wrote:
On Mon, 17 Mar 2025 14:25:46 +0000
bart <bc@freeuk.com> wibbled:
On 17/03/2025 12:07, Muttley@DastardlyHQ.org wrote:
Anything C could do so long as you don't include all the standard C
libraries
>
in "anything".
>
Another mysterious remark. You seem to consider it your job to put down
anything I do or say!
>
So, what do the standard C libraries have to do with anything here?
>
They're generally the interface to the OS on *nix. No idea about windows.
>
I think you can assume that the tool I used was up to the job
I'm assuming nothing since all we have is your word for it.
So presumably your amazing build system checks the current module build dates
>
and doesn't rebuild stuff that it doesn't have to?
>
Why would it matter? I can compile code at one million lines every two
seconds, and my largest project is 50K lines - do the math.
You'll have to excuse me if I take that figure with a large packet of salt
unless the code does nothing particularly complicated.
If you don't believe my figures, try Tiny C on actual C programs.
Tiny C is single pass, mine does multiple passes so is a little slower.
What the code does is not that relevant:
c:\cx\big>tim tcc fann4.c
Time: 0.855
c:\cx\big>dir fann4.exe
18/03/2025 10:44 10,491,904 fann4.exe
So tcc can generate 12MB per second in this case, for a test file of nearly 1M lines.
What you should find harder to believe is this figure:
c:\cx\big>tim gcc fann4.c
Time: 50.571 (44.2 on subsequent build)
c:\cx\big>dir a.exe
18/03/2025 10:51 9,873,707 a.exe
Since it can only manage 0.2MB per second for the same quality of code.
How about making such compilers faster first before resorting to makefile tricks?
Here is my C compiler on the same task:
c:\cx\big>tim bcc fann4
Time: 1.624
c:\cx\big>dir fann4.exe
18/03/2025 10:55 6,842,368 fann4.exe
Throughput is only 4MB/second, but it is generating a smaller executable.
I find it astonishing that even with machines at least a thousand times
faster than I've used in the past, you have to resort to tricks to avoid
compilation.
Why not?
You're missing the point. I mentioned a throughput of 500Klps above; divide that by 1000, and it means a machine from 40 years was able to build programs at 500 lines per second, which seems plausible.
So what do you think is a more realistic figure for today's machines: 20Klps for an unoptimised build? (The gcc test managed 22Klps.) That would mean a compilation speed of 20 lines per second on an early 80s PC, which is ludicrous.
Something is badly wrong.