Sujet : Re: Python recompile
De : bc (at) *nospam* freeuk.com (bart)
Groupes : comp.lang.cDate : 12. Mar 2025, 12:14:08
Autres entêtes
Organisation : A noiseless patient Spider
Message-ID : <vqrqa1$2itot$1@dont-email.me>
References : 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24
User-Agent : Mozilla Thunderbird
On 12/03/2025 10:37, David Brown wrote:
On 11/03/2025 22:18, bart wrote:
I needed support for big numbers in my interpreter. My product is well-integrated, and represents 5% of the 250KB interpreter size. The above gmp DLL is 666KB so is a poor match.
>
Out of curiosity - how often are such big numbers needed and used with your interpreter, excluding demos to show how to use big numbers? 64-bit arithmetic is enough for the huge majority of purposes outside of cryptography and mathematics, and 128-bit arithmetic covers almost all of the rest. Your code is not suitable for cryptography or mathematics. So what is it used for? And could it have been handled much more simply by using 128-bit fixed sizes?
How often are big numbers used in Python? There, its integer type transparently overflows into a big integer when needed (so C-style algorithms that rely on being masked to 64 bits won't work).
In my language you have to explicitly request the bignum type. It is used mainly for recreational code. Or sometimes to get accurate reference results to compare against when working with 64-bit ints and floats.
(How big is a 128-bit range for example? I can just do 'print 2L**128'.)
I did have 128-bit support in my implementation language once, better than gcc's in C since it also supported literals, and Print. I did think about using that instead of, or as well as, bignums. Maybe as the default int type instead of 64 bits.
But, as you say, real needs beyond 64 bits are rare. When it is needed, then they can be arbitrary large (forget 2**128, what about 2**1000000? Or you want to look at Fibonacci numbers beyond fib(92) which is the limit with int64).
In the end I dropped 128-bit numbers, because the only use-case, apart from bragging about it, was supporting 128 bits in the self-hosted compiler.