Sujet : Re: Can't Avoid That Shit Rust - Even On Gentoo
De : 186283 (at) *nospam* ud0s4.net (186282@ud0s4.net)
Groupes : comp.os.linux.miscDate : 02. Oct 2024, 05:57:28
Autres entêtes
Organisation : wokiesux
Message-ID : <E8-dnc9jS_FeT2H7nZ2dnZfqnPidnZ2d@earthlink.com>
References : 1 2 3 4
User-Agent : Mozilla/5.0 (X11; Linux x86_64; rv:78.0) Gecko/20100101 Thunderbird/78.13.0
On 10/1/24 12:41 PM, Charlie Gibbs wrote:
On 2024-10-01, 186282@ud0s4.net <186283@ud0s4.net> wrote:
I think 32-bit is (already) kinda obsolete whether
anybody wants to admit it or not.
>
Didn't take long, did it ?
>
LONG back some guy at a computer store (remember
those ?) asked my why anybody would WANT 16-bit
chips/vars. This was in the c64/Atari-800 days.
I told him "graphics !" - and was right.
>
NOW I wonder if 128-bit should be the aim of
all future standards. It'd be HARD to use up
128 bits for almost anything.
In other words, "128 bits ought to be enough for anybody." :-)
It's a really BIG number ...
But there SHOULD be a few 256/512-bit types in ye
olde library :-)
Now CPUs ... maybe 128-bit IS what future-lookers
need to immediately switch to. Haven't heard many
complaints about 64-bit chips, yet, but doesn't
hurt to plan ahead. Circuitry can be made SO small
now that the extra stuff for 128 all through may
not be such a burden.
OR ... are 'CPUs' even bulk of The Future ?
Somehow I see "AI" - implemented on large
distributed systems of diverse composition,
likely even some 'quantum' thrown in - being
the coming thing. They can emulate old CPUs.
Individual users, well, 99.5% of what they run
actually runs 'cloud' over 6-G and they just
need enough chip to do the pretty graphics.