Sujet : Re: Remember "Bit-Slice" Chips ?
De : 186283 (at) *nospam* ud0s4.net (186282@ud0s4.net)
Groupes : comp.os.linux.miscDate : 12. Dec 2024, 05:14:18
Autres entêtes
Organisation : wokiesux
Message-ID : <loGcnS4oQKCg_sf6nZ2dnZfqnPGdnZ2d@earthlink.com>
References : 1 2 3 4 5 6 7 8 9 10 11 12 13
User-Agent : Mozilla/5.0 (X11; Linux x86_64; rv:78.0) Gecko/20100101 Thunderbird/78.13.0
On 12/11/24 11:51 AM, The Natural Philosopher wrote:
On 11/12/2024 16:43, John Ames wrote:
On Wed, 11 Dec 2024 00:51:47 -0500
"186282@ud0s4.net" <186283@ud0s4.net> wrote:
>
Better innovate SOMETHING, otherwise we're gonna see 'peak computing'
when it's become clear we need thousands of times that for the Really
Cool Stuff.
>
I've long been of the opinion that things're gonna get Real Interesting
when Moore's Law finally his the wall and "throw a beefier rig at it!"
is no longer a viable pitch for any "your X isn't delivering Y fast
enough for project Z!" problems.
>
We have already hit it.
Hence the proliferation of multiple cores.
Which works for multi-user and multi-threaded operations, but not necessarily for linear single thread code.
Very correct.
In some respects it's a "how do we DEFINE computing ?"
sort of thing. A bunch of Nvidia chips doing "AI" is
one way of looking at it. The peta+ FLOPS you may want
to solve a supernova implosion or horrific math equation
is another. In short 'performance' now has to take a cue
from "What We WANT done" rather than the olde-tyme
benchmarks.
Not everything can be 'parallelized' either ... some
jobs just require vast instructions-per ...
As said elsewhere, 'pure photonic' tech MAY provide
the ultimate - 10x to 100x what we're seeing with
the low-nanometer transistor designs. But, in this
universe, that's kinda IT.
We need a new universe.