Liste des Groupes | Revenir à col misc |
On 4/5/25 9:07 PM, The Natural Philosopher wrote:temperature conditions that add errors in.
Not really. If you se as close to zero coefficient resistors and enough feedback the circuits are insensitive to temperature.>Ummm ... I'm gonna kinda have to disagree.
Not really, That was mostly sorted years ago.
There are several factors that lead to errors in
analog electronics - simple temperature being
the worst.
And get a very accurate 'wrong answer'.Digital FP *can* be done to almost arbitrary precision.Keep>
carrying those errors through several stages and soon
all you have is error, pretending to be The Solution.
So no different from floating point based current climate models, then...
If you're running, say, a climate or 'dark energy' model
then you use a LOT of precision.
Most complex dynamic systems are 'analogue' and anyone who has modelled them using analogue electronics can tell you that if they do not have the right negative feedback they become unstable, and that's how we engineers know that 'positive feedback' in the climate is bullshit.To say the least :-)Again, perhaps some meta-material that's NOT sensitiveAn awful lot of op-amps.
to what typically throws-off analog electronics MIGHT
be made.
>
I'm trying to visualize what it would take to make
an all-analog version of, say, a payroll spreadsheet :-)
>
CAN be done, but is it WORTH it ???
But, I suppose, a whole-budget CAN be viewed
as an analog equation IF you try hard enough.
No, it isn't.The thing is that analogue computers were useful for system analysis years before digital stuff came along. You could examine a dynamic system and see if it was stable or not.Well, *how* stable it is ........
Digital is always right-on.
So what do you NEED most - speed or accuracy ?That is the point.
If not you did it another way. People who dribble on about 'climate tipping points'have no clue really as to how real life complex analogue systems work.I'm just gonna say that "climate" is beyond ANY kind
of models - analog OR digital. TOO many butterflies.
In certain cases it is a better solution. I can envisage a chip comprised of many many linear amplifiers whose gain, frequency response and interconnections were programmable by digital logic, to allow one to model an enormously interconnected system very quickly, to at least see what its sensitive areas in fact were....And we shall see ... advantage, or not ?Now discrete use of analog as, as you suggested, doingIts being thought about.
multiplication/division/logs initiated and read by
digital ... ?
>
Maybe, horrors, "depends" .....Well that means nothing. That's just anti-tech speak for 'I don't know what I am talking about, and I am as good as you. so therefore you don't either'...
The "real world" acts as a very complex analogThe point is you don't. If your system is so unstable that one atomic decay renders the cat dead, it doesn't last long in the 'real world
equation - until you get down to quantum levels.
HOW the hell to best DEAL with that ???
No. Digital is SLOW. Many hundreds of cycles to do in what analogue can do *approximately* in one or two.I'd say digital traded precision for speed ...Oh well, we're out in sci-fi land with most of this ...Well no, we are not.
may as well talk about using giant evil brains in
jars as computers :-)
>
Digital traded speed for precision.
Well tough, because you aint gonna get that . The speed of light is the speed of lightMassive parallelisation will definitely do *some* things faster.Agreed ... but not EVERYTHING.
Sometimes there's just no substitute for clock
speed and high-speed mem access.
Oh of course.Think 4096 core GPU processors...I think that's the way it will happen, decline of the general purpose CPU and emergence of specific chips tailored to specific tasks. Its already happening to an extend with on chip everything....I kinda understand. However that whole chip chain
will likely need to be fully, by design, integrated.
This is NOT so easy with multiple manufacturers.
ANYway ... final observation ... it keeps lookingI think we have taken in around 75 years, Turing's basic concept of a Turing machine to the end of the line, more or less. There is still room for improvement, but not at the level of general purpose computing cores.
like we're far closer to the END of increasing
computer power than the beginning. WHAT we want
computed is kinda a dynamic equation, but OVERALL
we're kinda near The End.
THEN what ?
Les messages affichés proviennent d'usenet.