Sujet : Re: Win11 explorer bug?
De : '''newspam''' (at) *nospam* nonad.co.uk (Martin Brown)
Groupes : sci.electronics.designDate : 13. Dec 2024, 10:55:18
Autres entêtes
Organisation : A noiseless patient Spider
Message-ID : <vjh0ab$3ben5$1@dont-email.me>
References : 1 2 3 4 5 6 7 8 9 10 11 12 13
User-Agent : Mozilla Thunderbird
On 12/12/2024 21:09, john larkin wrote:
On Thu, 12 Dec 2024 04:00:23 -0700, Don Y
<blockedofcourse@foo.invalid> wrote:
On 12/12/2024 2:59 AM, Martin Brown wrote:
Probably because it is *so* bug.
(typo for big but Freudian slip seems OK)
>
Once something becomes "complex" (i.e., too large to fit in a
single brain), it becomes difficult to understand the repercussions
of specific design decisions -- because you can't remember
EVERYTHING with which they interact.
Engineers design giant systrems - cars, airplanes, bridges, buildings
- with lots of parts, and nobody understands all the parts. And they
work first time.
There are hundreds of years experience building large physical objects and customers can more or less understand engineering diagrams and now virtual 3D renderings of their new building made possible by software.
It didn't stop someone during build phase connecting a high pressure steam pipe to a stairway handrail on one plant that I know of. Big engineering diagrams can also be confusing when loads of similar diameter pipes (and non-pipes) go through a partition.
Software is still in the medieval cathedral building era but without the make walls thicker just in case strategy. It is still a good heuristic that if it is still standing after 5 years then it was a good 'un.
Ely cathedral on the fens and the crooked spire at Chesterfield are examples that didn't quite fall down but don't quite look as designed.
https://www.elycathedral.orghttps://en.wikipedia.org/wiki/Church_of_St_Mary_and_All_Saints,_Chesterfield
Software is different, and it never works first time. Most programs
don't even compile first try.
It is better if they don't compile at all until they are nearly correct. The more faults that are found at compile time the better. Static code analysis has done a lot to improve software quality in the past decade.
The big problem is that software developers get lumped with last minute changes caused by salesmen promising new features to customers and to hide hardware defects that electronics engineers left in and need to be remediated in software because manufacturing has already started.
Mission creep (or starting out with nothing even resembling a coherent self consistent requirements specification) is a big factor in large scale software failures. We are stuck with the suits saying ship it and be damned we can always update the software later with something that actually works. Hardware tends to be immutable even when there is a significant fault present software is expected to kludge around it.
Unfortunately most projects at universities are sufficiently small that anyone who is even reasonably good at programming can hack a solution out of the solid more quickly and without using the processes needed for large scale software development.
I could probably code a "Hello, world!" program that would run first
try.
That is the problem. Anything under about 3 man months you can get away with murder (and that means most university teaching projects). Things start to get a bit sticky when you are talking 3 man years and above.
If you so despise software why are you using Spice and why are you not still cutting up bits of red and blue sticky tape? Software mostly works and you have to learn to live with its quirks or write your own.
-- Martin Brown