Liste des Groupes | Revenir à se design |
And business targets "average performers" as the effort to hireWill an average *coder* (someone who has managed to figure out how toWell, we had the developers we had, and the team was large enough that
get a program to run and proclaimed himself a coder thereafter)
see the differences in his "product" (i.e., the code he has written)
and the blemishes/shortcomings it contains?
they cannot all be superstars in any language.
I went to school in the mid 70's. Each *course* had its ownBecause computers were quite expensive then (circa 1982), and soI was still scratching my head about why Pascal was so different than>
C, so I looked for the original intent of the founders. Which I found
in the Introductions in the Pascal Report and K&R C: Pascal was
intended for teaching Computer Science students their first
programming language, while C was intended for implementing large
systems, like the Unix kernel.
Wirth maintained a KISS attitude in ALL of his endeavors. He
failed to see that requiring forward declarations wasn't really
making it any simpler /for the coders/. Compilers get written
and revised "a few times" but *used* thousands of times. Why
favor the compiler writer over the developer?
Pascal was optimized to eliminate as much of the compiler task as
possible, given that teaching languages are used to solve toy
problems's, the focus being learning to program, not to deliver
efficient working code for something industrial-scale in nature.
Yes. The same is true of eeking out the last bits of performanceYes, lots. They were generally written in assembler, and it wasPrior operating systems were all written in assembly code, and so were>
not portable between vendors, so Unix needed to be written in
something that could be ported, and yet was sufficient to implement a
OS kernel. Nor can one write an OS in Pascal.
You can write an OS in Pascal -- but with lots of "helper functions"
that defeat the purpose of the HLL's "safety mechanisms".
estimated that about 20% of the code would have to be in assembly if
Pascal were used, based on a prior project that had done just that a
few years earlier.
The target computers were pretty spare, multiple Motorola 68000Even the 645 ran at ~500KHz (!). Yet, supported hundreds of users
single-board computers in a VME crate or the like. I recall that a
one megahertz instruction rate was considered really fast then.
Much was made by the Pascal folk of the cost of software maintenance,There likely is less call for change in such an "appliance".
but on the scale of a radar, maintenance was dominated by the
hardware, and software maintenance was a roundoff error on the total
cost of ownership. The electric bill was also larger.
One can write reliable code in C. But, there has to be disciplineThe conclusion was to use C: It was designed for the implementationThis did work - only something like 4% of Unix had to be written in
assembly, and it was simply rewritten for each new family of
computers. (Turned out to be 6%.)
of large realtime systems, while Pascal was designed as a teaching
language, and is somewhat slow and awkward for realtime systems,
forcing the use of various sidesteps, and much assembly code. Speed
and the ability to drive hardware directly are the dominant issues
controlling that part of development cost and risk that is sensitive
to choice of implementation language.
If you can't explain the bulk of a solution "seated, having a drink",Hidden complexity is still complexity, with complex failure modesSo the Pascal crowd fell silent, and C was chosen and successfully>
used.
>
The Ada Mandate was rescinded maybe ten years later. The ISO-OSI
mandate fell a year or so later, slain by TCP/IP.
I had to make a similar decision, early on. It's really easy to get
on a soapbox and preach how it *should* be done. But, if you expect
(and want) others to adopt and embelish your work, you have to choose
an implementation that they will accept, if not "embrace".
>
And, this without requiring scads of overhead (people and other
resources) to accomplish a particular goal.
>
Key in this is figuring out how to *hide* complexity so a user
(of varying degrees of capability across a wide spectrum) can
get something to work within the constraints you've laid out.
rendered incomprehensible and random-looking to those unaware of
what's going on behind the pretty facade.
I prefer to eliminate such complexity. And not to confuse theBy picking good abstractions, you don't have to do either.
programmers, or treat them like children.
War story from the days of Fortran, when I was the operating systemI don't eschew pointers. Rather, if the object being referenced can
expert: I had just these debates with the top application software
guy, who claimed that all you needed was the top-level design of the
software to debug the code.
He had been struggling with a mysterious bug, where the code would
crash soon after launch, every time. Code inspection and path tracing
had all failed, for months. He challenged me to figure it out. I
figured it out in ten minutes, by using OS-level tools, which provide
access to a world completely unknown to the application software folk.
The problem was how the compiler handled subroutines referenced in one
module but not provided to the linker. Long story, but the resulting
actual execution path was unrelated to the design of application the
software, and one had to see things in assembly to understand what was
happening.
(This war story has been repeated in one form or another many time
over the following years. Have kernel debugger, will travel.)
E.g., as I allow end users to write code (scripts), I can'tPascal uses this exact approach. The absence of true pointers is
assume they understand things like operator precedence, cancellation,
etc. *I* have to address those issues in a way that allows them
to remain ignorant and still get the results they expect/desire.
>
The same applies to other "more advanced" levels of software
development; the more minutiae that the developer has to contend with,
the less happy he will be about the experience.
>
[E.g., I modified my compiler to support a syntax of the form:
handle=>method(arguments)
an homage to:
pointer->member(arguments)
where "handle" is an identifier (small integer) that uniquely references
an object in the local context /that may reside on another processor/
(which means the "pointer" approach is inappropriate) so the developer
doesn't have to deal with the RMI mechanisms.]
crippling for hardware control, which is a big part of the reason that
C prevailed.
I assume that RMI is Remote Module or Method Invocation. These areThe latter. Like RPC (instead of IPC) but in an OOPS context.
inherently synchronous (like Ada rendezvous) and are crippling forThere is nothing that inherently *requires* an RMI to be synchronous.
realtime software of any complexity - the software soon ends up
deadlocked, with everybody waiting for everybody else to do something.
This is driven by the fact that the real world has uncorrelatedYou only expect the first event you await to happen before you
events, capable of happening in any order, so no program that requires
that event be ordered can survive.
There is a benchmark for message-passing in realtime software whereSo, like most benchmarks, is of limited *general* use.
there is ring of threads or processes passing message around the ring
any number of times. This is modeled on the central structure of many
kinds of radar.
Even one remote invocation will cause it to jam. As
will sending a message to oneself. Only asynchronous message passing
will work.
Joe Gwinn
Les messages affichés proviennent d'usenet.