In article <
vefvo0$k1mm$1@dont-email.me>, <
Muttley@DastartdlyHQ.org> wrote:
On Sat, 12 Oct 2024 16:36:26 -0000 (UTC)
cross@spitfire.i.gajendra.net (Dan Cross) boring babbled:
In article <vee2b1$6vup$1@dont-email.me>, <Muttley@dastardlyhq.com> wrote:
Unlikely to be running *nix in that case.
>
We're discussing the concept of a "standalone binary"; you seem
to think that means a binary image emitted by a linker and meant
to run under a hosted environment, like an operating system. It
does not.
>
It can mean either. Essentially its a binary that contains directly runnable
CPU machine code. I'm not sure why you're having such a conceptual struggle
understanding this simple concept.
Oh, I understand what you mean; it's your choice of non-standard
terminology that I object to. Admittedly, Microsoft uses the
term "standalone binary" to describe one other people might call
"statically" linked", but you seem to mean any binary that comes
out of say, invoking the `gcc` or `clang` driver and getting an
executable object image. And in context, you seem to mean that
a "compiler" is a program that _only_ generates such artifacts,
but that's nonsense, and in fact, many compilers simply don't
work that way. Even LLVM might generate an intermediate
language, that is then in turn processed to generate object code
for some particular target; that target might be an ISA for
which physical silicon exists, or it might not.
Consider, for example, the MMIX CPU designed by Knuth; a
compiler may generate code for that, even though there is no CPU
implementing the MMIX instruction set that I can pop on down to
Microcenter and buy. Does that make that compiler less of a
compiler? Or, keeping with the theme of MMIX, I'll bet someone
has done an HDL implementation of it suitable for loading into
and running on an FPGA; so is a compiler targeting it now a
real compiler?
Or consider x86; most modern x86 processors are really dataflow
CPUs, and the x86 instruction encoding is just a bytecode that
is, in fact, interpreted by the real CPU under the hood. So
where does that fit on your little shrink-to-fit taxonomy? What
about a compiler like LLVM that can target multiple backends,
some of which may not actually be hardware (like, say, eBPF).
Is any compiler that generates an intermediate laguage not a
Real compiler, since it's not generating executable object code
directly? What about a compiler that _only_ outputs an object
file and defers to an explicitly programmer-driven seperate link
step? The Plan 9 compiler suite works that way, and indeed,
actual instruction selection is deferred to the linker.
https://9p.io/sys/doc/compiler.htmlOr consider the APEX compiler for APL, which generated output as
code in the SISAL programming language; the SISAL compiler, in
turn, output either C or FORTRAN. This was actually quite
useful; APL is great at very high-level optimizations ("multiply
these matrices this way..."), SISAL was great a medium-level
inter-procedural optimizations, and of course the system C and
FORTRAN compilers excel at low-level register-level
optimization. The effect was a program that was highly
optimized when finally distilled down into an executable image.
To assert that these weren't compilers is inane.
Now you're just being silly.
>
*shrug* Not my problem if you haven't dealt with many embedded
systems.
>
I could bore you with the number I've actually "dealt with" including
military hardware but whats the point.
Weird appeals to experience, with vague and unsupported claims,
aren't terribly convincing.
You've probably programmed the
occasional PIC or arduino and think you're an expert.
Ok, Internet Guy.
Are they? Thats debatable these days. I'd say Linux is a lot closer to
the philosphy of BSD and SYS-V than MacOS which is a certified unix.
>
Yes, they are.
>
I disagree. Modern linux reminds me a lot of SunOS and HP-UX from back in
the day.
Then I can only guess that you never used either SunOS or HP-UX.
Not something that can be said for MacOS with its role-our-own
Apple specific way of doing pretty much everything.
Well, since we're talking about "standalone binaries" and my
example was the Unix kernel, I should confess that I was really
thinking more like the Unix kernel, or perhaps the standalone
installation program that came on the V7 tape.
Standalone in the sense that the opcodes in the binary don't need to be
transformed into something else before being loaded by the CPU.
>
Yeah, no, that's not what anybody serious means when they say
that.
>
Anybody serious presumably meaning you.
Sorry, you've shown no evidence why I should believe your
assertions, and you've ignored directly disconfirming evidence
showing that those assertions don't hold generally. If you want
to define the concept of a "compiler" to be what you've narrowly
defined it to be, you'll just have to accept that you're in very
short company and people aren't going to take you particularly
seriously.
I'd say its a grey area because it isn't full compilation is it, the p-code
still requires an interpreter before it'll run.
>
Nope.
>
Really? So java bytecode will run direct on x86 or ARM will it? Please give
some links to this astounding discovery you've made.
Um, ok.
https://en.wikipedia.org/wiki/JazelleAgain, I bring up my earlier example of a CPU simulator.
Compiling is not the same as converting. Is a javascript to C converter a
compiler? By your definition it is.
>
Yes, of course it is. So is the terminfo compiler, and any
>
So in your mind google translate is a "compiler" for spoken languages is it?
To quote you above, "now you're just being silly."
number of other similar things. The first C++ compiler, cfront
emitted C code, not object code. Was it not a compiler?
>
No, it was a pre-compiler. Just like Oracles PRO*C/C++.
Nope.
Only heard of one of them so mostly irrelevant. Mine come from the name of
tools that compile code to a runnable binary.
>
It's very odd that you seek to speak from a position of
authority when you don't even know who most of the major people
in the field are.
>
I know the important ones. You've dug out some obscure names from google
that probably only a few CS courses even mention never mind study the work of.
Ok, so you aren't familiar with the current state of the field
as far as systems go; fair enough.
In that case, let's just take a look at an authoritative source
and see what it says. From Chapter 1, "Introduction to
Compiling", section 1.1 "Compilers", first sentence of
"Compilers: Principles, Techniques, and Tools" (1st Edition) by
Aho, Sethi, and Ullman: "Simply stated, a compiler is a program
that reads a program written in one language -- the _source_
language -- and translates it into an equivalent program in
another language -- the _target_ language."
In the second paragraph, those authors go on to say, "...a
target language may be another programming language, or the
machine language of any computer".
Note "any computer", could also be a kind of virtual machine.
And of course, if the target langauge is another programming
language, that already covers what is under discussion here.
So it would seem that your definition is not shared by those who
quite literally wrote the book on compilers.
Look, I get the desire to want to pin things down into neat
little categorical buckets, and if in one's own experience a
"compiler" has only ever meant GCC or perhaps clang (or maybe
Microsoft's compiler), then I can get where one is coming from.
But as usual, in its full generality, the world is just messier
than whatever conceptual boxes you've built up here.
- Dan C.