On 14/03/2025 00:37, Waldek Hebisch wrote:
> bart <
bc@freeuk.com> wrote:
>> This task can be done with a program called a 'compiler'.
>
> You ignore fact that people are developing programs.
And many, for assorted reasons, just need to build programs from source code ...
> And developement
> is much more than "turning a bunch of source files into an executable".
... then this /is/ pretty much it. If the process /has/ to involve multiple source languages (rather than just being the nearest language the developer had on-hand to solve some aspect), then support for those become part of the set of dependencies.
Those dependencies become part of the configuration needed at the build-site, and ought to be carefully considered.
Instead, the expectation seems to be that the user of the build-machine has EVERYTHING the developer has, even if poorly justified.
> GPL says "The source code for a work means the preferred form of
> the work for making modifications to it". And that nicely captures
> the idea: sources are in form that is convenient for developer
> and may require several steps before one gets the executable.
And my idea is different; there is:
* The development source code
* <Intermediate>
* The runnable binaries
What I'm suggesting goes in the middle. A minimal, streamlined set of sources, possibly amalgamated (which helps if the user wants to incorporate this product into their own), with a minimal set of dependencies.
But even if users want to be able to develop the program further, I have doubts as to how much the personal choices and programming styles of the main developer(s) should be inflicted on others.
>
> FYI, is started using "Linux tools" on MS-DOS, before I
> heard about Linux. And tools are widely available,
> IIUC classic zOS is problematic, but I think that USS
Before I started working with microprocessors on basically bare systems, I used DEC and ICL computers. There I never used any such tools. There were compilers, and there were linkers.
While on 1980s PC-style machines, programs were typically not huge. I used my own fast compilers and fast loaders (to combine object files into one program).
I would be very, very familar with the modules of my application. And I would know the dependencies.
In the language I used, all modules of an app shared the same program-wide header. Any changes in a header usually meant a full recompile, unless I was confident that wasn't needed. In any case, a full build was still a matter of seconds.
So for me they would have been pointless.
(I didn't use any other third party tools, so why 'make' of all things? How would it even work when my compiler was built into my IDE as a resident program? Make needs to invoke it as a discrete program. It would make things slower rather than faster!
And of course now, it wouldn't work AT ALL with my whole-program compiler - it's either all files or nothing.)
> is OK. Some systems may be prone to breakage, when
> there is small number of _active_ users and nobody bothers
> to timely report real problems.
> Coming back to developement, in many cases it is desirable
> to generate some C files. C preprocessor can do some
> transformations, but is may be convenient to transform
> source in a way hard to do via proprocessor. Also,
> this is C group, but big programs frequently have parts
> in different languages, an then it is good to have
> language-neutral tools. Simple tools to transform
> or generate files are 'sed' and 'awk'. Some people
> use 'perl' or 'python' for such purpose and in principle
> there is much more possiblities.
I think a built-in, /capable/ but still compact script language in an OS would have a been good idea, even better if it was the same one across OSes. Shell scripts don't really count, and ones like Python are quite massive (and then, people coding in Python are 99% guaranteed to need to import third party modules that have to be installed).
So it hasn't happened. In that case, supply scripts tuned to the OS where to the build is to be done, or avoid them, or use a language which /has/ to be pressent (eg. C if the main app is already in C).
Or provide something that Just Works, without it being some behemoth that involves multi-GB downloads, takes an hour, and with a thousand failure points, to end up with that 0.5MB binary.
> More specialised are compiler-writing tools like 'bison'
> and 'flex'. You can write a parser by hand, but using
> 'bison' one does not need to know much to write working
> parser. 'flex' helps in writing scanners, but IMO
> is quite useful for text processing task (its support
> for regular expressions is nicer than Perl or Python
> offer, and speed frequently is much better too).
Help! I've played with those tools, and believe me you don't want such dependencies. (For a start, there are a hundred variations from 100KB to 10MB.)
You build your app, but then you want someone else to build it. Now THEY have to source the exact tools you used.
> Users beside binary need also documentation. In principle
> documentation could be provided as plain text,
Users of applications? Come on, this is not about creating a binary anymore, it is about packaging an application. That is an entirely different endeavour. It might not even involve running any compilers.
It might however involve writing additional applications such as installers. Or web-programming if it's partly done on-line.
How far do you take this, how much of the computer industry are you going to drag in, to justify a complex makefile - and configure script - for some Mickey Mouse project I'm trying to build?
> but there
> is now tendency to offer nicely formatted documentation
<long snip>
> (they can be implemented in few megabytes of code)
> incentive to make smaller tools is rather low.
To reiterate, I am talking about this particular subtask:
* You have N source files to translate into a 1 binary file, using a compiler.
That Is All. You want to justify not caring about how badly this one task is done, because there may be a whole lot of other stuff that could be needed on top.
But it is this one task that is my biggest stumbling block with other people's programs. And usually I don't need anything else.
In this case of GMP.DLL (please don't send me any more links; it was just an example of something I wanted to try out many years ago), the end result is one binary of 0.4-0.7MB. Nothing else; no installation, no manuals written in LaTeX, no nothing.
To that end, the build process is over-the-top in my view.
How you thought that maybe, such things have gotten out of hand, and if so, why is that? Maybetoo many people not caring, because the process works (eventually, and having to bend over backwards on non-compliant OSes), and it is apparently nobody's job to keep on top of such things.
Just add extra layers to solve problems, or buy faster machines with more cores!
(I would love to know how long such a build process would have taken on a 1980s machine with floppy disks! Possibly a week.)
>> It should not rely on anything that is not native to the target platform.
>
> Why not? You already admited that one need a compiler. While
> a bunch of tiny tools should be a blocker?
Odd attitude. So if I need one small dependency, I might as well have a hundred bigger ones?
With such attitudes, I think we're talking at cross purposes.
>> And I'm not suggesting reinventing those 2/4 either, just using the ones
>> I have on my OS.
>
> Well, one trouble is that things you have on your preferred OS are
> not free, one can not use them legally on a different OS.
I haven't suggested that (which things?). And compilers are generally free on Windows too.
(Also I'm not sure why people have a problem with a paid-for OS. Is Mac-OS free?
Is Android free? If so, should people be allowed to complain if there's something wrong with it? According to Scott Lurndal, they're not allowed to complain about free, open source software. They should write their own!)
> it comes to developement tools what cames with basic install of
> your OS seem to be quite limited. No law prevents you from installing
> free tools and that is simplest solution.
Yeah, but it's not just a couple of tools, is it? It's basically duplicating most of the Linux eco-system on Windows.
Suppose you want to run some software on Linux, but it was only available as source code, and it could only be built within Visual Studio.
What would you think about installing the whole of VS under Linux, under a complex set of emulation layers? What about the end result being an EXE, not ELF file, itself needing to run under Wine?
It's messy isn't it? And completely out of proportion to the perhaps 300KB end-result. And with plenty of failure points (I don't think I've had anything work properly under Wine.)
Wouldn't you rather just build it natively on Linux with the tools you already have?
Perhaps you're starting to see my point of view!
> Alternative is extra work for developers to reproduce what tool do
> within program, that would be reinventing the wheel.
Reinventing a wheel is fine, if it means I end up with a 26" wheel exactly the right size for my bike, rather than a monstrous 100' one that threatens to destroy my town!
(26" is 0.7m, 100' is 30m.)
> Or some
> intermedate stages between binary and source, which require extra
> work and there seem to be small interest in such things (say
> hundreds of people among milions using free software).
Take the argument further: why even bother with binary anymore? What is the point? Just provide EVERYTHING as source code.
Apparently it is completely error-proof provided Bart isn't involved.
But if you admit to there being some advantages to binary, then some apply to streamlined source code too.
> Otherwise
> they would have to spend effort on incompatible "Windows
> native toolkit" which would fragment developement effort.
A simplified, streamlined build process would be simpler and faster on any target not just Windows.
After all there wouldn't be any Windows dependencies - as you say there's nothing in Windows for them to use!
> Using no tools may work for very simple projects, but
> means too much extra work in other cases.
>> (Those developers also seem to think that the only alternative to MSYS2,
>> configure scripts etc is to use the monstrosity that is MS Visual Studio
>> and all it comprises.)
>
> You see, people that use Windows also want to use tools.
Voluntarily, or because they have to as so much stuff needs them, or they have been exposed to developing software in academia where they used Linux?
>> If an application is written in C, then a C compiler should suffice.
>>
>>> That is fine, as long
>>> as you keep this to yourselfs. But refusing other
>>> folks right to use tools they want to use is not
>>> fair.
>>
>> Developers can do what they like. But they shouldn't inflict their
>> choices on other people, especially those using other OSes.
>
> Well, I use Linux and can maintain X based GUI. Since I do
> not have Windows I can not create native Windows GUI. And
> while there were several significant contributors to the
> program (some working on Windows), nobody volunteered to
> create Windows GUI. So ATM for this program the only way
> to get GUI on Windows is to use Cygwin and X library provided
> by Cygwin or run Linux version under WSL.
About a decade ago, I needed a console library for my scripting language, that was designed to work on Windows or Linux. Again it was an API layer which either worked on top of WinAPI, or used Linux functions (which mostly used escape codes).
I similar plans for for GUI/graphics, but I lost interest, and also didn't relish working with X11 which seemed even more impossible than WinAPI.
However there are some cross-platform libraries, and for some stuff I could use those.
So I think you used the wrong approach by being too dependent on your native GUI library. Actually I'd use a simpler API layer anyway even if I wasn't planning to go cross-platform.
> I do not know why you develop software.
The stuff I do now is language related: compilers, interpreters, assemblers, backends, and (shortly) emulators.
It's fascinating getting an idea, especially a bold one, and running with it to see what happens.
Like a new module scheme, but it's not so great then you refine it further, now it is gorgeous.
Whole-program compilation - can it work? Will it be fast enough? Is it fast enough to run from source?
What happens if I try and interpret C code? What happens if I interpret sql.c (a 250Loc file)? What happens if I interpret the compiler which then interprets sql.c?
Can a C compiler (not interpreter) work without a linker?
How tidily can such tools be packaged? How small can they be? How self-contained? How fast?
How can I minimise dependencies? How easy is it to distribute source code for someone else to build on a remote machine? (Getting on topic!)
How much faster is my dynamic language than equivalents? How much more adept is it?
I can do all this with no outside help (I need a file system and a C library, both provided by Windows; on Linux I'd also need a local C compiler, as mine targets Windows ABI.)
So, what I do is fun for me and challenging. Having to use what I consider inferior and dated tools is not. Neither would I want to use massive GUI apps to develop software.
I haven't even touched on a language development.
> For me significant
> part is fun, in particular that I can do "interesting"
> things with moderate effort. But all that fun disappears
> if somebody want we to do a lot of work for no good
> reason (beyond "I want it that way").
That is exactly my view of makefiles.
Note that virtually all my development is done with a small, simple, text-mode IDE. That's been the case since 1981, when the IDE included a resident primitive editor and compiler.
The IDE displays the modules of my project, and allows me to edit, compile, link (in the old days, or for C now) and run.
same hardware), 'configure' needed 2 hours on Linux, but
> went fine. Then some extra hours to compile the program.
How long ago and what size was the final result?
>> C is supposed to be that famous portable HLL, but in developer's minds,
>> 'portable' seems to mean working only on any Unix-like system!
>
> All general purpose OS-es that I know provide files and some
> form of 'open'. Of course, that needs system specific headers.
> On systems providing POSIX headers one can just use POSIX, for
> others do whatever given system need to be done.
>
>> (My language, not C, has a std library that includes some OS
>> functionality. But those are wrapped in a set of functions that reside
>> in one OS-specific module, that provides the same API across Windows and
>> Linux.
>>
>> For example, the function 'os_getdllinst' is implemented on top of
>> 'LoadLibrary' on Windows, and 'dlopen' on Linux.
>>
>> What I don't do is directly use LoadLibrary, and insist that people
>> running it on Linux have to install some emulation library to provide
>> that Windows functionality. Which wouldn't work anyway as it would also
>> work with .dll files rather than .so files.
>>
>> You see the kind of thought I put into this stuff? I wish others would
>> do the same! Instead of being so 'provincial'.)
>
> Your library stuff is nice. I am not sure why you find 'open'
> troubling? AFAIR it was available in C libraries of DOS and
> Windows compilers that I used (IIRC there was some part talking
> about '_open' which was supposed to be more official interface,
> but one could make plain 'open' to work).
>
>
>>>>> To put it differenly, you could compile program in one computer
>>>>> and get relatively small program which runs OK on different
>>>>> Windows machines because libraries it needs are already present.
>>>>> This is quite different compared to Cygwin, where you need to
>>>>> install Cygwin libraries before normal Cygwin program can run
>>>>> (I write about normal programs because actual compiler in
>>>>> Cygwin and Mingw used to be the same and with some tweaks one
>>>>> could use Cygwin compiler to produce MinGW execuatbles).
>>>>>
>>>>
>>>> I'm not interested in whatever Cygwin or Mingw are about.
>>>
>>> Oh, so you do not want to know. In such case why you started
>>> discussing it?
>>
>> Because it comes up everwhere that gcc is used on Windows?
>>
>> I started by saying I didn't know exactly what Mingw was. I used to
>> think it was the compiler, and indeed obtaining gcc used to involve
>> visiting sites where 'mingw' or 'mingw64' names figured heavily.
>>
>> Some of my gcc files also have 'mingw32' in them; I don't know why:
>>
>> c:\tdm\bin>fc gcc.exe x86_64-w64-mingw32-gcc.exe
>> Comparing files gcc.exe and X86_64-W64-MINGW32-GCC.EXE
>> FC: no differences encountered
>>
>> Apparently it is to do with some boring functions discussed in parallel
>> subthreads.
>
> The 'x86_64-w64-mingw32' part is so called target triple, it
> specifies on which system binaries produced by the compiler
> are supposed to run. 'mingw' really means using Microsoft
> libraries. I am not sure what _exactly_ this combination
> means, it could mean 32-bit code which can only be run on
> 64-bit Windows. Or 64-bit code that uses library confusingly
> named 'mingw32'.
>
> If I was really interested I could try to build on Linux
> gcc with such a target and look what it produces, but
> I do not thing we are interested enough in the answer to
> do this.
>
>>>> If I were to
>>>> use libraries not part of the OS, then it would be ones like SDL2 to get
>>>> interesting things done. Not try and emulate bits of Linux that I'd
>>>> never heard of.
>>>
>>> I mentioned one case: about 90% of code in "my" library was
>>> provided in fact by other libraries. So using code provided
>>> by other saved me a lot of work, turning something that
>>> otherwise could be multiyear job into reasonably small effort.
>>
>> Sure. Then just provide the binaries which somebody (or even some
>> script) can build once, on a machine where everything is known to work.
>>
>> This suits Windows which has famous binary compatibility. (If there is a
>> 32-bit version of Windows 11, it can probably still run my 1990s 16-bit
>> binaries!)
>>
>>> To use the libraries I had to work out build process, thanks
>>> to tools it was very easy.
>>
>> Thanks to tools designed to work within a labyrinthine build
>> environment, working within such an environment.
>>
>> Purely as an exercise, could you have produced a minimal viable bundle
>> of source files, compiler (or recommendation for one), readme and what
>> ever else was necessary, that would work in an alien environment? Say,
>> Windows.
>
> IIRC I produced a CDROM containg sources and MinGW, installing it
> on Windows and running 'make' would recompile my library from
> sources. I do not remember exactly, but I think that it would only
> recompile "my" files and simply link in libraries that I used.
>
> If you mean source bundle for everything, then there were 6
> general libraries involved: libz (compression, needed by
> libtiff), libjpeg (needed by libtiff), libtiff, lapack,
> BLAS (needed by lapack), hull (geometry code). Each had its
> own Makefile. lapack and BLAS needed fortran compiler.
> I am not sure if I tried to build the libraries on Windows,
> but I think that possibly with small tweaks to Makefiles it
> would work (using MinGW compiler + tools). I write about tweaks
> as libtiff needed to run a program, but probably did not provide
> an '.exe' extention.
In 1990s I depended on an Intel 'IJL' DLL for loading Jpegs. Then Intel withdrew the binary from their website (fortunately I still had copies).
Instead, they provided a developer's download, which included the source code for you to build it yourself. But it was a part of much bigger package which was a 75MB download. At time, I was still using a modem that might have run at max 33K baud.
The effort needed to create that DLL (I didn't even have a C compiler), together with the small chance of sucess was completely off the scale compared with just downloading a 350KB ready-to-use binary.