Interesting, I didn't know that. Is that standard behavior (i.e. does it work with Optlink, VisualStudio link.exe, the LLVM linker and maybe unilink)?

Well my knowledge about such things is highly Linux specific ( I never really used anything else ), wikipedia suggests in the overview section of:

that some linkers do include the complete library in the output and not only the referenced modules. In this case two mains would probably conflict, but only if the linker does not consider the library symbols as weak symbols. I don't know whether there is some standard behaviour for linkers or where I would find such a specification. The easiest way would be to try out all major linkers/platforms and see whether it works everywhere. I believe Jacob has a Mac, you seem to have a Windows installation, maybe you could give it a try?

The test case I used: Provide two source files, one with some function, one with main, link them together in a static and in a dynamic library.
Write another two source files, one with main one without (does not matter really what it contains). Put some printf/cout/writeln statement in each function. Try to link the no-main file with the library, then the main-file. If both succeed, run them and check the output to see what main gets called. In the external main file you can reference the function in the non-main library file, but no symbol in the main defining file.

dub would not need to rely on this information, if the registry can not provide it, dub can fall back to either:

  1. dumb behaviour (every time version statements are changed a new version of the library is built)

I actually think that this is sufficient. Usually you don't change versions all of the time (in isolation), but rather switch configurations or build types. If there is one binary per configuration/build type there is no need to be more clever. Even for tagged versions.

We should definitely start this way, if it works well for most/all users no need to implement something more sophisticated, if something more sophisticated seems necessary it can be added any time.

Can you go into more detail? I'd like to understand the problem.

When I simply ran dub on the package (almost empty with dlibgit as its only dependency), it did not build the dependency, it only added its import paths, no source files. Using --rdmd fixed the problem. I need to track this down a bit more, I very soon drifted off to thinking about building in general and how to do it. It could easily have been my fault.

The former. With incomplete support to build dependencies as separate libraries.

-> Great, may I jump in and complete it, at least for Linux. Maybe some pointers where you started? generators/build.d? What is already done, what needs to be done next?

This is the case right now. How good it scales depends a lot on how CTFE/template heavy the code is, but you can definitely hit walls.

- One module at a time?


- Some combination?

and the aforementioned per-dependency builds are planned.

Could per dependency-library builds be /the/ way to do it, with dropping all others? Or are there reasons to support alternate methods? Libraries are of course no silver bullet, even a single package can be too much to compile at once, so the "one-module-at-a-time" option or some deterministic way of splitting things up would be needed anyway.

I have one machine that would be suited, but that's in the internal network and I don't really feel comfortable using it for external stuff security wise. Running builds on the virtual server that runs the registry would probably hamper the execution of the other services considerably (esp. considering its mere 2GB of RAM). Maybe someone has a few spare bucks for an EC2 instance? I guess that could stay turned off most of the time and should come out quite cheap.

I see, I believe we will find a solution, first I should have something implemented ;-)

The question I have in mind is how official it will become to build with rdmd. With voices that IDEs should also use the external build tool to build a project and other voices that always want to build with --rdmd it sounds like we should possibly actually provide rdmd support for all kinds of projects and we could maybe even go back and base everything on it...

Why do people care about --rdmd or not? How does it affect them, as long as it works correctly?

The other issue with rdmd of course is that it currently is not actually sufficient as a general build tool, so all this would mean a partially dysfunctional package system until all issues and missing features of rdmd (or dmd) are sorted out.

Wasn't the purpose of rdmd scripting? I myself, would never had thought of using it for building whole projects+dependencies.

Best regards,