On Tue, 19 Mar 2013 16:57:03 GMT, Robert wrote:

Yes, so if a custom "configurations" section or an own "sourcePaths" definition are specified, they will override the default choices. See package_.d line 101 ff. It first searches for all standard folder names and adds them, then parses the package.json file to let the user override the settings, and finally generates default configurations if none have been specified.

Ah, yeah now I see, the parsing of the package.json overwrites the values and does not append. Perfect :-)

I don't see how the static/dynamic libraries would solve that. If vibe.d would be compiled as a dynamic library, the linker should not consider it's main function at all and if compiled as a static library it should produce conflicting definition errors as in the source library case. It would be great if the linker could somehow be employed for this, but I don't see how yet.

As the linker only looks up symbols in libraries when they can not be found in the provided .o files, it will only use the main in the library if you provided none. You only get a conflict if you depend on some symbol that is within the same module as main. I just tested it on Linux, it works: If I don't provide my own main the one of the dynamic/static library is used, if I provide one, mine is used. I get a conflict if I define main and depend on a symbol that is in the same file as the library's main, but only then. To be honest, I would not even have thought of putting main in a library, but your statement about no-conflict, made me realize that this in fact works this way.

So in summary: If we use libraries/rdmd then dub does not need to know if the application provides its own main or not, nor in which file it would be. For rdmd it still needs to know the "app file", whether it contains main or not is irrelevant. The main/no-main problem is gone.

Interesting, I didn't know that. Is that standard behavior (i.e. does it work with Optlink, VisualStudio link.exe, the LLVM linker and maybe unilink)?

I think the approach with handling modules and HTML documentation and also the version statements is a good idea. But I think we shouldn't rely on that information for the build, if just because we may not be able to continually generate this information for branches if there are many active projects. Avoiding dependencies to rebuild also sounds like a low-priority issue, so as long as we don't introduce a road block for that somewhere, I think we can get to that once everything else is settled. Let's first get the build semantics right (...and some important stuff that has nothing to do with semantics at all).

dub would not need to rely on this information, if the registry can not provide it, dub can fall back to either:

  1. dumb behaviour (every time version statements are changed a new version of the library is built)

I actually think that this is sufficient. Usually you don't change versions all of the time (in isolation), but rather switch configurations or build types. If there is one binary per configuration/build type there is no need to be more clever. Even for tagged versions.

  1. Parse the code itself to gather the version/debug statements. (Bad at least because it would add another dependency to dub)
  2. Use rdmd. (We could for example simply require rdmd for branch builds)
  3. Do as C/C++ does, hope for the best (the version/debug command line changed do not affect the build). I don't really like that, but if dub would simply print a warning that it does not have this information and you have to take care, this might be fine for branch builds as the user always can fall back to rdmd.

Actually I started to think about this, because I had some problems building software with dub (dub seemed to not build the dependency at all, when using the default build method).

Can you go into more detail? I'd like to understand the problem.

In my quest for the reason, I started to think how I would do it, the last post was what I came up with ;-) -> So in essence this proposal was my concept of getting build semantics right, but to be honest I don't even understand exactly what your current approach is, maybe you could briefly explain what it looks like? And what the problems are?

  • How do you decide what to build? Simply all modules of all dependencies? Track them somehow via imports?

The former. With incomplete support to build dependencies as separate libraries.

  • If you decided what to build, how do you build it?
    • All source files passed at once to dmd? Does this scale? If not, how do you split up?

This is the case right now. How good it scales depends a lot on how CTFE/template heavy the code is, but you can definitely hit walls.

- One module at a time?

This

- Some combination?

and the aforementioned per-dependency builds are planned.

Also, when implementing this, I would make the whole thing a small separate build server that is controlled by a simple REST interface from the start, so that it can be a separate process with different user rights and on a different server (or multiple servers).

Sounds good. Btw, do you have the resources to host such a server? Or do you know someone who does?

I have one machine that would be suited, but that's in the internal network and I don't really feel comfortable using it for external stuff security wise. Running builds on the virtual server that runs the registry would probably hamper the execution of the other services considerably (esp. considering its mere 2GB of RAM). Maybe someone has a few spare bucks for an EC2 instance? I guess that could stay turned off most of the time and should come out quite cheap.

The question I have in mind is how official it will become to build with rdmd. With voices that IDEs should also use the external build tool to build a project and other voices that always want to build with --rdmd it sounds like we should possibly actually provide rdmd support for all kinds of projects and we could maybe even go back and base everything on it...

I finally need to look up how rdmd works in detail, I am just guessing at the moment.

The other issue with rdmd of course is that it currently is not actually sufficient as a general build tool, so all this would mean a partially dysfunctional package system until all issues and missing features of rdmd (or dmd) are sorted out.