Something better than Unity Build in the future ?

Started by
7 comments, last by l0calh05t 7 years, 2 months ago

Hi,

Actually we use Unity Build to improve the compile speed and reduce the lib size.

Something better is planned in the future for C++ ?

Thanks

Advertisement

C++ build speeds are pretty good if your project is structured well. Sometimes that isn't always practical. But I am not aware of there being any changes to C++'s compilation and link model any time soon.

https://blogs.msdn.microsoft.com/vcblog/2015/12/03/c-modules-in-vs-2015-update-1/

Hey Dirk,

Awesome link, let's hope it will be approved for C++17.

I stand corrected - or at least, reminded. Still, modules won't be C++17, certainly not as a full feature. Maybe as an experimental one. It's probably not worth expecting it to slash build times either, though there will be improvement. A well-structured C++ program today much resembles one built with the new module system anyway, so if your problem is that you change interfaces a lot, you're still going to be doing a lot of rebuilds. The main benefit of a unity build is to reduce on file I/O time and that is arguably best addressed these days by throwing an SSD at the problem rather than trying to subvert the system.

The main benefit of a unity build is to reduce on file I/O time and that is arguably best addressed these days by throwing an SSD at the problem rather than trying to subvert the system.

Not really.

When you compile N files that all of them include the same header, this header gets parsed N times, alongside macros and all that. That's where unity builds shine at.

Precompiled headers help with that problem; but like you said if your project isn't well structured then your precompiled headers will contain lot of non-shared stuff; and causes an almost full rebuild when you change a header.

The problem gets aggravated if a set of 7 files include the same headers, and another set of 10 files include a different set of headers. Unity builds can deal with this well, but precompiled requires to include all set of headers shared by all 17 files.

Modules would be the ideal solution to have both modularity and data stored precompiled. But it has been years since it was promised that "modules are around the corner" and yet haven't arrived.

this header gets parsed N times, alongside macros and all that

My understanding is that modern compilers are generally too sensible to do that unless they truly need to. #pragma once helps.

Obviously if you fill your headers with macros that vary from one compilation unit to the next, the compiler has to re-parse it - but no module system is going to save you from interfaces that your project chooses not to keep static.

Depends on how it is compiled.

If the compiler is called once with all the files it may have that option. It would be an enormous command line for large projects/

For all major projects I've worked with, I remember them all starting a unique instance for each.

For empirical evidence on the project I've got up right now, watching Visual Studio and task manager, I can see that compiling generates one new compiler process per file for each CPU. Checking their command lines, they were called individually per file. That data can't easily be preserved in separate running instances of the compiler.

Hey Dirk,

Awesome link, let's hope it will be approved for C++17.

It won't https://botondballo.wordpress.com/2015/06/05/trip-report-c-standards-meeting-in-lenexa-may-2015/

However, Microsoft and Clang both have implementations of the TS already.

This topic is closed to new replies.

Advertisement