Jump to content
  • Advertisement
Sign in to follow this  
Phil123

Reducing Compile/Link Times

This topic is 853 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

I'm currently working on optimizing somewhat of a moderately sized code base with maybe 40 or so external libraries.  There are quite a few things that have helped that I've done, but the time it takes to link is still a serious productivity problem.

 

I understand what contributes to slow compile and link times in general, but I need to be able to quickly and efficiently locate the problems in code.  I don't have time to manually go through and "guess," so I've been trying to develop a process in order to locate the most significant problems in the code base (so I can fix it).  And to that end, I've looked into using Microsoft's DumpBin tool (using VS2015).  I've tried to make sense of the data it generates, though I have had little immediate success in that regard, and on top of that, there doesn't seem to be a wealth of information on this topic.

 

So, as per usual, here I am, looking for suggestions.

Share this post


Link to post
Share on other sites
Advertisement

many times, programmers will include a header in another header when they only need a typedef from it. This leads to every header in a project including 20 other headers from the project, when adding a few typedefs is all that's needed.

template-heavy code is also pretty expensive to compile. headers that use template metaprogramming should NOT be included in other headers, unless needed for a function that would incur a significant performance penalty for not being inlined.

but really, large projects always take a long time to compile. You might go through hours of work to shave a few seconds off compile time.

but if you really want to do this, just "guess" which in-project header is included the most often, and start by shaving that down. It'll give you the most bang for your buck.

Edited by nfries88

Share this post


Link to post
Share on other sites

You can also compile your external libraries into a single Utility .DLL which can be loaded at runtime.

 

You might also need to be a bit more careful on how you structure your headers. A header that is included everywhere, will need to get recompiled, which will force the files that's included those headers to recompile. Luckily it won't go too deep. But it's a significant difference when you edit something like Unreals BaseObject class, then recompile it. Nearly 70% of the code base is dependent on that, that's 70% of the code base that needs to recompile.

Share this post


Link to post
Share on other sites

In older projects, there is often a lot of obsolete #include.

We add #include lines when needed, but nobody warns us if it can be removed.

 

I know of someone that wrote a script that tries to take out each #include in turn, and then test if the code still compiles. If it does, it can obviously be removed smile.png

In particular, #include lines in header files can cost a lot of time.

 

Such a script could be a start to find candidate #include lines that may be removable.

Edited by Alberth

Share this post


Link to post
Share on other sites

If you have link-time code generation enabled in visual studio that can increase the link time by an order of magnitude in my experience. Disabling it (for all projects) made a big difference in my build time and didn't affect runtime performance much.

Share this post


Link to post
Share on other sites


but really, large projects always take a long time to compile. You might go through hours of work to shave a few seconds off compile time.

but if you really want to do this, just "guess" which in-project header is included the most often, and start by shaving that down. It'll give you the most bang for your buck.

 

Thanks for the reply, and yeah, I wouldn't find it acceptable if I just said to myself "yeah, it's a large project, there's nothing I can do about it."  I also don't plan on guessing anything, rather I need to develop a process that identifies the problem so I can solve it.

 


You might also need to be a bit more careful on how you structure your headers. A header that is included everywhere, will need to get recompiled, which will force the files that's included those headers to recompile. Luckily it won't go too deep. But it's a significant difference when you edit something like Unreals BaseObject class, then recompile it. Nearly 70% of the code base is dependent on that, that's 70% of the code base that needs to recompile.

 

Indeed.  I think it's preferred to avoid this as much as possible.

 


In older projects, there is often a lot of obsolete #include.

We add #include lines when needed, but nobody warns us if it can be removed.



I know of someone that wrote a script that tries to take out each #include in turn, and then test if the code still compiles. If it does, it can obviously be removed smile.png

In particular, #include lines in header files can cost a lot of time.



Such a script could be a start to find candidate #include lines that may be removable.

 

Agreed.  I really like this idea...and I'm going to add it as one of the solutions to explore.

 


Andy Firth wrote a series of blogs about his experiences in optimising the compile and link times at Bungie. Unfortunately, they were originally published on AltDevBlogADay, so they are very hard to track down:



http://web.archive.org/web/20140719071550/http://www.altdev.co/2011/09/20/codebuild-optimisation1/

http://web.archive.org/web/20140719091139/http://www.altdev.co/2011/11/04/code-build-optimization-part-2/

http://web.archive.org/web/20140719083140/http://www.altdev.co/2011/11/21/code-build-optimisation-part-3/

http://web.archive.org/web/20140719073415/http://www.altdev.co/2011/12/26/code-build-optimisation-part-4-incremental-linking-and-the-search-for-the-holy-grail/

 

I'll read all of these.  Thanks.

 


If you have link-time code generation enabled in visual studio that can increase the link time by an order of magnitude in my experience. Disabling it (for all projects) made a big difference in my build time and didn't affect runtime performance much.

 

I'll try this, thank you.

 

 

 

Thanks for the replies, though I'd still like to be able to analyze each .obj (or any other generated file really as necessary) to find out what's costing me so much for link times.

Share this post


Link to post
Share on other sites


Thanks for the replies, though I'd still like to be able to analyze each .obj (or any other generated file really as necessary) to find out what's costing me so much for link times.


Linking is the stage at which external references are resolved to actual memory addresses (technically, RVAs). In tightly coupled code or code that uses a lot of global state, this can take a lot of time. There's not much you can do about tightly coupled code besides refactor, but the global state problem could be resolved by moving all code requiring this global state to the same compilation unit. Please note that string literals are also, to the linker, external references (to the string table in the final executable). If the code makes heavy use of string literals, this may somewhere to start.

Share this post


Link to post
Share on other sites

While you concentrate on reducing compile/link times now, a different strategy is to avoid doing it completely.

If you change code, there is not much of a way around it, but if you eg tweak numbers, you could load those from a file. A much more invasive change is to add a scripting language, so you can code logic in the script.

Limited forms of logic, such as state machines could be loaded from YAML or XML.

Share this post


Link to post
Share on other sites

pre-compiled headers can speed things up.

 

link time code generation is slower.

 

optimization is slower.

 

conditional compilation is faster.

 

nothing should be in a header that doesn't have to be public for the module's API.

 

don't feel bad, i develop in full release mode build only. the reason is that on rare occasions, things can stop working when you turn on or off debugging or the options mentioned above.   so i only develop in final release mode to prevent this.  so no-precompiled headers, no conditional compilation, all optimizations on, and link time code generation.   for a directx game with 3 source files and about 130,000 lines of code, link time is about two minutes.

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!