# C++ 40 seconds to compile a .cpp file - is it totally normal?

## Recommended Posts

In my most complex gameplay source file (single .cpp) with a lot of necessary #include,

... whenever I edit it (e.g. add a single space character), it needs to recompile 40 seconds.

• Is it what to be expected?
• Do you also face this issue?
• Should I just buy a new computer or more RAM?

I am using Visual Studio 2017.  I am not using Unity Build (single compilation unit technique).

Just thinking that the cause is my code, I can't eat and sleep well.  It also induces nightmare at my bedtime. Please help.

##### Share on other sites

How many lines long is it? If it is huge you should refactor into smaller files.

How long is the .cpp file? How much RAM and what CPU do you have?

##### Share on other sites

Compilation depends on many circumstances. First the compiler has to handle your preprocessor directives, these are includes but also anything conditional you define with a # in front. Therefore the preprocessing unit has to resolve all of these statements from top to bottom before anything else can take place.

Every include will be processed, the file will be loaded and added to the process too. After the conditionals are resolved and the preprocessor knows what code to push to the compiler, macro replacement takes place. Any macro definition (so a define with additional arguments required) are resolved recursively wherever you used the definition in code. It is truely recursively until either the preprocessor dosen't find anymore defined identifiers in the macro code or the macro calls itself, then the processor breaks. This steps may happen at the same time, this is preprocessor dependant.

The preprocessor I wrote in C# to detect dependencies between C++ files does all of this on the fly for example.

Templates are resolved (specific code is generated for each template for each different arguments passed to them, this is why templates may cause code bloat) and then the code is pushed finally to the compilation unit.

So to answer your question, it depends: how many include files do you have, how much and complex are the macros you use, how many templates do you use with different arguments. Did you set the include guards correctly (to not include a file twice as you already processed it in this compilation unit) or included more files as necessary, does a simple forward declare can be used instead of the whole header?

By the way, 40 seconds are nothing. Huge projects like game engines (Unreal for example) use so called "Unity Files" where anything is included at once in one file. This is a try to reduce huge build times that may occure else, our Unreal project for example took more than 10 minutes to compile before we toggled it to generate a Unity File.

What I like to do in such cases is to have our custom build tool generate a dependency graph of each include and where it has been used. This not just helps avoid circular dependencies and helps modularizing the project but also shows unnecessary include directives that can cause much higher compile times

##### Share on other sites

• My computer is Intel Core i5-4460 CPU @ 3.20GHZ, memory 8GB, Windows7 64 bit.
• My problematic .cpp is 230 lines.  It has around 30 #includes.    I don't know how to count summation of the line including all in #include. (5000? not sure)
• My program (all=100K line) are currently splited into a lot of header.  Most implementation is in source file (not header).
20 minutes ago, Shaarigan said:

our Unreal project for example took more than 10 minutes to compile before we toggled it to generate a Unity File.

Thank.  How much the recompiling took after you generate the Unity File?

Is the Unity File you mentioned is a part of unity build technique?

##### Share on other sites
2 hours ago, hyyou said:

My problematic .cpp is 230 lines.  It has around 30 #includes.

Ther is some option in Visual Studio, and enabling it lists how inclusion dependencies are resolved, so while compiling it prints all header files in order, and from that you can see unnecessary cycles or something. Helps to fix it.

I think it's this option:

A helpful work around is to make certain files compile faster by using debug build, even if release build is selected.

snippet from @Hodgman

	#if defined _MSC_VER
#define MAKE_DEBUGGABLE __pragma(optimize("", off))
#elif defined __clang__
#define MAKE_DEBUGGABLE _Pragma ("clang optimize off")
#else
#define MAKE_DEBUGGABLE
#endif


With this included, just add MAKE_DEBUGGABLE on top of your slowly compiling files (before or after the includes matters ofc.)

Very helpful also in case debug builds run too slow to be useful.

##### Share on other sites
3 hours ago, hyyou said:

Is the Unity File you mentioned is a part of unity build technique﻿?

To not get any misapprehensions, Unity File dosen't involve anything from Unity3D or Unity Technologies Inc, it is just named like this because it has all dependencies in one file. The only resource I was able to find about it without pointing you to the source code is UnrealBuildTool Build Configuration chapter

Our compile time was reduced to round about 3 minutes after turning Unity Build Mode on

##### Share on other sites

Without seeing code, my guess is that you include unnecessary headers, or else you're using a lot of 'header only' libraries.

You can reduce the number of unnecessary headers by only including those you need, and by using forward declarations (which, for compile safety, can be placed in smaller simpler headers) and making sure you use include guards properly.

Header only libraries make distribution of the library easier, but the cost is increased compilation times.

My next guess is that you're using template-heavy code.  It's going to increase compilation times.  Header only libraries and template-heavy code go hand-in-hand, and both are compile-time performance killers.

Sometimes you jut need to go back to the olden days.  When I was young, compilation could take hours between when you submitted the job at the card reader and when you got the results back at the line printer.  The best time-saving technique was to make sure your code was correct in the first place, a term called 'desk checking'.  It seems modern technology is expanding to consume all of the time- and labour-saving convenience it has introduced.

##### Share on other sites
1 hour ago, JoeJ said:

A helpful work around is to make certain files compile faster by using debug build, even if release build is selected.

Thank!  I set optimization flag (only) of the most-frequently-changed project to false, and it becomes 25-30 seconds.

It takes some large performance hit, but it is acceptable.      Yeah thank, JoeJ!!

##### Share on other sites
6 hours ago, hyyou said:

In my most complex gameplay source file (single .cpp) with a lot of necessary #include,

... whenever I edit it (e.g. add a single space character), it needs to recompile 40 seconds.

• Is it what to be expected?
• Do you also face this issue?
• Should I just buy a new computer or more RAM?

I am using Visual Studio 2017.  I am not using Unity Build (single compilation unit technique).

Just thinking that the cause is my code, I can't eat and sleep well.  It also induces nightmare at my bedtime. Please help.

8 RAM is a lot. So is 3.2 GHz.

##### Share on other sites
13 hours ago, Acosix said:

8 RAM is a lot. So is 3.2 GHz.

For a dev machine in 2019? Not really. 3.2GHz i5 is fine, but not excessive. 8Gb is definitely on the low side.

That said, for a 320 line file, it should be fine.

## Create an account

Register a new account

• 9
• 56
• 18