Jump to content
  • Advertisement
Sign in to follow this  
Reidu

compilation time

This topic is 4870 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

I've got a question about compilation time in C/C++. Is the time required to compile a source file roughly proportional to the size of the file? Obviously some instructions take longer to compile than others, and compilers may differ, but is this a reasonable guess? I know a little about this stuff, but I've never taken a compilers class. Thanks, Reid

Share this post


Link to post
Share on other sites
Advertisement
That's not a bad assumption at a very coarse level. Its probably close enough that it would be hard to dispute for most files.

However its not completly accurate. Consider two files of the same byte size. One contains one class with many methods, the other contains many classes. The one containing many classes would take longer to compile due to the bookkeeping overhead of introducing a new type.

Having said that you can produce a counter example using templates.

And all bets are off if you use template meta programming, that can cause a 20 line file to compile slower than an entire linux kernal build:)

Cheeers
Chris

Share this post


Link to post
Share on other sites
Mhhhh, I would say it depends more on the compilation unit size. A file just containing a few #include can take ages to compile because of what's in the included files. That's why it is important to reduce as much as possible the quantity of #include you put in .h, and use forward declaration.

On the language point on vue, I believe that templates can take quite a time to compile. That's why code containing a lot of different STL structures can suddendly take ages to compile. Precompiled headers become your friend, them.

Share this post


Link to post
Share on other sites
this is maybe a bit off-topic, but templates usually increase compilation-times quite a bit.

Share this post


Link to post
Share on other sites
Hmm, all very innnnnteresting...

The reason I ask is because I'm working on a distributed compilation project and I wanted to know if looking at file sizes would be a reasonable heuristic for determing how to distribute compile jobs fairly. This might not even come into play, but I figured it would be good to know.

So, it sounds like the number of #include's and templates aside, most *ordinary* code compiles at about the same speed?

Thanks for all the quick replies.
- Reid

[Edited by - Reidu on June 15, 2005 3:22:24 PM]

Share this post


Link to post
Share on other sites
I would personaly assume file-size is a good guess. Perhaps file-size after running it though the preprocessor, but this might just cause too much overhead to be really justified.

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!