Jump to content

  • Log In with Google      Sign In   
  • Create Account

Artemiye

Member Since 08 Jul 2012
Offline Last Active Aug 21 2016 02:47 PM

Posts I've Made

In Topic: Determine What Game To Make

19 August 2016 - 07:47 PM

Or am I missing something? What is your purpose for making a game, and why don't you already know what you want to work on?

 

The main purpose (aside from the obvious enjoyment) is to see through the entire development cycle of a non-trivial project where I'm the only programmer on the team (both for learning purposes and because it's a major goal I set for myself since I graduated from school).  I should probably have rephrased the last part to "...what the hell should I work on, taking into account points 1 and 2?"  I think I know what I'd want to work on if I stopped worrying about #2 so much, though I'll admit it's been hard to get over that so I'm looking for some advice so I can make sure I take a step in the right direction.


In Topic: Reducing Compile/Link Times

17 March 2016 - 06:50 PM


Linking is the stage at which external references are resolved to actual memory addresses (technically, RVAs). In tightly coupled code or code that uses a lot of global state, this can take a lot of time. There's not much you can do about tightly coupled code besides refactor, but the global state problem could be resolved by moving all code requiring this global state to the same compilation unit. Please note that string literals are also, to the linker, external references (to the string table in the final executable). If the code makes heavy use of string literals, this may somewhere to start.

 

Hmm, yeah, this is one of the major problems.  The code is tightly coupled and I won't have the time to refactor it.  There is, indeed, an enormous amount of string literals as well.  In a perfect world I'd be able to fix both of these issues, but right now I'm going to explore other less-invasive options first (writing something to let me know what should be added to the precompiled header, writing something that compiles each cpp minus one include to provide a report on what includes can be removed, and combining .cpps or splitting header files as required, and potentially implementing unity builds).

 


While you concentrate on reducing compile/link times now, a different strategy is to avoid doing it completely.

If you change code, there is not much of a way around it, but if you eg tweak numbers, you could load those from a file. A much more invasive change is to add a scripting language, so you can code logic in the script.

Limited forms of logic, such as state machines could be loaded from YAML or XML.

 

Yeah, a portion of it is data driven, which helps.  Though it isn't possible to avoid it completely in this scenario.

 


pre-compiled headers can speed things up.



link time code generation is slower.



optimization is slower.



conditional compilation is faster.



nothing should be in a header that doesn't have to be public for the module's API.

 

Agreed...I'm really starting to favour barely having anything in header files these days.  If I can get away with passing dependencies to stand alone static functions (which are 100% in the cpp file), then I do so.

 


A unity build can also be a lot faster than a traditional compile.

If you are using CMake there is a module than can automatically apply strategies such as precompiled headers and unity build to your project.

 

Premake is used, and I am strongly considering to take some spare time to set up a different premake script that includes some unity builds.

 


If you get troubled by compile time, and there are more computers in development/available, Source Control offers farm compiling, it works great, can make your large compiles multiple times faster, also displaying progress of what is compiled and where.

 

Yep, that would be quite amazing, however, for reasons I won't get in to, this won't be possible.

 

 

Thanks for the suggestions guys.  It looks like DumpBin won't help me after all but I should be able to survive without it.


In Topic: Reducing Compile/Link Times

12 March 2016 - 05:51 PM


but really, large projects always take a long time to compile. You might go through hours of work to shave a few seconds off compile time.

but if you really want to do this, just "guess" which in-project header is included the most often, and start by shaving that down. It'll give you the most bang for your buck.

 

Thanks for the reply, and yeah, I wouldn't find it acceptable if I just said to myself "yeah, it's a large project, there's nothing I can do about it."  I also don't plan on guessing anything, rather I need to develop a process that identifies the problem so I can solve it.

 


You might also need to be a bit more careful on how you structure your headers. A header that is included everywhere, will need to get recompiled, which will force the files that's included those headers to recompile. Luckily it won't go too deep. But it's a significant difference when you edit something like Unreals BaseObject class, then recompile it. Nearly 70% of the code base is dependent on that, that's 70% of the code base that needs to recompile.

 

Indeed.  I think it's preferred to avoid this as much as possible.

 


In older projects, there is often a lot of obsolete #include.

We add #include lines when needed, but nobody warns us if it can be removed.



I know of someone that wrote a script that tries to take out each #include in turn, and then test if the code still compiles. If it does, it can obviously be removed smile.png

In particular, #include lines in header files can cost a lot of time.



Such a script could be a start to find candidate #include lines that may be removable.

 

Agreed.  I really like this idea...and I'm going to add it as one of the solutions to explore.

 


Andy Firth wrote a series of blogs about his experiences in optimising the compile and link times at Bungie. Unfortunately, they were originally published on AltDevBlogADay, so they are very hard to track down:



http://web.archive.org/web/20140719071550/http://www.altdev.co/2011/09/20/codebuild-optimisation1/

http://web.archive.org/web/20140719091139/http://www.altdev.co/2011/11/04/code-build-optimization-part-2/

http://web.archive.org/web/20140719083140/http://www.altdev.co/2011/11/21/code-build-optimisation-part-3/

http://web.archive.org/web/20140719073415/http://www.altdev.co/2011/12/26/code-build-optimisation-part-4-incremental-linking-and-the-search-for-the-holy-grail/

 

I'll read all of these.  Thanks.

 


If you have link-time code generation enabled in visual studio that can increase the link time by an order of magnitude in my experience. Disabling it (for all projects) made a big difference in my build time and didn't affect runtime performance much.

 

I'll try this, thank you.

 

 

 

Thanks for the replies, though I'd still like to be able to analyze each .obj (or any other generated file really as necessary) to find out what's costing me so much for link times.


In Topic: Trustworthiness of LinkedIn

09 January 2016 - 11:03 PM


And so The Nightmare Begins.

 

 /pushthreadhijack

 

https://soundcloud.com/l-spiro/zeal-island

What software do you use for this?

 

 /popthreadhijack


In Topic: Support Multiple Shading Models (Deferred Rendering)

23 December 2015 - 08:24 PM

The standard solution for ambient at the moment is Image Based Lighting, preconvolved with (parts of) your BRDF and corrected with a LUT. See the Brian Karis Unreal 4 lighting course notes for the details that everyone is copying/refining laugh.png

 

That makes sense.  I read a portion of the course notes, and damn, now I have even more questions (though they aren't completely relevant to this thread's title so I'll post it in a new thread).


PARTNERS