Advertisement Jump to content
  • Advertisement


  • Content Count

  • Joined

  • Last visited

Community Reputation

1276 Excellent

About Phil123

  • Rank

Personal Information

  • Role
  • Interests
  1. Phil123

    Determine What Game To Make

      The main purpose (aside from the obvious enjoyment) is to see through the entire development cycle of a non-trivial project where I'm the only programmer on the team (both for learning purposes and because it's a major goal I set for myself since I graduated from school).  I should probably have rephrased the last part to "...what the hell should I work on, taking into account points 1 and 2?"  I think I know what I'd want to work on if I stopped worrying about #2 so much, though I'll admit it's been hard to get over that so I'm looking for some advice so I can make sure I take a step in the right direction.
  2. Phil123

    Determine What Game To Make

    What are some methods that you've used to determine what game you're going to make (and eventually release)?     My main concerns are the following:   1. Over scoping the project and asset requirements (specifically things I cannot do: art, music, and sound effects). 2. The project ending up being too similar to a game that's already been released.   There are quite a few games that I've played where I would love to take the core gameplay concepts and greatly improve upon them, but this comes back to #2 as a potential problem, and I don't want to simply clone a game or have it labelled as such, but I'm also not confident enough in my design ability to take something and make it truly unique (yeah, I know, "unique" is being used loosely here).   I do have some self-imposed constraints to keep the scope reasonable (2d, no online features, no major story), but this hasn't helped me really narrow down what I want to make.  I'm stuck in a limbo of "I'd love to work on a game, but what the hell should I work on?"   Thoughts?
  3. Phil123

    Reducing Compile/Link Times

      Hmm, yeah, this is one of the major problems.  The code is tightly coupled and I won't have the time to refactor it.  There is, indeed, an enormous amount of string literals as well.  In a perfect world I'd be able to fix both of these issues, but right now I'm going to explore other less-invasive options first (writing something to let me know what should be added to the precompiled header, writing something that compiles each cpp minus one include to provide a report on what includes can be removed, and combining .cpps or splitting header files as required, and potentially implementing unity builds).     Yeah, a portion of it is data driven, which helps.  Though it isn't possible to avoid it completely in this scenario.     Agreed...I'm really starting to favour barely having anything in header files these days.  If I can get away with passing dependencies to stand alone static functions (which are 100% in the cpp file), then I do so.     Premake is used, and I am strongly considering to take some spare time to set up a different premake script that includes some unity builds.     Yep, that would be quite amazing, however, for reasons I won't get in to, this won't be possible.     Thanks for the suggestions guys.  It looks like DumpBin won't help me after all but I should be able to survive without it.
  4. Phil123

    Reducing Compile/Link Times

      Thanks for the reply, and yeah, I wouldn't find it acceptable if I just said to myself "yeah, it's a large project, there's nothing I can do about it."  I also don't plan on guessing anything, rather I need to develop a process that identifies the problem so I can solve it.     Indeed.  I think it's preferred to avoid this as much as possible.     Agreed.  I really like this idea...and I'm going to add it as one of the solutions to explore.     I'll read all of these.  Thanks.     I'll try this, thank you.       Thanks for the replies, though I'd still like to be able to analyze each .obj (or any other generated file really as necessary) to find out what's costing me so much for link times.
  5. I'm currently working on optimizing somewhat of a moderately sized code base with maybe 40 or so external libraries.  There are quite a few things that have helped that I've done, but the time it takes to link is still a serious productivity problem.   I understand what contributes to slow compile and link times in general, but I need to be able to quickly and efficiently locate the problems in code.  I don't have time to manually go through and "guess," so I've been trying to develop a process in order to locate the most significant problems in the code base (so I can fix it).  And to that end, I've looked into using Microsoft's DumpBin tool (using VS2015).  I've tried to make sense of the data it generates, though I have had little immediate success in that regard, and on top of that, there doesn't seem to be a wealth of information on this topic.   So, as per usual, here I am, looking for suggestions.
  6. Phil123

    Trustworthiness of LinkedIn

       /pushthreadhijack What software do you use for this?    /popthreadhijack
  7.   That makes sense.  I read a portion of the course notes, and damn, now I have even more questions (though they aren't completely relevant to this thread's title so I'll post it in a new thread).
  8.   Sounds easy enough, I could definitely implement that, thank you.  I'd appreciate your thoughts on how ambient lighting should be handled with PBR, as I'm quite unsure about this myself.
  9. Here's the problem: I'd prefer to use deferred rendering, however, I'd also like to support multiple shading models (strauss, ward, ashikhmin-shirley, etc).  The necessary data (position, normal, diffuse, material, etc) is output to various textures which, as standard, is read by a shader that creates the final picture.   Right now, I would have to switch on the material type in this shader and use the appropriate formulas to get the result, though I've read that heavy branching like this should be avoided (and for legitimate reasons).  What's the standard solution here?    Also, how should ambient lighting be calculated in a scene for non-blinn phong shading models?  Normally you could just leverage the mesh's diffuse texture to trivially set a base "lit" value, but this won't work for shading models that are drastically different (as parts of the mesh that are lit dynamically will look very different from those that are just lit by ambient lighting).
  10.   Agreed.  I try to do this as much as I can in my personal projects.  If I can design code in a manner such that it does not require obscene amounts of error checking, then that's a good thing.       Also agreed.  Unless you mean ignoring an error is only using an assert and not an if check, in which case, that's what I'm not sure yet, heh.       This seems more reasonable than using if checks everywhere in some functions (in this case, internal functions).       Yeah.  I guess my next questions would be: how do you personally decide what to do when an error has occurred?  Where do you draw the line for performance and error handling?  As an extreme example...are you really going to handle NaNs with if checks in a math/physics library?       And this is why I'm on the fence about how I should approach error handling.  If you even use a single assert (without a matching if check to exit the function or otherwise handle the problem) to eliminate one possible code path in a Debug build, your code is technically broken (or has a bug, at least) in Release builds.     Maybe what I should also ask is that in what cases should you: only use asserts, use asserts and some appropriate if checks*, or use asserts and if check* absolutely everything?  Or is the slight performance hit and the development time cost of gracefully handling every single error worth it? *By if check, I either mean setting the value to something appropriate so the function can continue running, exiting the function and/or deciding to crash the program (if the error is severe enough).
  11. I believe that handling input and output errors of each and every function is absolutely vital and that doing so will help a code base's long term debug-ability as errors are immediately caught.  However, I'm not exactly sure what the standard in the industry is for doing this (both from an efficiency stand point and a maintainability standpoint, as introducing additional code paths increases code complexity).  I believe the long term cost of assert statements is very minimal as it doesn't introduce additional potential code paths that you have to maintain and they're generally compiled out for release builds anyway for efficiency, so my questions are as follows:   1 - What errors do you if check?  (Or more specifically...) 2 - Do you if check every single potentially null pointer? 3 - Do you if check programmer errors, or do you just leave it up to the assert to catch it? 4 - If you believe you should if check absolutely everything, do you think that this has an impact on the maintainability and readability of the code base? 5 - How would your answers change in a team of 5, 20, or 100 people?   Thoughts?
  12. Phil123

    Beautiful code - part 2 - top down design

    I'm not convinced this is beautiful code, and here's why:   int a; for (a=0   Declaration and initialization on two separate lines is unnecessary in this case.  In fact, declaring "int a" there is completely unnecessary as that value will always be (MAXTGTS - 1) following the end of that loop unless one of those methods secretly modifies it.   a=0; a<MAXTGTS   Writing code like this becomes unreadable quick.   sstgt[a].active == 0   I'm not sure what sstgt means.  Also, you're iterating through an entire array of objects despite some of them (likely) being inactive, and thus, ignored.   if (sstgt[a].active == 0) { continue; } if (camera_can_see_tgt(a) == 0) { continue; }   There's some duplicate code here.     I'm hoping that this method is nested inside a class.  Otherwise it means you've throwing around globals.  Though perhaps I'm missing something here.  Can you elaborate on your methods?
  13. Phil123

    Anyone tried Screeps, the MMO for programmers?

    I'd love to try this.  Though I don't know how much time I can actually sink into it :(
  14. Phil123

    Game Loop Design

      Yeah, you're right.  I think I'll interpolate between two previous states instead of estimating where they'll be.     Good to know.  Thank you.     Woops...That'll definitely be removed in my actual loop.     Yeah, I see what you mean.     Thanks for the input everyone.  I have an even clearer idea in regards to how I'll be implementing this.
  15. Phil123

    Game Loop Design

      Yeah, I could see a networked game being interpolated in this fashion.  Though I feel that it might become problematic if there aren't additional prediction methods.       t is for time, and I chose the values 1 and 2 for a simple example.   Interpolating physics based objects is quite simple as you will always have that data in memory and in use.  The question here is when you aren't using physics based movement for some objects and you need to interpolate for those objects as well, which again is simple until you consider that some objects will not even be rendered, and therefore you shouldn't be wasting time saving their previous frame data.  I'm looking for a clean solution for that.
  • Advertisement

Important Information

By using, you agree to our community Guidelines, Terms of Use, and Privacy Policy. is your game development community. Create an account for your GameDev Portfolio and participate in the largest developer community in the games industry.

Sign me up!