how important are variable types for efficiency?

Started by
8 comments, last by rip-off 18 years, 3 months ago
Hello, I'm currently coding my game and am looking to optimize it a little. Most of the variables are int's, but the number contained in these int's are relatively small (<30,000). I'm going go and change eligible int's to short's or bools, but I was wondering, 1. does this really add efficiency when looking at the scope of an entire PC game? 2. Is there a variable between short and bool?
Advertisement
That is a bad idea, the int type was made because coders needed a type which represented the type the CPU worked best using. Therefore in almost any cases operations on int will be faster than on short. There is no way you can ensure short will be 16 bits so you might check that before using short, also you can't be sure an int is big enough, but on CPUs released in the last 20 (maybe exaggerated a little) years you can almost be sure int is big enough. Between short and bool there is the object called char which represent one byte (8 bit on most current CPUs). One of the only reasons to use short is to make the type smaller.
The only places you should use smaller integers are where they make sense from a programmers understanding point of view (bool) or when they have a noticable effect on cache performance, such as if you have an array you will pass through that only needs to store values in the range 0..255 or similar.
Or if they are required to fill in a file structure or a communication protocol.
"Debugging is twice as hard as writing the code in the first place. Therefore, if you write the code as cleverly as possible, you are, by definition, not smart enough to debug it." — Brian W. Kernighan
I guess my assumption for smaller variables = faster code was wrong. Can anyone give a good argument for making many of my int's into shorts when coding a game? Thanks for the reply's.
Quote:Original post by hoihoi8
I guess my assumption for smaller variables = faster code was wrong. Can anyone give a good argument for making many of my int's into shorts when coding a game? Thanks for the reply's.


If you have many many ints in an array and shorts have fewer bytes than ints on your platform, then, depending on how they're used, you may see a speed up because there's less memory being accessed (but you may see a slow down because they're smaller than the native type).

For what it's worth, this is the usual justification for using floats in 3D when the usual advice is "Use doubles unless you have a good reason not to" (Hmm... should you default to long doubles in C99?).
One good argument could be you want to send the data over a network, so instead of sending 32 bits you would send 16 bit (on a normal PC, sizeof(int) and sizeof(short) might differ on other platforms). In most cases I think it would be faster to use int and convert to short before sending. You might even be able to use some hacks (if you can make assumptions about your compiler) like this:
union{    int i;    short s;}

I could imagine this would work on most compilers. Also if your target platform doesn't have a lot of memory, but plenty of processing power short might be a better choice. I would say use int until you find a problem in a particular piece of code, then try to replace some int with shorts and see if you get a performance improvement (not enough to see visually of course, but you might be able to measure the difference with a profiler).
Quote:Original post by Anonymous Poster

If you have many many ints in an array and shorts have fewer bytes than ints on your platform, then, depending on how they're used, you may see a speed up because there's less memory being accessed (but you may see a slow down because they're smaller than the native type).


I'm not a compiler, nor a memory management expert, so I guess i'll just leave my ints the way they are for now until i get more expirience with the subject.

Quote:Original post by CTar
I could imagine this would work on most compilers. Also if your target platform doesn't have a lot of memory, but plenty of processing power short might be a better choice. I would say use int until you find a problem in a particular piece of code, then try to replace some int with shorts and see if you get a performance improvement (not enough to see visually of course, but you might be able to measure the difference with a profiler).


I'll keep this in mind if part of my program is slowing things down. I was trying to do general optimizing at the moment, and haven't measured peticular parts of code to find the bottlenecks yet.

Thanks!

Quote:Original post by hoihoi8
I'll keep this in mind if part of my program is slowing things down. I was trying to do general optimizing at the moment, and haven't measured peticular parts of code to find the bottlenecks yet.

Thanks!


wrong way to do it. the type of optimisation you are doing right now will have ltille effect( probably ). the chances that most of your CPU time is spent doing one or two things. im my game, those things are drawing and collision detection. optimisations that cut down the number of objects drawn and the number of objects attempting collision detection will affect me more than changing my data types.

my advice: find those one or two things. see what high level optimisations you can do before being force to make the code unclear by doing low level ones.

good luck!

This topic is closed to new replies.

Advertisement