#### Archived

This topic is now archived and is closed to further replies.

# Fast Percentage

This topic is 5231 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

## Recommended Posts

This is a really simple question. I'm trying to get an accurate percentage without using floating points. This is how I would calculate a percentage with points: fcurrent * (100.0 / fmaximum) With fcurrent and fmaximum being FLOATs or DOUBLEs. Is this a crappy way to accomplish this? Is there a better way? Anyways, this is how I've attempted doing it without points: (current * (25600 / maximum)) >> 8 It's the same deal, but I'm making my own point system with the lower byte. I'm not a math wiz, so criticism will not harm me. I hate that I have to divide to accomplish this, but I know no other way. Thanks for any advice! EDIT - I forgot to mention that maximum will be in the range of 0 - 9999. The higher it is, the less accurate my percentage, right? Hmm, would it be safe to use the entire lower 16 bits for a point system? [edited by - Jiia on March 18, 2004 2:59:24 AM]

##### Share on other sites
You really shouldn''t be bothering yourself about performance until you measure that the thing you want to optimize is a performance bottleneck.
I really can''t imagine why percentages would be a bottleneck.
besides, if you have computed 100.0/fMaxValue once, then computing the % is just a matter of a single fpu operation, it''s hard to be faster than that. You can probably do billions of those per second on a modern cpu.

##### Share on other sites
It''s more about removing casts to FLOATs than improving speed or performance. My coordinate system is made up of regular 32 bit integers. To get a percentage, it would look like this:

perc = (LONG) FLOAT(current) * (100.0 / FLOAT(maximum))

Which is really ugly. And sometimes the compiler doesn''t want to calculate the values in correct order, so I have to do this as well:

perc = (LONG) FLOAT( FLOAT(current) * (100.0 / FLOAT(maximum)) )

To show it that I want the value calculated as a FLOAT, *then* casted. I really wanted to get rid of all those casts.

By the way, it won''t help me to store the 100.0 / maximum calculation. The whole calculation is done for thousands of different values, and none with the same maximum number twice.

##### Share on other sites
have you tried:
percent = (long)(100.0*fcurrent/fmaximum)

that way it will work

Otherwise, use some rounding function

##### Share on other sites
Theres an old trick thats not very accurate, but pretty damn fast.

First, rather than using 100 as your divisor (x*y/100), you pick a power of 2, so that we can use bitshifting, rather than division. For the purpose of this demonstration, I'm going to use 1024. Next, I need a lookup table, containing 100 values, 0->99. The 0 isn't neccessary, but thats a minor detail. The values in this table will correspond to the percentages of 1024. So, naturally lookup[50] should yield 512, and lookup[1] would be 10.

Result = Number * Lookup[Percent] >> 10;

Now, as for the reverse, to find what percent a number is of the other, you can't get out of the division, but you can drop the multiplication like so.

Result = (Number << 10)/Other;

That much will tell you what percent the other is of the number, but in terms of 1024. If you remember that we still have a sorted lookup table, you could binary search that for your number, though, by then we've created a bigger bottle neck then we had originally, so I'd recommend just keeping the number in terms of 1024 and adjusting the math accordingly.

[edited by - inmate2993 on March 18, 2004 2:16:49 PM]

##### Share on other sites
fair enough that you can make calculating percentages faster with some bitshifting. But if calculating percentages is the actual bottle-neck of your engine then you better have an E3 booth because your game is going to be a thousand times better than Doom3 or Half-Life 2.

spending your time on making simple calculations like this faster is a total waste of time for any game i can think of. the odds that something like this are a performance bottleneck are about zero. assuming that this is a game optimization, of course....

-me

[edited by - Palidine on March 18, 2004 2:22:49 PM]

##### Share on other sites
unless he is doing a percentage game

##### Share on other sites
In which case he should stop right now hehe.

##### Share on other sites
Inmate2993 -> That''s a really clever way to rid of the divide

Palidine -> I don''t care much about how fast it is, I just want to avoid using FLOATs.

I''m not sure what everyone has against optimization. Perhaps I want to allow gamers too poor to afford decent computers to play my game. Is that a big deal? Maybe every clock tick counts.

By the way, once a game is complete, there are going to be billions of calculations per second going on while you''re playing. If you want to toss everything into the "no need to optimize bin", you''re going to have a damn slow game. Then even if you profile the game, your level/map loads and anything else that only happens at a user level is going to crawl like mud. But then, most professional games suffer from the same fate.

##### Share on other sites
quote:
Original post by Jiia
I''m not sure what everyone has against optimization. Perhaps I want to allow gamers too poor to afford decent computers to play my game. Is that a big deal? Maybe every clock tick counts.

The reason everyone is "against optimization" is because you''re worrying about the low-level optimisations, generally is far more milage in algorithmic improvements, quite often you can save many thousands of cycles by researching and using a better algorithm, compared to the odd hundred you might pick up by low level optimisations; consider for example sorting a list; suppose you were using a bubble sort: you could try to speed it up by streamlining the compare and the position swapping code and perhaps halve the execution time (and probably make the code unreadable at the same time...but that''s another story) or you could choose a better algorithm - for example the quick sort algorithm, who''s comparative improvement will be huge because you carry out far fewer iterations to sort the list - and its effect is disproportionatally noticable on longer lists.

1. 1
2. 2
Rutin
19
3. 3
JoeJ
16
4. 4
5. 5

• 30
• 22
• 13
• 13
• 17
• ### Forum Statistics

• Total Topics
631700
• Total Posts
3001800
×