# whats the fastest way to bucket sort pointers?

This topic is 2285 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

## Recommended Posts

i have shaders only 12 of them so far , they will be increasing and entitys,

every entity has a pointer to a shader

i want to bucket each entity into groups of shaders.

i used stl::maps but this seams slow , is there a faster way ?

##### Share on other sites
You'll need to clarify, I can't understand what you're asking.

##### Share on other sites

i have shaders only 12 of them so far , they will be increasing and entitys,

every entity has a pointer to a shader

i want to bucket each entity into groups of shaders.

i used stl::maps but this seams slow , is there a faster way ?

i ahve entitys , ie. models , and they all have shader pointers . this is so that they have a ppointer to the differnt shaders

i want to sort all the models into seperate vectors, for fast access. they need to be sorted vai the pointers of the shaders.

##### Share on other sites
And std::map<shader*, std::vector<model*>> (using smart pointers where appropriate) is insufficient?

Is it too slow to populate, to iterate over, to search through?

##### Share on other sites

And std::map<shader*, std::vector<model*>> (using smart pointers where appropriate) is insufficient?

Is it too slow to populate, to iterate over, to search through?

tred that , was way too slow .

had to re configure my engine to use enums : got the speed now though

##### Share on other sites
Did you profile your code to see what was slow? What exactly do you mean by "using enums"? How would it gain you speed?

##### Share on other sites
Just sort them all into the one vector. Your "buckets" are then different ranges within that vector.

##### Share on other sites

[quote name='Telastyn' timestamp='1319137120' post='4874788']
And std::map<shader*, std::vector<model*>> (using smart pointers where appropriate) is insufficient?

Is it too slow to populate, to iterate over, to search through?

tred that , was way too slow .

had to re configure my engine to use enums : got the speed now though
[/quote]

enums aren't any smaller or easier to hash than pointers. If you're not using pointers and are copying your entire object every time... yeah, that's going to suck.

But since you won't actually tell us anything meaningful... best of luck with that.

##### Share on other sites
maps can be slow if you don't know how to use them properly, and fast if you do. There are various tricks like making use of swap and const-references etc that you need to know to use them efficiently.

Without seeing your code, my experience tells me to assume that you used them poorly, because that assumption is most often correct.

##### Share on other sites

[quote name='thedodgeruk' timestamp='1319155020' post='4874858']
[quote name='Telastyn' timestamp='1319137120' post='4874788']
And std::map<shader*, std::vector<model*>> (using smart pointers where appropriate) is insufficient?

Is it too slow to populate, to iterate over, to search through?

tred that , was way too slow .

had to re configure my engine to use enums : got the speed now though
[/quote]

enums aren't any smaller or easier to hash than pointers. If you're not using pointers and are copying your entire object every time... yeah, that's going to suck.

But since you won't actually tell us anything meaningful... best of luck with that.
[/quote]

erm , need to sort my entitys so that i have less state changes on the GPU , so need to bucket sort all my enttiys via the shader pointer , so when done i have one bucket for all entitys that have shader plaincolour, other plainTexture , other phong , other normalmapping ect

and did an analize and it was saying with map, it was saying the slowest thing in my engine was itterating through the map , once i collected all my info into the buckets

##### Share on other sites
Use assembly language, stop screwing around, if you want speed, size, or both, come to the dark side @ asmcommunity.net
No we do not support malicious stuff, we are good people who help each other, and welcome novices and experts alike.
I was forced to program in c and c++ all this year, and i learned a few things, like, MSVC IS CRAP and I like codeblocks, and so on.

##### Share on other sites
How were you profiling? Were you profiling a Debug or Release build? If iterating through a 12 element std::map was the most expensive thing in your "engine", then you mustn't be doing a lot of work elsewhere in your program.

Can you show us some code? Maybe you are making a minor mistake that ends up doing unnecessary work.

For small numbers of keys, a map has a lot of constant and hidden* overheads. It is only when the number of keys is large that you see the benefits. I agree with Hodgman, I think a sorted linear contiguous structure like std::vector<> would be much more efficient, and not too hard to code.

* [size="1"]Hidden overhead includes cost of cache misses and allocations, which is ignored by big O analysis.

##### Share on other sites
A map (i.e. balanced binary tree) of vectors is totally overkill. Implementing it in assembly also wont help, as the inefficiency is in the algorithm / data-structure, not the implementation.

All you need is one [font="'Courier New"]std::vector[/font] plus [font="'Courier New"]std::sort[/font] (or a custom radix sort if you've got thousands of entities and want that little bit of extra speed).