An R8G8B8 scene has a range of colours that can be represented as an 256x256x256 cube. If each entry in the cube is R8G8B8, that comes to about 49MB, which is clearly totally unacceptable.
We could use a smaller cube with NEAREST sampling, but it would be like downsampling our image from R8G8B8 to R4G4B4 or something before posterising - and really, we don't want to do that, because it binds neighbouring colour values together in ways that we may not want to bind them - e.g. maps 100 and 101 as if they were both 100, even though 101 should actually map as if it was some other value.
So, I think abandoning volume textures might be a good plan. Sadly our colour values aren't normalised so we can't do something cunning like, say, convert the colour to a 2D spherical coordinate and look it up in a 2D texture.
We're going to need to actually have a list of poster colours, and iterate through them, testing to find the one our input colour is closest to. Eugh. That's pretty simple on a GF6 or something I guess, just loop in the pixel shader... but I'm interested in a more restricted platform *cough* which can't loop in the pixel shader and has very few instructions to work with anyway.
Let's say I can perform a magic trick which lets me reinterpret_cast<> a block of memory from framebuffer to vertex buffer. What do I get then? The ability to process colours in the vertex shader... if I feed this 'vertex colour' data in stream 1 alongside some simple position data in stream 0... the gouraud would fuck me up but I might be able to drop into flat shading mode, in which case colour is taken from the first vertex in the primitive. So I feed in a load of quads *cough* wired up such that each color will come through as the first vertex in a quad which precisely covers that pixel's position in the framebuffer. Hey, lookie, I've got the ability to process fragments in my vertex shader!
Once I've gotten that it should be pretty much trivial to loop through a list of 16 colours or so and find the one with the closest value (dp3 them or something). No branching instructions, but we've got slt/sge so it should be doable. Write out our selected colour as oD0, the rasterizer generates a single pixel flat-shaded with our oD0 color, and we're done.
This still leaves me with the problem of finding the colours to quantize to in the first place. Perhaps that should wait until tomorrow.
1. RGB -> HSL
2. Quantize S, Quantize L or set to constant values
3. Quantize H or lookup in a 1D "hue remap" texture
4. HSL -> RGB
[smile]