I have a number of complex sprites made up of very simple parts that don't change that often, and redrawing the parts real-time is a pretty big waste of cycles. So i'm trying to push the entire sprite onto a texture, and just use that texture as my rendering call for the sprite.
This is all great, but whenever i try to copy, the texture i get is an image with the sprite on a black background.
This makes sense, i'm clearing the buffer and drawing the entity, then glCopyTexImage2D() straight to the texture. Naturally i'll get whatever's in the gaps, which in this case is my clear color, black.
I obviously can't have great black squares following my sprites, but i also can't get them into their texture without rendering them somewhere first. Is there a way to render directly to a texture, or to specify that i want the untouched portions of the texture to have zeroed alpha?
Note; i tried a real hack that involved clearing to an ugly purple (1, 0, 1, 1) and running a 'replacement' shader that ignored fragments that would be purple whenever rendering the fragments of the sprite's texture, but this resulted in a sort of purple 'border' around everything, and the shader didn't quite work correctly anyway.
Copying framebuffer to texture - with alpha
The purple border is probably from anti-aliasing in the image. If you look at texture in an image editor, more than likely you will see colors other than the clear color around the image. The only other issue I can think of would be that you are using a lower mip level.
You were absolutely right, i'd ensured my rendering didn't use AA, but the images did indeed have soft edges. After some quick tweaks to them, it works fairly well. D'OH.
I still don't like running every sprite through the equivalent of a replacement filter every cycle, but it still really beats drawing a few hundred quads to make one sprite.
I still don't like running every sprite through the equivalent of a replacement filter every cycle, but it still really beats drawing a few hundred quads to make one sprite.
Wouldn't it make more sense just to use render to texture instead of rendering to framebuffer and then copying the bits to a texture? Seems like an unnecessary middle step.
This topic is closed to new replies.
Advertisement
Popular Topics
Advertisement