Hi there !
The rendering part of my current project consists of a very-very-very basic renderer for a 3D scene :
- I feed a list of triangles to a Vertex shader working with position, colour and uv.
- Vertex shader uses a constant buffer for a world matrix and a viewproj matrix.
- Pixel shader use the VS output, samples the texture, outputs a pixel, and that's it.
It currently deals with no light effect whatsoever. But it works as intended
Aside from that, I have a different rendering system for sprites and text used by the GUI, which also works as intended
The sprite rendering system deals with dynamic buffers management and optimization, so the code for setting everything up for a given frame is quite intricate already. Maybe that prevents me from seeing a weird bug I have now, for which I need your insights.
See, I was thinking about going towards a deferred renderer right now (And deal with advanced effects such as lights when I'm there).
So in preparation for this work, I've set up a custom render target for my 3D scene, and render it there instead of the backbuffer. To test this step, I've set up a sprite covering the whole screen, using that new render target as its texture input, and it also worked as intended, blitting the same scene on my screen that I had previously (yeah, lots of smileys, but this simple stuff got me quite happy alpready)
Obviously, the final stage of a deferred shader would benefit from more advanced pixel operations that what my sprite rendering system was designed to do, so my next step for porting my forward renderer to deferred mode was to remove the "blit as sprite" step, to set up another renderer class using two new shaders, to manually set up a quad covering the screen, and use that for rendering instead of one of my 'sprite' objects. It currently uses the dumbest VS possible, taking a float2 'POSITION' and float2 uv as input, not transforming the input position in any way, outputting a float4(input.pos, 0.5f, 1.0f) 'SV_POSITION', and texture coords (left unchanged).
And... from here nothing is working as intended at all
The output I get is a quad, sure, but transformed in some way. Transformed with (so it seems), something from the viewproj matrix I use for the 3D scene (when *nothing* in the shader code uses that) : its orientation changes whith the same mouse inputs that would otherwise rotate the 3D view. Its texture is all messed up... What's worse : Out of ideas for the cause of this bug, I used the same "solid yellow, untransformed" renderer used to perform my very first 3D test for that engine (Direct3D11, btw), I got the same messed-up output (no solid yellow at all, seems to weirdly sample the texture still).
I guess when I've found the reason for this bug, I'd laugh at myself for having skipped a very basic initialisation phase between the rendering of the 3D scene to the render target, and the rendering of the final deferred quad, but for the moment I've double-checked and triple-checked my code and found nothing...
Is it possible that some constant buffers (or other resource) set up for being used by a given shader would collide with another shader definition that doesn't use them, and mess the shader input in some strange way ? Should I reset any constant buffer usage to zero on the device context, before rendering with the other shader that doesn't care about them ?
Any other ideas ?
Thank you all