I am curious if it is possible to set a texture as the background for the framebuffer on which the 3D scene is rendered on in an easy way.
To be more specific:
I want to take the frames of my webcam and draw my 3D scene (which consits of particles in a black space) on it. I experimented with framebufferObjects and render-to-texture techniques. At the moment I create a framebuffer object and attach two textures to it. On one texture I render my 3D scene, while I copy the frame of my webcam to the other texture. I thought I could do something with glBlitFramebuffer() but unfortunately it just copies one texture to the other.
I thought I could somehow work with stencil buffers because I just need to punch out the black space of my 3D particle scene and draw it on my webcam frame, but I couldn't find any helpful ressources on this topic so far.
I would think the easiest (and fastest) way would be to just take the background image and put it on a quad and render it using an orthographic camera before rendering the 3d components of the scene in perspective. Unless I am not fully understanding what it is you are trying to do.
Wow, so blitting the framebuffer befor drawing the scene works fine.... It's far easier then I thought.
I didn't even try this before because I supposed alle the color-values would totally mix up when I already got a background.
yes, as long as you keep in mind that both directx and opengl is a rasterizer with a depth buffer then you will be fine
for example, when you do cubemapping for background just put in the shader gl_FragDepth = 1.0, that way ALL fragments for the background
are _exactly_ at the edge of zfar and will always be the "furthest away pixel" (turn off depth test, depthfunc always, depthwrite on)
just remember though, you can do fullscreen operations before and after your scene, but almost always never in between, because that will intefere with the actual geometry
so, the usual setup is: background -> geometry -> alpha-blended geometry -> postprocessing