I'm asking myself how to implement something like the projectors in Unity using OpenGL or DirectX.
I know how to do projective texture mapping, but in the implementations I've seen, the projected texture is passed as an additional texture to a shader - so it's basically multitexturing. If I would render a scene like in the attached screenshot, I would pass the grass texture and one "blob"-texture to my shader and render that.
Unity, however, supports an arbitrary number of projectors. The scene on the attached screenshot shows a terrain with 16 projectors. But Unty supports way more than that - I could easily use 100 projectors, and all would be rendered, correctly perspectively projected onto the terrain, including correct occlusions if a "blob" is projected, for example, over a mountain ridge. It's perfectly projected onto the underlying geometry always.
So, Unity obviously uses a different approach than the one I know, which is passing all the projected textures (and their texture-matrices, etc.) to the shader.
How could I implement the same functionality using OpenGL or DirectX?