Thanks, guys. I was afraid of that. And no, I don't think that I'd have any option to stuff a frame counter into an alpha channel or something like that.
A question - do you need it to work on alternating frames? I mean, do you strictly need it (for something like some 3D glasses that use this principle to distinguish between images for left and right eye)?
Yes and no. I have more than only one application where I'd need something like this. The mentioned 3D glasses support is indeed one of them, but that's not yet urgent. For now, I've thought of something else:
I've got a couple of post-process effects that are quite a bit expensive in terms of GPU load, mainly things were I sample / measure / compute several aspects of the same effect sequentially and then merge / mix the partial results into one final image, so I thought instead doing this all together in a single shader pass (I don't have multiple passes anyway, BTW), I could sort of "split" these calculations and spread them over two or more frames. The potential flickering caused by toggling between two (or more) effects would be acceptable, in some cases even desired (think of a, for example, simulated night vision effect).
The idea of using frames for that purpose is just to make sure that the different parts of the effect would be executed in alternating order, thus to avoid that one part would be executed much more often than the other what would cause weird visual lagging.
So, if using frames is not a possible or suitable way of doing such a "split", what other options do I have (besides the most obvious, distinguish between even and off pixels)?