Jump to content

  • Log In with Google      Sign In   
  • Create Account


[DX9] Execute pixel shader branch on every N-th frame


Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.

  • You cannot reply to this topic
9 replies to this topic

#1 Meltac   Members   -  Reputation: 283

Like
0Likes
Like

Posted 11 January 2014 - 07:56 AM

Goodday!

 

I have a post-process pixel shader being executed by the given game engine (X-Ray 1.0, but that's not topic here). I want to split the shader into two different branches where each of them is called alternatingly upon every second frame. The engine passes a timer value to the shader, but that's about all.

 

So is it possible to either measure the current FPS rate and calculate by the given timer value when one or the other branch has to be executed, or even to configure the shader somehow that it conditionally executes a specific function (or branch) only on every N-th frame?



Sponsor:

#2 cozzie   Members   -  Reputation: 1445

Like
1Likes
Like

Posted 11 January 2014 - 10:44 AM

You could define a few techniques, one calling the PS you only need the 'n'th frame. Andthen on the CPU/ application side determine when you call which technique?
(other then techniques you could also define 2 different PS)

Depending on several other factors, this might be quicker then brancinh within one PS

#3 Meltac   Members   -  Reputation: 283

Like
0Likes
Like

Posted 11 January 2014 - 02:48 PM

I wouldn't have asked for a HLSL way if I had an option to do this application-/CPU-wise cool.png

 

The issue is that I can ONLY access the HLSL pixel (and vertex) shaders while the game engine (=application) itself is basically a black box. So I need a way of doing that HLSL-wise.

 

If I, for example, had a function that gives me the current frame rate from within the pixel shader I could calculate all that I need purely within the pixel shader code.

 

Any ideas?


Edited by Meltac, 11 January 2014 - 02:49 PM.


#4 Adam_42   Crossbones+   -  Reputation: 2361

Like
0Likes
Like

Posted 11 January 2014 - 02:59 PM

In that case your best option is to borrow the timer, and assume a specific frame rate (e.g. 60FPS). Code would look something like:

// Compute a "frame number" value, in the 0-1 range, which wraps every 4 frames
float test = frac(elapsedTimeInMilliseconds / ((1000.0 / 60.0) * 4.0));

if (test < 0.5)
{
    // Do Stuff
}
else
{
    // Do other stuff
}


#5 Meltac   Members   -  Reputation: 283

Like
0Likes
Like

Posted 11 January 2014 - 03:21 PM

 

In that case your best option is to borrow the timer, and assume a specific frame rate (e.g. 60FPS).

 

Thanks. So I do not have a way to determine the current FPS shader-wise? Because, frame rate heavily changes during gameplay (according to level complexity, currently applied post effects or particles, number of light sources and other GPU demanding objects to be rendered, and the like). I can't just "assume" some average FPS because that would be inaccurate in many cases and then lead to unsynchronized cycling between the two shader branches what in turn would cause visual delay / lag when one of the branches is executed much more often than the other.



#6 MJP   Moderators   -  Reputation: 10224

Like
0Likes
Like

Posted 12 January 2014 - 02:04 AM

Using a timer sounds entirely unreliable. Could you maybe hijack an alpha channel somewhere and stuff a frame counter into it? If you could read it, increment it, and write it out then you would be able to determine whether you need to run your technique.



#7 Tom KQT   Members   -  Reputation: 1503

Like
0Likes
Like

Posted 13 January 2014 - 03:34 AM

I cannot think of any way how to access the frame number within shaders and/or how to measure fps within shaders (without being able to pass these values from the application code).

If you really can modify ONLY hlsl shaders, then I'm afraid you won't be even able to store some useful information into the alpha channel.

 

A question - do you need it to work on alternating frames? I mean, do you strictly need it (for something like some 3D glasses that use this principle to distinguish between images for left and right eye)?


Edited by Tom KQT, 13 January 2014 - 06:41 AM.


#8 Meltac   Members   -  Reputation: 283

Like
0Likes
Like

Posted 13 January 2014 - 09:05 AM

Thanks, guys. I was afraid of that. And no, I don't think that I'd have any option to stuff a frame counter into an alpha channel or something like that.

 


A question - do you need it to work on alternating frames? I mean, do you strictly need it (for something like some 3D glasses that use this principle to distinguish between images for left and right eye)?

 

Yes and no. I have more than only one application where I'd need something like this. The mentioned 3D glasses support is indeed one of them, but that's not yet urgent. For now, I've thought of something else:

 

I've got a couple of post-process effects that are quite a bit expensive in terms of GPU load, mainly things were I sample / measure / compute several aspects of the same effect sequentially and then merge / mix the partial results into one final image, so I thought instead doing this all together in a single shader pass (I don't have multiple passes anyway, BTW), I could sort of "split" these calculations and spread them over two or more frames. The potential flickering caused by toggling between two (or more) effects would be acceptable, in some cases even desired (think of a, for example, simulated night vision effect).

 

The idea of using frames for that purpose is just to make sure that the different parts of the effect would be executed in alternating order, thus to avoid that one part would be executed much more often than the other what would cause weird visual lagging.

 

So, if using frames is not a possible or suitable way of doing such a "split", what other options do I have (besides the most obvious, distinguish between even and off pixels)?



#9 Tom KQT   Members   -  Reputation: 1503

Like
0Likes
Like

Posted 14 January 2014 - 02:03 AM

Alternating on the level of pixels (either even/odd line or like on a checkerboard) doesn't sound so bad, did you give it a try?

Then just the timer, as was already suggested above. Advantage over frame-to-frame is that the flickering would be still roughly in the same speed, not depending on the framerate (which could be desired in your mentioned example of night vision). Disadvantage is that you would probably have to swap the individual shaders slower, to make sure that each shaders get at least one frame to be shown at all.



#10 Meltac   Members   -  Reputation: 283

Like
0Likes
Like

Posted 15 January 2014 - 04:19 AM


Alternating on the level of pixels (either even/odd line or like on a checkerboard) doesn't sound so bad, did you give it a try?

 

Yes I did, works technically flawless, as would be expected. However it makes the image looking entirely pixelated, rough and unpolished because resolution is quartered, so I'd need at least some sort of gaussian blur to make it look "smooth" again what would performance make drop drastically and therefor is not an option here (especially because it would result in the opposite of what I am intending, to decrease GPU load).

 

So I think, besides all downsides, best option might be using the timer approach and try to find some well balanced frequency setting for toggling between the partial effects, so to make sure both of them shall be executed equally. I could still provide a fallback using the mentioned checkerboard approach for users with slow frame rates where the timer appraoch wouldn't look good.

 

Thanks guys, I appreciate your help.


Edited by Meltac, 15 January 2014 - 04:21 AM.





Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.



PARTNERS