pseudomarvin

OpenGL GLSL shader crashes instead of taking longer to compute

Recommended Posts

I assumed that if a shader is computationally expensive then the execution is just slower. But running the following GLSL FS instead just crashes

void main()
{
	float x = 0;
	float y = 0;
	int sum = 0;

	for (float x = 0; x < 10; x += 0.00005)
	{
		for (float y = 0; y < 10; y += 0.00005)	
		{
			sum++;		
		}
	}
	
	fragColor = vec4(1, 1, 1 , 1.0);
}

with unhandled exception in nvoglv32.dll. Are there any hard limits on the number of steps/time that a shader can take before it is shut down? I was thinking about implementing some time intensive computation in shaders where it would take on the order of seconds to compute a frame, is that possible? Thanks.

Share this post


Link to post
Share on other sites

Slow shaders (on the order of two seconds per frame) will cause Windows TDR to assume that the GPU has locked up, and it will reboot the graphics driver. Under D3D, this manifests as a "device removed" error code from the Present function. I don't know how GL reports it, but it shouldn't crash.

What GL function are you calling when the crash occurs?

Share this post


Link to post
Share on other sites

Thus adding such snall value might cause crash too try to define highp precision float shouldnt meant imo to be such small for shaders try to multiple it by 10, anyway loop doesnt return anything. If thats only a sample code then ok but you could do not calculate sum like that

Edited by WiredCat

Share this post


Link to post
Share on other sites

I want to point out something that I did not know about shaders when I started and it always caused crashes for me.

In shaders the for loop is unrolled so that little code you have there is huge when unrolled. Think of it:

10 / 0.00005 = 200000 * 200000 = 40 000 000 000.

That is you have more than 40 000 000 000 lines of code in that one shader. I think it would crash even before it's done building.

Share this post


Link to post
Share on other sites

What Hodgman said. Plus there is the tendency to write GPU shader code as if its equivalent to CPU. Though programmable, GPU architecture and programming paradigm have subtle differences than CPU. If you think about this, crashing aside...why would you want to write code like what was posted, keeping in mind that this code will be execute per fragment...thus as the resolution of your render target increase, so does the cost of this shader.

Share this post


Link to post
Share on other sites

To clarify, I'm working on IBL for Physically Based Rendering and I wanted to implement a slow and dumb brute force way of solving the integral in the reflectance equation before doing it the optimized way but I found I couldn't. It was really just for reference to see if I get things right.

@Hodgman It fails on SDL_GL_SwapWindow. Curiously not after the first frame but always after the second.

@WiredCat I've tried using double instead of float and also increased the size of the incremented value. Still crashes. It is of course just sample code

@Scouting Ninja Well it builds ok. It seems that the reason is as far as I can tell not related to the size of the shader program on the GPU.

# Crashes
for (double x = 0; x < 100; x += 1.0)
{
  for (double y = 0; y < 100; y += 1.0)	
  {
    sum += 1.0;		
  }
}

# Works
  for (int x = 0; x < 10000; x += 1)
{
  for (int y = 0; y < 10000; y += 1)	
  {
    sum += 1.0;		
  }
}

Thanks guys, it's not necessarily a mystery that I need to solve, I was just curious what the hard limit is and whether there's a way to go around it.

Share this post


Link to post
Share on other sites
11 hours ago, pseudomarvin said:

Well it builds ok. It seems that the reason is as far as I can tell not related to the size of the shader program on the GPU.

Then I am really surprised by the fact that a double crashes and a integer doesn't.

Because a double is more bites than a integer it would stand to reason for me that the unrolling of the larger double is what is causing the crash.

11 hours ago, pseudomarvin said:

Thanks guys, it's not necessarily a mystery that I need to solve

If you ever find out what it is please share, I tried it on my own and it only slowed down; I did not get a crash. Using high numbers only causes the shader to fail and render pink.

Edited by Scouting Ninja

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now