Simple GLSL Image Processing

Started by
18 comments, last by DaveDavis 12 years, 12 months ago

I'm pretty sure you don't need a depth attachment if you don't want one, you should be able to render fine to just a color attachment. Be sure to disable depth test and depth write though when drawing to the FB.



Is depth test enabled by default? I thought I would have to explicitly enable it with glEnable(GL_DEPTH_TEST). If I'm not doing that, should I have to worry about it?
Advertisement
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR_MIPMAP_LINEAR);

This is a problem if you don't build mipmaps. If you're not using them, you have to set the minfilter to not use mipmapping (GL_LINEAR for example), otherwise it won't sample the texture.

http://www.opengl.org/wiki/Common_Mistakes#Creating_a_Texture


Also this:
glGenFramebuffers(1, &fbo_id);
glBindFramebuffer(GL_FRAMEBUFFER, 1);

You probably meant to bind fbo_id, not "1". It may work but it is prone to breaking.


Is depth test enabled by default? I thought I would have to explicitly enable it with glEnable(GL_DEPTH_TEST). If I'm not doing that, should I have to worry about it?
[/quote]

You're right, its disabled by default. You should be fine.
[size=2]My Projects:
[size=2]Portfolio Map for Android - Free Visual Portfolio Tracker
[size=2]Electron Flux for Android - Free Puzzle/Logic Game
Ok cool, I figured it out. It was partially the GL_MIN_TEXTURE_FILTER I was using as you mentioned, and partially the way I was binding/unbinding textures and FBO, etc.

This stackoverflow question helped a bit: http://stackoverflow.com/questions/3466736/render-to-fbo-not-working-gdebugger-says-otherwise

And I've updated the gist new code: https://gist.github.com/916742

On to implementing my blur shader... :)
Alright, I've tried to implement a blur shader with some gaussian blur code I found, but no luck.

I've updated the gist at https://gist.github.com/916742, but the main problem is at line 274. The program always fails here with the message "linking with uncompiled shader". I can't seem to find much helpful online about this error, so if anyone has any ideas I'm all ears. I've included the shader source files at the bottom of the gist as well for reference.
I'm still learning and using copy-pasted code from tutorials, so I have no idea if this is your problem, but here's the code I'm using to accomplish something similar:



myShader = glCreateProgramObjectARB();

myFragShader = glCreateShaderObjectARB(GL_FRAGMENT_SHADER_ARB);
glShaderSourceARB(myFragShader,1,&myFragShaderSource,0);
glCompileShaderARB(myFragShader);
glAttachObjectARB(myShader, myFragShader);

glLinkProgramARB(myShader);



I notice that you create your vert and frag shaders using glCreateShader, whereas I use glCreateShaderObject, and then attach that to the main shader program... Maybe this is it?
I'm pretty sure glCreateShader is the right code. I've never even heard of glCreateShaderObject, and I can't find any documentation on it except for a couple blurbs from some powerpoint presentations from like 2002.

I'm going to guess that that's some extension that was around in the very early days of shading languages, but glCreateShader should be the correct way.
[size=2]My Projects:
[size=2]Portfolio Map for Android - Free Visual Portfolio Tracker
[size=2]Electron Flux for Android - Free Puzzle/Logic Game

I'm pretty sure glCreateShader is the right code. I've never even heard of glCreateShaderObject, and I can't find any documentation on it except for a couple blurbs from some powerpoint presentations from like 2002.

I'm going to guess that that's some extension that was around in the very early days of shading languages, but glCreateShader should be the correct way.


I think you're right, a quick google for glCreateShaderObject doesn't seem to return anything useful. I'm still trying to figure out when to use the "*ARB" methods; it seems like you have to use some of them still, but most of the time I'm using the non-ARB methods.

I have a feeling it has something to do with the shader code, since I don't get that method if I change the fragment shader to a simple pass-through frag shader, but the results from the compile log for the gauss2.fs shader don't report anything wrong, so I'm kind of lost at this point.


Oh yeah, should have tested... glCreateShaderObject isn't even defined in my environment.

glCreateShaderObjectARB is provided by GLEW I believe, and there's no glCreateShaderARB.

Replacing the call with plain "glCreateShader" produces the exact same result.
Ok, so I've updated my gist with the working code: https://gist.github.com/916742

The main thing to notice is that the fragment shader has changed quite a bit: no more use of the "const" keyword, and no more initializing the offsets and gaussian kernel weights during the declaration (or even in loops). I'm not sure why I can't do that, but I'm guessing it's because my laptop is using an intel integrated graphics chip that doesn't support the most current GLSL language. That's the only thing I can guess, I've yet to test the old frag shader code on my desktop (which is nvidia).

Anyway, hopefully this helps someone else in the future.
Just updated the gist so that the texture offsets and kernel weights are calculated in the main program and then passed into the shader, since we should only have to calculate those once at the beginning of the program.

This topic is closed to new replies.

Advertisement