• Content count

  • Joined

  • Last visited

Community Reputation

106 Neutral

About Muftobration

  • Rank
  1. Yes, hardware support goes by major version. OpenGL 4.2 hardware will support all OpenGL 4.x.
  2. Basic shading problem (FFP)

    Thank you, mhagain. I was originally using smooth shading, but I thought things just didn't look quite right, so I switched to flat to see if I could make sense of it. Now that I think about it, I guess this is probably how it's supposed to look. [img][/img]
  3. I've been having some shading trouble. The sphere here is just a unit sphere created and triangulated in modo (attached), with normals also computed in modo. The light is in the upper left of the picture a few units away. On average, the shading looks right, but why the huge discrepancy between two triangles from the same quad? I've ruled out incorrectly computed normals and back-facing polygons. [img][/img]
  4. OpenGL OpenGL 3 on a 2.1 device

    Ah, that's a shame. Thanks anyway.
  5. I have searched, but nobody seems to be asking this question, nor have I found an answer. Is it possible to run a program containing OpenGL 3 code on a machine that only supports up to 2.1? I would be most interested in some sort of software that wraps around the program and simulates any calls that the hardware cannot handle. I'm doing some OpenGL 3 development on my X201 and the intel graphics cannot correctly run many of the examples in the OpenGL SuperBible, 5th edition (see attached screen shot, which is supposed to depict a sphere). These examples work correctly on my GTX 460. If it were possible to get a small expresscard slot graphics adapter that provided support for OpenGL 3, that would be excellent, but I don't think such a device exists (besides the ViDock, which is not practical on the go). [img][/img]
  6. OpenGL Alpha based on depth

    That is a shame, but I appreciate your input. I'm pursuing a more roundabout way of accomplishing my desired effect using a gaussian texture and some camera position calculations. At least now I know I shouldn't pursue the method I was trying in my first post. Thank you for helping me.
  7. OpenGL Alpha based on depth

    Unfortunately, that also does not work.
  8. OpenGL Alpha based on depth

    [quote name='dpadam450' timestamp='1308158626' post='4823694'] Not using shaders sucks. Secondly, calling glGenTextures is supposed to be done once. I dont know if these are snippets or your actual function, but your making a new texture everyframe. I dont know what EYE_LINEAR is going to do in terms of alpha, but I'm assuming you can't generate alpha from depth. You might need to do this in a shader. Are the white quads always facing the user? If so then you can just glColor4f(...,....,....,alpha), where you calculate what the alpha is. [/quote] Thank you for the reply. I would certainly prefer to use shaders, but the application for which I am developing this requires that I do not. Thanks for pointing out the texture generation problem. I also don't really know what EYE_LINEAR does; it's only there because that was in the code I was using as a reference. The white quads is just a test case to see if I can get alpha to vary based on depth. The real application will only involve lines making up a grid, but the grid can be rotated and translated any which way. I suppose my real question is how I can make texture coordinates depend on depth and then how I can make alpha depend on a texture.
  9. I've been trying to implement a system that will render geometry wherein the rendered geometry fades into the background as the distance increases. The idea is something just like GL_FOG, but fading to the background image (which is not static) instead of a specific color. I'm having difficulty with my implementation and I'm not even sure it's the best way. I'm basing it off of this post (last post in there): [url=""][/url] The gist of it is, you set up automatic texture coordinate generation and make that depend on the depth. Then, you make a 1D texture of values 0-255 and bind that. Finally, you render your geometry with that texture applied, making close geometry opaque and far geometry transparent. I haven't been able to get it to work. I have a small test case I made, shown below. I render a blue quad for the background, then I render three white squares at different distances in front of that blue quad. The white squares close to the quad should be somewhat blue, but they are still pure white. Does anyone know what I've done wrong? [code] glDisable(GL_LIGHTING); // Blue bar in the background glColor3f(0.0f, 0.0f, 1.0f); glBegin(GL_QUADS); glVertex3f(3.0f, 1.0f, -15.0f); glVertex3f(-3.0f, 1.0f, -15.0f); glVertex3f(-3.0f, -1.0f, -15.0f); glVertex3f(3.0f, -1.0f, -15.0f); glEnd(); glEnable(GL_TEXTURE_1D); GLuint depthAlphaTexture; // Texture reference for the 1D texture produced in the loop below glGenTextures(1, &depthAlphaTexture); // Tell OpenGL to make space for the texture glBindTexture(GL_TEXTURE_1D, depthAlphaTexture);// Make this the currently used texture glPixelStorei(GL_UNPACK_ALIGNMENT, 1); // Pixel data is aligned in byte order glTexParameteri(GL_TEXTURE_1D, GL_TEXTURE_WRAP_S, GL_CLAMP); // Don't repeat the texture! We want clamping of alpha values. glEnable(GL_BLEND); glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA); GLubyte alphaTexture[256]; for (int i = 0; i < 256; i++) { alphaTexture[i] = 255 - i; } glTexImage1D(GL_TEXTURE_1D, 0, 1, 256, 0, GL_ALPHA, GL_UNSIGNED_BYTE, alphaTexture); // Give OpenGL the texture data glEnable(GL_TEXTURE_GEN_S); // Automatically generate texture coordinates based on depth glTexGeni(GL_S, GL_TEXTURE_GEN_MODE, GL_EYE_LINEAR); glPushMatrix(); glLoadIdentity(); GLfloat splane[4] = {0.0f, 0.0f, -1.0f/25, 0.0f}; // Change the s texture coordinate to depend on depth (z value) not the x value glTexGenfv(GL_S, GL_EYE_PLANE, splane); glPopMatrix(); // White squares glColor3f(1.0f, 1.0f, 1.0f); glBegin(GL_QUADS); // Farthest glVertex3f(2.0f, 0.5f, -10.0f); glVertex3f(1.0f, 0.5f, -10.0f); glVertex3f(1.0f, -0.5f, -10.0f); glVertex3f(2.0f, -0.5f, -10.0f); glEnd(); glBegin(GL_QUADS); glVertex3f(0.333f, 0.333f, -5.0f); glVertex3f(-0.333f, 0.333f, -5.0f); glVertex3f(-0.333f, -0.333f, -5.0f); glVertex3f(0.333f, -0.333f, -5.0f); glEnd(); glBegin(GL_QUADS); // Closest glVertex3f(-0.333f, 0.166f, 0.0f); glVertex3f(-0.667f, 0.166f, 0.0f); glVertex3f(-0.667f, -0.166f, 0.0f); glVertex3f(-0.333f, -0.166f, 0.0f); glEnd(); glEnable(GL_LIGHTING); [/code]
  10. I've solved the quadrant issue. I was sampling the texture in the wrong positions: sampX = (gl_FragCoord.x + float(x)) / screenDimension; sampY = (gl_FragCoord.y + float(y)) / screenDimension; needed to be changed to sampX = float(x) / screenDimension; sampY = float(y) / screenDimension; Since resolving that issue, I've come up with another. The unsharp mask I'm making should be mostly black with some lighter color around the edges. It's about right, but instead of being mostly black, it's mostly dark gray and changes shade as the object moves closer to or farther from the camera. [img][/img] [img][/img]
  11. I'm working on a simple type of SSAO that involves unsharp masking the depth buffer. To verify that I correctly loaded the depth texture (a texture the same size as the screen containing depth values), I created a simple shader that just colors each fragment according to its depth. These two images show a test object I made with the shader off and on: [img][/img] [img][/img] That all seems to be working well, so I moved on to unsharp masking the depth texture (the next step). I wrote a shader to blur the depth texture and display the blurred version, but I'm getting a weird artifact that I can't track down. The render only appears in the lower left quadrant: [img][/img] Here's the fragment shader code for the first depth image (the second image): [code] uniform sampler2D depthValues; void main() { gl_FragColor = texture2D(depthValues, gl_FragCoord.xy/1024.0); } [/code] And here's the fragment shader code for the shader that's misbehaving (third image): [code] uniform sampler2D depthValues; float[25] gauss = float[25] (0.0030, 0.0133, 0.0219, 0.0133, 0.0030, 0.0133, 0.0596, 0.0983, 0.0596, 0.0133, 0.0219, 0.0983, 0.1621, 0.0983, 0.0219, 0.0133, 0.0596, 0.0983, 0.0596, 0.0133, 0.0030, 0.0133, 0.0219, 0.0133, 0.0030); const float kernelDimension = 5.0; const float screenDimension = 1024.0; void main() { vec4 sum = vec4(0,0,0,0); int iter = 0; int i = int(gl_FragCoord.x); int j = int(gl_FragCoord.y); int maxX = i + int(floor(kernelDimension/2.0)); int maxY = j + int(floor(kernelDimension/2.0)); float sampX; float sampY; for (int x = i - int(floor(kernelDimension/2.0)); x < maxX; x++) { for (int y = j - int(floor(kernelDimension/2.0)); y < maxY; y++, iter++) { sampX = (gl_FragCoord.x + float(x)) / screenDimension; sampY = (gl_FragCoord.y + float(y)) / screenDimension; if (sampX >= 0.0 && sampX <= 1.0 && sampY >= 0.0 && sampY <= 1.0) { sum += texture2D(depthValues, vec2(sampX, sampY)) * gauss[iter]; } } } gl_FragColor = texture2D(depthValues, gl_FragCoord.xy / screenDimension) - sum; } [/code] The second shader blurs the depth texture with a 5x5 Gaussian kernel (gauss). If you know why I'm getting these results, please help me solve the issue.