• Advertisement
Sign in to follow this  

OpenGL Debugging Graphics

This topic is 1396 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

I am curious what you have in your bag of tricks for debugging 3d graphics.  I know it is a broad question, but I'm hoping there are some really good ideas out there floating around in the ether that I haven't stumbled upon yet.

 

I see this with beginners a lot.  Most of the problems in a beginning OpenGL class were students putting everything into the scene, and then nothing shows up.  To me, it is a "black screen of death".  Where is my model?  There aren't any errors, and no way to really figure out what's wrong.  But that feels like guessing, and debugging by guessing never works out.

 

For example, when I was writing a software renderer from scratch, I was trying to clip in homogeneous coordinates.  But I didn't really understand the math, so when I tried out the clip code I got an array index out of bound exception (this was in Java).  Well great.  I had no idea how to fix that.  I ended up setting up four views: top, left, front, and perspective, and then drawing the view frustum, hand clipping all the lines so nothing would crash.  Then I could see what was happening.  It ended up being a sign error in the near clipping plane.  It was easy to fix, but it took two weeks of side work to find it.

 

What do you do when you load up a scene and nothing is there?

Edited by Glass_Knife

Share this post


Link to post
Share on other sites
Advertisement

What do you do when you load up a scene and nothing is there?

I ask myself why I was stupid enough to have gotten that far without testing along the way. In brief, "Don't get in that situation." wink.png

 

Actually, I just go back to the last change I made (usually not more than one or two routines), undo it to make sure that's where the problem is, and debug from there. I have the luxury of relatively small apps that don't take long to compile. However, if you're going to spend more than an hour debugging something, a few minutes compile time is worth the effort.

 

 

 


array index out of bound exception (this was in Java). Well great. I had no idea how to fix that.

When you have an error as specific as that - you should be able to locate the line of code that's causing the problem. Admittedly, I'm unfamiliar with how one might debug java, but a good IDE will pinpoint things for you. In a specific case like that - look at the data coming in and determine if it's right or not. That, at a minimum, divides your code in half. Either the problem is before that line of code, or after.

Edited by Buckeye

Share this post


Link to post
Share on other sites


Now, thats kind of a bare-bones debugging experience. There are much better tools today -- like PIX, Visual Studio Graphics Diagnostics, and Valve's similar upcoming OpenGL tools -- that let you inspect the goings-on of the graphics pipeline in great detail by capturing all your draw commands together with all the data. For example, this topic is the same black-screen-of-death style problem, and the tools make it pretty straight forward to investigate and discover the problem.

 

I imagined that having some kind of tool would be necessary to really get in there and figure out what is going on.  In my example I wrote my own graphical debugging framework so I could play with the problem and figure out what was happening.  Maybe I just haven't put enough time into my code to add  debugging capabilities.  I can think of lots of things that could be done to inspect the pipeline and find problems.

Share this post


Link to post
Share on other sites

Obviously tools are great, if they work. Fixing the "where's my triangle?" problem can be hard no matter what, and I think that's a reflection on the complexity of the hardware and the API's.

 

Having a quick loop for recompiling shaders can be really nice for debugging, especially if you can hot-swap them while your program is running. It can make it very easy to quickly isolate problems without having to step through a debugger line by line.

Share this post


Link to post
Share on other sites

 

and debugging by guessing never works out.

It always works out. Beginning graphics is all about guessing

My biggest takeaway from when I started, was never program stuff without source control. Once you break something especially in graphics, it could take hours. Don't have people put everything into the scene. Create a box first, apply a texture second, put 1 more item, put another texture. Verifying each time that it works helps to know exactly what line is breaking it. Then its a matter of taking the breaking code and moving it around to see if it works in different areas.

Share this post


Link to post
Share on other sites

Don't forget the glValidateProgram function, call this right before you do the drawcalls and write away the output somewhere.
Useful for if your shader linked correctly, but doesn't produce expected output (eg. a black screen of doom).

Share this post


Link to post
Share on other sites

when i started development with cuda i discovered nvidia nsight, i've never used it for opengl debugging but it looks very promising :)

 

Share this post


Link to post
Share on other sites

AMD have CodeXL aswell, it's a visual studio addin, lets you debug your OpenGL states as though you are using the VS debugger(almost). Really helpful. Aside from that, I have a checkError function that if I create  a debug build (no optimisations) it will call glGetError and log the results with a timestamp and a little information about where it was called, what the state of the program at that point was. I sprinkle it liberally around problem errors, and run my app for a little while, and parse my logs after. I normally write it to a stringstream and on calling the destructor for the singleton I dump the stringstream to a file, formatted.

Share this post


Link to post
Share on other sites
For geometry problems I absolutely love OpenGL's immediate mode. It's so easy to render a couple of lines or points to show, where the object, coordinate system, light source, particle effect, ... should be. You can also encode states or properties as colors if you are trying to find out, why a problem only occures for a small subset of entities. Being able to render text at a 3D location can also be extremely beneficial.

If it's shader related, implement hot-swapable shaders like MJP suggested, and output "debug colors" to circle in on the problem. Having had shader debuggers at work and hot swapping in my private engine I would take hot swapping over a shader debugger almost every time.

Also, seeing the API calls not as code but as a list like in apitrace or PIX can give a second perspective and move things out of your blind spot.

What Buckeye said (reverting the last change) obviously plays well with lots of small version control commits.

Share this post


Link to post
Share on other sites


If it's shader related, implement hot-swapable shaders like MJP suggested, and output "debug colors" to circle in on the problem. Having had shader debuggers at work and hot swapping in my private engine I would take hot swapping over a shader debugger almost every time.

 

This is a new idea to me.  Can you give me an example of how debug colors work.  It sounds fascinating, but I don't see the benefit.

Share this post


Link to post
Share on other sites

To address what issues a beginner/student might make, here is a list of things to do to find silly mistakes:

 

Voted up, mostly because in my career (working on a wide range of platforms), I think I've failed at every one of these. biggrin.png

Share this post


Link to post
Share on other sites

 


If it's shader related, implement hot-swapable shaders like MJP suggested, and output "debug colors" to circle in on the problem. Having had shader debuggers at work and hot swapping in my private engine I would take hot swapping over a shader debugger almost every time.

 

This is a new idea to me.  Can you give me an example of how debug colors work.  It sounds fascinating, but I don't see the benefit.

 

 

Many times when optimizing I draw difference of naive version and optimized version. This way I can spot if I have make silly mistakes with math. This also work well when replacing math with aproximations.

Share this post


Link to post
Share on other sites

 


If it's shader related, implement hot-swapable shaders like MJP suggested, and output "debug colors" to circle in on the problem. Having had shader debuggers at work and hot swapping in my private engine I would take hot swapping over a shader debugger almost every time.

 

This is a new idea to me.  Can you give me an example of how debug colors work.  It sounds fascinating, but I don't see the benefit.

 

 

Let's say that you suspect a certain value is ending up being less than 0, even though that shouldn't be the case. You can check the value in the shader, and set the color to bright if it's less than 0. You can also very easily disable large sections of complicated code by hard-coding a pixel shader output color, which can be very helpful for narrowing down the location of your bug.

Share this post


Link to post
Share on other sites

 

 


If it's shader related, implement hot-swapable shaders like MJP suggested, and output "debug colors" to circle in on the problem. Having had shader debuggers at work and hot swapping in my private engine I would take hot swapping over a shader debugger almost every time.

 

This is a new idea to me.  Can you give me an example of how debug colors work.  It sounds fascinating, but I don't see the benefit.

 

 

Let's say that you suspect a certain value is ending up being less than 0, even though that shouldn't be the case. You can check the value in the shader, and set the color to bright if it's less than 0. You can also very easily disable large sections of complicated code by hard-coding a pixel shader output color, which can be very helpful for narrowing down the location of your bug.

 

 

Ah, like a visual debug statement.  I hadn't thought of that.  Got it.

Share this post


Link to post
Share on other sites

As many others have already mentioned, this is a really big topic :)  I remember writing a post or two from my journal here on GameDev from long ago, so went digging through the old ones to see if I could find one or two examples.  I had posted a short description about debugging that you can find here.

 

Overall, for beginners getting their first D3D program up and running with no geometry output, it is usually that the geometry isn't transformed into the view volume properly.  That is often an issue with the matrices - either concatenated improperly or that they are transposed from what the API expects.  It is also fairly common to forget to set the viewport in the rasterizer, which eliminates all pixels from the pipeline.

 

After those, you need to consider all the reasons that the pipeline might cut out your geometry at each stage of the pipeline.  Starting from the beginning of the pipeline, here are some examples:

 

Input Assembler: Input layout doesn't match the actual data layout

Vertex Shader: Transformations incorrect, clipping planes incorrect

Hull / Domain Shaders: Pretty much anything can go wrong here...

Geometry Shaders: Usually transformations, or perhaps that your triangle winding is backwards

Rasterizer: Viewports missing, culling modes incorrect

Pixel Shader: It can kill pixels, but that has to be explicitly done in the shader...

Output Merger: Depth test, stencil test, and blending states can all cause things to disappear

 

The key is to be able to make quick, targeted API changes and be able to check the results fast.  If you can do that at runtime, you are way ahead of the game (although beginners probably don't have this capability...).

 

For pixel shader debugging, I also echo the suggestion for diagnostic color outputs.  In this post you can see a parallax occlusion mapping screenshot that uses colors to indicate where certain regions of the virtual volume are located.

Share this post


Link to post
Share on other sites

I've run into tons of trouble with this exact problem. After spending hours and hours on trying to figure out why my "stuff" isn't rendering, I've come up with a comprehensive checklist of things to verify and check.

If you've never rendered a model or primitive to the screen before using your current API, you want to establish a baseline by trying to do the most basic thing you can: render the simplest model/primitive you can. This is akin to writing your first "hello world" program for graphics. If you can do this, then the rest of graphics programming is simply a matter of adding on additional layers of complexity. The general debugging step then becomes a matter of adding on each subsequent layer of complexity and seeing which one breaks.

At the core, debugging is essentially just a matter of isolating and narrowing the problem down to as few possibilities as possible, then focusing in on each possibility.

 

This is for the C# and XNA API, but you can generalize or translate these points to your own language and API.

Let's start with the comprehensive checklist for primitive rendering (triangle lists, triangle strips, line lists, line strips):
1. Base case: Can you render a triangle to the screen without doing anything fancy?
    No:

       -Are you setting vertex positions for the three corners of the triangle? Are they different from each other? Is it rendering a triangle which should be visible in the current view?

       -Are you actually calling the "DrawPrimitive()" method, or equivalent in your API?
       -Are you using vertex colors which contrast against the background color?
       -Are you correctly applying a shader? Is the shader the correct shader? Have all shader settings been set correctly before you call the draw call?
       -Are you using a valid view and projection matrix which would actually let you view the triangle?
       -Are you using a world matrix which is transforming the triangle off screen? (You shouldn't even need a world matrix yet)
      -Are you using the right primitive type in your DrawPrimitives call? (triangle list vs triangle strip, etc)
2. Indexed verticies: Are you using an index buffer to specify the vertex drawing order?
    Yes:
       -Is the vertex drawing order compatible with your current cull mode? To find out, either toggle your cull mode or change your drawing order.
       -Are you actually creating an index buffer? Are you copying an array of ints into your index buffer to fill it with data? Are the array values correct?
       -If your index buffer is created, are you actually setting the graphics cards active index buffer to your index buffer?
       -Are you using "DrawIndexedPrimitives()" or your API's equivalent draw call? Are you correctly specifying the correct number of primitives to draw?
      -Does the drawing order make sense with regard to the primitive type you're using? ie, the vertex order in a triangle strip is very different from a triangle list.
3. Vertex Data:
   -Are you using a custom vertex declaration? If yes, skip to #4.

   -Are you using a vertex buffer? If yes:
       -You must use a vertex array of some sort, at some point, to populate the vertex buffer. Verify that you're getting an array of verticies in your code. Using your IDE debugger, verify that the vertex data is correct.
      -Are you moving your vertex array data into a vertex buffer? Is the vertex buffer the correct size? Does the vertex buffer have the vertex data from your vertex array?
      -On the graphics card, are you setting the active vertex buffer before drawing? Is there an associated index buffer?
4. Custom Vertex Declarations: Are you using a custom vertex declaration?
  Yes: Then you must be defining your vertex in a Struct.
    -Does your vertex declaration include position information? If not, how are you computing the vertex position in your shader?
     -Does your vertex declaration include every field you want to use?
   -Are you creating a Vertex Declaration correctly?
       -Are your vertex elements being defined in the same order as they are in the struct fields? This is one of the few times declaration variable order really matters because it's specifying the order they appear in the struct memory block.
       -Are you correctly calculating the BYTE size of each variable in the vertex? Are you correctly calculating the field offset in bytes?

       -Are you correctly specifying the vertex element usage?

       -Are you correctly using the right usage index for the vertex element?
       -Are you specifying the correct total byte size for your custom vertex declaration?
 -Is your code correctly using the custom vertex data? ie, putting position information into a position variable.

5. High Level Shader Language (HLSL): Are you using a shader other than "BasicEffect"?
      -Are you actually loading the content for the shader and storing it in an "Effect" data structure?
      -Are you correctly initializing the effect?
      -Are you setting a "Current Technique" in your render call to one which exists in the shader?
      -Does the technique which you use include a vertex shader and a pixel shader? Are they supported by your API and graphics card?
      -Does the vertex shader require any global variables to be set? (ie, camera position, world matricies, textures, etc). Are they being set to valid data?
       -Does the vertex shader output valid data which the pixel shader can use?
       -Does the pixel shader actually output color information?
       -Does your vertex shader math and logic check out correctly? (If you don't know or aren't sure, it's time to use a shader debugger).

6. Shader debuggers:

    I'm using Visual Studio 2010, so I can't use the built-in shader debugger from VS2012. I have to use external tools. Here are the ones I've tried and my thoughts on them:
    NVidia FX Composer: It sucks. It is unstable and crashes frequently, has a high learning curve, and can't attach a shader debugger to an executable file (your game). You can't push custom vertex data into a shader and see how the shader handles it. This program is mostly useful for creating shaders for existing models.
   ATI's GPU PerfStudio: It doesn't work with DirectX 9.0, so if you're using XNA, you're out of luck. Sorry, ATI doesn't care enough. It's also a bit confusing to setup and get running.
    Microsoft PIX: It's a mediocre debugger, but is the best one I've found. It is included in the DirectX SDK. The most useful feature is being able to attach to an EXE and capturing a frame by pressing F12. You can then view every single method call used to draw that frame, along with the method parameters. This tool also lets you view every single resource (DX Surfaces, vertex buffers, index buffers, rasterizer settings, etc) on the graphics card, along with that resources data. This is the best way to see if your vertex data and index buffer data is legit. You can also debug a pixel in your vertex data. This lets you step through your shader code (HLSL or ASM) line by line and see what the actual variable values are being set to. It's an okay debugger, but it doesn't have any intellisense or let you mouse over a variable to see its values like the Visual Studio IDE debugger does. This is the debugger I currently use to debug my shaders. The debugging workflow is a bit cumbersome since you have to rebuild your project, start a new experiment, take a snapshot, find the frame, find the data object you want to see, step through the shader debugger to the variable you're interested in (~2 minutes). Here are a few "nice to know" notes on PIX:
  -If you're looking at the contents of a vertex buffer:

      -Each block is 32 bits, or 4 bytes in size. Keep this in mind if you're using a custom vertex declaration to pack data into a 4 byte block (such as with Color data).

      -0xFF is displayed as a funky value: -1.#Q0
     -Each 4-byte block is displayed in the order it appears in your custom vertex declaration. Each vertex data block is your vertex declaration size / 4. (ie, 36 bytes = 36 / 4 = 9 blocks per vertex)
     -The total size of the buffer is the blocks per vertex multiplied by the number of verticies you have (ie, 9 * 3 = 27 4-byte blocks)
      -Usage: If your vertex declaration byte offsets are off by a byte or more, you should expect to see funky data in the buffer.
  -Vertex Declaration should always match the vertex declaration in your custom vertex declaration struct.

-By selecting the actual draw call in the events list and then looking at the mesh, you can see the vertex information as it appears in the pre-vertex shader (object space), the post-vertex shader (world space), and Viewport (screen space). If the vertex data doesn't look right in any of these steps, you should know where to start debugging.
   *Special note: If you're creating geometries on the graphics card within your shader, you won't see much of value in the pre-vertex shader.
-The debugger includes an shader assembly language debugger. It's nice to have but not very useful.
-The shader compiler will remove any code which isn't used in the final output of a vertex. This is extra annoying when you're trying to set values to a variable and debug them.


Model Debugging:

The same principles from the primitive rendering apply, except you have to verify that you've correctly loaded the model data into memory and are calling the right method to render a model.

One handy tip which may help you for your project: Write down each step it takes to add and render a new model within your project (ie, your projects content creation pipeline & workflow). It's easy to accidentally skip a step as you're creating new assets and end up wasting time trying to isolate the problem to that missed step. An ounce of prevention is worth a pound of cure, right?

Share this post


Link to post
Share on other sites

* Wrap glGetError in a function that can print the error enum and a custom string and call it before and after tricky code, e.g.: CheckGLErrors( "DrawShadows begin" ) and CheckGLErrors( "DrawShadows end" )

* Reverse triangle winding order

* Output debug colors in a shader

* Use a frame debugger (eg. Crytek's RenderDoc, PIX or Xcode Instruments)

* Give your state objects debug labels

Share this post


Link to post
Share on other sites

For some kind of errors, using some kind of intercepting tool can be pretty useful. In the past I have used both glIntercept (https://code.google.com/p/glintercept/) and gDEBugger (http://developer.amd.com/tools-and-sdks/archive/amd-gdebugger/).

 

I also have some builds of my applications which, at every frame, check the date of the shader files and, if they are newer than the ones that were loaded (I store a timestamp in my Shader class) they are loaded again.

 

This way I can have the application in one screen, the shaders in Notepad++ in the other, and everytime I save in Notepad++ I see the results in the application. For tuning shaders this is awesome. It makes the application somewhat more unstable and of course it is not efficient, so I would never leave this in my production release...

 

EDIT:

Oh, I also have a macro for checking glError. It is a macro called __GL which is blank in Release mode and in Debug it basically calls a function that calls glError, and asserts that the error is GL_NONE. When I am desperate enough I just fill the code with calls to that macro in the sections I suspect.

 

And finally, the new OpenGL 4.1 (I believe) comes with the Debug context, which is a God-sent. More info in: http://www.altdevblogaday.com/2011/06/23/improving-opengl-error-messages/

Edited by Javier Meseguer de Paz

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
  • Advertisement
  • Popular Tags

  • Advertisement
  • Popular Now

  • Similar Content

    • By reenigne
      For those that don't know me. I am the individual who's two videos are listed here under setup for https://wiki.libsdl.org/Tutorials
      I also run grhmedia.com where I host the projects and code for the tutorials I have online.
      Recently, I received a notice from youtube they will be implementing their new policy in protecting video content as of which I won't be monetized till I meat there required number of viewers and views each month.

      Frankly, I'm pretty sick of youtube. I put up a video and someone else learns from it and puts up another video and because of the way youtube does their placement they end up with more views.
      Even guys that clearly post false information such as one individual who said GLEW 2.0 was broken because he didn't know how to compile it. He in short didn't know how to modify the script he used because he didn't understand make files and how the requirements of the compiler and library changes needed some different flags.

      At the end of the month when they implement this I will take down the content and host on my own server purely and it will be a paid system and or patreon. 

      I get my videos may be a bit dry, I generally figure people are there to learn how to do something and I rather not waste their time. 
      I used to also help people for free even those coming from the other videos. That won't be the case any more. I used to just take anyone emails and work with them my email is posted on the site.

      I don't expect to get the required number of subscribers in that time or increased views. Even if I did well it wouldn't take care of each reoccurring month.
      I figure this is simpler and I don't plan on putting some sort of exorbitant fee for a monthly subscription or the like.
      I was thinking on the lines of a few dollars 1,2, and 3 and the larger subscription gets you assistance with the content in the tutorials if needed that month.
      Maybe another fee if it is related but not directly in the content. 
      The fees would serve to cut down on the number of people who ask for help and maybe encourage some of the people to actually pay attention to what is said rather than do their own thing. That actually turns out to be 90% of the issues. I spent 6 hours helping one individual last week I must have asked him 20 times did you do exactly like I said in the video even pointed directly to the section. When he finally sent me a copy of the what he entered I knew then and there he had not. I circled it and I pointed out that wasn't what I said to do in the video. I didn't tell him what was wrong and how I knew that way he would go back and actually follow what it said to do. He then reported it worked. Yea, no kidding following directions works. But hey isn't alone and well its part of the learning process.

      So the point of this isn't to be a gripe session. I'm just looking for a bit of feed back. Do you think the fees are unreasonable?
      Should I keep the youtube channel and do just the fees with patreon or do you think locking the content to my site and require a subscription is an idea.

      I'm just looking at the fact it is unrealistic to think youtube/google will actually get stuff right or that youtube viewers will actually bother to start looking for more accurate videos. 
    • By Balma Alparisi
      i got error 1282 in my code.
      sf::ContextSettings settings; settings.majorVersion = 4; settings.minorVersion = 5; settings.attributeFlags = settings.Core; sf::Window window; window.create(sf::VideoMode(1600, 900), "Texture Unit Rectangle", sf::Style::Close, settings); window.setActive(true); window.setVerticalSyncEnabled(true); glewInit(); GLuint shaderProgram = createShaderProgram("FX/Rectangle.vss", "FX/Rectangle.fss"); float vertex[] = { -0.5f,0.5f,0.0f, 0.0f,0.0f, -0.5f,-0.5f,0.0f, 0.0f,1.0f, 0.5f,0.5f,0.0f, 1.0f,0.0f, 0.5,-0.5f,0.0f, 1.0f,1.0f, }; GLuint indices[] = { 0,1,2, 1,2,3, }; GLuint vao; glGenVertexArrays(1, &vao); glBindVertexArray(vao); GLuint vbo; glGenBuffers(1, &vbo); glBindBuffer(GL_ARRAY_BUFFER, vbo); glBufferData(GL_ARRAY_BUFFER, sizeof(vertex), vertex, GL_STATIC_DRAW); GLuint ebo; glGenBuffers(1, &ebo); glBindBuffer(GL_ELEMENT_ARRAY_BUFFER, ebo); glBufferData(GL_ELEMENT_ARRAY_BUFFER, sizeof(indices), indices,GL_STATIC_DRAW); glVertexAttribPointer(0, 3, GL_FLOAT, false, sizeof(float) * 5, (void*)0); glEnableVertexAttribArray(0); glVertexAttribPointer(1, 2, GL_FLOAT, false, sizeof(float) * 5, (void*)(sizeof(float) * 3)); glEnableVertexAttribArray(1); GLuint texture[2]; glGenTextures(2, texture); glActiveTexture(GL_TEXTURE0); glBindTexture(GL_TEXTURE_2D, texture[0]); glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE); glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE); glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR); glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR); sf::Image* imageOne = new sf::Image; bool isImageOneLoaded = imageOne->loadFromFile("Texture/container.jpg"); if (isImageOneLoaded) { glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, imageOne->getSize().x, imageOne->getSize().y, 0, GL_RGBA, GL_UNSIGNED_BYTE, imageOne->getPixelsPtr()); glGenerateMipmap(GL_TEXTURE_2D); } delete imageOne; glActiveTexture(GL_TEXTURE1); glBindTexture(GL_TEXTURE_2D, texture[1]); glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE); glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE); glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR); glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR); sf::Image* imageTwo = new sf::Image; bool isImageTwoLoaded = imageTwo->loadFromFile("Texture/awesomeface.png"); if (isImageTwoLoaded) { glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, imageTwo->getSize().x, imageTwo->getSize().y, 0, GL_RGBA, GL_UNSIGNED_BYTE, imageTwo->getPixelsPtr()); glGenerateMipmap(GL_TEXTURE_2D); } delete imageTwo; glUniform1i(glGetUniformLocation(shaderProgram, "inTextureOne"), 0); glUniform1i(glGetUniformLocation(shaderProgram, "inTextureTwo"), 1); GLenum error = glGetError(); std::cout << error << std::endl; sf::Event event; bool isRunning = true; while (isRunning) { while (window.pollEvent(event)) { if (event.type == event.Closed) { isRunning = false; } } glClear(GL_COLOR_BUFFER_BIT); if (isImageOneLoaded && isImageTwoLoaded) { glActiveTexture(GL_TEXTURE0); glBindTexture(GL_TEXTURE_2D, texture[0]); glActiveTexture(GL_TEXTURE1); glBindTexture(GL_TEXTURE_2D, texture[1]); glUseProgram(shaderProgram); } glBindVertexArray(vao); glDrawElements(GL_TRIANGLES, 6, GL_UNSIGNED_INT, nullptr); glBindVertexArray(0); window.display(); } glDeleteVertexArrays(1, &vao); glDeleteBuffers(1, &vbo); glDeleteBuffers(1, &ebo); glDeleteProgram(shaderProgram); glDeleteTextures(2,texture); return 0; } and this is the vertex shader
      #version 450 core layout(location=0) in vec3 inPos; layout(location=1) in vec2 inTexCoord; out vec2 TexCoord; void main() { gl_Position=vec4(inPos,1.0); TexCoord=inTexCoord; } and the fragment shader
      #version 450 core in vec2 TexCoord; uniform sampler2D inTextureOne; uniform sampler2D inTextureTwo; out vec4 FragmentColor; void main() { FragmentColor=mix(texture(inTextureOne,TexCoord),texture(inTextureTwo,TexCoord),0.2); } I was expecting awesomeface.png on top of container.jpg

    • By khawk
      We've just released all of the source code for the NeHe OpenGL lessons on our Github page at https://github.com/gamedev-net/nehe-opengl. code - 43 total platforms, configurations, and languages are included.
      Now operated by GameDev.net, NeHe is located at http://nehe.gamedev.net where it has been a valuable resource for developers wanting to learn OpenGL and graphics programming.

      View full story
    • By TheChubu
      The Khronos™ Group, an open consortium of leading hardware and software companies, announces from the SIGGRAPH 2017 Conference the immediate public availability of the OpenGL® 4.6 specification. OpenGL 4.6 integrates the functionality of numerous ARB and EXT extensions created by Khronos members AMD, Intel, and NVIDIA into core, including the capability to ingest SPIR-V™ shaders.
      SPIR-V is a Khronos-defined standard intermediate language for parallel compute and graphics, which enables content creators to simplify their shader authoring and management pipelines while providing significant source shading language flexibility. OpenGL 4.6 adds support for ingesting SPIR-V shaders to the core specification, guaranteeing that SPIR-V shaders will be widely supported by OpenGL implementations.
      OpenGL 4.6 adds the functionality of these ARB extensions to OpenGL’s core specification:
      GL_ARB_gl_spirv and GL_ARB_spirv_extensions to standardize SPIR-V support for OpenGL GL_ARB_indirect_parameters and GL_ARB_shader_draw_parameters for reducing the CPU overhead associated with rendering batches of geometry GL_ARB_pipeline_statistics_query and GL_ARB_transform_feedback_overflow_querystandardize OpenGL support for features available in Direct3D GL_ARB_texture_filter_anisotropic (based on GL_EXT_texture_filter_anisotropic) brings previously IP encumbered functionality into OpenGL to improve the visual quality of textured scenes GL_ARB_polygon_offset_clamp (based on GL_EXT_polygon_offset_clamp) suppresses a common visual artifact known as a “light leak” associated with rendering shadows GL_ARB_shader_atomic_counter_ops and GL_ARB_shader_group_vote add shader intrinsics supported by all desktop vendors to improve functionality and performance GL_KHR_no_error reduces driver overhead by allowing the application to indicate that it expects error-free operation so errors need not be generated In addition to the above features being added to OpenGL 4.6, the following are being released as extensions:
      GL_KHR_parallel_shader_compile allows applications to launch multiple shader compile threads to improve shader compile throughput WGL_ARB_create_context_no_error and GXL_ARB_create_context_no_error allow no error contexts to be created with WGL or GLX that support the GL_KHR_no_error extension “I’m proud to announce OpenGL 4.6 as the most feature-rich version of OpenGL yet. We've brought together the most popular, widely-supported extensions into a new core specification to give OpenGL developers and end users an improved baseline feature set. This includes resolving previous intellectual property roadblocks to bringing anisotropic texture filtering and polygon offset clamping into the core specification to enable widespread implementation and usage,” said Piers Daniell, chair of the OpenGL Working Group at Khronos. “The OpenGL working group will continue to respond to market needs and work with GPU vendors to ensure OpenGL remains a viable and evolving graphics API for all its customers and users across many vital industries.“
      The OpenGL 4.6 specification can be found at https://khronos.org/registry/OpenGL/index_gl.php. The GLSL to SPIR-V compiler glslang has been updated with GLSL 4.60 support, and can be found at https://github.com/KhronosGroup/glslang.
      Sophisticated graphics applications will also benefit from a set of newly released extensions for both OpenGL and OpenGL ES to enable interoperability with Vulkan and Direct3D. These extensions are named:
      GL_EXT_memory_object GL_EXT_memory_object_fd GL_EXT_memory_object_win32 GL_EXT_semaphore GL_EXT_semaphore_fd GL_EXT_semaphore_win32 GL_EXT_win32_keyed_mutex They can be found at: https://khronos.org/registry/OpenGL/index_gl.php
      Industry Support for OpenGL 4.6
      “With OpenGL 4.6 our customers have an improved set of core features available on our full range of OpenGL 4.x capable GPUs. These features provide improved rendering quality, performance and functionality. As the graphics industry’s most popular API, we fully support OpenGL and will continue to work closely with the Khronos Group on the development of new OpenGL specifications and extensions for our customers. NVIDIA has released beta OpenGL 4.6 drivers today at https://developer.nvidia.com/opengl-driver so developers can use these new features right away,” said Bob Pette, vice president, Professional Graphics at NVIDIA.
      "OpenGL 4.6 will be the first OpenGL release where conformant open source implementations based on the Mesa project will be deliverable in a reasonable timeframe after release. The open sourcing of the OpenGL conformance test suite and ongoing work between Khronos and X.org will also allow for non-vendor led open source implementations to achieve conformance in the near future," said David Airlie, senior principal engineer at Red Hat, and developer on Mesa/X.org projects.

      View full story
    • By _OskaR
      Hi,
      I have an OpenGL application but without possibility to wite own shaders.
      I need to perform small VS modification - is possible to do it in an alternative way? Do we have apps or driver modifictions which will catch the shader sent to GPU and override it?
  • Advertisement