Strange rendering behavior on different cards.

Started by
17 comments, last by Styves 10 years ago

Wow, that's strange, didn't read docs enough. If I couldn't find the faulty one, then it will be better to completely remove them.

Advertisement

I just want to point out that glGetError actually causes significant performance overhead, because it triggers pipeline flushes. In my case I found it advantageous to set a separate define to call it after every GL call and only enable that functionality periodically as a check or when something went wrong.

SlimDX | Ventspace Blog | Twitter | Diverse teams make better games. I am currently hiring capable C++ engine developers in Baltimore, MD.

Hello again, everyone.


I believe now you are initializing the OpenGL context with SDL's default values, whatever they are (ie. your SDL_GL_SetAttribute() calls have no effect)

Well, it seems to be like this.

But what are these defaults?

I've placed


SDL_GL_SetAttribute( SDL_GL_CONTEXT_MAJOR_VERSION, 4 );
SDL_GL_SetAttribute( SDL_GL_CONTEXT_MINOR_VERSION, 3 );

before window creation again, and that's causing errors to appear.

When I remove these lines, all works OK, logger reports that system is started with OpenGL 4.3 on nvidia driver v 319.32.

My card supports OpenGL 4.3, thats for sure. Driver supports that card, so the issue is within SDL 2 then ?

Have you tried requesting lower versions? 3.3 or 3.1 etc.

It certainly looks like you're doing everything correctly and it's how we do it on our projects.

"Ars longa, vita brevis, occasio praeceps, experimentum periculosum, iudicium difficile"

"Life is short, [the] craft long, opportunity fleeting, experiment treacherous, judgement difficult."

Yes, I tried. And, unfortunately, I've ended up with the same issues. On GF630 every GL version request fails with 'invalid enumerant'/'invalid operation'.

Without version request it initializes GL 4.3 ( as max of what this GPU is capable of ) and everything seems to be fine.

Maybe it's something like this exact driver working weird with this exact model of GPU card ?

I have the same driver on the laptop with GF740 and it's all ok there with SDL_GL_SetAttribute( SDL_GL_CONTEXT* ) calls.

Will try another driver on the next week and post here the results.

Well, I've tried to specify the OpenGL version on GF630 with the new driver [ sorry, took more than a week to get to this point ] :


Message: (1397464013393): Running with OpenGL 4.4.0 NVIDIA 331.67

... and now I'm getting the same strange error strings just like before. I don't know now whom to fight:

Linux OS ( 2/3 PCs works great ),

SDL2 ( weird things on the inside ??? )

or my own code ( seems more obvious, but I'm doing the standard init routine, and again - it works out on 2 of 3 PCs ),

or even the nVIDIA card/drivers ???

Maybe I can go in some hardcore way with some #ifdefs and platform-dependent code, using WinAPI for Win32 and X11 window init for *nix systems. That should cut off the SDL2 part away.

But is it worth it ?

Which version of the GF630 do you have? Only the Kepler version supports GL 4.3, the others only support up to 4.2 (according to NVIDIA specs). I don't think much will come from specifying it with the new driver as I'm thinking it's a hardware limitation. So with that, I'd say try switching it to 4.0-4.2 instead of 4.3 and see if it works then.

If you suspect SDL, then you can try switching to GLFW and see if the problem persists. :)

It's seems that hardware is OK - by default the 4.4 profile is initialized ( it is when version demands are commented-out from the code ) - my app gets these numbers from the OpenGL version string.

And I was trying to get a 3.3 core in the first part.

Well, the migration to the GLFW is still possible, but I feel the input system there a bit tricky + it seems that GLFW is limiting itself to a 60 FPS.

I'll give it another try in order to find the guilty one :)

You can disable VSync in GLFW with glfwSwapInterval(0);

:)

This topic is closed to new replies.

Advertisement