Sign in to follow this  
c_olin

OpenGL Using SDL for OpenGL context is slower than GLFW

Recommended Posts

c_olin    197
I've been cleaning up my code a bit making sure that if I were to switch graphics APIs it wouldn't be a huge deal. So I decided to switch to SDL for context creation and input since I was using GLFW.

Unfortunately I experienced a severe performance degradation when using SDL. FPS dropped from 180-200 (GLFW) to 40-70 (SDL) and the framerate is choppy and less continuous. I'm not too familiar with OpenGL context management (hence why I'm using GLFW and SDL) and I found little information regarding performance between SDL and GLFW so I decided to ask here.

Here is the GLFW code:
[code]

if (!glfwOpenWindow(width, height, 8, 8, 8, 8, bytesPerPixel * 8, 0,
fullscreen ? GLFW_FULLSCREEN : GLFW_WINDOW)) {
throw Exception("Failed to create window.");
}

glfwDisable(GLFW_MOUSE_CURSOR);

glfwSetWindowTitle(getName().c_str());

glfwSetKeyCallback(keyCallback);
glfwSetMousePosCallback(mousePosCallback);
glfwSetMouseButtonCallback(mouseButtonCallback);

setMouseOrigin(Vector<2, int>(width/2, height/2));
if (verticleSync) {
glfwSwapInterval(1);
} else {
glfwSwapInterval(0);
}
[/code]



And the SDL code:
[code]

int videoFlags;
const SDL_VideoInfo *videoInfo;

videoInfo = SDL_GetVideoInfo();

if (!videoInfo) {
String error(SDL_GetError());
SDL_Quit();

throw Exception("Failed: " + error);
}

videoFlags = SDL_OPENGL;
videoFlags |= SDL_GL_DOUBLEBUFFER;
videoFlags |= SDL_HWPALETTE;

SDL_GL_SetAttribute( SDL_GL_RED_SIZE, 8 );
SDL_GL_SetAttribute( SDL_GL_GREEN_SIZE, 8 );
SDL_GL_SetAttribute( SDL_GL_BLUE_SIZE, 8 );
SDL_GL_SetAttribute( SDL_GL_DEPTH_SIZE, 16 );
SDL_GL_SetAttribute( SDL_GL_STENCIL_SIZE, 0 );
SDL_GL_SetAttribute( SDL_GL_DOUBLEBUFFER, 1 );

if (videoInfo->hw_available) {
videoFlags |= SDL_HWSURFACE;
} else {
videoFlags |= SDL_SWSURFACE;
}

if (videoInfo->blit_hw) {
videoFlags |= SDL_HWACCEL;
}

SDL_GL_SetAttribute(SDL_GL_DOUBLEBUFFER, 1);

if (!SDL_SetVideoMode(width, height, bytesPerPixel * 8, videoFlags)) {
String error(SDL_GetError());
SDL_Quit();

throw Exception("Failed: " + error);
}

SDL_ShowCursor(0);
[/code]



Note that my renderer uses deferred shading and therefore heavily relies on frame buffer objects. I am currently developing on Windows 7. Any help is appreciated.

Share this post


Link to post
Share on other sites
Yours3!f    1532
are you relying on the libraries' own 2D drawing stuff?

if so performance penalty might be because SDL's internal 2D drawing doesn't rely on OpenGL.

Share this post


Link to post
Share on other sites
rip-off    10976
Can you post a minimal program reproducing this behaviour?

Your SDL code is quite unusual, when dealing with OpenGL you generally just pass SDL_OPENGL (possibly with SDL_FULLSCREEN) to SDL_SetVideoMode(). Most of the other flags you are setting probably shouldn't be used.

Share this post


Link to post
Share on other sites
zacaj    667
My engine isnt using FBO's yet, but when I did a quick speedtest before starting I found SDL to be MUCH faster that GLFW

Share this post


Link to post
Share on other sites
Ninjaboi    163
You could try taking a look at your SDL event snippet of your code. When I first started using SDL, I made the mistake of not specifying all the events that I wanted to have actually be taken for the program. Instead, it checked all events, keyboard, mouse, OS specific, joypad, etc. Optimizing the code for only the events I wanted, it reduced resource usage dramatically.

Share this post


Link to post
Share on other sites
c_olin    197
Thanks for the quick replies everyone.

[quote name='Yours3!f' timestamp='1306695891' post='4817204']
are you relying on the libraries' own 2D drawing stuff?

if so performance penalty might be because SDL's internal 2D drawing doesn't rely on OpenGL.
[/quote]

No.

[quote name='rip-off' timestamp='1306697697' post='4817214']
Can you post a minimal program reproducing this behaviour?
[/quote]

That is quite impossible as my engine is fairly complex.

[quote name='rip-off' timestamp='1306697697' post='4817214']
Can you post a minimal program reproducing this behaviour?

Your SDL code is quite unusual, when dealing with OpenGL you generally just pass SDL_OPENGL (possibly with SDL_FULLSCREEN) to SDL_SetVideoMode(). Most of the other flags you are setting probably shouldn't be used.
[/quote]

I tried getting rid of everything else and just calling SDL_SetVideoMode with SDL_OPENGL and saw no observable difference. At least the code is considerably shorter.

[quote name='Ninjaboi' timestamp='1306706219' post='4817254']
You could try taking a look at your SDL event snippet of your code. When I first started using SDL, I made the mistake of not specifying all the events that I wanted to have actually be taken for the program. Instead, it checked all events, keyboard, mouse, OS specific, joypad, etc. Optimizing the code for only the events I wanted, it reduced resource usage dramatically.
[/quote]

This might be on the right track. I added an event filter and got smoother framerate but it is still hovering around 70 which is concerning. Here is my event code. It is just translating keyboard and mouse events to internally represented events and put into the queue.

[code]

int filter(SDL_Event* event) {
switch(event->type) {
case SDL_KEYDOWN:
case SDL_KEYUP:
case SDL_MOUSEMOTION:
case SDL_MOUSEBUTTONDOWN:
case SDL_MOUSEBUTTONUP:
case SDL_QUIT:
return 1;
default:
return 0;
}
}

SDL_SetEventFilter((SDL_EventFilter)&filter); // Called on window creation.


void SDLWindow::swapBuffers() {
SDL_GL_SwapBuffers();

SDL_Event event;
Vector<2, int> mouseOrigin = getMouseOrigin();

while (SDL_PollEvent(&event)) {
switch(event.type) {
case SDL_KEYDOWN:
pushInputEvent(InputEvent(InputEventType::KeyDown, mouseOrigin, (Key::Enum)(int)event.key.keysym.sym));
break;
case SDL_KEYUP:
pushInputEvent(InputEvent(InputEventType::KeyUp, mouseOrigin, (Key::Enum)(int)event.key.keysym.sym));
break;
case SDL_MOUSEMOTION:
pushInputEvent(InputEvent(InputEventType::MouseMove, Vector<2, int>((int)event.motion.x, (int)event.motion.y)));
break;
case SDL_MOUSEBUTTONDOWN:
pushInputEvent(InputEvent(InputEventType::KeyDown, mouseOrigin, (Key::Enum)(int)event.button.button));
break;
case SDL_MOUSEBUTTONUP:
pushInputEvent(InputEvent(InputEventType::KeyUp, mouseOrigin, (Key::Enum)(int)event.button.button));
break;
case SDL_QUIT:
break;
default:
break;
}
}
}
[/code]


Share this post


Link to post
Share on other sites
Ninjaboi    163
Not quite sure if this will work, but you could try removing the default case in that event switch statement. If I recall, that was the resource-hungry checker in my code in one of my first projects using SDL. I believe the theory I had behind it was "I told it to check these events, but I'm guessing that if none of the events that I've specified are the event that has occurred, then default must be being called for every single event that's possible excluding the ones that I've already specified in my cases.". Sorry if that doesn't make much sense, I'm in a room of very loud people, and it's hard to think when your head is pounding :cool:.

Tell us if that works ( cross your fingers! ).

EDIT: You might also check for your games refresh timer/loop. See if you have already limited your frame rate ( or just the amount of times your main loop is refreshed ). I usually keep mine at 20 for small 2D games, and 60 for everything else.

Share this post


Link to post
Share on other sites
c_olin    197
Thanks for the reply.

[quote name='Ninjaboi' timestamp='1306713773' post='4817285']
Not quite sure if this will work, but you could try removing the default case in that event switch statement. If I recall, that was the resource-hungry checker in my code in one of my first projects using SDL. I believe the theory I had behind it was "I told it to check these events, but I'm guessing that if none of the events that I've specified are the event that has occurred, then default must be being called for every single event that's possible excluding the ones that I've already specified in my cases.". Sorry if that doesn't make much sense, I'm in a room of very loud people, and it's hard to think when your head is pounding :cool:.

Tell us if that works ( cross your fingers! ).
[/quote]

I tried this and it had no effect. Which makes sense because with my event filter in place I am only receiving 0-3 events per frame.

[quote name='Ninjaboi' timestamp='1306713773' post='4817285']
EDIT: You might also check for your games refresh timer/loop. See if you have already limited your frame rate ( or just the amount of times your main loop is refreshed ). I usually keep mine at 20 for small 2D games, and 60 for everything else.
[/quote]

My main-loop should not having anything to do with it since when I use GLFW I am not having this problem.

Share this post


Link to post
Share on other sites
mhagain    13430
With that kind of framerate drop, you're doing something wrong.

First thing is - as always with framerates in that kind of region - check for vsync.

I note that you're asking for a 16-bit depth buffer. Double check what you actually get (SDL_GL_GetAttribute) and also double-check that you're not getting stencil as well. It's common enough (not widespread but I've seen it happen a few times) for OpenGL context creation to give you stencil even if you didn't ask for it (or asked for 0 bits), and if so, you should be clearing stencil at the same time as you clear depth. That will only account for a ~10% to ~20% perf drop, but it's still significant enough.

If you're doing an SDL_Sleep at the end of each frame, then stop doing it now. SDL's timer is quite coarse with poor resolution, and SDL_Sleep guarantees a minimum sleep time, not a maximum or exact. You may be sleeping for a lot longer than you think you are.

Any reason for the SDL_HWPALETTE? You're not trying to use OpenGL in color index mode are you? Take it out back and shoot it, you might be getting some weird pixel format that's dropping stuff to software emulation. While you're at it, drop your startup flags to the bare minimum. Rip out everything that's not needed - start with what was suggested above and only add in what you actually need to support your program.

So start with that, see how you get on, and report back. ;)

Share this post


Link to post
Share on other sites
c_olin    197
Thanks for the reply. The SDL init code is a copy paste from the old NEHE tutorials. I did try removing all flags but SDL_OPENGL and I got the same results. Vsync is turned off. I don't use the stencil buffer but I don't really care if there is one or not. In GLFW I tried using 8 bits-per-pixel, 24 bit depth, and 24 bit stencil and it worked great. I tried the exact same options in SDL and got the same frame-rate drop and stuttering:

[code]

SDL_SetEventFilter((SDL_EventFilter)&filter);

SDL_GL_SetAttribute(SDL_GL_RED_SIZE, 8);
SDL_GL_SetAttribute(SDL_GL_GREEN_SIZE, 8);
SDL_GL_SetAttribute(SDL_GL_BLUE_SIZE, 8);
SDL_GL_SetAttribute(SDL_GL_ALPHA_SIZE, 8);
SDL_GL_SetAttribute(SDL_GL_DEPTH_SIZE, 24);
SDL_GL_SetAttribute(SDL_GL_STENCIL_SIZE, 24);

SDL_GL_SetAttribute(SDL_GL_SWAP_CONTROL, 0);
SDL_GL_SetAttribute(SDL_GL_DOUBLEBUFFER, 1);

if (!SDL_SetVideoMode(width, height, bytesPerPixel * 8, SDL_OPENGL)) {
String error(SDL_GetError());
SDL_Quit();

throw Exception("Failed: " + error);
}

SDL_ShowCursor(0);
[/code]

I have tried messing with the values and commenting out lines and nothing seems to affect my results. My OpenGL code does not generate any OpenGL errors and I have run it with gDEbugger and removed all deprecated, redundant, and erroneous calls. I really have no idea how I could be having such dramatically differing results. I am considering digging through SDL and GLFW to see how the native code differs between the two.


Share this post


Link to post
Share on other sites
rip-off    10976
You don't need to post your entire engine. Just the basic OpenGL/SDL initialisation code, the event code and a drawing loop that does nothing. Enough to demonstrate the problem, nothing more.

Share this post


Link to post
Share on other sites
c_olin    197
[quote name='rip-off' timestamp='1306763172' post='4817498']
You don't need to post your entire engine. Just the basic OpenGL/SDL initialisation code, the event code and a drawing loop that does nothing. Enough to demonstrate the problem, nothing more.
[/quote]



I see. I have shown all of my SDL code and my OpenGL code is fairly minimal (just VBOs, FBOs, Textures, and shaders). However I'm not sure if the change in frame-rate would be apparent with no drawing. In fact I'm pretty sure it will not be observable as my main menu GUI is not considerably slower with SDL, only when the in-game scene is rendered which is rather complex.

Thanks for the reply; I appreciate the help. I'm considering sticking with GLFW for awhile since graphics API independence is not super important right now and perhaps as my engine matures this issue will disappear as it seems that the slow down may not be directly related to SDL.

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

Sign in to follow this  

  • Similar Content

    • By markshaw001
      Hi i am new to this forum  i wanted to ask for help from all of you i want to generate real time terrain using a 32 bit heightmap i am good at c++ and have started learning Opengl as i am very interested in making landscapes in opengl i have looked around the internet for help about this topic but i am not getting the hang of the concepts and what they are doing can some here suggests me some good resources for making terrain engine please for example like tutorials,books etc so that i can understand the whole concept of terrain generation.
       
    • By KarimIO
      Hey guys. I'm trying to get my application to work on my Nvidia GTX 970 desktop. It currently works on my Intel HD 3000 laptop, but on the desktop, every bind textures specifically from framebuffers, I get half a second of lag. This is done 4 times as I have three RGBA textures and one depth 32F buffer. I tried to use debugging software for the first time - RenderDoc only shows SwapBuffers() and no OGL calls, while Nvidia Nsight crashes upon execution, so neither are helpful. Without binding it runs regularly. This does not happen with non-framebuffer binds.
      GLFramebuffer::GLFramebuffer(FramebufferCreateInfo createInfo) { glGenFramebuffers(1, &fbo); glBindFramebuffer(GL_FRAMEBUFFER, fbo); textures = new GLuint[createInfo.numColorTargets]; glGenTextures(createInfo.numColorTargets, textures); GLenum *DrawBuffers = new GLenum[createInfo.numColorTargets]; for (uint32_t i = 0; i < createInfo.numColorTargets; i++) { glBindTexture(GL_TEXTURE_2D, textures[i]); GLint internalFormat; GLenum format; TranslateFormats(createInfo.colorFormats[i], format, internalFormat); // returns GL_RGBA and GL_RGBA glTexImage2D(GL_TEXTURE_2D, 0, internalFormat, createInfo.width, createInfo.height, 0, format, GL_FLOAT, 0); glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_NEAREST); glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_NEAREST); DrawBuffers[i] = GL_COLOR_ATTACHMENT0 + i; glBindTexture(GL_TEXTURE_2D, 0); glFramebufferTexture(GL_FRAMEBUFFER, GL_COLOR_ATTACHMENT0 + i, textures[i], 0); } if (createInfo.depthFormat != FORMAT_DEPTH_NONE) { GLenum depthFormat; switch (createInfo.depthFormat) { case FORMAT_DEPTH_16: depthFormat = GL_DEPTH_COMPONENT16; break; case FORMAT_DEPTH_24: depthFormat = GL_DEPTH_COMPONENT24; break; case FORMAT_DEPTH_32: depthFormat = GL_DEPTH_COMPONENT32; break; case FORMAT_DEPTH_24_STENCIL_8: depthFormat = GL_DEPTH24_STENCIL8; break; case FORMAT_DEPTH_32_STENCIL_8: depthFormat = GL_DEPTH32F_STENCIL8; break; } glGenTextures(1, &depthrenderbuffer); glBindTexture(GL_TEXTURE_2D, depthrenderbuffer); glTexImage2D(GL_TEXTURE_2D, 0, depthFormat, createInfo.width, createInfo.height, 0, GL_DEPTH_COMPONENT, GL_FLOAT, 0); glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_NEAREST); glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_NEAREST); glBindTexture(GL_TEXTURE_2D, 0); glFramebufferTexture(GL_FRAMEBUFFER, GL_DEPTH_ATTACHMENT, depthrenderbuffer, 0); } if (createInfo.numColorTargets > 0) glDrawBuffers(createInfo.numColorTargets, DrawBuffers); else glDrawBuffer(GL_NONE); if (glCheckFramebufferStatus(GL_FRAMEBUFFER) != GL_FRAMEBUFFER_COMPLETE) std::cout << "Framebuffer Incomplete\n"; glBindFramebuffer(GL_FRAMEBUFFER, 0); width = createInfo.width; height = createInfo.height; } // ... // FBO Creation FramebufferCreateInfo gbufferCI; gbufferCI.colorFormats = gbufferCFs.data(); gbufferCI.depthFormat = FORMAT_DEPTH_32; gbufferCI.numColorTargets = gbufferCFs.size(); gbufferCI.width = engine.settings.resolutionX; gbufferCI.height = engine.settings.resolutionY; gbufferCI.renderPass = nullptr; gbuffer = graphicsWrapper->CreateFramebuffer(gbufferCI); // Bind glBindFramebuffer(GL_DRAW_FRAMEBUFFER, fbo); // Draw here... // Bind to textures glActiveTexture(GL_TEXTURE0); glBindTexture(GL_TEXTURE_2D, textures[0]); glActiveTexture(GL_TEXTURE1); glBindTexture(GL_TEXTURE_2D, textures[1]); glActiveTexture(GL_TEXTURE2); glBindTexture(GL_TEXTURE_2D, textures[2]); glActiveTexture(GL_TEXTURE3); glBindTexture(GL_TEXTURE_2D, depthrenderbuffer); Here is an extract of my code. I can't think of anything else to include. I've really been butting my head into a wall trying to think of a reason but I can think of none and all my research yields nothing. Thanks in advance!
    • By Adrianensis
      Hi everyone, I've shared my 2D Game Engine source code. It's the result of 4 years working on it (and I still continue improving features ) and I want to share with the community. You can see some videos on youtube and some demo gifs on my twitter account.
      This Engine has been developed as End-of-Degree Project and it is coded in Javascript, WebGL and GLSL. The engine is written from scratch.
      This is not a professional engine but it's for learning purposes, so anyone can review the code an learn basis about graphics, physics or game engine architecture. Source code on this GitHub repository.
      I'm available for a good conversation about Game Engine / Graphics Programming
    • By C0dR
      I would like to introduce the first version of my physically based camera rendering library, written in C++, called PhysiCam.
      Physicam is an open source OpenGL C++ library, which provides physically based camera rendering and parameters. It is based on OpenGL and designed to be used as either static library or dynamic library and can be integrated in existing applications.
       
      The following features are implemented:
      Physically based sensor and focal length calculation Autoexposure Manual exposure Lense distortion Bloom (influenced by ISO, Shutter Speed, Sensor type etc.) Bokeh (influenced by Aperture, Sensor type and focal length) Tonemapping  
      You can find the repository at https://github.com/0x2A/physicam
       
      I would be happy about feedback, suggestions or contributions.

    • By altay
      Hi folks,
      Imagine we have 8 different light sources in our scene and want dynamic shadow map for each of them. The question is how do we generate shadow maps? Do we render the scene for each to get the depth data? If so, how about performance? Do we deal with the performance issues just by applying general methods (e.g. frustum culling)?
      Thanks,
       
  • Popular Now