Jump to content
  • Advertisement
Sign in to follow this  
FonzTech

OpenGL glCreateShader crashes application

This topic is 1081 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Hi at all! I'm following this tutorial:
http://www.opengl-tutorial.org/beginners-tutorials/tutorial-2-the-first-triangle/
 
I'm using Code::Blocks with SDL 2.0.4. I've correctly setup all the includes and libs (and GLEW statically). These are the linker arguments:
-lmingw32
-lglew32s
-lOpenGL32
-lSDL2
-lSDL2_image
-lSDL2_mixer
 
MinGW compiler and libraries are x86_64. I have initialized SDL (everything), GLEW, setup the OpenGL context and made it current. Also glewInit does not give any error.
These are the SDL settings I use:
SDL_GL_SetAttribute(SDL_GL_RED_SIZE, 5);
SDL_GL_SetAttribute(SDL_GL_GREEN_SIZE, 5);
SDL_GL_SetAttribute(SDL_GL_BLUE_SIZE, 5);
SDL_GL_SetAttribute(SDL_GL_DEPTH_SIZE, 16);
SDL_GL_SetAttribute(SDL_GL_DOUBLEBUFFER, 1);
SDL_GL_SetAttribute(SDL_GL_CONTEXT_PROFILE_MASK, SDL_GL_CONTEXT_PROFILE_CORE);
SDL_GL_SetAttribute(SDL_GL_CONTEXT_MAJOR_VERSION, 3);
SDL_GL_SetAttribute(SDL_GL_CONTEXT_MINOR_VERSION, 3);
SDL_SetHint(SDL_HINT_RENDER_OPENGL_SHADERS, "1");
All seems to work, until glCreateShader is called.
 
No errors on compile, except one strange warning:

Warning: corrupt .drectve at end of def file[background=#ffffff] [/background]


Searching around the internet, I found that the compiler throws that warning because I'm using the GLEW made for VC++, not for MinGW. I downloaded the libraries from the official website... So the problem are the GLEW libraries? Or something else?

Thanks in advance for any help!

Share this post


Link to post
Share on other sites
Advertisement

Searching around the internet, I found that the compiler throws that warning because I'm using the GLEW made for VC++, not for MinGW. I downloaded the libraries from the official website... So the problem are the GLEW libraries? Or something else?

 

I would definitely worry about that.

 

Another thing you could check is if glCreateShader is NULL. What sort of hardware are you running on? You could also put glGetError just before and check nothing is a miss.

 

You could try building GLEW for MinGW, this thread might shed some light: http://stackoverflow.com/questions/6005076/building-glew-on-windows-with-mingw though it is windows might give some useful info.

Edited by Nanoha

Share this post


Link to post
Share on other sites

You need to compile glew yourself for mingw. I've been through this myself.

Anything else worth checking, if openGL errors come up after glew initialization. (Not the return value from glewInit() but openGL error). 

Edited by WoopsASword

Share this post


Link to post
Share on other sites

I have downloaded Visual Studio 2015, recompiled the SDL2main.lib for VS2015 x64 and all is correctly setup.

 

Another thing you could check is if glCreateShader is NULL

 

 

Yes, printing it results in a null pointer -.-"

 

Also Visual Studio tells me about an exception at address 0x0000000000000000 (null). So glCreateShader does not exists?

 

Anyway, I'm running on a sh**ty laptop with i3-m330, Radeon 5650.

 

This is the version of my driver: 15.20.1062.1003-150728a1-187449C

 

I've read on the AMD website that it does support modern OpenGL...

 

Any advice for this?

Edited by ReDevilGames

Share this post


Link to post
Share on other sites

A common mistake is to call glewInit before creating your context; worth checking that you haven't done this.

Share this post


Link to post
Share on other sites

This is how I initialize all the things:

// Init SDL Library
SDL_Init(SDL_INIT_EVERYTHING);

// Setup SDL for OpenGL
SDL_GL_SetAttribute(SDL_GL_RED_SIZE, 5);
SDL_GL_SetAttribute(SDL_GL_GREEN_SIZE, 5);
SDL_GL_SetAttribute(SDL_GL_BLUE_SIZE, 5);
SDL_GL_SetAttribute(SDL_GL_DEPTH_SIZE, 16);
SDL_GL_SetAttribute(SDL_GL_DOUBLEBUFFER, 1);
SDL_GL_SetAttribute(SDL_GL_CONTEXT_MAJOR_VERSION, 4);
SDL_GL_SetAttribute(SDL_GL_CONTEXT_MINOR_VERSION, 0);
SDL_GL_SetAttribute(SDL_GL_CONTEXT_PROFILE_MASK, SDL_GL_CONTEXT_PROFILE_CORE);
SDL_GL_SetAttribute(SDL_GL_CONTEXT_FLAGS, SDL_GL_CONTEXT_FORWARD_COMPATIBLE_FLAG);
SDL_GL_SetAttribute(SDL_GL_SHARE_WITH_CURRENT_CONTEXT, 1);
SDL_SetHint(SDL_HINT_RENDER_OPENGL_SHADERS, "1");

// Create window and renderer
window = SDL_CreateWindow("Game Engine", SDL_WINDOWPOS_CENTERED, SDL_WINDOWPOS_CENTERED, 1280, 720, SDL_WINDOW_SHOWN | SDL_WINDOW_OPENGL);
renderer = SDL_CreateRenderer(window, 0, SDL_RENDERER_ACCELERATED);

// Create the OpenGL context
context = SDL_GL_CreateContext(window);
SDL_GL_MakeCurrent(window, context);

// Init GLEW
glewExperimental = GL_TRUE;
glewInit();

What's wrong? Why can't I call glCreateShader?

Edited by ReDevilGames

Share this post


Link to post
Share on other sites
...
SDL_GL_SetAttribute(SDL_GL_SHARE_WITH_CURRENT_CONTEXT, "1");
...

Hmmm... how about you create a minimal example with your real code that we could compile?

Share this post


Link to post
Share on other sites

https://wiki.libsdl.org/SDL_GL_SetAttribute

 

 

You should use SDL_GL_GetAttribute() to check the values after creating the OpenGL context, since the values obtained can differ from the requested ones.

 

The probability is that you're not getting a hardware-accelerated GL context; i.e one which uses Microsoft's software GL 1.1 implementation.  As the quote above shows, just because you ask SDL for something it's no guarantee that you're going to get it.  So you need to work through your SDL_GL_SetAttribute calls to find which one is bumping you down to a software context.  To me that 16-bit display mode looks a possible candidate.

Share this post


Link to post
Share on other sites

Hmmm... how about you create a minimal example with your real code that we could compile?

That was a copy-paste error because I messed up this post... Just look in the history... ;)
 

The probability is that you're not getting a hardware-accelerated GL context [...] To me that 16-bit display mode looks a possible candidate.

I now tried to set SDL_GL_ACCELERATED_VISUAL to 1 (by default it should use hardware if possible) and buffer size to 32, but it still crashes...

I'll try to check the values...

 

EDIT: I tried, but values I get are the same of what I set before initializing the context...

Sincerely I don't know why I'm getting this error...

Edited by ReDevilGames

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

GameDev.net is your game development community. Create an account for your GameDev Portfolio and participate in the largest developer community in the games industry.

Sign me up!