Sign in to follow this  
proanim

OpenGL OpenGL 3+ and SDL

Recommended Posts

proanim    455
Is there a way to use SDL, GLFW and GLEW? It seems that all three are conflicting in some way. I am looking to initialize opengl 3.3 at least with glfw, opengl extensions with glew and handle input and message pump with sdl. It also seems that if glfw doesn't create a window sdl window creation fails, and if glfw does create a window than sdl also creates its window, and there is no effect from any of the gl functions like 'glClearColor' whatsoever.

Share this post


Link to post
Share on other sites
Aldacron    4544
[quote name='proanim' timestamp='1355797440' post='5011887']
Is there a way to use SDL, GLFW and GLEW? It seems that all three are conflicting in some way. I am looking to initialize opengl 3.3 at least with glfw, opengl extensions with glew and handle input and message pump with sdl.[/quote]

I don't understand why you would want to do this. GLFW handles input events just fine. You gain nothing by attempting to do it. I strongly recommend you either drop SDL and use GLFW exlusively, or get [url="http://www.libsdl.org/hg.php"]the latest SDL2 snapshot[/url], compile it, and use it instead. SDL2 supports OpenGL 3+ context creation.

Share this post


Link to post
Share on other sites
proanim    455
Ok so i decided to use SDL2 and i have created a window with opengl context, everything nice and easy but, i can't get glew to intialize when using sdl2. I need glew for shaders there is no shader support is SDL2 - no glCreateShader and such. Any help?

Share this post


Link to post
Share on other sites
ic0de    1012
[quote name='proanim' timestamp='1355844662' post='5012084']
Ok so i decided to use SDL2 and i have created a window with opengl context, everything nice and easy but, i can't get glew to intialize when using sdl2. I need glew for shaders there is no shader support is SDL2 - no glCreateShader and such. Any help?
[/quote]

when are you calling glewInit(), I'm using SDL2 and I call it after initializing my window and OpenGL context and it works just fine.

Share this post


Link to post
Share on other sites
proanim    455
Ok no it doesn't crash when i call glewInit(); after i have setup window and opengl context, but i still can't get shaders to work. This really pisses me off big time, whenevery I want to test something it fails with shaders. I wanted to test SDL2 with this http://www.opengl-tutorial.org/beginners-tutorials/tutorial-2-the-first-triangle/ tutorial. It seems that it works ok on its own but when i want to use it's shader parts 'shader.hpp' and 'shader.cpp' to make the shaders work it fails with 'vector out of bounds' error [img]http://public.gamedev.net//public/style_emoticons/default/angry.png[/img] . If i comment out the part that reports errors it still doesn't work. I don't have any idea how to fix this, any help?

Share this post


Link to post
Share on other sites
Aldacron    4544
A "vector out of bounds" error has to do with a std::vector being accessed with an invalid index. In the code from the tutorial you linked, the only use I see of std::vector is in creating a buffer for error messages from the shaders (a rather odd thing to do IMO). So my guess is that your shader is failing, the error path is being executed and there is a mistake with the buffer. Did you copy shader.cpp by hand, drop it in your project, or copy/paste?

Whatever the case, all anyone can do is guess without seeing *your* offending code and the exact error messages. So in the future, when you have errors like this, please include more information so people can help you more easily.

Share this post


Link to post
Share on other sites
proanim    455
I have copy/pasted shader.hpp and shader.cpp files from tutorials and vertex and fragment shader are in the same directory as the executable.

My code looks like this:

base.h
[source lang="cpp"]// main header file
#ifndef _BASE_H_
#define _BASE_H_

// preprocessor directives
#define WIN32_LEAN_AND_MEAN

// included headers
#include <windows.h>
#include <stdio.h>
#include <stdlib.h>
#include <string.h>
#include <assert.h>
#include <string>
#include <vector>
#include <iostream>
#include <fstream>
#include <algorithm>

// OpenGL headers
#include "GL/glew.h"
#include "GL/glm/glm.hpp"
#include "GL/glm/gtc/matrix_transform.hpp"
#include <GL/gl.h>

// SDL headers
#include "SDL/SDL.h"

// project specific headers
#include "shader.hpp"

// included libraries
#pragma comment(lib, "glew32.lib")
#pragma comment(lib, "opengl32.lib")
#pragma comment(lib, "SDL.lib")
#pragma comment(lib, "SDLmain.lib")

// namespace
using namespace glm; // must be before std namespace
using namespace std;

// function prototypes

//global variables
SDL_Window *MainWin;
SDL_GLContext MainContext;

SDL_Event event; // the event structure that will be used
bool quit = false; // make sure the program waits for a quit

GLuint VBO; // vertex buffer object
GLuint IBO; // index buffer object

#endif[/source]
main.cpp

[source lang="cpp"]#include "base.h"

int main(int argc, char* argv[])
{
// init SDL subsystems
if(SDL_Init(SDL_INIT_VIDEO | SDL_INIT_AUDIO) == -1)
return 1;

// setup OpenGL version
SDL_GL_SetAttribute(SDL_GL_CONTEXT_MAJOR_VERSION, 3);
SDL_GL_SetAttribute(SDL_GL_CONTEXT_MINOR_VERSION, 3);
SDL_GL_SetAttribute(SDL_GL_CONTEXT_PROFILE_MASK, SDL_GL_CONTEXT_PROFILE_CORE);

//setup stencil buffer
SDL_GL_SetAttribute(SDL_GL_STENCIL_SIZE, 8);

// setup depth buffer
SDL_GL_SetAttribute(SDL_GL_DOUBLEBUFFER, 1);
SDL_GL_SetAttribute(SDL_GL_DEPTH_SIZE, 24);

// create main window
MainWin = SDL_CreateWindow("TestApp",
SDL_WINDOWPOS_CENTERED, SDL_WINDOWPOS_CENTERED,
800, 600, SDL_WINDOW_OPENGL | SDL_WINDOW_SHOWN);

MainContext = SDL_GL_CreateContext(MainWin); // attach OpenGL context to window

// init GLEW
if (glewInit() != GLEW_OK)
{
fprintf(stderr, "Failed to initialize GLEW\n");
return -1;
}

SDL_GL_SetSwapInterval(0); // vsync

// background color
// dark blue background
glClearColor(0.0f, 0.0f, 0.3f, 0.0f);

GLuint VertexArrayID;
glGenVertexArrays(1, &VertexArrayID);
glBindVertexArray(VertexArrayID);

// create and compile our GLSL program from the shaders
GLuint programID = LoadShaders("SimpleVertexShader.vertexshader", "SimpleFragmentShader.fragmentshader");

static const GLfloat g_vertex_buffer_data[] = {
-1.0f, -1.0f, 0.0f,
1.0f, -1.0f, 0.0f,
0.0f, 1.0f, 0.0f,
};

GLuint vertexbuffer;
glGenBuffers(1, &vertexbuffer);
glBindBuffer(GL_ARRAY_BUFFER, vertexbuffer);
glBufferData(GL_ARRAY_BUFFER, sizeof(g_vertex_buffer_data), g_vertex_buffer_data, GL_STATIC_DRAW);

// while the user hasn't quit
while(quit == false)
{
// clear the screen
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);

// use shader
glUseProgram(programID);

// 1st attribute buffer : vertices
glEnableVertexAttribArray(0);
glBindBuffer(GL_ARRAY_BUFFER, vertexbuffer);
glVertexAttribPointer(
0, // attribute 0. No particular reason for 0, but must match the layout in the shader.
3, // size
GL_FLOAT, // type
GL_FALSE, // normalized?
0, // stride
(void*)0 // array buffer offset
);

// Draw the triangle !
glDrawArrays(GL_TRIANGLES, 0, 3); // From index 0 to 3 -> 1 triangle

glDisableVertexAttribArray(0);

SDL_GL_SwapWindow(MainWin);

// while there's an event to handle
while(SDL_PollEvent(&event))
{
// if the user has Xed out the window
if(event.type == SDL_QUIT)
{
// quit the program
quit = true;
}
}
}

// Close and destroy the window
SDL_DestroyWindow(MainWin);
// quit SDL
SDL_Quit();

return 0;
}[/source]
any ideas?

Share this post


Link to post
Share on other sites
BitMaster    8651
My suggestion would be to use the debugger to step into the call to LoadShaders, and then follow the code flow line by line in the debugger.

That said, I highly suspect your working directory is simply not what you expect it to be. But the bug in the shader error handling must be fixed anyway, so work on that first.

Share this post


Link to post
Share on other sites
proanim    455
when debugging this line crashes

glGetShaderInfoLog(VertexShaderID, InfoLogLength, NULL, &VertexShaderErrorMessage[0]); [img]http://public.gamedev.net//public/style_emoticons/default/huh.png[/img]

now I just tried to compile the actual tutoral and I am fairly sure I could run it, but now there is the same crash no matter what. And since there is a warning about conflicting libraries - when I sitch to multi threaded dll instead (multi threaded dll debug option) in linker configuration i get failure on comile time with

shader.obj : error LNK2019: unresolved external symbol __imp___CrtDbgReportW referenced in function "public: char & __thiscall std::vector<char,class std::allocator<char> >::operator[](unsigned int)" (??A?$vector@DV?$allocator@D@std@@@std@@QAEAADI@Z)

how should you compile this thing so it works? [img]http://public.gamedev.net//public/style_emoticons/default/blink.png[/img]

Share this post


Link to post
Share on other sites
BitMaster    8651
If you get errors about "conflicting libraries" (you failed to post an accurate error message again, by the way) I assume that means something on the lines of MSVCRT? If so, it's no wonder things blow up right and left.

Everything you link together statically must use the same runtime (/MT, /MTd, /MD or /MDd). There are other ways to cause problems here but with MSVC, the choice of the right runtime for all libraries is the most common issue.

If switching to /MDd as you tried does cause problems at link time, have you tried to "Rebuild All" the project in question? What exactly are linking to, with which runtimes was that compiled?

Share this post


Link to post
Share on other sites
BitMaster    8651
Again, have you tried rebuilding it all? Have you changed the predefined Debug build of MSVC to use /MD? That will probably not be a good idea as the only change, use the predefined Release build instead.

Share this post


Link to post
Share on other sites
proanim    455
With release build, /MD is default same libs and everything unchanged it compiles and runs without the crash and without 'vector out of bounds'. But shaders which are placed in the same directory as exe so it is correct file path still fails? [img]http://public.gamedev.net//public/style_emoticons/default/huh.png[/img]

Share this post


Link to post
Share on other sites
proanim    455
Ok i found my old project file in which i got the tutorials and shaders to work, turns out i setup the project exactly the same except i used glfw instead of sdl at the time. This is very irritating since i don't know why it only works with that project. Is there something in sdl or somewhere that can make these problems with shaders and all from above posts?

EDIT: the project which works has /MD and still works in debug without errors or crashes, unlike the one where i used sdl Edited by proanim

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

Sign in to follow this  

  • Similar Content

    • By Zaphyk
      I am developing my engine using the OpenGL 3.3 compatibility profile. It runs as expected on my NVIDIA card and on my Intel Card however when I tried it on an AMD setup it ran 3 times worse than on the other setups. Could this be a AMD driver thing or is this probably a problem with my OGL code? Could a different code standard create such bad performance?
    • By Kjell Andersson
      I'm trying to get some legacy OpenGL code to run with a shader pipeline,
      The legacy code uses glVertexPointer(), glColorPointer(), glNormalPointer() and glTexCoordPointer() to supply the vertex information.
      I know that it should be using setVertexAttribPointer() etc to clearly define the layout but that is not an option right now since the legacy code can't be modified to that extent.
      I've got a version 330 vertex shader to somewhat work:
      #version 330 uniform mat4 osg_ModelViewProjectionMatrix; uniform mat4 osg_ModelViewMatrix; layout(location = 0) in vec4 Vertex; layout(location = 2) in vec4 Normal; // Velocity layout(location = 3) in vec3 TexCoord; // TODO: is this the right layout location? out VertexData { vec4 color; vec3 velocity; float size; } VertexOut; void main(void) { vec4 p0 = Vertex; vec4 p1 = Vertex + vec4(Normal.x, Normal.y, Normal.z, 0.0f); vec3 velocity = (osg_ModelViewProjectionMatrix * p1 - osg_ModelViewProjectionMatrix * p0).xyz; VertexOut.velocity = velocity; VertexOut.size = TexCoord.y; gl_Position = osg_ModelViewMatrix * Vertex; } What works is the Vertex and Normal information that the legacy C++ OpenGL code seem to provide in layout location 0 and 2. This is fine.
      What I'm not getting to work is the TexCoord information that is supplied by a glTexCoordPointer() call in C++.
      Question:
      What layout location is the old standard pipeline using for glTexCoordPointer()? Or is this undefined?
       
      Side note: I'm trying to get an OpenSceneGraph 3.4.0 particle system to use custom vertex, geometry and fragment shaders for rendering the particles.
    • By markshaw001
      Hi i am new to this forum  i wanted to ask for help from all of you i want to generate real time terrain using a 32 bit heightmap i am good at c++ and have started learning Opengl as i am very interested in making landscapes in opengl i have looked around the internet for help about this topic but i am not getting the hang of the concepts and what they are doing can some here suggests me some good resources for making terrain engine please for example like tutorials,books etc so that i can understand the whole concept of terrain generation.
       
    • By KarimIO
      Hey guys. I'm trying to get my application to work on my Nvidia GTX 970 desktop. It currently works on my Intel HD 3000 laptop, but on the desktop, every bind textures specifically from framebuffers, I get half a second of lag. This is done 4 times as I have three RGBA textures and one depth 32F buffer. I tried to use debugging software for the first time - RenderDoc only shows SwapBuffers() and no OGL calls, while Nvidia Nsight crashes upon execution, so neither are helpful. Without binding it runs regularly. This does not happen with non-framebuffer binds.
      GLFramebuffer::GLFramebuffer(FramebufferCreateInfo createInfo) { glGenFramebuffers(1, &fbo); glBindFramebuffer(GL_FRAMEBUFFER, fbo); textures = new GLuint[createInfo.numColorTargets]; glGenTextures(createInfo.numColorTargets, textures); GLenum *DrawBuffers = new GLenum[createInfo.numColorTargets]; for (uint32_t i = 0; i < createInfo.numColorTargets; i++) { glBindTexture(GL_TEXTURE_2D, textures[i]); GLint internalFormat; GLenum format; TranslateFormats(createInfo.colorFormats[i], format, internalFormat); // returns GL_RGBA and GL_RGBA glTexImage2D(GL_TEXTURE_2D, 0, internalFormat, createInfo.width, createInfo.height, 0, format, GL_FLOAT, 0); glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_NEAREST); glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_NEAREST); DrawBuffers[i] = GL_COLOR_ATTACHMENT0 + i; glBindTexture(GL_TEXTURE_2D, 0); glFramebufferTexture(GL_FRAMEBUFFER, GL_COLOR_ATTACHMENT0 + i, textures[i], 0); } if (createInfo.depthFormat != FORMAT_DEPTH_NONE) { GLenum depthFormat; switch (createInfo.depthFormat) { case FORMAT_DEPTH_16: depthFormat = GL_DEPTH_COMPONENT16; break; case FORMAT_DEPTH_24: depthFormat = GL_DEPTH_COMPONENT24; break; case FORMAT_DEPTH_32: depthFormat = GL_DEPTH_COMPONENT32; break; case FORMAT_DEPTH_24_STENCIL_8: depthFormat = GL_DEPTH24_STENCIL8; break; case FORMAT_DEPTH_32_STENCIL_8: depthFormat = GL_DEPTH32F_STENCIL8; break; } glGenTextures(1, &depthrenderbuffer); glBindTexture(GL_TEXTURE_2D, depthrenderbuffer); glTexImage2D(GL_TEXTURE_2D, 0, depthFormat, createInfo.width, createInfo.height, 0, GL_DEPTH_COMPONENT, GL_FLOAT, 0); glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_NEAREST); glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_NEAREST); glBindTexture(GL_TEXTURE_2D, 0); glFramebufferTexture(GL_FRAMEBUFFER, GL_DEPTH_ATTACHMENT, depthrenderbuffer, 0); } if (createInfo.numColorTargets > 0) glDrawBuffers(createInfo.numColorTargets, DrawBuffers); else glDrawBuffer(GL_NONE); if (glCheckFramebufferStatus(GL_FRAMEBUFFER) != GL_FRAMEBUFFER_COMPLETE) std::cout << "Framebuffer Incomplete\n"; glBindFramebuffer(GL_FRAMEBUFFER, 0); width = createInfo.width; height = createInfo.height; } // ... // FBO Creation FramebufferCreateInfo gbufferCI; gbufferCI.colorFormats = gbufferCFs.data(); gbufferCI.depthFormat = FORMAT_DEPTH_32; gbufferCI.numColorTargets = gbufferCFs.size(); gbufferCI.width = engine.settings.resolutionX; gbufferCI.height = engine.settings.resolutionY; gbufferCI.renderPass = nullptr; gbuffer = graphicsWrapper->CreateFramebuffer(gbufferCI); // Bind glBindFramebuffer(GL_DRAW_FRAMEBUFFER, fbo); // Draw here... // Bind to textures glActiveTexture(GL_TEXTURE0); glBindTexture(GL_TEXTURE_2D, textures[0]); glActiveTexture(GL_TEXTURE1); glBindTexture(GL_TEXTURE_2D, textures[1]); glActiveTexture(GL_TEXTURE2); glBindTexture(GL_TEXTURE_2D, textures[2]); glActiveTexture(GL_TEXTURE3); glBindTexture(GL_TEXTURE_2D, depthrenderbuffer); Here is an extract of my code. I can't think of anything else to include. I've really been butting my head into a wall trying to think of a reason but I can think of none and all my research yields nothing. Thanks in advance!
    • By Adrianensis
      Hi everyone, I've shared my 2D Game Engine source code. It's the result of 4 years working on it (and I still continue improving features ) and I want to share with the community. You can see some videos on youtube and some demo gifs on my twitter account.
      This Engine has been developed as End-of-Degree Project and it is coded in Javascript, WebGL and GLSL. The engine is written from scratch.
      This is not a professional engine but it's for learning purposes, so anyone can review the code an learn basis about graphics, physics or game engine architecture. Source code on this GitHub repository.
      I'm available for a good conversation about Game Engine / Graphics Programming
  • Popular Now