Jump to content

  • Log In with Google      Sign In   
  • Create Account


OpenGL program doesn't start on other peoples computers.


Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.

  • You cannot reply to this topic
15 replies to this topic

#1 Syerjchep   Members   -  Reputation: 180

Like
0Likes
Like

Posted 26 March 2014 - 08:03 PM

So even though I can run my program nicely, it seems few others can.

Here's some init code, with irrelevant stuff snipped out between lines:

if( !glfwInit() )
log("Failed to initialize GLFW\n");
 
Code for prefs and such goes here.
 

glfwOpenWindowHint(GLFW_FSAA_SAMPLES, 4);
if(oldopengl)
    {
        glfwOpenWindowHint(GLFW_OPENGL_VERSION_MAJOR, 3);
        glfwOpenWindowHint(GLFW_OPENGL_VERSION_MINOR, 2);
    }
    else
    {
        glfwOpenWindowHint(GLFW_OPENGL_VERSION_MAJOR, 4);
        glfwOpenWindowHint(GLFW_OPENGL_VERSION_MINOR, 0);
    }
glfwOpenWindowHint(GLFW_OPENGL_PROFILE, GLFW_OPENGL_CORE_PROFILE);
 
    if(fullscreen)
    {
        // Open a window and create its OpenGL context
        if( !glfwOpenWindow( screenx,screeny, 0,0,0,0, 32,0, GLFW_FULLSCREEN ) )
        {
            log("Failed to open GLFW window. If you have an older (Intel) GPU, they are not 4.0 compatible.\n");
            glfwTerminate();
        }
    }
    else
    {
        // Open a window and create its OpenGL context
        if( !glfwOpenWindow( screenx,screeny, 0,0,0,0, 32,0, GLFW_WINDOW ) )
        {
            log("Failed to open GLFW window. If you have an older (Intel) GPU, they are not 4.0 compatible.\n");
            glfwTerminate();
        }
    }
 
// Initialize GLEW
glewExperimental = true; // Needed for core profile
if (glewInit() != GLEW_OK)
log("Failed to initialize GLEW\n");

Excuse the lack of formatting, I can't add tabs easily here.

Also, I actually took out a lot of other code in between lines, just leaving the OpenGL init code.

 

I can run it fine. I have an older but higher tier desktop with Windows 7, 6 core processor, 8gb ram and a GTX 560 that can run it well.

I also have a Windows 8 laptop with Integrated Graphics that can run it as well, albeit with lower settings to maintain framerates.

 

Yet it seems that a majority of the people I give the application to simply can't run it. As in it crashes imminently.

Log files often come back like this:

0   : Program started.
0.00   : 4 quotes read.
0.00   : Reading prefs file.
0.05   : Failed to open GLFW window. If you have an older (Intel) GPU, they are not 4.0 compatible.

0   : Failed to initialize GLEW

0   : OpenAL: No Error
0   : Networking initalized.
0   : Initalized.
0   : Error: Missing GL version
0   : Using glew version: 1.7.0
0   : 
0   : 
0   : 
0   : 
0   : Opengl version used: 13107512.2007382186

In cases like the above, OpenGL (specifically GLFW in this case) doesn't even load.

Other times the logs look like this:

0   : Program started.
0.00   : 4 quotes read.
0.00   : Reading prefs file.
0.38   : OpenAL: No Error
0.38   : Networking initalized.
0.38   : Initalized.
0.38   : Using glew version: 1.7.0
0.38   : AMD Radeon HD 6800 Series
0.38   : ATI Technologies Inc.
0.39   : 3.2.12618 Core Profile Context 13.251.0.0
0.39   : 4.30
0.39   : Opengl version used: 3.2
0.39   : Compiling shader: StandardShadingVertex.glsl

It seems to load okay, but loading the shaders crashes the program. 

(I'll post shader loading code in a second.)

 

 

I realize I haven't provided a ton of information, so I'll awnser questions as needed and find the shader loading code to post.

Lastly, I can offer the program it self if anyone wants to try running it.

Here:

http://syerjchep.org/release.zip

If you can even get to the black screen with white text on it. Then it works for you.

 



Sponsor:

#2 Chris_F   Members   -  Reputation: 2227

Like
0Likes
Like

Posted 26 March 2014 - 08:43 PM

Your code shows the use of GLFW (the old version btw, GLFW3 is a better choice) and yet the program you included comes with SDL. That seems a bit odd. I can't say why the first example failed. It obviously failed to create a valid context, which is why GLEW didn't initialize. As for the second example, it's likely one or more of your shaders contain GLSL which is invalid. Nvidias GLSL compiler is implemented by way of Cg, and it will allow some code which is only valid in Cg to compile as GLSL. AMD's compiler is much more strict.



#3 TheChubu   Crossbones+   -  Reputation: 4070

Like
0Likes
Like

Posted 26 March 2014 - 08:55 PM


Nvidias GLSL compiler is implemented by way of Cg
I doubt its the case anymore. Recent GLSL has no Cg equivalent.

"I AM ZE EMPRAH OPENGL 3.3 THE CORE, I DEMAND FROM THEE ZE SHADERZ AND MATRIXEZ"

 

My journals: dustArtemis ECS framework and Making a Terrain Generator


#4 Syerjchep   Members   -  Reputation: 180

Like
0Likes
Like

Posted 26 March 2014 - 09:01 PM

Your code shows the use of GLFW (the old version btw, GLFW3 is a better choice) and yet the program you included comes with SDL. That seems a bit odd. I can't say why the first example failed. It obviously failed to create a valid context, which is why GLEW didn't initialize. As for the second example, it's likely one or more of your shaders contain GLSL which is invalid. Nvidias GLSL compiler is implemented by way of Cg, and it will allow some code which is only valid in Cg to compile as GLSL. AMD's compiler is much more strict.

SDL_Net is used for the programs networking. It is not used for anything concerning graphics, though I believe I still have to link to all the standard SDL graphical libraries.

As for shaders, yes, they're GLSL. But I kinda thought that was just the term (openGL Shading Language) for shaders used with OpenGL, so IDK what you mean with Cg. The shaders are in the zip with the program obviously, but if requested I could just post them here.

 

 


Nvidias GLSL compiler is implemented by way of Cg
I doubt its the case anymore. Recent GLSL has no Cg equivalent.

 

Once again, even though I know how to program in OpenGL and write GLSL shaders, I don't really know what you're talking about.

 

 

Edit: This is the code I'm currently using to load shaders, it's from a tutorial some of you might recognize:

GLuint LoadShaders(const char * vertex_file_path,const char * fragment_file_path)
{
    // Create the shaders
    GLuint VertexShaderID = glCreateShader(GL_VERTEX_SHADER);
    GLuint FragmentShaderID = glCreateShader(GL_FRAGMENT_SHADER);
 
    // Read the Vertex Shader code from the file
    std::string VertexShaderCode;
    std::ifstream VertexShaderStream(vertex_file_path, std::ios::in);
    if(VertexShaderStream.is_open())
    {
        std::string Line = "";
        while(getline(VertexShaderStream, Line))
            VertexShaderCode += "\n" + Line;
        VertexShaderStream.close();
    }
 
    // Read the Fragment Shader code from the file
    std::string FragmentShaderCode;
    std::ifstream FragmentShaderStream(fragment_file_path, std::ios::in);
    if(FragmentShaderStream.is_open()){
        std::string Line = "";
        while(getline(FragmentShaderStream, Line))
            FragmentShaderCode += "\n" + Line;
        FragmentShaderStream.close();
    }
 
    GLint Result = GL_FALSE;
    int InfoLogLength;
 
    // Compile Vertex Shader
    log("Compiling shader: "+string(vertex_file_path));
    char const * VertexSourcePointer = VertexShaderCode.c_str();
    glShaderSource(VertexShaderID, 1, &VertexSourcePointer , NULL);
    glCompileShader(VertexShaderID);
 
    // Check Vertex Shader
    glGetShaderiv(VertexShaderID, GL_COMPILE_STATUS, &Result);
    glGetShaderiv(VertexShaderID, GL_INFO_LOG_LENGTH, &InfoLogLength);
    std::vector<char> VertexShaderErrorMessage(InfoLogLength);
    glGetShaderInfoLog(VertexShaderID, InfoLogLength, NULL, &VertexShaderErrorMessage[0]);
    log(&VertexShaderErrorMessage[0]);
 
    // Compile Fragment Shader
    log("Compiling shader: "+string(fragment_file_path));
    char const * FragmentSourcePointer = FragmentShaderCode.c_str();
    glShaderSource(FragmentShaderID, 1, &FragmentSourcePointer , NULL);
    glCompileShader(FragmentShaderID);
 
    // Check Fragment Shader
    glGetShaderiv(FragmentShaderID, GL_COMPILE_STATUS, &Result);
    glGetShaderiv(FragmentShaderID, GL_INFO_LOG_LENGTH, &InfoLogLength);
    std::vector<char> FragmentShaderErrorMessage(InfoLogLength);
    glGetShaderInfoLog(FragmentShaderID, InfoLogLength, NULL, &FragmentShaderErrorMessage[0]);
    log(&FragmentShaderErrorMessage[0]);
 
    // Link the program
    log("Linking program.");
    GLuint ProgramID = glCreateProgram();
    glAttachShader(ProgramID, VertexShaderID);
    glAttachShader(ProgramID, FragmentShaderID);
    glLinkProgram(ProgramID);
 
    // Check the program
    glGetProgramiv(ProgramID, GL_LINK_STATUS, &Result);
    glGetProgramiv(ProgramID, GL_INFO_LOG_LENGTH, &InfoLogLength);
    std::vector<char> ProgramErrorMessage( max(InfoLogLength, int(1)) );
    glGetProgramInfoLog(ProgramID, InfoLogLength, NULL, &ProgramErrorMessage[0]);
    log(&ProgramErrorMessage[0]);
 
    glDeleteShader(VertexShaderID);
    glDeleteShader(FragmentShaderID);
 
    return ProgramID;
}

Edited by Syerjchep, 26 March 2014 - 09:05 PM.


#5 Chris_F   Members   -  Reputation: 2227

Like
0Likes
Like

Posted 26 March 2014 - 09:49 PM

 


Nvidias GLSL compiler is implemented by way of Cg
I doubt its the case anymore. Recent GLSL has no Cg equivalent.

 

 

In any case I know for a fact that Nvidia's GLSL compiler will accept some code which does not conform to the Khronos spec. This is one of the pitfalls of developing GLSL shaders on a workstation with a Nvidia graphics card. Luckily there has been a recent development called Glslang, which is an official GLSL reference compiler from Khronos. If glslLangValidator compiles your code without error, and AMD or Nvidia fails, then it's their fault, not yours.


Edited by Chris_F, 26 March 2014 - 10:58 PM.


#6 Kaptein   Prime Members   -  Reputation: 2057

Like
0Likes
Like

Posted 26 March 2014 - 10:47 PM

Like Chris_F said, nvidia's compiler is alot more lenient (and has more features), while AMDs is standard compliant.

You need to check for GLSL compilation errors.. These are things you can fix at your friends computer in a heartbeat, because you should be getting the output as you compile each shader (if they fail).

 

I have made an implementation here:

https://github.com/fwsGonzo/library/blob/master/library/opengl/shader.cpp

 

It's probably not perfect, but it will tell you what went wrong.

 

What you are looking for is:

glGetShaderiv(shader_v, GL_COMPILE_STATUS, &status);

and

glGetProgramiv(this->shader, GL_LINK_STATUS, &status);

respectively.

 

Also, don't forget to LIBERALLY check for general opengl errors:

if (OpenGL::checkError())

....
 
It's important to have a general idea (best guess) where things went wrong, since it's unlikely your friends will figure that out for you. :)
 
Also, it might be worth mentioning, that unless you absolutely need 32bit depth testing - you could go for 24d8s, since that is the most common format.
Whether or not it's the fastest I have no idea.


#7 Chris_F   Members   -  Reputation: 2227

Like
0Likes
Like

Posted 26 March 2014 - 10:51 PM

If it's available to you, you might also want to log errors from GL_ARB_debug_ouput/GL_KHR_debug.



#8 Syerjchep   Members   -  Reputation: 180

Like
0Likes
Like

Posted 26 March 2014 - 11:20 PM

Well I've implemented plenty of glGetError calls but now need to upload the new version of the program and find someone with an AMD processor.

Also, is it bad if my vertex shader is version 330 core but my fragment shader is version 150 and they're run with each other?

#9 mhagain   Crossbones+   -  Reputation: 7821

Like
3Likes
Like

Posted 26 March 2014 - 11:55 PM

Works fine for me.  Checking the stderr log I see the following:

0	: Program started.
0.00	: 4 quotes read.
0.00	: Could not find prefs file, creating with defaults.
0.27	: OpenAL: No Error
0.28	: Networking initalized.
0.28	: Initalized.
0.28	: Using glew version: 1.7.0
0.28	: Intel(R) HD Graphics 4000
0.28	: Intel
0.28	: 4.0.0 - Build 9.18.10.3071
0.28	: 4.00 - Build 9.18.10.3071
0.29	: Opengl version used: 4.0
0.29	: Compiling shader: shaders/StandardShadingVertex.glsl
0.33	: No errors.

This was on a laptop with NVIDIA Optimus but note that it chose to use the Intel graphics, not the NV.  That's my first diagnosis and seems consistent with what you report in your first post:

0.38   : AMD Radeon HD 6800 Series
0.38   : ATI Technologies Inc.
0.39   : 3.2.12618 Core Profile Context 13.251.0.0
0.39   : 4.30
0.39   : Opengl version used: 3.2

A Radeon 6800 shouldn't be using GL 3.2 so my bet is that this is likewise a laptop, this time with AMD switchable graphics and a hacked-up driver (e.g see http://leshcatlabs.net/2013/11/11/amd-enables-legacy-switchable-graphics-support/ for a discussion of some of the horrible things you had to do if you wanted to update a driver using this technology).

 

Later on today I can test this on a standalone Radeon and we'll see what the results are.


It appears that the gentleman thought C++ was extremely difficult and he was overjoyed that the machine was absorbing it; he understood that good C++ is difficult but the best C++ is well-nigh unintelligible.


#10 mhagain   Crossbones+   -  Reputation: 7821

Like
3Likes
Like

Posted 27 March 2014 - 03:49 AM

OK, here's AMD:

4.74	: AMD Radeon HD 6450
4.74	: ATI Technologies Inc.
4.74	: 4.0.12430 Core Profile Context 13.152.1.8000
4.74	: 4.30
4.74	: Opengl version used: 4.0
4.80	: Compiling shader: shaders/StandardShadingVertex.glsl
terminate called after throwing an instance of 'std::logic_error'
  what():  basic_string::_S_construct null not valid

This was accompanied by a nice friendly "the application has requested the Runtime to terminate it in an unusual way" error.  Hopefully that helps you a bit more with troubleshooting.


It appears that the gentleman thought C++ was extremely difficult and he was overjoyed that the machine was absorbing it; he understood that good C++ is difficult but the best C++ is well-nigh unintelligible.


#11 Syerjchep   Members   -  Reputation: 180

Like
0Likes
Like

Posted 27 March 2014 - 07:08 AM

Well I looked up the error and all I got was "a string was constructed from a NULL char".
Basically I've edited my shader to look like this:
GLuint LoadShaders(const char * vertex_file_path,const char * fragment_file_path)
{
    try
    {
        checkForGLError();
        cerr<<"Before shaders...\n";
        // Create the shaders
        GLuint VertexShaderID = glCreateShader(GL_VERTEX_SHADER);
        GLuint FragmentShaderID = glCreateShader(GL_FRAGMENT_SHADER);

        cerr<<"Shaders allocated.\n";
        checkForGLError();

        // Read the Vertex Shader code from the file
        std::string VertexShaderCode;
        std::ifstream VertexShaderStream(vertex_file_path, std::ios::in);
        if(VertexShaderStream.is_open())
        {
            std::string Line = "";
            while(getline(VertexShaderStream, Line))
                VertexShaderCode += "\n" + Line;
            VertexShaderStream.close();
        }

        cerr<<"Vertex shader length: "<<VertexShaderCode.length()<<"\n";
        checkForGLError();

        // Read the Fragment Shader code from the file
        std::string FragmentShaderCode;
        std::ifstream FragmentShaderStream(fragment_file_path, std::ios::in);
        if(FragmentShaderStream.is_open()){
            std::string Line = "";
            while(getline(FragmentShaderStream, Line))
                FragmentShaderCode += "\n" + Line;
            FragmentShaderStream.close();
        }

        cerr<<"Fragment shader length: "<<FragmentShaderCode.length()<<"\n";
        checkForGLError();

        GLint Result = GL_FALSE;
        int InfoLogLength;

        // Compile Vertex Shader
        log("Compiling shader: "+string(vertex_file_path));
        char const * VertexSourcePointer = VertexShaderCode.c_str();
        cerr<<"v1\n";
        glShaderSource(VertexShaderID, 1, &VertexSourcePointer , NULL);
        checkForGLError();
        cerr<<"v2\n";
        glCompileShader(VertexShaderID);
        cerr<<"v3\n";
        checkForGLError();

        // Check Vertex Shader
        glGetShaderiv(VertexShaderID, GL_COMPILE_STATUS, &Result);
        glGetShaderiv(VertexShaderID, GL_INFO_LOG_LENGTH, &InfoLogLength);
        std::vector<char> VertexShaderErrorMessage(InfoLogLength);
        glGetShaderInfoLog(VertexShaderID, InfoLogLength, NULL, &VertexShaderErrorMessage[0]);
        log(&VertexShaderErrorMessage[0]);

        checkForGLError();

        // Compile Fragment Shader
        log("Compiling shader: "+string(fragment_file_path));
        char const * FragmentSourcePointer = FragmentShaderCode.c_str();
        glShaderSource(FragmentShaderID, 1, &FragmentSourcePointer , NULL);
        glCompileShader(FragmentShaderID);

        checkForGLError();

        // Check Fragment Shader
        glGetShaderiv(FragmentShaderID, GL_COMPILE_STATUS, &Result);
        glGetShaderiv(FragmentShaderID, GL_INFO_LOG_LENGTH, &InfoLogLength);
        std::vector<char> FragmentShaderErrorMessage(InfoLogLength);
        glGetShaderInfoLog(FragmentShaderID, InfoLogLength, NULL, &FragmentShaderErrorMessage[0]);
        log(&FragmentShaderErrorMessage[0]);

        checkForGLError();

        // Link the program
        log("Linking program.");
        GLuint ProgramID = glCreateProgram();
        glAttachShader(ProgramID, VertexShaderID);
        glAttachShader(ProgramID, FragmentShaderID);
        glLinkProgram(ProgramID);

        checkForGLError();

        // Check the program
        glGetProgramiv(ProgramID, GL_LINK_STATUS, &Result);
        glGetProgramiv(ProgramID, GL_INFO_LOG_LENGTH, &InfoLogLength);
        std::vector<char> ProgramErrorMessage( max(InfoLogLength, int(1)) );
        glGetProgramInfoLog(ProgramID, InfoLogLength, NULL, &ProgramErrorMessage[0]);
        log(&ProgramErrorMessage[0]);

        checkForGLError();

        glDeleteShader(VertexShaderID);
        glDeleteShader(FragmentShaderID);

        return ProgramID;
    }
    catch(int exc)
    {
        log("Making shader threw exception: "+ istr(exc));
    }
}
Where checkForGLError is defined as:
void checkForGLError()
{
    GLenum err;
	while ((err = glGetError()) != GL_NO_ERROR)
    {
        log("OpenGL error: " + err);
		cerr << "OpenGL error: " << err << endl;
	}
}
Basically if an exception is thrown I'll know.
I know the lengths of the strings before the code attempts to compile them.
I know which step in compiling it gets to (v1,v2,v3)
And if there are any glErrors thrown I'll hopefully see them.



If at your convince you or someone else could redownload the program, run it, and post stderr.txt (it'll have more information than logfile.txt) that'd be great.

#12 mhagain   Crossbones+   -  Reputation: 7821

Like
2Likes
Like

Posted 27 March 2014 - 08:20 AM

Okeydokey, here's the latest stderr:

0	: Program started.
0.00	: 4 quotes read.
0.00	: Could not find prefs file, creating with defaults.
0.31	: r loading model --- b3 out of scope --- 
OpenGL error: 1280
0.38	: OpenAL: No Error
0.38	: Networking initalized.
0.38	: Initalized.
0.38	: Using glew version: 1.7.0
0.38	: AMD Radeon HD 6450
0.38	: ATI Technologies Inc.
0.38	: 4.0.12430 Core Profile Context 13.152.1.8000
0.38	: 4.30
0.38	: Opengl version used: 4.0
0.38	: r loading model --- b3 out of scope --- 
OpenGL error: 1280
Before shaders...
Shaders allocated.
Vertex shader length: 2230
Fragment shader length: 7481
0.38	: Compiling shader: shaders/StandardShadingVertex.glsl
v1
v2
v3
terminate called after throwing an instance of 'std::logic_error'
  what():  basic_string::_S_construct null not valid

InfoLogLength is 0 perhaps?


Edited by mhagain, 27 March 2014 - 08:23 AM.

It appears that the gentleman thought C++ was extremely difficult and he was overjoyed that the machine was absorbing it; he understood that good C++ is difficult but the best C++ is well-nigh unintelligible.


#13 Syerjchep   Members   -  Reputation: 180

Like
0Likes
Like

Posted 27 March 2014 - 08:41 AM

Okeydokey, here's the latest stderr:

0	: Program started.
0.00	: 4 quotes read.
0.00	: Could not find prefs file, creating with defaults.
0.31	: r loading model --- b3 out of scope --- 
OpenGL error: 1280
0.38	: OpenAL: No Error
0.38	: Networking initalized.
0.38	: Initalized.
0.38	: Using glew version: 1.7.0
0.38	: AMD Radeon HD 6450
0.38	: ATI Technologies Inc.
0.38	: 4.0.12430 Core Profile Context 13.152.1.8000
0.38	: 4.30
0.38	: Opengl version used: 4.0
0.38	: r loading model --- b3 out of scope --- 
OpenGL error: 1280
Before shaders...
Shaders allocated.
Vertex shader length: 2230
Fragment shader length: 7481
0.38	: Compiling shader: shaders/StandardShadingVertex.glsl
v1
v2
v3
terminate called after throwing an instance of 'std::logic_error'
  what():  basic_string::_S_construct null not valid
InfoLogLength is 0 perhaps?

This is actually extremely helpful.
Well, I uploaded a new version.
This time there's a new setting in settings.txt.
Change LOG_SHADERS to 0 then try launching it.

#14 mhagain   Crossbones+   -  Reputation: 7821

Like
2Likes
Like

Posted 27 March 2014 - 08:45 AM

OK, first run, default settings:

0	: Program started.
0.00	: 4 quotes read.
0.00	: Could not find prefs file, creating with defaults.
0.30	: del --- c2 out of scope --- 
OpenGL error: 1280
0.33	: OpenAL: No Error
0.33	: Networking initalized.
0.33	: Initalized.
0.33	: Using glew version: 1.7.0
0.33	: AMD Radeon HD 6450
0.33	: ATI Technologies Inc.
0.33	: 4.0.12430 Core Profile Context 13.152.1.8000
0.33	: 4.30
0.33	: Opengl version used: 4.0
0.33	: del --- c2 out of scope --- 
OpenGL error: 1280
Before shaders...
Shaders allocated.
Vertex shader length: 2230
Fragment shader length: 7481
0.33	: Compiling shader: shaders/StandardShadingVertex.glsl
v1
v2
v3
0.37	: Vertex Error Message Size: 0,0
terminate called after throwing an instance of 'std::logic_error'
  what():  basic_string::_S_construct null not valid

Second run, LOG_SHADERS 0 .................................. IT WORKS! smile.png

0.33	: Vertex Error Message Size: 0,0
0.33	: Compiling shader: shaders/StandardShadingFragment.glsl
0.34	: Linking program.
0.36	: 
8.38	: UDP opened, address resolved.
8.88	: UDP opened, address resolved.
9.38	: UDP opened, address resolved.
16.49	: Window closed!
16.49	: OpenAL: No Error
16.64	: 0 frames ran.
16.64	: 16.6419 current time.
16.64	: 16.643 seconds elapsed.
16.64	: 0 fps average.
0	: Ended program.

Edited by mhagain, 27 March 2014 - 08:47 AM.

It appears that the gentleman thought C++ was extremely difficult and he was overjoyed that the machine was absorbing it; he understood that good C++ is difficult but the best C++ is well-nigh unintelligible.


#15 Syerjchep   Members   -  Reputation: 180

Like
1Likes
Like

Posted 27 March 2014 - 09:12 AM

OK, first run, default settings:

0	: Program started.
0.00	: 4 quotes read.
0.00	: Could not find prefs file, creating with defaults.
0.30	: del --- c2 out of scope --- 
OpenGL error: 1280
0.33	: OpenAL: No Error
0.33	: Networking initalized.
0.33	: Initalized.
0.33	: Using glew version: 1.7.0
0.33	: AMD Radeon HD 6450
0.33	: ATI Technologies Inc.
0.33	: 4.0.12430 Core Profile Context 13.152.1.8000
0.33	: 4.30
0.33	: Opengl version used: 4.0
0.33	: del --- c2 out of scope --- 
OpenGL error: 1280
Before shaders...
Shaders allocated.
Vertex shader length: 2230
Fragment shader length: 7481
0.33	: Compiling shader: shaders/StandardShadingVertex.glsl
v1
v2
v3
0.37	: Vertex Error Message Size: 0,0
terminate called after throwing an instance of 'std::logic_error'
  what():  basic_string::_S_construct null not valid
Second run, LOG_SHADERS 0 .................................. IT WORKS! smile.png
0.33	: Vertex Error Message Size: 0,0
0.33	: Compiling shader: shaders/StandardShadingFragment.glsl
0.34	: Linking program.
0.36	: 
8.38	: UDP opened, address resolved.
8.88	: UDP opened, address resolved.
9.38	: UDP opened, address resolved.
16.49	: Window closed!
16.49	: OpenAL: No Error
16.64	: 0 frames ran.
16.64	: 16.6419 current time.
16.64	: 16.643 seconds elapsed.
16.64	: 0 fps average.
0	: Ended program.

Wow man, thanks a ton!
Can't believe all this time it had nothing to do with OpenGL at all lol

#16 Glass_Knife   Moderators   -  Reputation: 4087

Like
2Likes
Like

Posted 27 March 2014 - 09:57 AM


Wow man, thanks a ton!
Can't believe all this time it had nothing to do with OpenGL at all lol

 

You know, the hardest lesson I've learned programming (and am still learning) is to put your ego aside and always assume you did something wrong.  I can't tell you how many times I've blamed a library/platform/driver/programmer for some problem, only to find it in my own code later where I was doing something silly.

 

Programming is hard.  :-)


I think, therefore I am. I think? - "George Carlin"
Indie Game Programming




Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.



PARTNERS