OpenGL Can't draw a triangle? No error either.

Started by
10 comments, last by Krankles 11 years, 2 months ago

Hi, I've recently started learning opengl and I've encountered a problem... I'm following this tutorial http://www.opengl-tutorial.org/beginners-tutorials/tutorial-2-the-first-triangle/

My problem is that my code doesn't draw the triangle! I've slightly altered the code but the whole thing SHOULD be working. My code is:


#include <iostream>
#include <GL/glew.h>
#include <GL/glfw.h>

int main() {
    // Initialize GLFW
    if (!glfwInit()) {
        std::cout << "Failed to initialize GLFW" << std::endl;
        return 1;
    }

    if (!glfwOpenWindow(1024, 768, 0, 0, 0, 0, 0, 0, GLFW_WINDOW)) {
        std::cout << "Failed to create window" << std::endl;
        glfwTerminate();
        return 1;
    }

    // Adds support for experimental or pre-release graphics drivers
    glewExperimental = GL_TRUE;

    // Intialize GLEW
    if (glewInit() != GLEW_OK) {
        std::cout << "Failed to initialize GLEW" << std::endl;
        return 1;
    }

    glfwSetWindowTitle("OpenGL Hacks");

    GLuint vertexArrayID;
    glGenVertexArrays(1, &vertexArrayID);
    glBindVertexArray(vertexArrayID);

    static const GLfloat vertexBufferData[] = {
        -1.0f, -1.0f, 0.0f,
        1.0f, -1.0f, 0.0f,
        0.0f, 1.0f, 0.0f
    };

    GLuint vertexBuffer;
    glGenBuffers(1, &vertexBuffer);
    glBindBuffer(GL_ARRAY_BUFFER, vertexBuffer);
    glBufferData(GL_ARRAY_BUFFER, sizeof(vertexBufferData), vertexBufferData, GL_STATIC_DRAW);

    glfwEnable(GLFW_STICKY_KEYS);
    while (glfwGetWindowParam(GLFW_OPENED)) {
        if (glfwGetKey(GLFW_KEY_ESC)) {
            break;
        }

        glClearColor(0.0f, 1.0f, 1.0f, 1.0f);
        glClear(GL_COLOR_BUFFER_BIT);

        glEnableVertexAttribArray(0);
        glBindBuffer(GL_ARRAY_BUFFER, vertexBuffer);
        glVertexAttribPointer(0, 3, GL_FLOAT, GL_FALSE, 0, (void*)0);
        
        glDrawArrays(GL_TRIANGLES, 0, 3);
        
        glDisableVertexAttribArray(0);

        glfwSwapBuffers();
    }

    glfwTerminate();
    return 0;
}

I'm using GLFW 2.7.7, GLEW 1.9.0 and my OS is Arch Linux. However, I'm running Arch Linux on VirtualBox and obviously, you can't install graphics drivers using VirtualBox. So maybe that's the problem? I don't know, but I tried compiling on windows 7 using MinGW and it still doesn't draw the triangle. So I don't think that's the problem.

Also, in the first tutorial, Opening a window, I didn't use the code:


glfwOpenWindowHint(GLFW_FSAA_SAMPLES, 4);
glfwOpenWindowHint(GLFW_OPENGL_VERSION_MAJOR, 3);
glfwOpenWindowHint(GLFW_OPENGL_VERSION_MINOR, 3);
glfwOpenWindowHint(GLFW_OPENGL_PROFILE, GLFW_OPENGL_CORE_PROFILE);

Because otherwise, my program would crash. (Probably because of no graphics drivers from virtualizing the OS).

Any help would be appreciated. Thanks.

EDIT: Also, when compiling the code on Windows 7, I see the window open, but no triangle drawn and that the program instantly crashes after opening it. Yes, I've compiled GLEW and GLFW from source. Possibly something in the code is making my program crash in Windows. The program doesn't crash on Linux, however.

Advertisement

Try calling glGetString with GL_VERSION (and print it) to find out what versions of OpenGL context are being created on each platform.

If you don't get at least a certain version (I don't remember which off-hand), some of those function pointers will be zero, hence a crash.

Also, everything in the loop except the clear and the draw should be moved above the loop (and the glDisableVertexAttributeArray removed).

Lastly, try moving on and using shaders, depending on what version of OpenGL context you're actually getting, it may not support showing any thing without a proper shader program.

New C/C++ Build Tool 'Stir' (doesn't just generate Makefiles, it does the build): https://github.com/space222/stir

When I do


std::cout << glGetString(GL_VERSION) << std::endl;

I get 4.2.0 on Windows and on my actual development OS, Arch Linux, I get 2.1 Mesa 9.0.1. My code now looks like this:


#include <iostream>
#include <GL/glew.h>
#include <GL/glfw.h>

int main() {
    // Initialize GLFW
    if (!glfwInit()) {
        std::cout << "Failed to initialize GLFW" << std::endl;
        return 1;
    }

    /*glfwOpenWindowHint(GLFW_FSAA_SAMPLES, 4);
    glfwOpenWindowHint(GLFW_OPENGL_VERSION_MAJOR, 3);
    glfwOpenWindowHint(GLFW_OPENGL_VERSION_MINOR, 3);
    glfwOpenWindowHint(GLFW_OPENGL_PROFILE, GLFW_OPENGL_CORE_PROFILE);*/

    if (!glfwOpenWindow(1024, 768, 0, 0, 0, 0, 0, 0, GLFW_WINDOW)) {
        std::cout << "Failed to create window" << std::endl;
        glfwTerminate();
        return 1;
    }

    // Adds support for experimental or pre-release graphics drivers
    glewExperimental = GL_TRUE;

    // Intialize GLEW
    if (glewInit() != GLEW_OK) {
        std::cout << "Failed to initialize GLEW" << std::endl;
        return 1;
    }

    glfwSetWindowTitle("OpenGL Hacks");

    std::cout << glGetString(GL_VERSION) << std::endl;

    GLuint vertexArrayID;
    glGenVertexArrays(1, &vertexArrayID);
    glBindVertexArray(vertexArrayID);

    static const GLfloat vertexBufferData[] = {
        -1.0f, -1.0f, 0.0f,
        1.0f, -1.0f, 0.0f,
        0.0f, 1.0f, 0.0f
    };

    GLuint vertexBuffer;
    glGenBuffers(1, &vertexBuffer);
    glBindBuffer(GL_ARRAY_BUFFER, vertexBuffer);
    glBufferData(GL_ARRAY_BUFFER, sizeof(vertexBufferData), vertexBufferData, GL_STATIC_DRAW);

    glEnableVertexAttribArray(0);
    glBindBuffer(GL_ARRAY_BUFFER, vertexBuffer);
    glVertexAttribPointer(0, 3, GL_FLOAT, GL_FALSE, 0, (void*)0);

    glfwEnable(GLFW_STICKY_KEYS);
    while (glfwGetWindowParam(GLFW_OPENED)) {
        if (glfwGetKey(GLFW_KEY_ESC)) {
            break;
        }

        glClearColor(0.0f, 1.0f, 1.0f, 1.0f);
        glClear(GL_COLOR_BUFFER_BIT);

        glDrawArrays(GL_TRIANGLES, 0, 3);
        
        //glDisableVertexAttribArray(0);

        glfwSwapBuffers();
    }

    glfwTerminate();
    return 0;
}

Fortunately, the triangle is drawn, but the program closes instantly! I'm doing this on windows. On my linux OS, however, the program doesn't draw the program, but doesn't close instantly (it is opened until you press ESC) as it should.

Thanks for the help.

That most recent code compiles and runs perfectly for me (Ubuntu with AMD drivers, GL 4.2).

Maybe the escape key check should go after the buffer swap call (the documentation says that polls events [the keyboard]); might be system dependent?

Otherwise, it's probably a glfw bug. Might be worth trying freeglut (or something else, like native win32 api [not as hard/annoying as X11]) to make sure it's not a GL problem (unlikely at this point).

New C/C++ Build Tool 'Stir' (doesn't just generate Makefiles, it does the build): https://github.com/space222/stir

Weird, the problem is that since I'm using cmake and make to build my project, it somehow causes my program to instantly close on Windows... So if I just do "g++ main.cpp -lglew32 -lglfw -lopengl32" the program runs just fine! This is my CMakeLists.txt


cmake_minimum_required(VERSION 2.8)
project(test)

add_definitions(-DFLAGS)
set(CMAKE_CXX_FLAGS ${FLAGS})

find_package(OpenGL REQUIRED)

set(CMAKE_MODULE_PATH cmake_scripts/)
find_package(GLFW REQUIRED)
include_directories(${GLFW_INCLUDE_DIR})

find_package(GLEW REQUIRED)
include_directories(${GLEW_INCLUDE_PATH})

add_executable(test main.cpp)
target_link_libraries(test ${GLEW_LIBRARY} ${GLFW_LIBRARY} ${OPENGL_LIBRARY})

Now, I got my FindGLEW.cmake from here http://code.google.com/p/assembly3d/source/browse/tools/viewer/cmake_scripts/FindGLEW.cmake and my FindGLFW.cmake from here http://code.google.com/p/assembly3d/source/browse/tools/viewer/cmake_scripts/FindGLFW.cmake

I don't know if there's a faulty with them (since I'm new to using cmake) but now I know that using it somehow causes my program to close immediately (note that it doesn't crash, it just closes immediately.) The program draws the triangle and stuff fine. I'm using version 4.7.2 of MinGW. This is all on Windows 7, not Linux.

On Linux, the program runs fine, but still doesn't draw that triangle that I see when running on Windows. I don't understand, is it because of when getting the GL_VERSION, it reports back 2.1 Mesa 9.0.1? It doesn't say 4.2.0 like on windows (Probably because of using VirtualBox and not being able to install a real graphics driver). Even using the command "g++ main.cpp -lglfw -lGL -lGLEW" doesn't fix the problem of the triangle not drawing.

Basically now, my questions are:

1. How come when running cmake/make on windows causes my program to somehow instantly close? And how can I fix it? The workaround is to just use g++ manually, but I really want to use cmake/make.

2. Why doesn't the triangle draw on my virtualized linux OS but it does on my main OS, Windows 7? And how can I fix it?

EDIT: Just in case, on my linux OS, I'm using a tiling window manager, which is i3-wm. Which you can find right here http://i3wm.org/ Using this tiling window manager has caused me some problems when developing a game. When I used SFML 2.0, I would run my game created using SFML 2.0 from the terminal. I would then move it out into floating mode instead of tiled mode so that it would work as a normal window instead of being tiled together with other windows. This would cause my SFML game to rescale the images and somehow make the game use unused space. You can see image here:v4v7g1.jpg

and this is what happens when I move the terminal around on the pong game.

2vb7p83.jpg

Now this is just an idea, I don't know if this is actually the problem with the triangle not rendering on my linux machine, but I'm just throwing it out there. (The art is from Game From Scratch by Serapth (i think), was too lazy to make my own art :P)

So I've recently decided to go on and tried using a shader in Linux. Unfortunately, it still doesn't draw and when I look in the terminal, I get these errors:


Compiling shader: SimpleVertexShader.vertexshader
0:2(14): preprocessor error: syntax error, unexpected IDENTIFIER, expecting NEWLINE

Compiling shader: SimpleFragmentShader.fragmentshader
0:2(14): preprocessor error: syntax error, unexpected IDENTIFIER, expecting NEWLINE

Linking program
error: linking with uncompiled shadererror: linking with uncompiled shader

This is the SimpleVertexShader.vertexshader file


#version 420 core

layout(location = 0) in vec3 vertexPosition_modelspace;

void main() {
    gl_Position.xyz = vertexPosition_modelspace;
    gl_Position.w = 1.0;
}

and this is the SimpleFragmentShader.fragmentshader


#version 420 core

out vec3 color;

void main() {
    color = vec3(1,0,0);
}

And this is the LoadShaders.cpp


#include <vector>
#include <fstream>
#include <algorithm>
#include <GL/glew.h>

GLuint loadShaders(const char *vertexFilePath, const char *fragmentFilePath) {
    // Create the shaders
    GLuint vertexShaderID = glCreateShader(GL_VERTEX_SHADER);
    GLuint fragmentShaderID = glCreateShader(GL_FRAGMENT_SHADER);

    // Read the Vertex Shader code from the file
    std::string vertexShaderCode;
    std::ifstream vertexShaderStream(vertexFilePath, std::ios::in);
    if (vertexShaderStream.is_open()) {
        std::string line = "";
        while(getline(vertexShaderStream, line))
            vertexShaderCode += "\n" + line;
        vertexShaderStream.close();
    }

    // Read the Fragment Shader code from the file
    std::string fragmentShaderCode;
    std::ifstream fragmentShaderStream(fragmentFilePath, std::ios::in);
    if (fragmentShaderStream.is_open()) {
        std::string line = "";
        while (getline(fragmentShaderStream, line))
            fragmentShaderCode += "\n" + line;
        fragmentShaderStream.close();
    }

    GLint result = GL_FALSE;
    int infoLogLength;

    // Compile Vertex Shader
    printf("Compiling shader: %s\n", vertexFilePath);
    char const *vertexSourcePointer = vertexShaderCode.c_str();
    glShaderSource(vertexShaderID, 1, &vertexSourcePointer, NULL);
    glCompileShader(vertexShaderID);

    // Check Vertex Shader
    glGetShaderiv(vertexShaderID, GL_COMPILE_STATUS, &result);
    glGetShaderiv(vertexShaderID, GL_INFO_LOG_LENGTH, &infoLogLength);
    std::vector<char> vertexShaderErrorMessage(infoLogLength);
    glGetShaderInfoLog(vertexShaderID, infoLogLength, NULL, &vertexShaderErrorMessage[0]);
    printf("%s\n", &vertexShaderErrorMessage[0]);

    // Compile Fragment Shader
    printf("Compiling shader: %s\n", fragmentFilePath);
    char const *fragmentSourcePointer = fragmentShaderCode.c_str();
    glShaderSource(fragmentShaderID, 1, &fragmentSourcePointer, NULL);
    glCompileShader(fragmentShaderID);

    // Check Fragment Shader
    glGetShaderiv(fragmentShaderID, GL_COMPILE_STATUS, &result);
    glGetShaderiv(vertexShaderID, GL_INFO_LOG_LENGTH, &infoLogLength);
    std::vector<char> fragmentShaderErrorMessage(infoLogLength);
    glGetShaderInfoLog(fragmentShaderID, infoLogLength, NULL, &fragmentShaderErrorMessage[0]);
    printf("%s\n", &fragmentShaderErrorMessage[0]);

    // Link the program
    printf("Linking program\n");
    GLuint programID = glCreateProgram();
    glAttachShader(programID, vertexShaderID);
    glAttachShader(programID, fragmentShaderID);
    glLinkProgram(programID);

    // Check the program
    glGetProgramiv(programID, GL_LINK_STATUS, &result);
    glGetProgramiv(programID, GL_INFO_LOG_LENGTH, &infoLogLength);
    std::vector<char> programErrorMessage(infoLogLength + 1);
    glGetProgramInfoLog(programID, infoLogLength, NULL, &programErrorMessage[0]);
    printf("%s\n", &programErrorMessage[0]);

    glDeleteShader(vertexShaderID);
    glDeleteShader(fragmentShaderID);

    return programID;
}

Lastly, this is the main.cpp:


#include <iostream>
#include <GL/glew.h>
#include <GL/glfw.h>
#include "shaders/LoadShader.cpp"

int main() {
    // Initialize GLFW
    if (!glfwInit()) {
        std::cout << "Failed to initialize GLFW" << std::endl;
        return 1;
    }

    /*glfwOpenWindowHint(GLFW_FSAA_SAMPLES, 4);
    glfwOpenWindowHint(GLFW_OPENGL_VERSION_MAJOR, 3);
    glfwOpenWindowHint(GLFW_OPENGL_VERSION_MINOR, 3);
    glfwOpenWindowHint(GLFW_OPENGL_PROFILE, GLFW_OPENGL_CORE_PROFILE);*/

    if (!glfwOpenWindow(1024, 768, 0, 0, 0, 0, 0, 0, GLFW_WINDOW)) {
        std::cout << "Failed to create window" << std::endl;
        glfwTerminate();
        return 1;
    }

    // Adds support for experimental or pre-release graphics drivers
    glewExperimental = GL_TRUE;

    // Intialize GLEW
    glewInit();
    if (glewInit() != GLEW_OK) {
        std::cout << "Failed to initialize GLEW" << std::endl;
        return 1;
    }

    glfwSetWindowTitle("OpenGL Hacks");

    GLuint vertexArrayID;
    glGenVertexArrays(1, &vertexArrayID);
    glBindVertexArray(vertexArrayID);

    static const GLfloat vertexBufferData[] = {
        -1.0f, -1.0f, 0.0f,
        1.0f, -1.0f, 0.0f,
        0.0f, 1.0f, 0.0f
    };

    GLuint vertexBuffer;
    glGenBuffers(1, &vertexBuffer);
    glBindBuffer(GL_ARRAY_BUFFER, vertexBuffer);
    glBufferData(GL_ARRAY_BUFFER, sizeof(vertexBufferData), vertexBufferData, GL_STATIC_DRAW);

    GLuint programID = loadShaders("SimpleVertexShader.vertexshader", "SimpleFragmentShader.fragmentshader");

    glfwEnable(GLFW_STICKY_KEYS);

    while (glfwGetWindowParam(GLFW_OPENED)) {
        glClearColor(0.0f, 0.0f, 0.0f, 0.0f);
        glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);

        glUseProgram(programID);

        glEnableVertexAttribArray(0);
        glBindBuffer(GL_ARRAY_BUFFER, vertexBuffer);
        glVertexAttribPointer(0, 3, GL_FLOAT, GL_FALSE, 0, 0);

        glDrawArrays(GL_TRIANGLES, 0, 3);

        glDisableVertexAttribArray(0);
	glUseProgram(0);

        glfwSwapBuffers();

        if (glfwGetKey(GLFW_KEY_ESC)) {
	    break;
        }
    }

    glfwTerminate();
    return 0;
}

How do I fix this problem? There's no problem on Windows, but it's a problem on the linux side...

My other questions still stands:

1. How come when running cmake/make on windows causes my program to somehow instantly close? And how can I fix it? The workaround is to just use g++ manually, but I really want to use cmake/make.

2. Why doesn't the triangle draw on my virtualized linux OS but it does on my main OS, Windows 7? And how can I fix it?

3. And now this question, how do I fix the "0:2(14): preprocessor error: syntax error, unexpected IDENTIFIER, expecting NEWLINE" problem?

Thanks.

1) Refering to post http://www.gamedev.net/topic/638573-opengl-cant-draw-a-triangle-no-error-either/?view=findpost&p=5030781: you aren't using shaders, so you need to use old-style vertex specification functions: glEnableClientState and glVertexPointer:




	static const GLfloat vertexBufferData[] =
	{ -1.0f, -1.0f, 0.0f, 1.0f, -1.0f, 0.0f, 0.0f, 1.0f, 0.0f };

	GLuint vertexBuffer;
	glGenBuffers(1, &vertexBuffer);
	glBindBuffer(GL_ARRAY_BUFFER, vertexBuffer);
	glBufferData(GL_ARRAY_BUFFER, sizeof(vertexBufferData), vertexBufferData, GL_STATIC_DRAW);

//	glEnableVertexAttribArray(0);
	glEnableClientState(GL_VERTEX_ARRAY);
	glBindBuffer(GL_ARRAY_BUFFER, vertexBuffer);
//	glVertexAttribPointer(0, 3, GL_FLOAT, GL_FALSE, 0, (void*) 0);
	glVertexPointer(3, GL_FLOAT, 12, 0);

	glfwEnable(GLFW_STICKY_KEYS);
	while (glfwGetWindowParam(GLFW_OPENED))
	{
		if (glfwGetKey(GLFW_KEY_ESC))
		{
			break;
		}

		glClearColor(0.0f, 1.0f, 1.0f, 1.0f);
		glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);

		glDrawArrays(GL_TRIANGLES, 0, 3);

2) To your last post: Mesa doesn't support GLSL 4.20 which you're using. Try GLSL 1.20 (or 1.30, but I'm not sure if it is implemented in your version Mesa).

It works! Using glEnableClientState and glVertexPointer draws a triangle on both OS's. But how come using glEnableVertexAttribArray and glVertexAttribPointer works on Windows but not Linux? And the tutorial even uses this code without shaders and it still draws the triangle.

Also, I tried changing GLSL version to 1.20 (even tried 1.30) but it still gives me that error. What I did was do "#version 120 core" instead of "#version 420 core" in both fragment and vertex shader files. How can I fix this? (Am I even changing the GLSL version by doing this? Sorry, new to this)

Also, wouldn't it be preferable to use glEnableVertexAttribArray and glVertexAttribPointer? Also, going a bit off-topic, but is the tutorial teaching fixed-function pipeline? Or is it programmable?

EDIT: I changed it to "#version 120" instead of "#version 120 core" and it now produces new errors.



Compiling shader: SimpleVertexShader.vertexshader
0:4(1): error: syntax error, unexpected IDENTIFIER

Compiling shader: SimpleFragmentShader.fragmentshader
0:4(15): error: `out' qualifier in declaration of `
Linking program
error: linking with uncompiled shadererror: linking with uncompiled shader

I think the SimpleVertexShader error is referring to the "layout" code and the SimpleFragmentShader error is referring to the "out" code.

I think the SimpleVertexShader error is referring to the "layout" code and the SimpleFragmentShader error is referring to the "out" code.

You're right, in/out variables are available since GLSL 1.40. In the previous version you have to use "attribute" keyword (for vertex shader's input), "varying" keyword (for variables passed from a vertex shader to a fragment shader) and "gl_FragData[]" for fragment shader's output.

"layout" is avialable since even newer version (I don'r remember which exactly). You have to specify indices from C++ using glBindAttribLocation.

Could you somehow show an example? I'm new to this.

Also, what are the differences between glEnableVertexAttribArray and glEnableClientState, and glVertexAttribPointer and glVertexPointer?

This topic is closed to new replies.

Advertisement