Issue when passing normals to shaders

Started by
11 comments, last by Jack Shannon 11 years, 4 months ago
It seems it doesn't matter if my normals are being passed at the moment :( I am unable to pass any 'varying' var from the vertex shader to the fragment shader. I have simplified the problem as much as possible, to just use a triangle with no normals;

main.h
[source lang="cpp"]#ifndef MAIN_H
#define MAIN_H

#include <iostream>
#include <GL/glew.h>
#include <GL/glfw.h>
#include <stdlib.h>
#include <cml/cml.h>
#include "Shader.h"

typedef cml::vector3f vec3f;

#endif[/source]

main.cpp
[source lang="cpp"]#include "main.h"

Shader* myShader;

void initGL()
{
glClearColor(0, 0, 0, 1);
glMatrixMode(GL_PROJECTION);
glLoadIdentity();
gluPerspective(50, 640.0 / 480.0, 1, 1000);
glMatrixMode(GL_MODELVIEW);
glEnable(GL_DEPTH_TEST);
myShader = new Shader("shader.vert", "shader.frag");
}

void display()
{
glLoadIdentity();
glClear(GL_COLOR_BUFFER_BIT|GL_DEPTH_BUFFER_BIT);
glUseProgram(myShader->program);
glBegin(GL_TRIANGLES);
glVertex3f(0, 1, -4);
glVertex3f(-1, -1, -4);
glVertex3f(1, -1, -4);
glEnd();
}

int main()
{
int running = GL_TRUE;

if (!glfwInit()) {
exit(EXIT_FAILURE);
}

if (!glfwOpenWindow(640, 480, 0, 0, 0, 0, 0, 0, GLFW_WINDOW)) {
glfwTerminate();
exit( EXIT_FAILURE );
}

GLenum err = glewInit();
if (GLEW_OK != err) {
std::cout << "Error: " << glewGetErrorString(err) << std::cout;
}

initGL();

while (running) {
display();
glfwSwapBuffers();

running = !glfwGetKey(GLFW_KEY_ESC) &&
glfwGetWindowParam(GLFW_OPENED);
}
glfwTerminate();
exit(EXIT_SUCCESS);
}[/source]
Shader.h
[source lang="cpp"]#ifndef SHADER_H
#define SHADER_H

#include "main.h"
#include <vector>
#include <string>
#include <fstream>

class Shader
{
void loadFile(const char* fileName, std::string& str);
GLuint load(std::string& shaderSource, GLuint shaderType);

public:
GLuint vs;
GLuint fs;
GLuint program;

Shader(const char* vs, const char* fs);
~Shader();
};

#endif[/source]
Shader.cpp
[source lang="cpp"]#include "Shader.h"

void Shader::loadFile(const char* fileName, std::string& str)
{
std::ifstream in(fileName);
if (!in.is_open()) {
std::cout << "File " << fileName << " cannot be opened\n";
return;
}
char line[300];
while (!in.eof()) {
in.getline(line, 300);
str += line;
str += '\n';
}
}

GLuint Shader::load(std::string& shaderSource, GLuint shaderType)
{
GLuint id = glCreateShader(shaderType);
const char* csource = shaderSource.c_str();
glShaderSource(id, 1, &csource, NULL);
glCompileShader(id);
char error[1000];
glGetShaderInfoLog(id, 1000, NULL, error);
std::cout << "Compile status: \n" << error << std::endl;
return id;
}

Shader::Shader(const char* vn, const char* fn)
{
std::string source;

loadFile(vn, source);
vs = load(source, GL_VERTEX_SHADER);

source = "";

loadFile(fn, source);
vs = load(source, GL_FRAGMENT_SHADER);

program = glCreateProgram();
glAttachShader(program, vs);
glAttachShader(program, fs);
glLinkProgram(program);
glUseProgram(program);
}

Shader::~Shader()
{
glDetachShader(program, vs);
glDetachShader(program, fs);
glDeleteShader(vs);
glDeleteShader(fs);
glDeleteProgram(program);
}[/source]
shader.vert
[source lang="cpp"]#version 120
varying vec4 color;

void main()
{
color = vec4(0.0, 1.0, 0.0, 1.0);
gl_Position = gl_ModelViewProjectionMatrix * gl_Vertex;
}[/source]
shader.frag
[source lang="cpp"]#version 120
varying vec4 color;

void main()
{
//gl_FragColor=vec4(1.0, 0.0, 0.0, 1.0); // Triangle is red

gl_FragColor=vec4(color); // Triangle is white. ('color' not being passed from vertex shader???)
}[/source]
The commented code in shader.frag explains what is happening. Why isn't the varying vector being passed???
Thank you and sorry for asking the wrong question before.
Advertisement
Youre using the GLFW library, what GL context does it create? a compatibility context or forward context, if its a forward context GLSL 120 shader might not work, although if that was the case I think you would be getting more errors then simply the varying problem, I may be wrong. From GLFW site FAQ

2.16 - Can any of the parameters to glfwOpenWindow be zero?
Yes. In fact, all parameters except the window mode can be zero, i.e. this is perfectly legal:
glfwOpenWindow(0, 0, 0, 0, 0, 0, 0, 0, GLFW_WINDOW);
Any parameter that is zero gets its desired value chosen by GLFW. Then, all parameters except the window mode are matched as closely as possible to what is available on the system. However, only the following parameters and hints are required to match exactly:
The window mode (i.e. the last parameter to glfwOpenWindow)
The GLFW_STEREO hint
The GLFW_OPENGL_PROFILE hint, if set to a non-zero value
The GLFW_OPENGL_FORWARD_COMPAT hint
To find out the actual properties of the window and OpenGL context, use the glfwGetWindowParam function after the window has been opened.
To see what you get on your machine using only default values, you can use the defaults test in the GLFW source distribution.
NumberXaero, thank you so much for looking into all this for me.

I've now taken the time to completely rewrite and relearn how to implement everything and am using attributes instead. Most importantly I've learnt how to properly debug. I'm up and running, apologies for wasting time!!

This topic is closed to new replies.

Advertisement