[OGL] Textures are not displaying properly

Started by
10 comments, last by GutenTag 7 years, 11 months ago

Hi Everyone, I've been tearing my hair out over this issue.

I have been having a persistant issue for the past couple of days, where a texture is loaded in, and keeps showing up white. I ran the program in gDEBbugger and it clearly shows that the texture is properly loading in, so the issue must be on the actual usage side.

My Object initialization code (for buffers and vertex attribute pointers) is


	glBindVertexArray(_VAO);
	glBindBuffer(GL_ELEMENT_ARRAY_BUFFER, _EBO);
	glBufferData(GL_ELEMENT_ARRAY_BUFFER, sizeof(GLuint) * _indices.size(), &_indices[0], GL_STATIC_DRAW);

	glBindBuffer(GL_ARRAY_BUFFER, _VBO);
	glBufferData(GL_ARRAY_BUFFER, sizeof(GLVertex) * _vertices.size(), &_vertices[0], GL_STATIC_DRAW);

	glEnableVertexAttribArray(0);
	glEnableVertexAttribArray(1);
	glEnableVertexAttribArray(2);
	glEnableVertexAttribArray(3);

	glVertexAttribPointer(0, 3, GL_FLOAT, GL_FALSE, sizeof(GLVertex), (void*)offsetof(GLVertex, position));
	glVertexAttribPointer(1, 2, GL_FLOAT, GL_FALSE, sizeof(GLVertex), (void*)offsetof(GLVertex, uv));
	glVertexAttribPointer(2, 3, GL_FLOAT, GL_FALSE, sizeof(GLVertex), (void*)offsetof(GLVertex, normal));
	glVertexAttribPointer(3, 4, GL_UNSIGNED_BYTE, GL_TRUE, sizeof(GLVertex), (void*)offsetof(GLVertex, color));

	glDisableVertexAttribArray(0);
	glDisableVertexAttribArray(1);
	glDisableVertexAttribArray(2);
	glDisableVertexAttribArray(3);
	glBindBuffer(GL_ARRAY_BUFFER, 0);
	glBindBuffer(GL_ELEMENT_ARRAY_BUFFER, 0);
	glBindVertexArray(0);

My Object rendering code is as follows (It uses an EBO)


	VPloc = shader.getUniformLocation("cameraMatrix");
	glUniformMatrix4fv(VPloc, 1, GL_FALSE, &camera.getMatrix()[0][0]);

	Mloc = shader.getUniformLocation("modelMatrix");
	glUniformMatrix4fv(Mloc, 1, GL_FALSE, &_modelMatrix[0][0]);

	glBindVertexArray(_VAO);
	glBindBuffer(GL_ELEMENT_ARRAY_BUFFER, _EBO);
	glBindBuffer(GL_ARRAY_BUFFER, _VBO);
	glActiveTexture(GL_TEXTURE0 + _texture->id);
	glUniform1i(shader.getUniformLocation("sampler"), _texture->id);

	glEnableVertexAttribArray(0);
	glEnableVertexAttribArray(1);
	glEnableVertexAttribArray(2);
	glEnableVertexAttribArray(3);

	glBindTexture(GL_TEXTURE_2D, _texture->id);
	glDrawElements(GL_TRIANGLES, _indices.size(), GL_UNSIGNED_INT, (void*)0);

	glBindTexture(GL_TEXTURE_2D, 0);
	_modelMatrix = glm::mat4(1.0f);

	glDisableVertexAttribArray(0);
	glDisableVertexAttribArray(1);
	glDisableVertexAttribArray(2);
	glDisableVertexAttribArray(3);
	glBindBuffer(GL_ARRAY_BUFFER, 0);
	glBindBuffer(GL_ELEMENT_ARRAY_BUFFER, 0);
	glBindVertexArray(0);

While the actual render loop looks like this


	glEnable(GL_MULTISAMPLE);
	glEnable(GL_LINE_SMOOTH);
	glEnable(GL_POLYGON_SMOOTH);
	if (_glWireframe) {
		glPolygonMode(GL_FRONT_AND_BACK, GL_LINE);
	}
	else {
		glPolygonMode(GL_FRONT_AND_BACK, GL_FILL);
	}

	
	glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
	//Diffuse draw
	_diffuseShader.use();

	//_testSprite->scale(0.5f);
	_testSprite->draw(_diffuseShader, _gameCamera);
	_diffuseShader.unuse();

	_window->SwapBuffers();
	glDisable(GL_MULTISAMPLE);
	glDisable(GL_LINE_SMOOTH);
	glDisable(GL_POLYGON_SMOOTH);

I've been able to resolve the issue in the past, but as soon as I change a few things in my Mesh code, everything breaks. I can't figure out what the issue is.

Advertisement

Silly me, I forgot to post my shader code

Vertex


#version 330 core

in vec3 vertexPosition;
in vec2 vertexUV;

out vec2 fragmentUV;

uniform mat4 modelMatrix;
uniform mat4 cameraMatrix;

void main() {
	gl_Position = cameraMatrix * modelMatrix * vec4(vertexPosition, 1.0f);
	fragmentUV = vertexUV;
}

Fragment


#version 330 core

in vec2 fragmentUV;
out vec4 color;

uniform sampler2D sampler;

void main() {
	color = texture(sampler, fragmentUV);
}

It's not working because the attrib arrays are disabled in your VAO. Those glDisableVertexAttribArray calls are not needed when using VAOs and should be gotten rid of.

Direct3D has need of instancing, but we do not. We have plenty of glVertexAttrib calls.

I don't see anything wrong with the vertex attributes, but yeah, it's usually unnecessary to disable them. You can just set up everything in your VAO and its buffers and attributes will be remembered next time you bind the VAO.

One problem could be with your glUniform call, you're passing weird stuff for it, you shouldn't use texture id for it. Maybe that's not the actual problem if the texture id is a sane number for glActiveTexture call, but at least I'd suggest fixing that one.


GLuint texture_id = ...;
int index = 0;
glActiveTexture(GL_TEXTURE0 + index);
glUniform1i(shader.getUniformLocation("sampler"), index);
glBindTexture(GL_TEXTURE_2D, texture_id);

Derp

I don't see anything wrong with the vertex attributes, but yeah, it's usually unnecessary to disable them. You can just set up everything in your VAO and its buffers and attributes will be remembered next time you bind the VAO.

One problem could be with your glUniform call, you're passing weird stuff for it, you shouldn't use texture id for it. Maybe that's not the actual problem if the texture id is a sane number for glActiveTexture call, but at least I'd suggest fixing that one.


GLuint texture_id = ...;
int index = 0;
glActiveTexture(GL_TEXTURE0 + index);
glUniform1i(shader.getUniformLocation("sampler"), index);
glBindTexture(GL_TEXTURE_2D, texture_id);

Afaik, _texture->id should be passing a valid GLuint texture id location, but I'll give this a shot and see what happens

Edit: Nope, still doesn't work, and I got rid of the Vertex attrib array disables (Left overs from testing using VBO rendering)

And you've made sure the shaders compile fine? Does glGetUniformLocation return valid location? Did you bind the attribute locations correctly? Btw, AMD's CodeXL is the new gDEBugger.

Derp

Application would've crashed if the shaders failed to compile, as far as I know, the attribute locations should be valid, as attempting to use an unused texture index causes the mesh to become black.

Also, thanks for letting me know about CodeXL, I'm going to give it a go now.

Afaik, _texture->id should be passing a valid GLuint texture id location, but I'll give this a shot and see what happens

If so then this is wrong:


glBindTexture(GL_TEXTURE_2D, _texture->id);

Direct3D has need of instancing, but we do not. We have plenty of glVertexAttrib calls.

My test texture passes a value of 1, I'll take a look at my texture creation code and see if something's amiss

Ok, here's my texture loading code, but I doubt there's much wrong with it, as the texture loads just fine according to to both CodeXL and gDEBugger


	FREE_IMAGE_FORMAT fif = FIF_DDS;
	FIBITMAP* dib(0);
	BYTE* bits(0);
	GLTexture* texture = new GLTexture();

	dib = FreeImage_Load(FIF_DDS, path.c_str());
	if (!dib) {
		return texture;
	}

	bits = FreeImage_GetBits(dib);
	texture->width = FreeImage_GetWidth(dib);
	texture->height = FreeImage_GetHeight(dib);
	
	glGenTextures(1, &texture->id);
	glBindTexture(GL_TEXTURE_2D, texture->id); //bind texture

	glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, texture->width, texture->height, 0, GL_BGRA, GL_UNSIGNED_BYTE, bits);

	if (_filterMode == "linear") {
		glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
		glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR_MIPMAP_LINEAR);
	}
	if (_filterMode == "nearest") {
		glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_NEAREST);
		glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_NEAREST_MIPMAP_NEAREST);
	}
	glGenerateMipmap(GL_TEXTURE_2D);

	glBindTexture(GL_TEXTURE_2D, 0);

	FreeImage_Unload(dib);
	printf("[TEXTURE] Loaded %s\n", path.c_str());
	return texture;

This topic is closed to new replies.

Advertisement