Drawing Text using glTextImage2D + FreeType : GL_INVALID_ENUM error

Started by
3 comments, last by tmason 9 years, 5 months ago

Hello,

So, fresh off of my last post I am running into an issue with using code that normally works. Essentially I am working off of this example online: http://en.wikibooks.org/wiki/OpenGL_Programming/Modern_OpenGL_Tutorial_Text_Rendering_01

And here is the code snippet that is giving me the GL_INVALID_ENUM error. The actual error occurs on the glTextImage2D() call.


glDisable(GL_CULL_FACE);
glUseProgram(TextProgramShaderID);
glActiveTexture(GL_TEXTURE0);
glBindTexture(GL_TEXTURE_2D, tex);
glUniform1i(TextureUniform, 0);
glBindVertexArray(TextureVertexArrayObjectID);
const char *p;
glBindBuffer(GL_ARRAY_BUFFER, TextureVertexBufferObjectID);
for (p = text; *p; p++) {
	if (FT_Load_Char(face, *p, FT_LOAD_RENDER))
		continue;
	glTexImage2D(
		GL_TEXTURE_2D,
		0,
		GL_ALPHA,
		gFont->bitmap.width,
		gFont->bitmap.rows,
		0,
		GL_ALPHA,
		GL_UNSIGNED_BYTE,
		gFont->bitmap.buffer
		);
	float x2 = x + gFont->bitmap_left * sx;
	float y2 = -y - gFont->bitmap_top * sy;
	float w = gFont->bitmap.width * sx;
	float h = gFont->bitmap.rows * sy;
	GLfloat box[4][4] = {
		{ x2, -y2, 0, 0 },
		{ x2 + w, -y2, 1, 0 },
		{ x2, -y2 - h, 0, 1 },
		{ x2 + w, -y2 - h, 1, 1 },
	};
	glBufferData(GL_ARRAY_BUFFER, sizeof box, box, GL_DYNAMIC_DRAW);
	glDrawArrays(GL_TRIANGLE_STRIP, 0, 4);
	x += (gFont->advance.x >> 6) * sx;
	y += (gFont->advance.y >> 6) * sy;

}
glBindBuffer(GL_ARRAY_BUFFER, NULL);
glBindVertexArray(NULL);
glBindTexture(GL_TEXTURE_2D, NULL);
glUseProgram(PrimaryProgramShaderID);
glEnable(GL_CULL_FACE);

I am not sure what is happening here to cause the error; I stepped through the code and each of the uniforms have values associated with the bound shader and there is data coming into the supplied arguments.

Any ideas?

Thank you.

Advertisement

It should be GL_LUMINANCE_ALPHA instead of GL_ALPHA, freetype output a luminance value + an alpha value per pixel.

Also, keep in mind that GL_ALPHA and GL_LUMINANCE_ALPHA have been removed starting with core OpenGL 3.1, you might want to use a different format and edit your shader to copy the value in the corresponding color channel (explained on this stackoverflow page) . You can also just create your own bitmap buffer with 4 byte per pixel and copy the luminance value in the RGB channels directly.

It should be GL_LUMINANCE_ALPHA instead of GL_ALPHA, freetype output a luminance value + an alpha value per pixel.

Also, keep in mind that GL_ALPHA and GL_LUMINANCE_ALPHA have been removed starting with core OpenGL 3.1, you might want to use a different format and edit your shader to copy the value in the corresponding color channel (explained on this stackoverflow page) . You can also just create your own bitmap buffer with 4 byte per pixel and copy the luminance value in the RGB channels directly.

Hmmm, I am a bit of a novice but I think I can follow along on your former solution.

So, change GL_ALPHA to GL_RED and then in my shader code use the "R" value for my color fragment coming from the texture sampler?

Hope that makes sense.

Thank you for your fast reply.

actually, you should only have one value from freetype, so GL_RED should be enough, sorry my mistake.

yes, edit your shader where the r value of your color will contain you alpha value.

or use a swizzle mask like explained here, in your case it would be:


//after binding the texture
GLint swizzleMask[] = {GL_RED, GL_RED, GL_RED,GL_RED};
glTexParameteriv(GL_TEXTURE_2D, GL_TEXTURE_SWIZZLE_RGBA, swizzleMask);

with this you don't need to change your shader.

Thank you, I will try this today.

This topic is closed to new replies.

Advertisement