Sign in to follow this  
halogen64

Corona - Loading a texture

Recommended Posts

I asked about some sprites in an earlier post, but maybe someone can tell me what I am doing wrong here and that will probably fix my earlier problem. The problem is that instead of an image, I just get a white square
#include <stdio.h>
#include <stdlib.h>
#include <GL/glut.h>

#include <corona.h>

#pragma comment(lib, "corona.lib")

corona::Image* image;
GLuint texture;

void display(void) {
	glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
	glLoadIdentity();
	
	glBindTexture(GL_TEXTURE_2D, texture);
	glBegin(GL_QUADS);
		glTexCoord2f(0.0f, 0.0f); glVertex2f (100, 100);
		glTexCoord2f(1.0f, 0.0f); glVertex2f (164, 100);
		glTexCoord2f(1.0f, 1.0f); glVertex2f (164, 164);
		glTexCoord2f(0.0f, 1.0f); glVertex2f (100, 164);
	glEnd();
	glutSwapBuffers();
}

void keyboard(unsigned char key, int x, int y) {
	if (key == 27) exit(0);
}

void init() {
	glEnable(GL_TEXTURE_2D);
	image = corona::OpenImage("spaceship.png", corona::PF_R8G8B8A8, corona::FF_PNG);
	glGenTextures(1, &texture);
	glBindTexture(GL_TEXTURE_2D, texture);
	glTexImage2D(GL_TEXTURE_2D, 0, 3, image->getWidth(), image->getHeight(), 0, GL_RGBA, GL_UNSIGNED_BYTE, image->getPixels());

}

void reshape(int width, int height) {
	glViewport(0, 0, width, height);
	glMatrixMode(GL_PROJECTION);
	glLoadIdentity();
	gluOrtho2D(0, width, 0, height);
	glMatrixMode(GL_MODELVIEW);
}

int main(int argc, char* argv) {
	glutInit(&argc, &argv);
	glutInitDisplayMode(GLUT_DOUBLE | GLUT_RGBA);
	glutInitWindowSize(640, 480);
	glutCreateWindow("DEMO");
	glutKeyboardFunc(keyboard);
	glutDisplayFunc(display);
	glutReshapeFunc(reshape);
	init();
	glutMainLoop();
	return EXIT_SUCCESS;
}


[Edited by - halogen64 on March 6, 2008 10:20:14 AM]

Share this post


Link to post
Share on other sites
This is a very common mistake in OpenGL. The default minification filter is GL_NEAREST_MIPMAP_LINEAR, ie. a mipmapping filter. It requires all the needed mipmap levels of the texture to be defined, otherwise it's as if texture mapping were disabled for the texture unit this texture object is bound to.

You're only defining the base mipmap level (level 0). You can manually define each mipmap level using glTexImage2D (there are specific requirements for mipmap levels, see the Mipmapping subsection in section 3.8.8 in the OpenGL Specs). You can automatically generate mipmaps by setting the GL_GENERATE_MIPMAP texture parameter to GL_TRUE (see the Automatic Mipmap Generation subsection right after the Mipmapping subsection in 3.8.8).

You can also manually generate the mipmap levels using glGenerateMipmapEXT from the GL_EXT_framebuffer_object (FBO) extension.

There's also the GLU mipmap building functions, but I don't recommend using those.

That's just what jumped out at me with a quick pass over your code, there could be other things wrong that can have a similar effect.

Share this post


Link to post
Share on other sites
From what I gathered, I tried this. The first texture loads (without proper alpha support) and the second texture is completely white. Can anyone help me out?


#include <stdio.h>
#include <stdlib.h>
#include <GL/glut.h>

#include <corona.h>

#pragma comment(lib, "corona.lib")

GLuint texture;
GLuint background;

void display(void) {
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
glLoadIdentity();
glEnable(GL_TEXTURE_2D);

glBindTexture(GL_TEXTURE_2D, background);
glBegin(GL_QUADS);
glTexCoord2f(1.0f, 1.0f); glVertex2f (80, 0);
glTexCoord2f(0.0f, 1.0f); glVertex2f (560, 0);
glTexCoord2f(0.0f, 0.0f); glVertex2f (560, 4800);
glTexCoord2f(1.0f, 0.0f); glVertex2f (80, 4800);
glEnd();

glBindTexture(GL_TEXTURE_2D, texture);
glBegin(GL_QUADS);
glTexCoord2f(1.0f, 1.0f); glVertex2f (100, 100);
glTexCoord2f(0.0f, 1.0f); glVertex2f (164, 100);
glTexCoord2f(0.0f, 0.0f); glVertex2f (164, 164);
glTexCoord2f(1.0f, 0.0f); glVertex2f (100, 164);
glEnd();
glutSwapBuffers();
}

void keyboard(unsigned char key, int x, int y) {
if (key == 27) exit(0);
}

void init() {

corona::Image* image = corona::OpenImage("spaceship.tga", corona::PF_R8G8B8A8, corona::FF_TGA);
glGenTextures(1, &texture);
glBindTexture(GL_TEXTURE_2D, texture);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_REPEAT);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_REPEAT);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_NEAREST);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_NEAREST);
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, image->getWidth(), image->getHeight(), 0, GL_RGBA, GL_UNSIGNED_BYTE, image->getPixels());

corona::Image* bg = corona::OpenImage("level1.tga", corona::PF_R8G8B8A8, corona::FF_TGA);
glGenTextures(1, &background);
glBindTexture(GL_TEXTURE_2D, background);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_REPEAT);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_REPEAT);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_NEAREST);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_NEAREST);

glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, bg->getWidth(), bg->getHeight(), 0, GL_RGBA, GL_UNSIGNED_BYTE, bg->getPixels());

}

void reshape(int width, int height) {
glViewport(0, 0, width, height);
glMatrixMode(GL_PROJECTION);
glLoadIdentity();
gluOrtho2D(0, width, 0, height);
glMatrixMode(GL_MODELVIEW);
}

int main(int argc, char* argv) {
glutInit(&argc, &argv);
glutInitDisplayMode(GLUT_DOUBLE | GLUT_RGBA);
glutInitWindowSize(640, 480);
glutCreateWindow("DEMO");
glutKeyboardFunc(keyboard);
glutDisplayFunc(display);
glutReshapeFunc(reshape);
init();
glutMainLoop();
return EXIT_SUCCESS;
}


Share this post


Link to post
Share on other sites
Quote:
Original post by halogen64
From what I gathered, I tried this. The first texture loads (without proper alpha support) and the second texture is completely white. Can anyone help me out?

*** Source Snippet Removed ***
For "alpha support" you will need to enable blending and/or alpha testing (see Chapter 6 and Chapter 10, respectively, in the Red Book for more information).

Your code looks okay, as far as I can tell (I've never used Corona). I thought I saw you were only setting the minification filter for the first texture object, but then I refreshed and now I see it for both of them so this might not be the problem, but texture parameters are per-texture-object state, so you do need to set them for each texture object (like the code now shows you are doing).

What video card do you have? It may not support GL 2.0 or the GL_ARB_texture_non_power_of_two extension, in which case you will need to make sure your images have power-of-two dimensions (not multiple of two).

You should also check for OpenGL errors. See Chapter 14 of the Red Book for more information on that.

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

Sign in to follow this