private void drawFrame(GL10 gl, int w, int h) {
gl.glViewport(0, 0, w, h);
gl.glMatrixMode(GL10.GL_PROJECTION);
gl.glLoadIdentity();
gl.glClearColor(0.5f,0.5f,0.5f,1);
gl.glClear(GL10.GL_COLOR_BUFFER_BIT | GL10.GL_DEPTH_BUFFER_BIT);
gl.glMatrixMode(GL10.GL_MODELVIEW);
gl.glLoadIdentity();
gl.glTranslatef(0, 0, -1000.0f);
gl.glScalef(0.5f, 0.5f, 0.5f);
gl.glColor4f(0.7f, 0.7f, 0.7f, 1.0f);
gl.glEnableClientState(GL10.GL_VERTEX_ARRAY);
gl.glEnableClientState(GL10.GL_COLOR_ARRAY);
gl.glEnable(GL10.GL_CULL_FACE);
gl.glShadeModel(GL10.GL_SMOOTH);
gl.glEnable(GL10.GL_DEPTH_TEST);
float squareVertices[] = {
// FRONT
-0.5f, -0.5f, 0.5f,
0.5f, -0.5f, 0.5f,
-0.5f, 0.5f, 0.5f,
0.5f, 0.5f, 0.5f,
// BACK
-0.5f, -0.5f, -0.5f,
-0.5f, 0.5f, -0.5f,
0.5f, -0.5f, -0.5f,
0.5f, 0.5f, -0.5f,
// LEFT
-0.5f, -0.5f, 0.5f,
-0.5f, 0.5f, 0.5f,
-0.5f, -0.5f, -0.5f,
-0.5f, 0.5f, -0.5f,
// RIGHT
0.5f, -0.5f, -0.5f,
0.5f, 0.5f, -0.5f,
0.5f, -0.5f, 0.5f,
0.5f, 0.5f, 0.5f,
// TOP
-0.5f, 0.5f, 0.5f,
0.5f, 0.5f, 0.5f,
-0.5f, 0.5f, -0.5f,
0.5f, 0.5f, -0.5f,
// BOTTOM
-0.5f, -0.5f, 0.5f,
-0.5f, -0.5f, -0.5f,
0.5f, -0.5f, 0.5f,
0.5f, -0.5f, -0.5f,
};
FloatBuffer vertices; // Vertex Data
// Allocate Vertex Data
vertices = BufferUtil.newFloatBuffer(72);
vertices.put(squareVertices);
gl.glVertexPointer(3, gl.GL_FLOAT, 0, vertices);
gl.glEnableClientState(gl.GL_VERTEX_ARRAY);
gl.glColor4f(1.0f, 0.0f, 0.0f, 1.0f);
//gl.glDrawArrays(gl.GL_LINE_STRIP, 0, 1);
gl.glDrawArrays(gl.GL_TRIANGLE_STRIP, 0, 4);
gl.glDrawArrays(gl.GL_TRIANGLE_STRIP, 4, 4);
gl.glColor4f(0.0f, 1.0f, 0.0f, 1.0f);
gl.glDrawArrays(gl.GL_TRIANGLE_STRIP, 8, 4);
gl.glDrawArrays(gl.GL_TRIANGLE_STRIP, 12, 4);
gl.glColor4f(0.0f, 0.0f, 1.0f, 1.0f);
gl.glDrawArrays(gl.GL_TRIANGLE_STRIP, 16, 4);
gl.glDrawArrays(gl.GL_TRIANGLE_STRIP, 20, 4);
}
Cube in OpenGL ES not rendering
Hi guys, I've used OpenGl before but now I am trying OpenGl ES and it's been awhile. I've been messing with this forever and can't figure out where I'm going wrong. It seems like no matter what I change the glTranslatef z value to everything ends up in the same place. Not only that, but I seem to be getting a square sometimes and some weird triangle at others. Any help would be greatly appreciated.
I'm trying to draw a cube but I would be happy with a square or a triangle at this point just to figure out what I'm doing wrong.
You have the identity matrix in your projection matrix. You need to set up a view frustum using a function such as glFrustum (or glFrustumf in OpenGL ES).
First you set the projection matrix to the identity matrix. This is the default projection matrix, and the clip space extends from -1 to 1 along all axes. So in order for the cube to be visible, it needs to fall within these limits after the modelview transformation.
So, the modelview transformation consists of translation by -1000 on the Z-axis and scaling by 0.5. On a cube that extends from -0.5 to 0.5 on all axes (that's your cube), the post-modelview cube will extend from -0.25 to -0.25 on the X and Y-axis, and from 999.75 to 1000.25 on the Z-axis. The last value are well outside the view volume, and so the cube is not redered.
What you need to correct depends on what you want. But to get anything at all at this point, you need to reduce the translation so the cube is within the view volume. Remove it completely is one option. Then you should start thinking about some projection matrix, unless you actually want the default identity matrix.
So, the modelview transformation consists of translation by -1000 on the Z-axis and scaling by 0.5. On a cube that extends from -0.5 to 0.5 on all axes (that's your cube), the post-modelview cube will extend from -0.25 to -0.25 on the X and Y-axis, and from 999.75 to 1000.25 on the Z-axis. The last value are well outside the view volume, and so the cube is not redered.
What you need to correct depends on what you want. But to get anything at all at this point, you need to reduce the translation so the cube is within the view volume. Remove it completely is one option. Then you should start thinking about some projection matrix, unless you actually want the default identity matrix.
You are calling
gl.glEnableClientState(GL10.GL_VERTEX_ARRAY);
gl.glEnableClientState(GL10.GL_COLOR_ARRAY);
but you never call
glColorPointer
Calling glColor4f use useless when you have glEnableClientState(GL10.GL_COLOR_ARRAY)
gl.glEnableClientState(GL10.GL_VERTEX_ARRAY);
gl.glEnableClientState(GL10.GL_COLOR_ARRAY);
but you never call
glColorPointer
Calling glColor4f use useless when you have glEnableClientState(GL10.GL_COLOR_ARRAY)
Quote:Original post by V-manTo clarify V-man's statement a little, enabling GL_COLOR_ARRAY tells OpenGL to retrieve all colours from the array you specified (using glColorPointer), and is likely to crash if you don't call glColorPointer (since it is then trying to access some random area of memory). If you want to set colours manually, don't enable GL_COLOR_ARRAY, and you can then use glColor4f to set the colour for all following vertices.
You are calling
gl.glEnableClientState(GL10.GL_VERTEX_ARRAY);
gl.glEnableClientState(GL10.GL_COLOR_ARRAY);
but you never call
glColorPointer
Calling glColor4f use useless when you have glEnableClientState(GL10.GL_COLOR_ARRAY)
This topic is closed to new replies.
Advertisement
Popular Topics
Advertisement