Drawing code not working

Started by
18 comments, last by Asem 13 years, 4 months ago
I think the question is simple: it doesn't draw anything on screen. I see a completely black screen. And glGetError doesn't return anything.

vertices, normals, texture coordinates has data, and attribute bindings don't throw any error.

There aren't errors but it doesn't draw anything.

May I have to do some initialization?
Advertisement
What are you actually trying to do?
What do you expect from the code?
What do you want to be drawn?
Can you draw stuff at all? (Don't expect us to be "up to date" with your knowledge and project, and don't expect us to dig through your posting history just to understand your project).
I'm trying to draw a CUBE.

I have exported a model from blender to Wavefront .OBJ., loaded using JAVA to arrays and pass to native code. The problem isn't what I'm trying to draw because I use a model parsed correctly to vertices, normals and texture coordinates and I see the same: a black screen.

Methods Order:

1. Java_com_company_tests_LoaderRenderer_initRendering

Uses to:
a ) Create program.
b ) Select correct vertex and fragment shader.
c ) Get attribute locations for attributes needed (I always need vertices coordinates and indices). Here I see if I will need normal and texture coordinate attributes.

2. Java_com_company_tests_LoaderRenderer_renderFrame

Here I draw the cube.
a ) I retrieve vertices, normals, texture coordinates and indices parameters.
b ) Enable vertices, normals and texture coordinate attributes.
c ) Draw model with glDrawElements.
d ) Disable attributes.
e ) JNI cleaning code.

Other functions are called from this functions.
Java_com_company_tests_LoaderRenderer_updateRendering is called when screen size change.

I use jfloatArray vertices, jfloatArray normals, jfloatArray texCoord, jintArray indices are arrays to draw the model. Cube's vertices are in these arrays. I have these arrays to draw many models.

I take a working example that read vertices, normals, texture coordinates and indices from a header file. I've modified the example to work with many models, not with only a model defined inside a header file.

If I use another model, it continues drawing nothing.

I haven't draw anything yet.

Do you need more details?

My project is an Android application with native code. All OpenGL ES 2.0 stuff is here.
what's this?

env->ReleaseFloatArrayElements(vertices, vertPos, 0);

Are you deleting the vertices? if you are then nothing will draw according to your code. try committing this and the one below it out.
I'm releasing pointer to array elements. This is necessary.

I've tried to remove those lines and application crash.

Thank you for your help.
oops I don't know anything about the coding for the android yet so I'll stick to opengl since that's most likely the problem.

try a simple triangle with an orthographic projection. I would think if that works then it may be a problem with the vertices. Or even more so draw with points instead.

another thing about your vertex shader:

static const char* meshVertexShaderNoNormalTexCoor = "attribute vec4 vertexPosition;void main() {    gl_Position = vertexPosition;} ";


This is fine but would really only show up for 2d stuff. you may need to create a modelview and projection matrix.

gl_Position = ProjectionMatrix * ModelViewMatrix * vertexPosition;

which ProjectionMatrix and ModelViewMatrix have to be shader uniforms. So it's not that it's drawing the cube but that the transformation are not right. at least for a 3d object.

use GLM math library: http://glm.g-truc.net/
This a good math library.

This is a shader from the opengl es 2.0 quick reference card
here: http://www.khronos.org/opengles/2_X/

VERTEX SHADERuniform mat4 mvp_matrix; // model-view-projection matrixuniform mat3 normal_matrix; // normal matrixuniform vec3 ec_light_dir; // light direction in eye coordsattribute vec4 a_vertex; // vertex positionattribute vec3 a_normal; // vertex normalattribute vec2 a_texcoord; // texture coordinatesvarying float v_diffuse;varying vec2 v_texcoord;void main(void){// put vertex normal into eye coordsvec3 ec_normal = normalize(normal_matrix * a_normal);// emit diffuse scale factor, texcoord, and positionv_diffuse = max(dot(ec_light_dir, ec_normal), 0.0);v_texcoord = a_texcoord;gl_Position = mvp_matrix * a_vertex;}FRAGMENT SHADERprecision mediump float;uniform sampler2D t_reflectance;uniform vec4 i_ambient;varying float v_diffuse;varying vec2 v_texcoord;void main (void){vec4 color = texture2D(t_reflectance, v_texcoord);gl_FragColor = color * (vec4(v_diffuse) + i_ambient);}
I've changed vertexShader and fragmentShader with these:

static const char* cubeMeshVertexShader = "attribute vec4 vertexPosition;attribute vec4 vertexNormal;attribute vec2 vertexTexCoord;varying vec2 texCoord;varying vec4 normal;void main() {    gl_Position = gl_ProjectionMatrix * gl_ModelViewMatrix * vertexPosition;    normal = vertexNormal;    texCoord = vertexTexCoord;} ";static const char* cubeMeshFragmentShader = "    precision mediump float;    varying vec2 texCoord;    varying vec4 normal;    uniform sampler2D texSampler2D;void main() {    gl_FragColor = vec4(1.0, 1.0, 1.0, 1.0);} ";


And I see nothing.

[Edited by - VansFannel on December 8, 2010 12:52:11 PM]
did you set the matrices themselves.
You define a matrix in c++ then pass that matrix to the shader so...

You need to make the matrices uniform and feed in the data to them:
gl_ProjectionMatrix
gl_ModelViewMatrix


uniform mat4 gl_ProjectionMatrix;
uniform mat4 gl_ModelViewMatrix;

The usual gl_ProjectionMatrix or gl_ModelViewMatrix doesn't work in opengl es 2.0 only 1.0 which mean those used to be built in functions are deprecated and won't work.

Use GLM math library to construct your matrices or some other math library. Android may already have a math lib for this reason.

This is an example of the code from glm. It's the same thing just the only 1 matrix is passed to the shader as a

uniform mat4 MVP

matrix, so you could do the same. It's no different then splitting up the matrix work.

You should really check out glm.
This is the link to example code on about hoe the the math lib works:
http://glm.g-truc.net/code.html

     // glm::vec3, glm::vec4, glm::ivec4, glm::mat4     #include <glm/glm.hpp>     // glm::perspective     #include <glm/gtc/matrix_projection.hpp>     // glm::translate, glm::rotate, glm::scale     #include <glm/gtc/matrix_transform.hpp>     // glm::value_ptr     #include <glm/gtc/type_ptr.hpp>         {     glm::mat4 Projection =     glm::perspective(45.0f, 4.0f / 3.0f, 0.1f, 100.f);     glm::mat4 ViewTranslate = glm::translate(     glm::mat4(1.0f),     glm::vec3(0.0f, 0.0f, -Translate));     glm::mat4 ViewRotateX = glm::rotate(     ViewTranslate,     Rotate.y, glm::vec3(-1.0f, 0.0f, 0.0f));     glm::mat4 View = glm::rotate(     ViewRotateX,     Rotate.x, glm::vec3(0.0f, 1.0f, 0.0f));     glm::mat4 Model = glm::scale(     glm::mat4(1.0f),     glm::vec3(0.5f));     glm::mat4 ModelView = View * Model;      glUniformMatrix4fv(     LocationModelView , 1, GL_FALSE, glm::value_ptr(ModelView ));     }      glUniformMatrix4fv(     LocationProjection, 1, GL_FALSE, glm::value_ptr(Projection));     } 
You are right. The problem is with my ProjectionMatrix and ModelViewMatrix.

I don't know how to set up them, but I'm sure the problem is with them.

Thank you very much for your time and help.

If you need more information, tell me.
if you look at my prev post that code at the bottom shows you how to make setup the projection and modelview matrices using glm math lib and there's the website to download the lib. Once you have your matrices setup pass them to shader as uniforms:

give it a try and if nothing shows up post again.

This topic is closed to new replies.

Advertisement