Sign in to follow this  
EvilNando

How to NOT map vertices to screen

Recommended Posts

EvilNando    96
[url="https://github.com/jckarter/hello-gl-ch3/blob/master/hello-gl.c"]https://github.com/jckarter/hello-gl-ch3/blob/master/hello-gl.c[/url]
According the example Ive been following, this structure is used for creating the vertices:


[code]

/*
* Data used to seed our vertex array and element array buffers:
*/
static const GLfloat g_vertex_buffer_data[] = {
-1.0f, -1.0f, 0.0f, 1.0f,
1.0f, -1.0f, 0.0f, 1.0f,
-1.0f, 1.0f, 0.0f, 1.0f,
1.0f, 1.0f, 0.0f, 1.0f
};
static const GLushort g_element_buffer_data[] = { 0, 1, 2, 3 };
[/code]



where do these vertices get assigned to screen space, I would like to get an actual quad of 1pixel instead to whatever my current gl_surface is

Share this post


Link to post
Share on other sites
dpadam450    2357
map -1 to 1, to 0 to window width/height

-1 becomes 0, 1 becomes window width

width, height = 640 x 480
-1.0*(640/2.0) = -320 , -320 + (640/2.0) = 0 , -1 became 0
1*(640/2.0) = 320, 320 + (640/2.0) = 1, 1 becomes window width


Follow that and your fine. Or if in 2D, u can use glOrtho(0,windowwidht, 0, windowheight); and then your g_vertex_buffer just holds pixel locations instead of -1 to 1 values.

Share this post


Link to post
Share on other sites
Brother Bob    10344
The matrix generated by glOrtho can be found in any decent documentation or reference manual. Your favorite linear algebra library should also contain functions to generate orthogonal projection matrices. Just multiply your vertices by that matrix in your vertex program.

Share this post


Link to post
Share on other sites
EvilNando    96
[quote name='Brother Bob' timestamp='1306245752' post='4815119']
The matrix generated by glOrtho can be found in any decent documentation or reference manual. Your favorite linear algebra library should also contain functions to generate orthogonal projection matrices. Just multiply your vertices by that matrix in your vertex program.
[/quote]

Then I have to create the matrix at the shader level? or is there a way to pass a custom data type to the shader as an attribute

sorry Im very noob at this xD

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

Sign in to follow this