# How to NOT map vertices to screen

## Recommended Posts

[url="https://github.com/jckarter/hello-gl-ch3/blob/master/hello-gl.c"]https://github.com/jckarter/hello-gl-ch3/blob/master/hello-gl.c[/url]
According the example Ive been following, this structure is used for creating the vertices:

[code]

/*
* Data used to seed our vertex array and element array buffers:
*/
static const GLfloat g_vertex_buffer_data[] = {
-1.0f, -1.0f, 0.0f, 1.0f,
1.0f, -1.0f, 0.0f, 1.0f,
-1.0f, 1.0f, 0.0f, 1.0f,
1.0f, 1.0f, 0.0f, 1.0f
};
static const GLushort g_element_buffer_data[] = { 0, 1, 2, 3 };
[/code]

where do these vertices get assigned to screen space, I would like to get an actual quad of 1pixel instead to whatever my current gl_surface is

##### Share on other sites
map -1 to 1, to 0 to window width/height

-1 becomes 0, 1 becomes window width

width, height = 640 x 480
-1.0*(640/2.0) = -320 , -320 + (640/2.0) = 0 , -1 became 0
1*(640/2.0) = 320, 320 + (640/2.0) = 1, 1 becomes window width

Follow that and your fine. Or if in 2D, u can use glOrtho(0,windowwidht, 0, windowheight); and then your g_vertex_buffer just holds pixel locations instead of -1 to 1 values.

##### Share on other sites
Any idea of what would be the equivalent of glOrtho in opengl 3.2?

##### Share on other sites
Brother Bob    10344
The matrix generated by glOrtho can be found in any decent documentation or reference manual. Your favorite linear algebra library should also contain functions to generate orthogonal projection matrices. Just multiply your vertices by that matrix in your vertex program.

##### Share on other sites
[quote name='Brother Bob' timestamp='1306245752' post='4815119']
The matrix generated by glOrtho can be found in any decent documentation or reference manual. Your favorite linear algebra library should also contain functions to generate orthogonal projection matrices. Just multiply your vertices by that matrix in your vertex program.
[/quote]

Then I have to create the matrix at the shader level? or is there a way to pass a custom data type to the shader as an attribute

sorry Im very noob at this xD

##### Share on other sites
Brother Bob    10344
You use the glUniform family of functions and a uniform variable in your shader.