Jump to content
  • Advertisement
Sign in to follow this  

How to NOT map vertices to screen

This topic is 2553 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

https://github.com/jckarter/hello-gl-ch3/blob/master/hello-gl.c
According the example Ive been following, this structure is used for creating the vertices:




/*
* Data used to seed our vertex array and element array buffers:
*/
static const GLfloat g_vertex_buffer_data[] = {
-1.0f, -1.0f, 0.0f, 1.0f,
1.0f, -1.0f, 0.0f, 1.0f,
-1.0f, 1.0f, 0.0f, 1.0f,
1.0f, 1.0f, 0.0f, 1.0f
};
static const GLushort g_element_buffer_data[] = { 0, 1, 2, 3 };




where do these vertices get assigned to screen space, I would like to get an actual quad of 1pixel instead to whatever my current gl_surface is

Share this post


Link to post
Share on other sites
Advertisement
map -1 to 1, to 0 to window width/height

-1 becomes 0, 1 becomes window width

width, height = 640 x 480
-1.0*(640/2.0) = -320 , -320 + (640/2.0) = 0 , -1 became 0
1*(640/2.0) = 320, 320 + (640/2.0) = 1, 1 becomes window width


Follow that and your fine. Or if in 2D, u can use glOrtho(0,windowwidht, 0, windowheight); and then your g_vertex_buffer just holds pixel locations instead of -1 to 1 values.

Share this post


Link to post
Share on other sites
The matrix generated by glOrtho can be found in any decent documentation or reference manual. Your favorite linear algebra library should also contain functions to generate orthogonal projection matrices. Just multiply your vertices by that matrix in your vertex program.

Share this post


Link to post
Share on other sites

The matrix generated by glOrtho can be found in any decent documentation or reference manual. Your favorite linear algebra library should also contain functions to generate orthogonal projection matrices. Just multiply your vertices by that matrix in your vertex program.


Then I have to create the matrix at the shader level? or is there a way to pass a custom data type to the shader as an attribute

sorry Im very noob at this xD

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

Participate in the game development conversation and more when you create an account on GameDev.net!

Sign me up!