# OpenGL OpenGL ES - Pixel perfect image rendering?

This topic is 3613 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

## Recommended Posts

I'm making a kind of 3D music program mostly for entertainment value, using C# and OpenGL ES, targeting a Windows Mobile touchscreen smartphone (HTC Touch Diamond/Pro). I'm calling into Windows Media Player from my program. Everything is kind of sort of working. I come from pre-existing 3D engines, so I'm just wondering what the best way to draw a pixel-precise image to the screen in OpenGL ES is - IE if I have a 64x64 bitmap and place it at 0, 0, I expect it to go from 0, 0 to 63, 63 and that's it. This way I can detect when the image has been touched - thus a bitmap button. I tried using GDI, but it flickered terribly with the OpenGL rendering, even after placing the draw call after the OpenGL rendering was complete. It seems I will have to go with pure OpenGL for this task. Does anyone have any pointers for me? Thanks. [Edited by - AndyCR512 on December 1, 2008 9:19:05 AM]

##### Share on other sites
Thanks, though I tried that and couldn't get it to work. I think it's the .NET wrapper I'm using - it exposes things in a rather odd way (there's not even a glVertex3f function, for example - it has to be done with glVertexPointer and glDrawArrays), so I think I just need to figure out how the API would like me to do this. I got the wrapper here: http://www.koushikdutta.com/2008/08/net-compact-framework-wrapper-for.html

##### Share on other sites
Quote:
 Original post by AndyCR512I think it's the .NET wrapper I'm using - it exposes things in a rather odd way (there's not even a glVertex3f function, for example - it has to be done with glVertexPointer and glDrawArrays)
It is exposing things properly. Immediate mode (glVertex, etc.) is deprecated in OpenGL, and was intentionally never added to OpenGL ES.

As for rendering GUI elements, you need to setup an orthographic projection matrix. Regular OpenGL includes a utility function for this, glOrtho(), but this was also removed from OpenGL ES, so you will have setup your own. Something like this should do the trick:
void setOrtho(float width, float height) {	glMatrixMode(GL_PROJECTION);	GLfloat m[16] = {		1.0f/width, 0, 0, 0,		0, 1.0f/height, 0, 0,		0, 0, 1, 0,		0, 0, 0, 1	}	glLoadMatrixf(m);	glMatrixMode(GL_MODELVIEW);}

This sets up an orthographic projection with the x axis aligned with the horizontal screen axis, from 0 to width, and the y-axis aligned with the vertical screen axis, from 0 to height. The only issue is that the y-axis may be inverted with respect to the screen's native coordinate system.

##### Share on other sites
Thank you for your help. The wrapper provides a glOrthox function for some reason, so I should be able to use that.

I still can't get it to work. I'll just keep trying. I feel so dumb being stuck on something as simple as drawing a bitmap. I think I spent too much time in high-level engines...

##### Share on other sites
Quote:
 Original post by AndyCR512I still can't get it to work. I'll just keep trying. I feel so dumb being stuck on something as simple as drawing a bitmap. I think I spent too much time in high-level engines...
OK, so time for a quick check list:
• Have you created and loaded the texture?
• Is the texture set to use GL_NEAREST filtering?
• Is mipmapping disabled?
• Is the texture bound?
• Is texturing enabled?
• Does your quad have texture coordinates?
• Are the vertices of your quad in the right order (counter-clockwise)?

If you can answer yes to all of those, it should be working ;)

##### Share on other sites
Thanks. I get a perfect texture-mapped square when I try displaying in 3D. When I try displaying it on the screen so that it renders like a plain 2D bitmap, I get a muddy mess - presumably an extreme close-up of the center of the texture. I'll try disabling mipmapping and using GL_NEAREST.

I have a feeling it may have to do with how I'm submitting the vertex array - I use this:

gl.VertexPointer(3, gl.GL_FLOAT, 0, (IntPtr)trianglePointer);

I thought you're supposed to use integers to describe 2D triangles. However, the wrapper does not provide a gl.GL_INT constant.

##### Share on other sites
Quote:
 Original post by AndyCR512I have a feeling it may have to do with how I'm submitting the vertex array - I use this:gl.VertexPointer(3, gl.GL_FLOAT, 0, (IntPtr)trianglePointer);I thought you're supposed to use integers to describe 2D triangles. However, the wrapper does not provide a gl.GL_INT constant.
That won't work. As you realised, you need a GL_INT constant to use an IntPtr, so since that is not available, specify your coordinates in floats instead.

##### Share on other sites
Thanks. I don't think that's passing an int - that pointer points to a float. The author of the wrapper used floats in his example, yet used IntPtr. Not at all sure why, but it seems to work.

I got it to kind of work by using smaller numbers in the vertices - .05 instead of 1, for instance. However, that does not correspond to one texel for one pixel on the screen. I don't know at all what these floats correspond to. Is it 0 for nothing and 1 to fill up the viewport? I'm just not sure.

At least it's displaying now, though scaled.

##### Share on other sites
Quote:
 Original post by AndyCR512I got it to kind of work by using smaller numbers in the vertices - .05 instead of 1, for instance. However, that does not correspond to one texel for one pixel on the screen. I don't know at all what these floats correspond to. Is it 0 for nothing and 1 to fill up the viewport? I'm just not sure.
It only corresponds to one texel-per-pixel if you have specified a projection matrix equivalent to glOrtho(0, width, 0, height), as in my earlier matrix function.

1. 1
Rutin
47
2. 2
3. 3
4. 4
5. 5
JoeJ
19

• 11
• 16
• 9
• 10
• 13
• ### Forum Statistics

• Total Topics
633003
• Total Posts
3009844
• ### Who's Online (See full list)

There are no registered users currently online

×