DirectX 11 and shifting vertices by a half-pixel in 2D rendering.

Started by
3 comments, last by Eric F. 8 years, 4 months ago

Hey all.

I read in the MS article about differences between DX 9 and 10/11 that you dont need to add half a pixel when placing your polygons now. I was glad to see this gone, but I am having a super hard time aligning everything properly.

For some background, here is how I define the world, view and projection matrices: (Pardon my use of Delphi ;) )


  D3DXMatrixIdentity( World );
  D3DXMatrixLookAtLH( View, D3DXVector3.Create(0, 0, -10), D3DXVector3.Create(0, 0, 0), D3DXVector3.Create(0, 1, 0) );
  D3DXMatrixOrthoOffCenterLH( Projection, 0, DisplaySize.Width, DisplaySize.Height, 0, 1, 100 ); // updated when display size changes.

.

All other objects are setup so that I can draw lines using line lists, draw images using quads, etc. Everything works, except that I seem to have some alignment problem.

My problem is as follow. Say for a viewport of 640 x 480 pixels, I need to provide the vertices coordinates 1-based instead of 0-based. As such, if I want to draw a quad that fills the whole viewport, I need to set it from 1, 1 to 640, 480 instead of 0, 0 to 639, 479.

Am I doing something wrong? Do I really have to subtract 1 from every vector I pass to the API?

I dont want to start fiddling with the projection and move it or kludge my renderer with code that might/can/will break on some platform combo.

I could post more code, but I'm not sure what's relevant and after working on this for five+ days, the mapping of the buffer and vertices assignment is pretty straightforward. And I use generic shaders that do not perform any extra processing, based off the Rastertek tutorials.

Any suggestions?

Thanks!

Advertisement

My problem is as follow. Say for a viewport of 640 x 480 pixels, I need to provide the vertices coordinates 1-based instead of 0-based. As such, if I want to draw a quad that fills the whole viewport, I need to set it from 1, 1 to 640, 480 instead of 0, 0 to 639, 479.


I'm not sure that I understand the requirement to use 1-based coordinates. It seems that you may be needlessly making things complicated for yourself.

Using 0-based coordinates you provide them in e.g the {0,0} to {640,480} range and everything works.

Direct3D has need of instancing, but we do not. We have plenty of glVertexAttrib calls.

I'm not sure where you are getting those numbers from. A (0, 0) to (639, 479) quad cannot possibly fill a (640, 480) viewport - that would be like a (0, 0) to (0, 0) quad filling a (1, 1) viewport. I'm also not sure why you're trying to add one when the offset specifically mentions half pixel.

In DirectX9 your quad was (-0.5, -0.5) to (639.5, 479.5), while in DirectX11 it's (0, 0) to (640, 480). Unless you were dealing with the half-pixel offset in your shaders, in which case you only have to edit your shaders

You should put quad vertices at the corners of pixels -- so with a viewport of size (width,height) a quad that completely covers the origin corner pixel should go from float2(0,0)/float2(width,height) to float2(1,1)/float2(width,height).

In your code, you might say that this sprite begins at pixel #0 and ends at pixel #0, but to make the quad actually cover pixel #0 (who's centre point is at 0.5/width), it needs corners at 0/width and 1/width.

After reading the replies, I went back to my code and had a good look over, taking what you guys said. Turns out, I didn't follow proper rasterization rules which was causing all manner of problems in my code.

This bit from MSDN might help someone having the same problem:

"Non-antialiased line rendering rules are exactly the same as those for GDI lines."

https://msdn.microsoft.com/en-us/library/windows/desktop/dd145027(v=vs.85).aspx

Thanks!

This topic is closed to new replies.

Advertisement