Options for a 2D tiling engine in DirectX

Started by
1 comment, last by Spodi 15 years, 11 months ago
Hi all, I'm making a 2D, top-down game with tiled terrain. I'm having some problems with getting the sprites to draw perfectly adjacently, though. No matter what I do, applying texture filtering causes the sprites to no longer line up with each other quite right. Right now, I'm using ID3DXSprite in object space. So: how does one write a 2D tiling engine in Direct3D? I see these options, and haven't gotten any of the first three to work: 1. Use ID3DXSprite with pre-transformed coordinates, i.e., drawing to screen coordinates. This works fine, except when you want to run the game at a different resolution, since you have to manually decide how to scale the sprites' sizes and positions. 2. Use ID3DXSprite in object space, but without texture sampling. This works OK, but is very ugly when the sprites have to be scaled. 3. Use ID3DXSprite in object space with texture sampling. This is the problem I have now: the sprites can't be drawn to exact pixel positions; there's space in between them. 4. Abandon ID3DXSprite and write my own system using quads...? Would I be able to draw tiles perfectly flush even using texture sampling? For anyone who's written a game with tiles: which option did you choose, and how well did you get it to work?
Advertisement
I've never done object space rendering, but if your problem lies only in texture sampling, first make sure your texture coords are being correctly calculated (they should be inflated by half-a-texel for all four verts of a quad).

Since you are using D3DXSprite, I am assuming it probably does a good job of taking care of this for you. However, if you run out of things to try, you can make a quick test using vertex buffers and FVFs. Instead of inflating tex coords you could also try to deflate them in relationship to current scale (to grab less edge pixels as you zoom out).

The second thing that could work is padding your textures on the texel boundary by repeating the edge pixel(s). Probably at least 2 pixels around all edges. This will make the filter grap those repeated pixels, averaging them with real edge pixels, and since they are the same, there will be no different-colored edge.
ValMan is on the money with the padding. Just copy out the borders an extra pixel in your textures. I'd recommend doing this through code to prevent the manual hassle if possible. Would be a good time to just turn them into a texture atlas, too... :]

The reason it appears is how filtering works. The GPU grabs past the region you define for the texture to allow the outer texels to have something to filter against, too. The result? Well, you've seen it - you end up filtering against a texel you don't want it to. Adding the padding will make it so that, instead of filtering against this "evil" texel, it will filter against itself, resulting in the same color.

Which options did I choose, you ask? Well, I first generate a big fat atlas texture and throw everything into there, leaving a 1 pixel border. Once I find the place to stick the texture chunk into the atlas, I copy it over, then cover the border. So far it has been flawless. Now, not only do I not have to worry about the filtering issues, I also get a lot better performance (because I can render the whole scene in just a single draw), don't have to pad manually, and don't have to care what the size of my texture is before it hits the atlas.

I also made sure that, unless specified, all the positions in my Draw() calls are rounded, which helps reduce the chance that it will even try to filter. It can be quite noticeable when your image switches between filtered and unfiltered as it moves, going from "blurry" to "normal".
NetGore - Open source multiplayer RPG engine

This topic is closed to new replies.

Advertisement