Archived

This topic is now archived and is closed to further replies.

2D in D3D - Problems with multiple frames of sprite animation in a texture

This topic is 4995 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Hello all, These are wonderful forums...I''m new here and it looks like a good place to get some help. I''m wondering if anyone can help me with a 2D problem I''m having in d3d. I''m trying to convert a DirectDraw project to D3D to take advantage of newer features (primarily alpha blending). I need to draw sprites, basically, and nothing else at this point. I''m using the id3dxsprite (sorry about the caps) interface to draw my sprites. All of it is working fine EXCEPT for the following problem. D3D loads textures at resolutions in powers of two. Most of my resources (bitmaps) are not in powers of two, which is not really a problem as d3d is converting them automatically. The problem comes in the fact that my animation resources have multiple frames stored in a grid structure, with the number of columns determined by the number of animation frames and the number of rows determined by the number of different animations. In other words, the animations go across each row. Some of the numbers of frames are not powers of two (i.e. an animation with 9 frames). When these are converted to 2^n textures, the number of pixels in each frame becomes different because of the conversion (i.e. if 512 pixels total, 9 frames = 512/9 pixels in each frame which means some frames have an odd and some an even number of pixels). This causes a bad effect when I draw my sprites because the same number of pixels are being drawn each time and the "extra" pixel, if there is one, is being considered part of the next frame, so each frame "slips" a little bit into the next and it looks like an old movie reel that''s slipping when animated. It''s late and I''m hoping this is comprehensible. Again, the basic problem is that I don''t see a good way to have non-power-of-two frame counts in a 2^n texture because the frames will have different widths, which presents a problem drawing/animating them. Can anyone help? Having only power-of-2 frame counts is not workable for me. This is all a home-brew self-learning project, so I realize these are most likely not industry best practices, but I would appreciate any help anyone can give. Thanks!

Share this post


Link to post
Share on other sites
You''re simply going to have to readjust your engine handling/thinking. Yes, keep your textures in a power of 2, (If you don''t, DX will convert it to the next sized power of 2, BAD!) but you can drawsprite in any size shape that is not a power of 2. Just keep an array of where the "sprites" are inside the texture.

Share this post


Link to post
Share on other sites
Well, if you want to be able to load odd-sized animation strips then you don't need to change your files at all.

Load your texture strip as a surface instead of as a texture. In memory, cut out symmetrical slices as textures and stuff them in all in a collection of some sort. (i.e., STL vector for C++ or ArrayLists for C#). Surfaces do not have any silly power of 2 restrictions.

For instance, if your texture is 32x288 and contains 9 frames, you would take 9 32x32 slices at the appropriate positions and add 9 textures to your game. If they are vertical or horizontal strips across the board, the logic will be very easy. Simply use the width or height to determine the slice size.

Set it up you would access the 8th texture of animation as SomeSprite[8].

Make sure to get rid of each surface when you are done with it.

...

Now, the good news. There is a tutorial on this very site which describes what I'm talking about and contains code for it.

The tutorial is here in the DirectX section. http://www.gamedev.net/reference/articles/article1608.asp. The article contains code for loading sprites as surfaces and taking smaller textures out of them if they are not power of 2 or are bigger than the maximum allowed by the graphics card.

But, you should to be able to write code for this in an hour or so, which is less time than it would likely take you to change your entire file system.

..

Now, this will only solve the problem with those sprites that are really animation strips. If they are actually non-power-of-2, then you just have to deal with the empty space.


[edited by - The Frugal Gourmet on April 6, 2004 5:22:38 PM]

Share this post


Link to post
Share on other sites
when you create your texture make sure you arent filtering it
D3DX_FILTER_NONE
this should prevent d3d from auto resizing to power of 2
as long as the video card supports non power of 2 textures
i use ID3DXSprite interface and have multiple frames in each texture and it works fine

Share this post


Link to post
Share on other sites
quote:
Original post by eFoDay
...as long as the video card supports non power of 2 textures...

Many don''t. From my own experience, I''d even go so far as to say the majority of cards don''t.

Share this post


Link to post
Share on other sites
eFoDay: Since I might want to work as a programmer someday, I might look to use this as a demo, so I need it to work with all cards. (Drat.) Thanks though.

Frugal: Sounds like it''ll work. I''ll get to work on it right away. Thank you very much.

Everyone else: Thanks for your help!

Share this post


Link to post
Share on other sites
Oh, one more thing...

eFoDay: I just tried D3DX_FILTER_NONE and it worked for my video card! Awesome. For the time being now I can concentrate on my game instead of the graphics engine. Thanks!

Share this post


Link to post
Share on other sites
quote:
Original post by glassJAw
Many don''t. From my own experience, I''d even go so far as to say the majority of cards don''t.


geforce 2, 3, and 4 and radeon 8500 does
i think all new ones support the texture cap. i dont see why they would stop supporting it.

Share this post


Link to post
Share on other sites
quote:
Original post by The Frugal Gourmet
Now, this will only solve the problem with those sprites that are really animation strips. If they are actually non-power-of-2, then you just have to deal with the empty space.


To handle the sprites that aren''t a power of 2, you might create a texture using an alpha format and use alpha testing to make the parts of the texture transparent where your image is not rendered to it.

Share this post


Link to post
Share on other sites
Don''t quote me on this one, but I remember reading that even if a video card does support non^2 textures, it will have to perform additional processing when rendering the texture and would cause a performance hit. At the very least you may want to look into what type of performance loss you would see, especially on older cards. Listing Radeon 9700 or above as a pre-req to run a 2D game might be a bit much, especially since you could deal with the conversion yourself when you load the texture from an image.

Share this post


Link to post
Share on other sites
Hi, there are many ways to load , blit a bitmap which isnt a power of 2 .
one of them
1- split your bitmap into small chuncks say 128*128, declare an array that holds those chuncks , load , draw them .
psuedo code
CTextures m_Textures[6]
for (short i=0; i<5; i++)
load_Textures();
for (short k=0; k<5; k++)
free_them();

for (short b=0 b<5; b++)
draw_them();

2- get away from D3DX and load your textures as surfaces like ddraw.


Share this post


Link to post
Share on other sites
quote:
Original post by eFoDay
quote:
Original post by glassJAw
Many don''t. From my own experience, I''d even go so far as to say the majority of cards don''t.


geforce 2, 3, and 4 and radeon 8500 does
i think all new ones support the texture cap. i dont see why they would stop supporting it.

My GeForce 2 doesnt, two of my friends'' GeForce 4s don''t.

Share this post


Link to post
Share on other sites