large alpha sprites problem [SOLVED]

Started by
10 comments, last by id_gamedev 16 years, 10 months ago
Hello all! I have a big problem. When I use large sprites (ie texture with 750x226 size) with lot of transparent pixels on it (say 50%) one weird thing is happening. Every time animation is started (sequence of textures) task manager shows that available memory is going down. After few hours of playing computer hangs and reset itself. There is no memory leaking, textures are loaded just once in system memory. If texture has alpha but with smaller texture dimensions, or on any other texture that hasn't alpha (I use PNG file format) everything works fine. Memory "eating" only occurs if texture is large and has lot of transparent pixels on it. Graphic card is ATI Radeon xpress200. Graphic card can load textures up to 2048x2048 dimension. I know that I should use pow2 dimensions on textures, but that shouldn't be a problem. Any idea? Thanks in advance. [Edited by - streamer on May 23, 2007 4:15:20 AM]
Advertisement
I can't think of any reason why this should happen, but do realise that 'task manager' is really not a good way of measuring memory leaks or system performance.

Get your hands on the AMD/ATI profiling tools and PIX from the SDK and do some analysis of your application. If the driver/GPU is burning memory it should appear in more detailed form in these tools.

Couldn't find it on the AMD website, but are the 'xpress 200' cards using a UMA? Do they have any dedicated VRAM? If not, you'll find that regular system memory will be under even more demand...

hth
Jack

<hr align="left" width="25%" />
Jack Hoxley <small>[</small><small> Forum FAQ | Revised FAQ | MVP Profile | Developer Journal ]</small>

I've never used PIX but I'll try it.
ATI xpress200 is integrated graphic card with 64MB VRAM. I think that card doesn't have own memory, instead she is borrowing from system. Anyway dxdiag shows that is Internal DAC 400Mhz.

Actually I noticed that my problem description was not so accurate. Problem occurs not while displaying large sprites, but rather when lot of alpha blended sprites are on screen and some of them are larger than one quarter of screen.
Animation eat some amount of memory, then if program stops the animation to do some other things inside of program, and then start again the animation, action eat another portion of memory.
If animation is constant, memory leaking occurs just once during the first loop of animation.
At first glance I thought problem starts when sprites are on same Z coordinate, but now I'm not so sure.
And it is not a problem only with sprites. I changed some of them to indexed quads but problem still remains.

Thanks for reply
I have the same card, and I haven't encountered such a problem. I have seen performance issues with large non-p2 textures, but not massive memory leaks. Just because the card can load non-p2 textures doens't mean it's fast.

Are you sure there are no leaks in your code? It could be a driver issue too, I updated mine from the default ones it shipped with and the card seems to be a lot faster.
I'm 100% sure there are no memory leaks in my program.
For example:
iAnimCounter++;
DrawSprite(100,100,iAnimCounter);
shouldn't make memory leaks.
DrawSprite just blits one quad in the screen, nothing more. I tried to debug the game, and memory loss occurs when DrawSprite is called, but not on the every frame??? but on every 10,11 frame. For animations of 40 frames memory leaks cc. 4 times. After first loop memory doesn't leak unless I stop animation wait few minutes and start it again. But I use old drivers that were shipped with my card. I'll try to update the drivers.
I tried to change driver. There is no driver problem, because memory loss persist.
Then I tried program on several computers AMD, Intel with either ATI graphic card or nVidia, but problem remained.
Then I tried some older programs I wrote, and I saw same problem there also. It is not sprite problem, either I use sprites or quads, memory is leaking. Also small textures cause memory to be leaking in smaller portion, big texture in big portion.
Now I'm suspecting that direct x version is the problem. I'm using Direct X SDK from December 2005. Could be that the problem? I didn't found nothing on microsoft site. Some of the newer releases didn't mentioned bug like that?
Is that possible?
So I updated my "old" DirectX SDK to newest. Old story.
Shortly, problem is not because of:
- old drivers
- old DirectX
- graphic card issues
- CPU issues

If I exclude all possible problems, only thing that remain is my program. So the problem is in the program.
There is another one strange thing.
If I purposely make program to lose device (alt+tab) and after recreating the device, program get back "lost" megabytes???
So actually it is not memory leaking made by some non-deleted pointers.
My program is completely loaded in system memory, just because I don't wanted to recreate so many textures. Only things that remained in managed memory are few fonts with vertex buffers.
Those I recreate if device is lost.
But on the other hand, why is that happening?
I'm getting desperate.
Are you 100% sure you're actually losing memory? When I run my app it APPEARS to be using more and more memory (for a while) as each sprite/texture is used for the first time. It's almost as if the load texture function merely prepares/caches the texture for use but the memory increase isn't seen until you actually use that texture. Don't know if this is your problem or not; does your memory problem continue indefinitely or does it stabilize after running for a little while?
Quote:Original post by MasterWorks
Are you 100% sure you're actually losing memory? When I run my app it APPEARS to be using more and more memory (for a while) as each sprite/texture is used for the first time. It's almost as if the load texture function merely prepares/caches the texture for use but the memory increase isn't seen until you actually use that texture. Don't know if this is your problem or not; does your memory problem continue indefinitely or does it stabilize after running for a little while?


Yes it goes infinitely until system runs out of memory.
But I solved the problem. Problem was in texture loading routine. D3DPOOL_MANAGED caused memory lost, but when I change that to D3DPOOL_DEFAULT everything works fine.
When I better think about it, surprisingly I think it is very normal thing to happen. My graphic card borrows 64MB from system memory. I'm loading about 200MB of textures in managed memory, and off course when I start some bigger animation that will step over 64MB and it eats memory. When animation is finished, game starts to run normally using some old textures, but when new animation starts, same story. GPU doesn't have enough memory for new animation, it erases some old textures, put new one, and eat another portion of memory.
Phew I knew it was some stupid kind of error.
It may be in the code some where that you are adding to a list and then not freeing it, it only happens when the texture is big thats why you are seeing it happen because you allocate more memory. Look in your code and you will see.

This topic is closed to new replies.

Advertisement