Jump to content
  • Advertisement


  • Content Count

  • Joined

  • Last visited

Community Reputation

122 Neutral

About Gilzu

  • Rank
  1. Quote:Original post by Machaira Quote:Original post by Mike.Popoloski Or you could stop using an outdated and deprecated API and switch to the XNA Framework. [smile] Fixed that for you. [grin] the project works on windows XP embedded, which AFAIK, has no XNA support, and even if it had, the gfx card doesn't support DX10
  2. following my exhausting attempts to find a memory leak, which was caused by TextureLoader.FromFile() I had to find an alternative by using the Texture.FromFile() which rendered the debug process to almost impossible and the runtime hundreds percents slower, I decided to write my own Texture.FromBitmap() method. Bugless this time. It took me about 2-3 days, so I decided to post it in favor of other annoyed programmers. It's just as fast as TextureLoader.FromFile(), it can't load TGA's but it can handle BMP, PNG, PCX and JPG. enjoy. // Load Bitmap Bitmap bitfile = new Bitmap(Image.FromFile(sTextureName)); Rectangle sourceRect = new Rectangle(0, 0, bitfile.Width, bitfile.Height); // Calculate the texture size in power of 2 for a wider gfx card support int l_Width, l_Height; for (l_Width = 2; l_Width < bitfile.Width; l_Width *= 2 ) ; for (l_Height = 2; l_Height < bitfile.Height; l_Height *= 2) ; // lock bitmap data and texture data BitmapData l_BitmapData = bitfile.LockBits(new Rectangle(0, 0, bitfile.Width, bitfile.Height), ImageLockMode.ReadWrite, PixelFormat.Format32bppArgb); // Create texture Format l_TextureFormat = Format.A8R8G8B8; tex = new Texture(Graphics.D3DDevice, l_Width, l_Height, 1, Usage.None, l_TextureFormat, Pool.Managed); uint[,] l_TextureData = (uint[,])tex.LockRectangle(typeof(uint), 0, LockFlags.None, new int[] { tex.GetSurfaceLevel(0).Description.Height, tex.GetSurfaceLevel(0).Description.Width }); // Copy pixels from bitmap to texture unsafe { int l_pixel = 4; for (int l_h = 0; l_h < sourceRect.Height; l_h++) { for (int l_w = 0; l_w < sourceRect.Width; l_w++) { byte* l_pointer = (byte*) (void*) l_BitmapData.Scan0 + (l_BitmapData.Stride*l_h) + (l_w*l_pixel); l_TextureData[l_h, l_w] = (uint) Color.FromArgb( l_pointer[3], l_pointer[2], l_pointer[1], l_pointer[0] ).ToArgb(); } } } // unlock the bitmap and the texture bitfile.UnlockBits(l_BitmapData); tex.UnlockRectangle(0); bitfile.Dispose();
  3. The chronicles of my ever continuing effort to resolve this problem: 13. Reference Rast. won't work on Windows Embedded. 14. Same with PIX 15. Software surfaces result in the same result. 16. Tried allocation of a new Texture and then releasing it - WORKED OUT JUST FINE! 17. Tried allocation of a new Texture, render a Bitmap on it and then releasing it to rule out other memory problems - WORKS. It appears as the TextureLoader.LoadFromFile() does load the texure, but renders it (oh, the pun in that) impossible to Dispose(). Does anyone has a good snippet to bypass TextureLoader.LoadFromFile() ? I've tried using the Bitmap class, but it is horrendously slow in comparison to TextureLoader.LoadFromFile().
  4. Hi all, I'm having this weird memory leak. In my application (C#.NET 2005 / Managed DirectX 9.0c / 2D with Direct3D using Sprite object) I have a selection screen which launches one of the options that the user chose and when it's done, the user returnes to the main selection screen. All is fine until I moved my app to its target machine, a winXP embedded ( which is essentially win xp pro) where it suddenly started invoking me strange low memory / out of virtual memory errors. Thing is, that in my own computer, everything work smoothly. I tried ruling out the following: 1. I use the latest build of winXP embedded (2007) and the directX components. 2. Drawed on my viewport the available memory: in my own computer, once i return to the main selection screen (thus releasing/disposing all of the textures) memory was back to normal, but with the *same* code on the winXPe machine it shows a memory leak i.e. +20 mb of textures that were not freed. 3. I tried to rule out other memory leaks by not using textures - everything works just fine, so it must be this problem. 4. For every texture I use after leaving to the selection screen I .Dispose() the texture and the sprite object i use to display it and then set it to null. 5. I tried using the dbmon.exe util of the DX SDK, and it says that there's no memory leak - probably this memory is released after the program quits, when there is no referenses i might've missed out. 6. after I quit the app on the winXPe machine, memory does free up... 7. I tried using GC.Collect after quitting the module and returning to the selection screen 8. I tried using GC.WaitForPendingFinalizers after GC.Collect 9. Also tried putting Thread.Sleep(1000) before GC.Collect 10. On a deparate attempt, I tried disposing the D3DDevice and recreate it. On my computer it worked just fine, target xpe machine... well, you know, memory was out after a couple of times. 11. .Reset() the device doesnt work either. 12. EvictManagedResources() - same effect. I'm running out of ideas and I really want to put this to work. Your kind help will be most appreciated, -Gil
  • Advertisement

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

GameDev.net is your game development community. Create an account for your GameDev Portfolio and participate in the largest developer community in the games industry.

Sign me up!