Jump to content
  • Advertisement
Sign in to follow this  
K A Z E

OpenGL 2 Problems: with resizing back buffer & IDirect3DDevice9 CreateTexture

This topic is 4309 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

1) I was playing around with the SetViewport function I found and noticed that if I created a viewport with the parameters {0, 0, ClientWidth, ClientHeight}, and did that in my SetupView function like I do in OpenGL, which is called whenever I resize my window, when I restore my window, which I created maximized, back to its restored size of 640x480, the viewport is smaller than the client area of the window... (was that sentence a run-on or not? With all the commas I put, it kind of sounds right to me. =) ) Well, after looking through the D3D docs a while, I realized this is because the back buffer size of my D3DDevice is still 1024x712, the size of the client area of my maximized window. Well I thought no big deal, if I wanted to have multiple viewports I would just set their width and height to a fraction of the back buffer width instead of the actual client area of the window. Then I noticed that when I shrink my window or worse, enlarge it from whatever the original client area of the window was when I created it (and thus the back buffer width), the whole scene would be scrunched or stretched to fit the new client area. Yikes. Not pretty. Especially if I created the window NOT maximized and then maxed it. Well after looking through the docs some more, I found a way to fix it. The Reset function of the IDirect3DDevice9 object. Whoops. Another problem. If I did this, I would have to re-create all my textures and stuff. And that wouldn't be good if I had a lot of them. So what can I do about this? Is there any way to resize the back buffer OTHER than resetting? 2) I posted this in for beginners a few days ago but never got it answered. I'm trying to make an image loader class that manually loads images in and creates a texture for them instead of using D3DXCreateTextureFromFile because I want to keep information about the images like width, height, etc... Well I just copied/pasted the loading code I wrote for my OpenGL image loader, and then set about making a GenerateTexture function. The problem is, it failes when I call the IDirect3DDevice9 CreateTexture function. Here's the code from the function:
void D3DImage::GenerateTexture(LPDIRECT3DDEVICE9 D3DDevice, UINT MipmapLevels)
{
	DeleteTexture(); // If, for whatever reason, this function has already been
	                 // called and a texture exists, release it before generating
	                 // this one
	// Width (UINT), Height (UINT), bAlpha (bool), and Texture (LPDIRECT3DTEXTURE9)
	// are member variables
	if(FAILED(D3DDevice->CreateTexture(Width, Height, MipmapLevels, 0, bAlpha ? D3DFMT_A8R8G8B8 : D3DFMT_R8G8B8, D3DPOOL_MANAGED, &Texture, NULL)))
	{blah blah...;}
...
}


The image is 512x512, 24-bit, and I'm trying to generate only 1 level. The function is returning D3DERR_INVALIDCALL. So what am I doing wrong here? Why is it failing? Any help would be appreciated. Thanks in advance. And sorry for post being so long. I didn't intend it to be. =) EDIT: Solved problem 2. Apparently it was failing because of the D3DFMT_R8G8B8. It has to be D3DFMT_X8R8G8B8 instead. [Edited by - K A Z E on September 28, 2006 3:46:26 AM]

Share this post


Link to post
Share on other sites
Advertisement
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!