Jump to content
  • Advertisement
Sign in to follow this  
Hawkblood

DX11 How do I detect the available video memory?

This topic is 742 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Well, that's the question. I HAD a method in DX9, but I can't seem to find my example. I don't even think it would work with DX11 anyway.

Share this post


Link to post
Share on other sites
Advertisement

That's a place to start, but I would like to have the available as the program runs; creating and destroying textures..... I do need what you showed me, but I need more.

Share this post


Link to post
Share on other sites

I compared the memory info with what DX Caps program was showing me. It turns out my program is using the on-board video card and not my NVIDIA card. I have tried to change which adapter it uses, but it keeps crashing on my "EnumOutputs()" function.

Share this post


Link to post
Share on other sites

I do need what you showed me, but I need more.

Think you need to include what else you are looking for or are in need of. 

Share this post


Link to post
Share on other sites

on-the-fly amount of available video memory.

 

NEW: my laptop has 2 video cards available: the on-board one and the NVIDIA installed as the one I want to use for all games (which it seems to use for them except my program).

     I need to learn how to tell the program that I want to use the NVIDIA instead of the on-board one. My attempts so far are failing at "EnumOutputs()" function. I need more info......STUDY!

Share this post


Link to post
Share on other sites

I suppose I could just keep track of the memory usage and not exceed it. The reason I wanted to do it on-the-fly is that I would scale down images to keep it away from the limit as needed.

 

Hodgeman, do you know of an easy to read example to select the second video card? I know it can be done because the DX examples can do it. The problem with the DX examples is that they are VERY hard to follow. Here is a code snippet of the problem area: 

	result = CreateDXGIFactory(__uuidof(IDXGIFactory), (void**)&factory);
	if(FAILED(result))
	{
		MessageBox(hWnd, "create factory.", "Error", MB_OK);
		return false;
	}

UINT WhichAdapter=1;
UINT q=0;
//vector <IDXGIAdapter*> vAdapter;
while (factory->EnumAdapters(q,&adapter)!=DXGI_ERROR_NOT_FOUND){
	DXGI_ADAPTER_DESC pDesc;
	adapter->GetDesc(&pDesc);
	testString+=*vString(q)+" ";
	for (int z=0;z<128;z++){
		if (pDesc.Description[z]!='\0')
		testString+=pDesc.Description[z];
		else break;
	}
	testString+="\n";
	q++;
}



	// Use the factory to create an adapter for the primary graphics interface (video card).
	result = factory->EnumAdapters(0, &adapter);
	if(FAILED(result))
	{
		MessageBox(hWnd, "enum adapters.", "Error", MB_OK);
		return false;
	}

	// Enumerate the primary adapter output (monitor).
	result = adapter->EnumOutputs(0, &adapterOutput);
	if(FAILED(result))
	{
		MessageBox(hWnd, "enum outputs.", "Error", MB_OK);
		return false;
	}


I can view the available video adapters by way of the string I display once D3D is running. 0 is the on-board video and 1 is the NVIDIA card. If I put a 1 in the EnumAdapters(1,&adapter) line it fails at EnumOutputs. I'm stumped.......

 

Share this post


Link to post
Share on other sites
It fails because the NVIDIA card doesn't have any outputs. It's how those dual GPU setups work, the final output has to go through the integrated card no matter what.
Both NVIDIA and AMD provide ways to force using the dedicated card; the easiest one is adding this to your program:
extern "C"
{
	__declspec(dllexport) DWORD NvOptimusEnablement = 1;
	__declspec(dllexport) int AmdPowerXpressRequestHighPerformance = 1;
}

Share this post


Link to post
Share on other sites

DUDE! If I could upvote you more than once, it would be +100..... maybe more.

 

That worked. But it shows I have 3 of them and it doesn't show the on-board one.... Don't think that's too big a problem unless someone actually has multiple video cards. Since I don't, I don't know if this keeps the others from showing. Do you know if that's a problem?

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!