Jump to content
  • Advertisement
lonewolff

DX11 Odd Media Foundations bug

Recommended Posts

Hi Guys,

I am presently trying to render a video frame to a quad in DirectX 11 using Microsoft Media Foundations. But I am having a problem with 'TransferVideoFrame'.

m_spMediaEngine->TransferVideoFrame(texture, rect, rcTarget, &m_bkgColor);

The call keeps returning 'invalid parameter'.

Some googling revealed this - https://stackoverflow.com/questions/15711054/how-do-i-grab-frames-from-a-video-stream-on-windows-8-modern-apps

Which explains that there is (was??) a bug with the NVidia drivers back in 2013. Other searches say this bug was fixed.

The very odd thing is, if I run the application through Visual Studio Graphical debugger the video renders perfectly to the quad with no errors, so this would suggest that my code is sound.

I can't do as the link suggests above and try to create the device as 9_x, as this is an extension that I am creating for GameMaker 2 and I don't have access to renderer creation (only the device handle - which is fine most of the time).

I am presently downloading VS 2017 to see if it behaves better with more recent SDK's (as I am currently using VS 2015).

Thanks in advance :)

 

 

Share this post


Link to post
Share on other sites
Advertisement

Was a long shot. Just tried under VS2017 and getting the same behaviour.

'Invalid parameter' when the application is run and no video output (only sound), but works perfectly in the VS Graphical Analyser, displaying both video and sound.

 

Share this post


Link to post
Share on other sites

Did you try the debug layer for dx and dxgi ? VSGD is very intrusive and can change the behavior in many case and silent a bug...

 

 

Share this post


Link to post
Share on other sites

Hi galop1n,

The problem here is that being an extension for GameMaker that I am creating, I can't enable the debug layer (as the device creation is handled in the internals of GM).

Having said that, I am about to add this to a small DX11 framework I created a while back (which will have the debug layer), so hopefully I can get a better snapshot of what is going wrong. 

Will keep you posted on what happens. :)

Share this post


Link to post
Share on other sites
15 minutes ago, lonewolff said:

Hi galop1n,

The problem here is that being an extension for GameMaker that I am creating, I can't enable the debug layer (as the device creation is handled in the internals of GM).

Having said that, I am about to add this to a small DX11 framework I created a while back (which will have the debug layer), so hopefully I can get a better snapshot of what is going wrong. 

Will keep you posted on what happens.

You can always open the dx control panel, and force the debug layer based on the executable path

Share this post


Link to post
Share on other sites

Really? I did not know that. I'll take a look.

 

[edit]

Exception thrown at 0x7522B802 in video.exe: Microsoft C++ exception: _com_error at memory location 0x0019E4B4.

 

[edit2]

Getting closer. It appears to be the way I am creating the texture.

Exception thrown at 0x7522B802 in OOT Engine Q3.exe: Microsoft C++ exception: _com_error at memory location 0x00A5DB2C.
D3D11 ERROR: ID3D11DeviceContext::CreateVideoProcessorOutputView: Resource must have bind flag D3D11_BIND_RENDER_TARGET - was created with 8! [ STATE_CREATION ERROR #3145933: CREATEVIDEOPROCESSOROUTPUTVIEW_INVALIDBIND]

I'll adjust some things and let you know.

 

[edit3]

Nailed it!

I was previously setting the texture as a dynamic texture, thinking forward about having to copy the video frame back. But this wasn't the case. It needed to be D3D11_USAGE_DEFAULT and be set as a render target.

Code for future reference, just in case I fall in to the same trap down the track.

	// Create dynamic texture
	D3D11_TEXTURE2D_DESC textureDesc = { 0 };
	textureDesc.Width = 720;
	textureDesc.Height = 576;
	textureDesc.MipLevels = 1;
	textureDesc.ArraySize = 1;
	textureDesc.Format = DXGI_FORMAT_B8G8R8A8_UNORM;
	textureDesc.SampleDesc.Count = 1;
	textureDesc.Usage = D3D11_USAGE_DEFAULT;
	textureDesc.BindFlags = D3D11_BIND_RENDER_TARGET | D3D11_BIND_SHADER_RESOURCE;
	textureDesc.CPUAccessFlags = D3D11_CPU_ACCESS_WRITE;
	textureDesc.MiscFlags = 0;

Thanks for the assistance guys :)

Edited by lonewolff

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!