Jump to content
  • Advertisement
Sign in to follow this  
fireup6

[D3D9] Converting ARGB surface to NV12 surface

This topic is 844 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

I'm trying to use hardware encoding on a nvidia card using NVENC. I've successfully hooked directX and have the EndScene call, so i can get my hands on the backbuffer which is in ARGB format.

The problem is that NVENC only supports resources with the NV12 format. This isn't mentioned in the official SDK, but is clear in various powerpoint slides and the actual SDK source code you can read. In the SDK source, they create the surface that they use as an input for NVENC like so:

		IDirectXVideoProcessorService * directx_services;
		DXVA2CreateVideoService(d3ddev, IID_PPV_ARGS(&directx_services));
		directx_services->CreateSurface(desc.Width, desc.Height, 0, (D3DFORMAT)MAKEFOURCC('N', 'V', '1', '2'), D3DPOOL_DEFAULT, 0, DXVA2_VideoProcessorRenderTarget, &NV12_surface[0], NULL);

I wasn't even aware that D3D9 supported NV12 format. Already I'm not sure what's going on here. But nevertheless, the surface is created successfully, and can be mapped in NVENC.

So the problem now is how to convert the ARGB backbuffer data into NV12. This has to be done on the GPU, without resorting to copying this data to system memory.

The strategy I'm thinking of is to copy the backbuffer to it's own texture, change render targets to my NV12 target, set a pixel shader, draw the texture onto the NV12 render target (can this even be done?), and use the saved backbuffer as a texture input so that in my pixel shader i can compute the YUV components based on the texture input. Then I'll have a filled NV12 surface, and use that as my input to the encoder.

It sounds "somewhat" correct, but I'm confused as to how the pixel shader would work across the two different formats? What the pixel shader would think is one pixel on the NV12 render target, would actually be 4 Y components, at least  for 66% of the data?

Any pointers in the right direction would be appreciated. Thanks!

Share this post


Link to post
Share on other sites
Advertisement
In general, pixel shaders don't have to be aware of the output format - the GPU handles the conversion from the PS's floating point ouputs to the target texture format. That said, I've not used NV12 and don't even know if you can use it as a render target.

Another option to try would be the StretchRect function, which is a fairly high level request to the driver for pixel data to be copied and format-converted.

As for D3D9 supporting NV12 - it doesn't... But if you pass "invalid" D3DFORMAT values to D3D, it just passes them onto the driver! Every driver takes advantage of this to implement "extensions" to D3D :)
There's a lot of magic values, often FOURCC codes, that different drivers will accept.

Share this post


Link to post
Share on other sites

Unfortunately StretchRect returns D3DERR_INVALIDCALL sad.png The MSDN doc on StrechRect says:
 

  • Stretching supports color-space conversion from YUV to high-precision RGBA only. Since color conversion support is not supported by software emulation, useIDirect3D9::CheckDeviceFormatConversion to test the hardware for color conversion support.

 

I tried CheckDeviceFormat between A8R8G8B8 and the (D3DFORMAT)MAKEFOURCC('N', 'V', '1', '2'), and got D3DERR_NOTAVAILABLE, so it seems like it's not something  that can be done by simple using StrechRect, as great as that would have been.

Edited by fireup6

Share this post


Link to post
Share on other sites

Hm, i tried using the IDirectXVideoProcessorService::CreateSurface() to create an A8R8G8B8 surface instead of an NV12 surface and see if i could SetRenderTarget to that, and while creation of the surface would suceed, setting it as a render target would fail. So it seems like regardless of the type of surface i create using that method, i can't render to them.

I'd like to instead use the straightforward d3device->CreateTexture() method to create a texture and then a surface. I can create a A8R8G8B8 texture/surface this way with no problem and render to it. But if i try creating a texture/surface (D3DFORMAT)MAKEFOURCC('N', 'V', '1', '2'), it fails.

So with 
IDirectXVideoProcessorService::CreateSurface() i can create the surfaces in the formats I need, but i can't set them as render targets. With d3device->Create() I can't create the formats I need, but the formats i can create i will be able to render to.

Any explanation about why i can use IDirectXVideoProcessorService to create NV12 surfaces, but not d3device->CreateTexture? Is there any other way to create NV12 surfaces?

Edited by fireup6

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!