Hello, I'm trying to capture each frame of the renderer in YUV420P format for encoding. I don`t really understand how to do this yet. Right now I'm getting the data in RGB format like this:
// initialize
d3d->CreateTexture(width,height,1,D3DUSAGE_RENDERTARGET,D3DFMT_X8R8G8B8,D3DPOOL_DEFAULT,&backBuffer, NULL);
backbuffer->GetSurfaceLevel(0,&backBufferSurface);
d3d->CreateOffscreenPlainSurface(width,height,D3DFMT_X8R8G8B8,D3DPOOL_SYSTEMMEM,&tempSurface,NULL);
d3d->GetRenderTargetData(backbufferSurface,tempSurface);
// in render loop
D3DLOCKED_RECT lr;
tempSurface->LockRect(&lr,0, D3DLOCK_READONLY);
// Do processing with lr.pBits
tempSurface->UnlockRect();
So from what I understand from reading other threads is that I can:
- convert the lr.pBits to Y U V planes on CPU or
- use pixel shader to convert the RGB texture to YUV texture and lockrect that to get pBits in YUV
I'm really not sure how to do either of these methods. Could someone please enlighten me on how would I go about doing either of these or if there's any better method (perhaps using DXVA)?