Play video using DX 9.0 in fullscreen

Started by
6 comments, last by Wyster 14 years, 1 month ago
I'm looking for a good way to play video files in my game, in full screen. I have my own graphics engine and so, but i have no good idea how to play video in fullscreen. I have looked a lot in the internet but i only find outdated info. It seems almoast impossible to play video with DX9.0 in fullscreen. I have kind of successfully managed to play an avi file using DirectShow and EVR and it works great in windowed mode(i can even play it in fullscreen if graphics engine is set to windowed mode), however, if i put both graphics engine AND video to play in fullscreen, nothing gets rendered. Or should i use anything else that will work parallel with my graphics engine? Btw, i read that "true" fullscreen is no longer used by most applications and that you instead fake it by resizing the window to screen size. I tried this but encountered the problem because there's a maximum size you can resize window to(screen size) and that's a few pixels to small(borders of the window is a little visible on the sides). Anyway to fix this? //Tobias
Advertisement
I find that if you get the frame data with direct show you can swap it into a texture and then render that on anything you want. The hard part is getting to the frame data everything else is easy especially if you have written a graphics engine in Direct3D.
Hi Tobias,
What you should probably consider is using a custom allocator / presenter and VMR9. There are some standard DirectShow samples that should get you going. This will get the video coming into your engine as a dynamic texture you can then map to anything you like.

I found it all a bit obtuse to be honest, but managed in the end to integrate it into my own engine. And - assuming you have a PC with average sort of spec - playing full screen shouldn't be a problem.

Tim
Thanks for the tip about VMR9, it seems like exactly what i need.

However, while implementing this i've run into a major problem. I've done all the steps necessary(99% sure anyway), the only thing i haven't done is create an array of surfaces, instead in the GetSurface() function i simply provide the backbuffer through Direct3DDevice->getRenderTarget().

Now basicly what happens when i start this is that i get an annoying window asking me if i want to use ffdshow or not, if no, video gets rendered in separate window, if yes, i get a grey window with nothing rendered. After some debugging i found that it's because PresentImage() is never called by VMR. I don't have the faintest idea why?

Here's the relevant codes:

Allocator class:

#include "Allocator.h"#include "Graphics.h"Allocator::Allocator(Graphics* gDevice){	mGraphics = gDevice;	mRefCount = 1;}Allocator::~Allocator(){	mSurfAllocNotify->Release();	mSurfAllocNotify = NULL;}HRESULT STDMETHODCALLTYPE Allocator::InitializeDevice(DWORD_PTR dwUserID, VMR9AllocationInfo *lpAllocInfo, DWORD *lpNumBuffers){	return S_OK;}HRESULT STDMETHODCALLTYPE Allocator::TerminateDevice(DWORD_PTR dwID){	return S_OK;}HRESULT STDMETHODCALLTYPE Allocator::GetSurface(DWORD_PTR dwUserID, DWORD SurfaceIndex, DWORD SurfaceFlags, IDirect3DSurface9 **surface){	if( surface == NULL )		return E_POINTER;	if(mGraphics->getRenderTarget(0, surface))		return S_OK;	else		return E_FAIL;}HRESULT STDMETHODCALLTYPE Allocator::AdviseNotify(IVMRSurfaceAllocatorNotify9 *pIVMRSurfAllocNotify){	if( pIVMRSurfAllocNotify == NULL)		return E_POINTER;	else		mSurfAllocNotify = pIVMRSurfAllocNotify;	return mSurfAllocNotify->SetD3DDevice(mGraphics->getDevice(), mGraphics->getDirect3D()->GetAdapterMonitor(D3DADAPTER_DEFAULT ));}HRESULT STDMETHODCALLTYPE Allocator::StartPresenting(DWORD_PTR dwUserID){	return S_OK;}HRESULT STDMETHODCALLTYPE Allocator::StopPresenting(DWORD_PTR dwUserID){	return S_OK;}HRESULT STDMETHODCALLTYPE Allocator::PresentImage(DWORD_PTR dwUserID, VMR9PresentationInfo *presInfo){	mGraphics->beginRender(true);	mGraphics->renderGUI();	mGraphics->endRender();	return S_OK;}HRESULT STDMETHODCALLTYPE Allocator::QueryInterface(REFIID riid, void** ppvObject){	HRESULT hr = E_NOINTERFACE;	if(ppvObject == NULL)		hr = E_POINTER;	else if(riid == IID_IVMRSurfaceAllocator9)	{		*ppvObject = static_cast<IVMRSurfaceAllocator9*>( this );		AddRef();		hr = S_OK;	}	else if(riid == IID_IVMRImagePresenter9)	{		*ppvObject = static_cast<IVMRImagePresenter9*>( this );		AddRef();		hr = S_OK;	}	else if(riid == IID_IUnknown)	{		*ppvObject = 			static_cast<IUnknown*>(static_cast<IVMRSurfaceAllocator9*>( this ) );		AddRef();		hr = S_OK;	}	return hr;}ULONG	STDMETHODCALLTYPE Allocator::AddRef(){	return InterlockedIncrement(&mRefCount);}ULONG	STDMETHODCALLTYPE Allocator::Release(){	ULONG ret = InterlockedDecrement(&mRefCount);	if( ret == 0 )		delete this;	return ret;}


Class implementing the filter graph:
#include "VideoPlayer.h"VideoPlayer::VideoPlayer(Graphics* gDevice){	mSurfAlloc		= new Allocator(gDevice);	mGraph			= NULL;	mMediaControl	= NULL;	mMediaEvent		= NULL;	mFilter			= NULL;	mIsPlaying		= false;}VideoPlayer::~VideoPlayer(){	mGraph			->Release();	mMediaControl	->Release();	mMediaEvent		->Release();	mFilter			->Release();	mGraph			= NULL;	mMediaControl	= NULL;	mMediaEvent		= NULL;	mFilter			= NULL;	CoUninitialize();}bool VideoPlayer::init(){		IVMRFilterConfig9*				filterConfig = NULL;	IVMRSurfaceAllocatorNotify9*	surfAllocNotify = NULL;	HRESULT hr = CoInitialize(NULL);	if(SUCCEEDED(hr))		hr = CoCreateInstance(CLSID_FilterGraph, NULL, CLSCTX_INPROC_SERVER, IID_IGraphBuilder, (void**)&mGraph);	else 	{		MessageBox(0, L"CoCreateInstance FilterGraph Failed", L"Videoplayer error", MB_OK | MB_ICONERROR);		return false;	}	if(SUCCEEDED(hr))		hr = CoCreateInstance(CLSID_VideoMixingRenderer9, NULL, CLSCTX_INPROC_SERVER, IID_IBaseFilter, (void**)&mFilter);	else 	{		MessageBox(0, L"CoCreateInstance VMR9 Failed", L"Videoplayer error", MB_OK | MB_ICONERROR);		return false;	}	if(SUCCEEDED(hr))		hr = mFilter->QueryInterface(IID_IVMRFilterConfig9, reinterpret_cast<void**>(&filterConfig));	else 	{		MessageBox(0, L"QueryInterface FilterConfig9 Failed", L"Videoplayer error", MB_OK | MB_ICONERROR);		return false;	}	if(SUCCEEDED(hr))		hr = filterConfig->SetRenderingMode( VMRMode_Renderless );	else 	{		MessageBox(0, L"SetRenderingMode failed", L"Videoplayer error", MB_OK | MB_ICONERROR);		return false;	}	if(SUCCEEDED(hr))		hr = mFilter->QueryInterface(IID_IVMRSurfaceAllocatorNotify9, reinterpret_cast<void**>(&surfAllocNotify));	else	{		MessageBox(0, L"QueryInterface SurfaceAllocNotify Failed", L"Videoplayer error", MB_OK | MB_ICONERROR);		return false;	}	if(SUCCEEDED(hr))		hr = surfAllocNotify->AdviseSurfaceAllocator(0, mSurfAlloc);	else 	{		MessageBox(0, L"CoCreateInstance FilterGraph Failed", L"Videoplayer error", MB_OK | MB_ICONERROR);		return false;	}	if(SUCCEEDED(hr))		hr = mSurfAlloc->AdviseNotify(surfAllocNotify);	else 	{		MessageBox(0, L"CoCreateInstance FilterGraph Failed", L"Videoplayer error", MB_OK | MB_ICONERROR);		return false;	}	if(SUCCEEDED(hr))		hr = mGraph->AddFilter(mFilter, L"Video Mixing Renderer 9");	else	{		MessageBox(0, L"AddFilter Failed", L"Videoplayer error", MB_OK | MB_ICONERROR);		return false;	}	if(SUCCEEDED(hr))		hr = mGraph->QueryInterface(IID_IMediaControl, reinterpret_cast<void**>(&mMediaControl));	else	{		MessageBox(0, L"QueryInterface MediaControl", L"Videoplayer error", MB_OK | MB_ICONERROR);		return false;	}	if(SUCCEEDED(hr))		hr = mGraph->QueryInterface(IID_IMediaEvent, reinterpret_cast<void**>(&mMediaEvent));	else	{		MessageBox(0, L"QueryInterface MediaEvent", L"Videoplayer error", MB_OK | MB_ICONERROR);		return false;	}	return true;}bool VideoPlayer::playFile(LPWSTR filename){		HRESULT hr;	hr = mGraph->RenderFile(filename, NULL);	if(SUCCEEDED(hr))		hr = mMediaControl->Run();	else	{		MessageBox(0, L"RenderFile Failed", L"Videoplayer error", MB_OK | MB_ICONERROR);		return false;	}	if(FAILED(hr))		return false;	mIsPlaying = true;	return true;}bool VideoPlayer::isPlaying(){	HRESULT hr = S_OK;	LONG evCode, lparam1, lparam2;	while(SUCCEEDED(hr))		hr = mMediaEvent->GetEvent(&evCode, &lparam1, &lparam2, 0);		mMediaEvent->FreeEventParams(evCode, lparam1, lparam2);		if(evCode == EC_COMPLETE || evCode == EC_USERABORT)			stop();	return mIsPlaying;}void VideoPlayer::stop(){	FILTER_STATE fs;	mMediaControl->GetState(5, (OAFilterState*)&fs);	if(fs == State_Running)	{		mMediaControl->Pause();		mMediaControl->StopWhenReady();		mIsPlaying = false;	}}


Any help would be much appreciated.
I'm not sure exactly how your allocator class is being created (in particular, it's not clear what your "Graphics" class is), but with an initial quick look of your code, is your PresentImage allocator method doing the right thing? What does the renderGUI function do?

What PresentImage needs to do is take the internal texture rendered by the video stream and render it to a texture you can then use elsewhere in your engine. As far as I understand it, ideally nothing should be rendered to the screen during PresentImage - it's just a routine to access the current video texture.

Again, I'd urge you look at the sample, particularly at how it implements the PresentImage method.

I hope that helps! Good luck,
Tim

Unfortunately that is irrelevant to the problem. PresentImage is never even called by VMR. Other functions like Addref(), QueryInterface() and InitializeDevice() is called correctly but for some reason VMR stops calling after that.

Btw, i'm working on same team as Tobias, in case your wondering.
OK, fair point...

Next thing... I notice your allocator is not creating a surface in the InitializeDevice routine. I believe you should be creating a private surface in here using a call like...
lpAllocInfo->dwFlags = VMR9AllocFlag_OffscreenSurface;m_lpIVMRSurfAllocNotify->AllocateSurfaceHelper(lpAllocInfo, lpNumBuffers, &surface);

This is the surface you return in your GetSurface routine.

I could be wrong, but not creating a surface might be the reason you're not getting a call to PresentImage.

Once you do get calls to PresentImage it's in there that you'll need to blit from this new private surface into your own texture to be rendered later.

HTH, Tim
Actually i'm trying to avoid creating a new surface(this could be what's causing the error but i kinda want confirmation before i try something completely new).
In the GetSurface() function i simply supply the current backbuffer, since that one is already created, i shouldnt have to create another surface. The video is supposed to be the only thing rendered to the screen so using textures or new surfaces seems like a big overkill.

This topic is closed to new replies.

Advertisement