DirectShow Questions
Hey all
I'm trying to make a webcam application. Originally I thought I should grab individual frames from the camera, compress them, and then send them. But doing that gives me vast images. A 160x120 JPEG compressed image is 5Kb, and that means about 2FPS if I'm sending over a typical internet connection.
I thought I should probably be using MJPEG, since this is what its best at. However, how do I read the data it produces? I'd like to add a MJPEG compressor to my graph, and then somehow read the output, and send it through a socket, where it can be read at the other end.
Is there a way to render to a memory buffer or something in DirectShow? I know I can render to a preview window, a file, or to a NullRenderer, but I can't find any documentation on rendering to a buffer. Is it possible?
Cheers,
Steve
EDIT: This should maybe be in the DirectX forum. Mods feel free to move it.
[Edited by - Evil Steve on September 13, 2004 1:01:47 PM]
The data is delivered to you already in a buffer, you need to write a two filters, a socket sink and a socket source.
I see. I assume I'll have to write these filters, they don't already exist in DirectShow? I can't find any reference to them in the docs anyway. I'm having a look around the MSDN for information about writing a custom filter at the moment.
Cheers,
Steve
Cheers,
Steve
Why don't you use the AVIFile-API?
It has got compression, can capture too, is low-level enough
to be very flexible, but high-level-enough to be implemented
in a few minutes.
Look at your MSDN-Lib.
It has got compression, can capture too, is low-level enough
to be very flexible, but high-level-enough to be implemented
in a few minutes.
Look at your MSDN-Lib.
Quote:Original post by Martin Foerster
Why don't you use the AVIFile-API?
It has got compression, can capture too, is low-level enough
to be very flexible, but high-level-enough to be implemented
in a few minutes.
Look at your MSDN-Lib.
I had a look at it before, but I couldn't find any functions to read from a capture device, only from a file.
Quote:Original post by Evil Steve
I see. I assume I'll have to write these filters, they don't already exist in DirectShow? I can't find any reference to them in the docs anyway. I'm having a look around the MSDN for information about writing a custom filter at the moment.
Sorry, my grammar was horrendous; yes, you need to implement these two filters yourself, or find generic ones already written by someone else.
Ok, thanks.
Does anyone have any links to tutorials for this stuff? The MSDN is confusing at best, and I'm not entirely sure about all the COM stuff. I understand that I need to register a class factory with COM, and that factory should create my filters - but I'm not sure how to register my stuff with COM, or if I need the 7 or 8 Co*() methods in my DLL...
Does anyone have any links to tutorials for this stuff? The MSDN is confusing at best, and I'm not entirely sure about all the COM stuff. I understand that I need to register a class factory with COM, and that factory should create my filters - but I'm not sure how to register my stuff with COM, or if I need the 7 or 8 Co*() methods in my DLL...
I've managed to get something working with the help of the code from the base classes (but I'm writing my own classes, not using them).
I can't get COM to realise that my class factories are actually in the DLL. I've tried using CoRegisterClassObject(), but that doesn't seem to work. I don't really want to have to add my filters to the registry. Anyway, it seems to work by sidestepping COMs initialisation and just doing IBaseFilter* pFilter = new CSocketSink(). Which I'm sure is horrible, but its working for now.
The problem I'm having at the moment is that I don't know exactly what I should have for my media type. The camera gives a media type of MEDIATYPE_Video and MEDIASUBTYPE_RGB24, with the format set to video, and the VIDEOINFOHEADER set up accordingly.
Whenever I call RenderStream(), I get error code 0x80040217 (VFW_E_CANNOT_CONNECT). If I put breakpoints everywhere, I can see CEnumMediaTypes::Next() being called 11 times. I return only one media type, which is exactly the same as the one thats returned by the camera.
Anyone know whats wrong here?
Cheers,
Steve
I can't get COM to realise that my class factories are actually in the DLL. I've tried using CoRegisterClassObject(), but that doesn't seem to work. I don't really want to have to add my filters to the registry. Anyway, it seems to work by sidestepping COMs initialisation and just doing IBaseFilter* pFilter = new CSocketSink(). Which I'm sure is horrible, but its working for now.
The problem I'm having at the moment is that I don't know exactly what I should have for my media type. The camera gives a media type of MEDIATYPE_Video and MEDIASUBTYPE_RGB24, with the format set to video, and the VIDEOINFOHEADER set up accordingly.
Whenever I call RenderStream(), I get error code 0x80040217 (VFW_E_CANNOT_CONNECT). If I put breakpoints everywhere, I can see CEnumMediaTypes::Next() being called 11 times. I return only one media type, which is exactly the same as the one thats returned by the camera.
Anyone know whats wrong here?
Cheers,
Steve
This topic is closed to new replies.
Advertisement
Popular Topics
Advertisement