Archived

This topic is now archived and is closed to further replies.

Barn Door

Modes which don't support hardware

Recommended Posts

Hi, It says in the doc that you need to query an adapter for the modes(resolution, bit format) it supports. Then, if you want to use one of these modes whilst using hardware acceleration, you need to query to see if the mode supports hardware acceleration. Now, assuming I haven't misinterpreted this, why should some display modes of an adapter not support hardware acceleration? What are they good for? Thanks, BD. [edited by - Barn Door on September 2, 2002 9:17:06 PM]

Share this post


Link to post
Share on other sites
more like a feature required is not as good or powerful..

kinda like 1280 x 1024 on a 4meg card may not support hardware acceleration for anything.

its kinda wierd actually, why some cards support high resolutions with no acceleration.. some features like zbuffering and stencil buffers may not run at certain resolutions.

Its my duty, to please that booty ! - John Shaft

Share this post


Link to post
Share on other sites
Hi,

Thanks for your reply.

quote:

more like a feature required is not as good or powerful..




Hmmm, for the return value of CheckDeviceType it says...

''If the device can be used on this adapter, D3D_OK is returned''

Otherwise its just errors. i.e. yes or no

BD.

Share this post


Link to post
Share on other sites
quote:

its kinda wierd actually, why some cards support high resolutions with no acceleration.. some features like zbuffering and stencil buffers may not run at certain resolutions.



because a resolution of 1024x768 needs more memory , then say 640x480 (becasue there are more pixels to put information in) to obtain hardware acceleration. u need memory for the back buffer, for the fron buffer, for the zbuffer, and if ur card has 4 mb, then all that memory might take up more the 4mb on a 1024x768, but less then 4mb on 640x480 (therefor allowing hardware acceleration).

hope that mde sense (im not too sure myself on this)




"We call em 'natural disasters' but 'he' (or she?) calls them memory leaks!!"
Al
** MY HQ**

[edited by - alfmga on September 3, 2002 11:50:35 AM]

Share this post


Link to post
Share on other sites
Yeah, the reasons have been mentioned - mainly memory consumption. Also, cards like the older Voodoos only support rendering to 16 bit screen mode, for example, although your desktop can run in 24 or 32 bit mode without problems.

It''s loads of this wacky stuff that you have to query there, basically.

- JQ
Full Speed Games. Coming soon.

Share this post


Link to post
Share on other sites
Cheers for the response.

I''m still confused though.

When checking to see if the adapter supports a given pixel format using CheckDeviceType, one passes in two formats. One for the back buffer and one for the display.

Why would the back buffer ever be of a different format to the display?

BD.



Share this post


Link to post
Share on other sites
its really simple.

1204x768 has 786432 pixels on screen
640x480 has 307200 pixels on screen

so your application would use up more memory filling up 786432 pixels, then it would take for filling up 307200. maybe that helped?

and
quote:

Why would the back buffer ever be of a different format to the display?



i dont think it ever would. i can see a reason for it. You Can however make a backbuffer with an alpha value in it. say ur display mode is X8R8G8B8 (32bit no alpha), then you can have a backbuffer that is 32 bit with 8bit alpha ie: A8R8G8B8

i cant seem to find a reason for having alpha in the backbuffer, but its there for u to use
...enjoy






"We call em ''natural disasters'' but ''he'' (or she?) calls them memory leaks!!"
Al
**MY HQ**

Share this post


Link to post
Share on other sites
I think my original post might have been a little misleading because I didn't know what I was talking about.

I don't think it makes sense that some modes are not valid for hardware acceleration just because they have a higher resolution and thus take more memory.

The type of call to CheckDeviceType in question is only passing in a pixel format to see if the pixel format is valid.

My default video card can work with D3DFMT_R5G6B5 with screen dimensions, say, of 1600, 1200. That's 1, 920, 000 - 16 bit pixels.

There are other 16 bit modes in the DirectX doc e.g. D3DFMT_X1R5G5B5 but even at a resolution of 320, 200 which is only 64, 000 - 16 bit pixels, it doesn't work.

So, in the second case, far less memory yet it doesn't work whilst the other does.

I don't see what the format is being validated against in a call to CheckDeviceType and thus why it and any modes which support it might be rejected?

It doesn't seem to make sense that its a simple case of - does this card support this format - because in the SDK sample, the display modes of the adapters are enumerated and an array of pixel formats is built up. These formats come from the card! So why would you then need to check them in a call to CheckDeviceType ?

So maybe one is just checking if the format off the card is compatible with a DirectX HAL device? However, I copied a format name out of the DirectX doc which I knew my card didn' support and ran it by CheckDeviceType and it returned an error saying that it wasn't supported!

BD.



[edited by - Barn Door on September 3, 2002 7:46:17 PM]

Share this post


Link to post
Share on other sites
quote:

There are other 16 bit modes in the DirectX doc e.g. D3DFMT_X1R5G5B5 but even at a resolution of 320, 200 which is only 64, 000 - 16 bit pixels, it doesn''t work.


X1R5G5B5 is really a 15 bit mode, and the reason it dosnt work is probably because that format is not supported by ur card, where as R5G6B5 is. so if a format is completely unsupported then checkDeviceType will fail because the card dosnt even have that format ur testing for check the box ur card came in. it should have a list of formats/resolutions that *it* supports. take note of _it_ , because what ur gfx card supports is not what can be used. in ny case my gfx card supports modes upto 2048x1536. but i can only use modes upto 1024x768 cauase i have a crappy monitor BLAST!!

quote:

These formats come from the card! So why would you then need to check them in a call to CheckDeviceType ?



to make sure that the *specific* firmat (+resolution) can use hardware acceleration or not.



"For us it''s a natural disaster....For him/she it''s a damn memory leak..."
Al
**MY HQ**

Share this post


Link to post
Share on other sites
quote:
Original post by Barn Door
I don't see what the format is being validated against in a call to CheckDeviceType and thus why it and any modes which support it might be rejected?

It doesn't seem to make sense that its a simple case of - does this card support this format - because in the SDK sample, the display modes of the adapters are enumerated and an array of pixel formats is built up. These formats come from the card! So why would you then need to check them in a call to CheckDeviceType ?



i'll take a wild shot in dark with this... perhaps when the diplay modes are enumerated, ALL the modes are returned even the ones that cant use hardware acceleration for reasons mentioned above.
also, some cards just dont support all the modes and X1R5G5B5 might have been one of them....
but im probably wrong



Get busy livin' or get busy dyin'... - Shawshank Redemption

[edited by - Syrillix on September 3, 2002 8:01:52 PM]

Share this post


Link to post
Share on other sites
Thanks for your reply.

quote:

to make sure that the *specific* firmat (+resolution) can use hardware acceleration or not



But the call to CheckDeviceType doesn''t involve any mention of resolution! It only mentions a pixel format in the 3rd and 4th parameters.

Share this post


Link to post
Share on other sites
Anyone think that this has got nothing to do with resolution and nothing to do even with modes and that actually the doc is WRONG?

Maybe, instead of ...

'If required, the application checks for the presence of hardware acceleration in each enumerated display mode by calling IDirect3D8::CheckDeviceType,'

it should be...

'If required, the application checks for the presence of hardware acceleration in each enumerated FORMAT by calling IDirect3D8::CheckDeviceType,'

So CheckDeviceType just works out whether the given format is compatible with the card AND the device type.

BD.

[edited by - Barn Door on September 3, 2002 8:30:09 PM]

Share this post


Link to post
Share on other sites
first this is from the sdx docs

quote:

IDirect3D8::CheckDeviceType
Verifies whether or not a certain device type can be used on this adapter and expect hardware acceleration using the given formats.



ive underlines all the necessary bits.

quote:

But the call to CheckDeviceType doesn''t involve any mention of resolution! It only mentions a pixel format in the 3rd and 4th parameters.



i put "+resolution" in brackets because im not sure about it. and i would assume that directX knows which resolution the computer is in, *after* you create the IDirect3D interface. but again im not sure of this. just trying to help out.

quote:

''If required, the application checks for the presence of hardware acceleration in each enumerated display mode by calling IDirect3D8::CheckDeviceType,''

it should be...

''If required, the application checks for the presence of hardware acceleration in each enumerated FORMAT by calling IDirect3D8::CheckDeviceType,''



i guess ur right. send it in to MS . but this is what is usually done. I myself dont check each format individually because it would just require extra code in my for loop, whereas checking each displaymode dosnt required any extra code, amd it dosnt make a difference (to the best of my knowledge that is)




"For us it''s a natural disaster....For him/she it''s a damn memory leak..."
Al
**MY HQ**

Share this post


Link to post
Share on other sites