• Advertisement

Archived

This topic is now archived and is closed to further replies.

DirectX 8, And Older VooDoo Cards

This topic is 6246 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

I Just got the DirectX 8 SDK from Microsoft. I installed it, and tried out some of the sample apps. Everything runs, but the D3D programs won''t detect that my VooDoo card has ANY 3D accelleration functions, and decides to run the software ignoring my card. ===================== My System: True Intel 166MHz Pentium Processor 32Mb RAM Original VooDoo 3D accellerator, with the most up to date drivers (that I can find) ===================== I''ve also been using the DX 6.1 SDK up untill now, and it used the card fine. (I''d get approximately 60 fps on each of the samples, now I can only get approximately 0.25 fps) Any help would be appreciated! Rook

Share this post


Link to post
Share on other sites
Advertisement
I believe that it is because your drivers do not support DirectX 8. You will have to contact your card manufacturer for some that do.

Share this post


Link to post
Share on other sites
The problem with that, is that my card is so old that I can''t get newer drivers than the ones 3Dfx updates... and they just went bust.

The only other alternative I can see (if you are right) is that I need a new 3D Card.

If so, do you have any suggestions? (I was planning for an upgrade anyway)

Rook

Share this post


Link to post
Share on other sites
I''m in exactly the same boat actually. I have an orchid righteous and have looked everywhere for some drivers to no avail. I think a new 3D card will probably be the only option, but I don''t intend to spend the money.

There is, apparantly, the possibility of writing a driver for directX 8 that can use your cards hardware (there is something about it in the documentation, but I can''t remember much about it), but this would probably quite difficult.

I would like to know a list of cards which have drivers to work on DX8.

Share this post


Link to post
Share on other sites
Hmmm,

My Voodoo2 (with the DirectX7 driver) is working alright with DX8, er I think. I have the MW4 demo, and it runs the first time, but then on subsequent runs it says stuff to the effect of getting updated drivers. So, I mess about with it''s reg settings so it thinks it''s starting up the first time again... and I''m ready for another game

My Righteous 3D II has been living off the reference drivers of 3Dfx since Righteous/Micronics disappeared into that Diamond place. And now with 3Dfx itself on the cliff edge (if not alrady over it), it looks like there''ll be no new VoodooX drivers.

My V2''s days of capatability (and usefullness) are at an end I fear (never mind you orginal Voodoo )

Ro_Akira

Share this post


Link to post
Share on other sites
TwoFlower:

Yea, A List of cards with compatible drivers would be useful. I just wonder if anyone would be willing to do such a thing...

Ro_Akira:

I guess it''s time to open up my wallet, and buy a new card, that is DX8 compatible. Oh well...

I guess I''ll stick with trying to figure out the DX8 2D implementation, and work from there. (heh... by the time I figure it out, I''ll have the new card)

Thanx for all the replies

Rook

Share this post


Link to post
Share on other sites
Ro_Akira:

Are you sure it''s DX8 compatable, does MW4 (whatever that is) actually use the new DX8 stuff? I can run DX7 stuff using the old HAL fine, but DX8 only gives me a reference driver to use which is waaaaaaaaaaaay slow.

Share this post


Link to post
Share on other sites
Rook:

Truth be told, I''d buy a whole new computer. Sadly, when I open my wallet, I find it empty. Awww...


TwoFlower:

Well, the requirements state that it needs DX8. As far as I know (and i could be wrong), once an app is compiled to use a version of DX, it needs at least that version (or if it is possible, I doubt the guys who made MW4 would bother with the extra effort).
On a side note, it''s just the DX8 run-times I have BTW, I''m still using the DX7 SDK. Partly because it takes ages to download the DX8 SDK.

Ro_Akira

Share this post


Link to post
Share on other sites
Just as we''re all VX people here, I thought maybe you could help with this (from another post, by me):

I''ve determined that my vid cards use this format in 16-bit:

Card:.......First Byte/Second Byte
S3 VirgeGX:..XGG BBBBB/RRRRR GGG
Voodoo2:.....GGG BBBBB/RRRRR GGG

Now, I believe these are BGR cards (the S3 seems to be, in 24-bit anyway. And somebody mentioned in another post of mine, that the Banshee was BGR(which is a derivative of the V2?)). But I thought BGR would be something like:

BBBBB GGG/GGG RRRRR.

Am I missing something here? Is there anywhere where you can get this information like this on video cards (I''ve tried manufacturer websites), preferably online?

Ro_Akira

Share this post


Link to post
Share on other sites
I would say BGR is in that order, but it depends on how you look at the bytes. I seem to remember it being RGB on my card, but it''s been a long time since I played with that stuff. But why do you need to know anyway? Is it just for curiosity, or are you going to use it in an implementation, because I wouldn''t hard code my program to use any specific set up, i''d get it from the DirectX surface caps. They give a mask of which bits refer to red, green and blue IIRC.

Share this post


Link to post
Share on other sites
I want this info, because I want to know what way I should be ordering the bits in my own graphics format.

  
for(Y=0;Y<sizey;Y++)
{
for(X=0;X<sizex;X++)
{
// Currently RGB

// First Byte

*lpd=0;
lpd++;
byte_offset++;

// Second Byte

*lpd=224;
lpd++;
byte_offset++;
}
lpd+=dyoffset;
}


This testing code will give pure red. (I''ve also tested the pure blue/green). this is where I''m getting the
GGG BBBBB/RRRRR GGG format from.

Ro_Akira

Share this post


Link to post
Share on other sites
You shouldn''t assume that this format will remain the same, particularly if you want it to work on more than one type of graphics card. Ideally you should check the surface format and reformat your images when they are loaded.

Share this post


Link to post
Share on other sites
My EnumerateDirectDrawDevices() function reports the position of the R,G and B bits of my Voodoo2 as being:
R G B
11 5 0

This flies in the face of the GGG BBBBB/RRRRR GGG, obviously. But how come that code above does that then? (See diagram down below)

  

This is showing what''s being put on the surface, (those binary ''1''s make up the 224. Note how they seem match the GGG BBBBB/RRRRR GGG ''theory''.

byte 1 byte 2
bit 15 14 13 12 11 10 9 8 7 6 5 4 3 2 1 0
bin 0 0 0 0 0 0 0 0 1 1 1 1 1 0 0 0
[\source]

Ro_Akira

Share this post


Link to post
Share on other sites
224 makes 11100000 binary, also you are writing it to the second byte which, I think, is the reds (intel is byte reversed)
Try writing 31 (0001 1111) and see if it comes out blue.

Share this post


Link to post
Share on other sites
Aha! Your right. 1110 0000 does make 224.

(Just to clear this up: Intel bytes are LSByte first (let''s say, in a 16-bit value), and the bits go MSBit first (from left to right?) Also, I''m writing bytes at a time, (instead of WORDs, this shouldn''t matter though...)

Writing 31 to the first byte does make it blue. And writing 248 (which is 1111 1000), to the second makes it red. I''m building up the pixture now of...

XXXB BBBB/RRRRR XXX where X''s are presumed to be the remaining greens... ugh! That looks familiar!I''m not drinking enough coffee or something...

This all leads me to this...
Is it true that when people talk about RGB (or presumably
RRRRR GGG/GGG BBBBB) that in fact, because of Intels way of doing bytes, it actually ends up like GGGB BBBB/RRRRR GGG, for example in a file.

Eh?

P.S. Thanks for this help!

Ro_Akira

Share this post


Link to post
Share on other sites
Well, if you store it in a word it would be like this

RRRRRGGGGGGBBBBB

but because you are writing one byte at a time, you write like this

GGGBBBBB RRRRRGGG

Because when you write a word to memory the cpu automatically stores it the ''wrong'' way round.

Stick to writing with words (or better still, dwords) as it takes the same time to write 32bits as it does to write 8bits (as long as the data is aligned).

Share this post


Link to post
Share on other sites
Ah, ''excellent'' (to quote a certain nuclear power plant owner).

Then it''s that RRRRR GGGGGG BBBBB way that I should be writing to my file then (I assume that''s how most 16-bit images are stored).

Thanks for clearing that whole sorry mess up!
Ro_Akira

Share this post


Link to post
Share on other sites

  • Advertisement