Archived

This topic is now archived and is closed to further replies.

Uthman

software pixel shaders

Recommended Posts

Uthman    485
ive never done dx/d3d programming w/ pixel shaders, but i have a general idea of how they work. i ask this question without regards to speed or realtime ability- have any of you ever written/seen a pixel shading engine in software? I find it easier to use technologies when I know how they work from the inside out.

Share this post


Link to post
Share on other sites
davepermen    1047
on the other hand, it renders faster than a gfFX 5200! :D at least, on this p4 2.8gig HT fsb800 :D

If that''s not the help you''re after then you''re going to have to explain the problem better than what you have. - joanusdmentia

davepermen.net

Share this post


Link to post
Share on other sites
Guest Anonymous Poster   
Guest Anonymous Poster
not really... 2.4ghz @ 800fbs will easily rape the daughter of any geforce or ati card

Share this post


Link to post
Share on other sites
aboeing    180
I am finding this difficult to believe. Why would anyone buy a gfx card if software is faster. davepepermen: could you provide some performance comparisons for identical scenes?

Share this post


Link to post
Share on other sites
billybob    134
of course, barely anyone has a p4 2.4, and that wouldn''t leave much for everything else a game needs. i wouldn''t imagine physics would go alongside that well.

Share this post


Link to post
Share on other sites
vember    122
quote:
Original post by Anonymous Poster
not really... 2.4ghz @ 800fbs will easily rape the daughter of any geforce or ati card


maybe it would supply a glorious display of hardware pronography, but it won''t be able to match the performance of a proper card under the same conditions.

Share this post


Link to post
Share on other sites
Guest Anonymous Poster   
Guest Anonymous Poster
Note that Davepermen only mentioned the FX 5200, not 5900.
If you do software rendering on the CPU you''ll need to do your physics and AI on the GPU.

Share this post


Link to post
Share on other sites
aboeing    180
well its usually cheaper just to buy 2 cpu''s than a new gfx card. plus that gives you extra niceness for other apps aswell..
anyway, this has made me rethink a project that i WAS going to start which focused on GPU hardware acceleration, but if optimized code can achieve similar performance now, than i probably wont bother...
I would really like to see a comparison, because on my system the software shader is significantly slower than hardware. I suppose it depends on the shaders complexity..

Share this post


Link to post
Share on other sites
davepermen    1047
actually, what i stated was a (real..) joke. the gfFX 5200 is so utterly slow at ps2.0, it hurts. softwire accelerated sw-shader runs faster on a p4 2.8gig than similar stuff on a gfFX 5200. i haven''t made a real comparison, as i would never buy such a piece of.. hw..

of course, the p4 cannot beat a real gpu. the radeon9700pro i have is about 10 to 20times faster in ps2.0 than the gfFX 5200.

and yes, doing it on a gpu, even while slower, means you can gain more cpu power for other tasks. and that''s a good thing.

If that''s not the help you''re after then you''re going to have to explain the problem better than what you have. - joanusdmentia

davepermen.net

Share this post


Link to post
Share on other sites
sjelkjd    171
quote:
Original post by davepermen
actually, what i stated was a (real..) joke.

Sometimes it''s hard to tell because of your blind hatred towards geforce fx(and nvidia).

Share this post


Link to post
Share on other sites
davepermen    1047
hehe:D

shall i buy now one to test how true it is? at least i have smooth fps in sw-shader, wich i can see gfFX 5200 users don''t have in similar scenes:D

i am not blind hatred towards gfFX. i can clearly see the good things of it. and i was an nvidiot myself for a long time. problem is they really messed up this time, both as a company, and the gfFX as a product. there''s about no one supporting them anymore except possibly carmack.

i''m not happy about this. well i AM happy that ati got a good chance once, too, now. they worked hard for it. the gfFX is great for all dx8 style stuff. for dx9, it doesn''t work well, and has a lot of missing features, wich i would like to use. as well as the performance for the money is simply too low. and i don''t want a product where i cannot rely that it acts the way it should, wich we currently can''t in nvidia-driver supported gfFX cards.

If that''s not the help you''re after then you''re going to have to explain the problem better than what you have. - joanusdmentia

davepermen.net

Share this post


Link to post
Share on other sites
sjelkjd    171
quote:
Original post by davepermen
shall i buy now one to test how true it is? \


Naw, don''t waste your money. The 5200 is sloooooow.

But is the rest of the line really that bad? I thought that the gfFX had more instructions and higher instruction count programs than the 9800 pro. I don''t see how it''s not dx9 compatible.

Share this post


Link to post
Share on other sites
noVum    170
It currently doesn''t support float textures and will never support multiple render targets in DX
Also the 9800 Pro has a unlimited instruction count (9700 doesnt have that feature)

Share this post


Link to post
Share on other sites
davepermen    1047
yes, gfFX is rather useless. all have only "half" performance in ps2.0 and equivalents, they have much "bugs a.k.a driver optimisations a.k.a. cheats". tons of problems. and lack of good general floatingpoint support anyways.

if you want to know details, beyond3d.com has analized it in the forums very much how the gfFX works, and why this is bad for performance and quality in dx9. (and opengl, the way its ment to be used, too).

the gfFX show up currently in new benches, that 3dmark03 was RIGHT about their bad performance. i think, after all the cheating scandals afterwards, and all, this one statement says enough. 3dmark03 showed correct dx9 performance comparisons between radeons, and geforces.

and if you like to code with opengl the way its ment to be coded, or in dx9, don''t even consider one. their support is in both cases rather bad, and in opengl your only way is to escape out to GlideFX, a.k.a tons of nvidia proprietary extensions. that way gfFX an outperform all other cards. but only that way. and its a stupid way.


a nice one: http://www.gamersdepot.com/hardware/video_cards/ati_vs_nvidia/dx9_desktop/001.htm

gives some ideas.

If that''s not the help you''re after then you''re going to have to explain the problem better than what you have. - joanusdmentia

davepermen.net

Share this post


Link to post
Share on other sites
C0D1F1ED    452
quote:
Original post by noVum
Isn''t Nick working for nVidia now? I can remember a post on Flipcode where he said something like that...

I could have had an internship if there weren''t so many visa issues and I didn''t have re-examinations... But I''m working for another company now :-)

No davepermen, it''s not ATI ;-)

Share this post


Link to post
Share on other sites
sjelkjd    171
I looked on beyond3d, but there''s no r350 vs gfFX comparison that I could find. Perhaps you could post a link?

You''re right that the gfFX seems to be slower. But where is it not dx9 compatible? Let me repeat - what functionality is lacking?

Share this post


Link to post
Share on other sites
davepermen    1047
it lacks full floatingpoint support for all sort of texture formats. it lacks rendering to several targets at the same time.

you can process floatingpoint pixelshaders, but you can not have "random" floatingpoint input and output. you can have that on ati cards.

opengl side its very visible: ati supports sampling from 1D,2D,3D,CUBE,RECT textures, nvidia only supports sampling from RECT, and only in their own NV_fragment_program.

they lack generality/genericy.

wich company, nick? am i allowed to know? (you can mail me personally, if you want:D)

glad btw that you are not with nvidia now, as they have currently a quite bad position.. and i don''t want you to get NDAed by such a company:D we need you! :D

If that''s not the help you''re after then you''re going to have to explain the problem better than what you have. - joanusdmentia

davepermen.net

Share this post


Link to post
Share on other sites
_Madman_    100
Lets talk about f-buffer that WORKZ, & the way its implemented!!!
HAHAHAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA

And yeap GFFX is hellowa slow I REALLY WASN''T ABLE TO PLAY CMR3 (allready finished) with bells and wistles, as it was CPU BOUND

And forget about FX5900 as it runZ at unbearable speed in DX9 titles only about 100 fps

And Radeons are slower anyway when you do a lot of tex lookup.

The point is FX is not slow, but code must be reorganized to use less registers. PERIOD!!!!

______________________________
Madman

Share this post


Link to post
Share on other sites
davepermen    1047
madman, you know that nobody cares about you as you don''t bother informing yourself over more than your box with gfFX written on, and the little markethingfuzz written on it..

and your writing style is terrible. so nobody can take you serious.

as i said yet several times. register usage is ONE big thing. but by far not all.




If that''s not the help you''re after then you''re going to have to explain the problem better than what you have. - joanusdmentia

davepermen.net

Share this post


Link to post
Share on other sites
davepermen    1047
sorry, i went off...:D




If that''s not the help you''re after then you''re going to have to explain the problem better than what you have. - joanusdmentia

davepermen.net

Share this post


Link to post
Share on other sites
_Madman_    100
I have no marketing $hit on my box, though you definately have I''m bit annoyed with painfull FX register usage, but FX isn''t slow, especially when you write programs using right style, and even if you arn''t FX isn''t slow as fps are skyrocketing anywayZ Rad+/-2fps. Yeap, right now Radeons are better to $hit the way you used to, but there is more flexibility on FX. The situation is much like Intel vs. AMD

______________________________
Madman

Share this post


Link to post
Share on other sites