Archived

This topic is now archived and is closed to further replies.

INVERSED

The Perils of Hardware Shaders

Recommended Posts

So, I''ve just fiinished reading that article on pixel and vertex shaders, and I''ve decided that they are really neat. I would like to implement them in my engine, but I''m wondering what pitfalls might I run into using such a system. For instance, I was thinking that if my engine wanted to use up to say 6 lights, that the vertex shader would have to calculate 6 lights worth of code every frame, even if there where only four. So then you would have to write a different shader for each potential number of lights the game might support and switch shaders based on how many are active. I guess, in theory, this is what happens in D3D or OGL as you enable lights, even if you haven''t set the parameters for that light. Has anyone else pondered some of the pitfals of shaders? If so, I''m curious as to what said pitfalls are, and what woork arounds you''ve come up with.

Share this post


Link to post
Share on other sites
The most obvious pitfall is that there''s still a lot of people out there using cards that don''t support shaders.

Share this post


Link to post
Share on other sites
I wouldn''t worry about simple cases like multiple lights. You can put loops in your shaders and something like Cg or Sh will just unroll them. In Cg I think you would just use some macro like NUM_LIGHTS, and compile the shader multiple times with different values for NUM_LIGHTS.

I don''t know if you''d call this a pitfall, but lack of debugging support is a pain. The usual workflow is
1.tweak shader
2.tweak shader
3.black screen/no object drawn
4.curse; bang head on desk
5.rewrite shaders to dump every temporary you use to the output color until you find one that''s wrong.

But in my experience the benefit of programmable shading is so great you can ignore all the potential disadvantages.

"Math is hard" -Barbie

Share this post


Link to post
Share on other sites
quote:
Original post by INVERSED
So, I''ve just fiinished reading that article on pixel and vertex shaders, and I''ve decided that they are really neat. I would like to implement them in my engine, but I''m wondering what pitfalls might I run into using such a system. For instance, I was thinking that if my engine wanted to use up to say 6 lights, that the vertex shader would have to calculate 6 lights worth of code every frame, even if there where only four. So then you would have to write a different shader for each potential number of lights the game might support and switch shaders based on how many are active. I guess, in theory, this is what happens in D3D or OGL as you enable lights, even if you haven''t set the parameters for that light. Has anyone else pondered some of the pitfals of shaders? If so, I''m curious as to what said pitfalls are, and what woork arounds you''ve come up with.


Or you could make a shader that can handle the maximum number of lights it has to deal with and then put in 0 for the light color/intensity for the others.

Share this post


Link to post
Share on other sites
quote:
Original post by Pragma
I don''t know if you''d call this a pitfall, but lack of debugging support is a pain.


Not with DirectX/HLSL. You can debug shaders using DX9 and VS.NET.

Share this post


Link to post
Share on other sites
quote:
Original post by superpig
quote:
Original post by Pragma
I don''t know if you''d call this a pitfall, but lack of debugging support is a pain.


Not with DirectX/HLSL. You can debug shaders using DX9 and VS.NET.


Unless you''re using Win2K, it seems..

Share this post


Link to post
Share on other sites
quote:
Original post by superpig
quote:
Original post by Pragma
I don''t know if you''d call this a pitfall, but lack of debugging support is a pain.


Not with DirectX/HLSL. You can debug shaders using DX9 and VS.NET.

öAnd WindowsXP Professional. MS have done some polls regarding the need for Win2K support, but I''m not sure what the results where.

Muhammad Haggag,
Optimize
Bitwise account: MHaggag -

Share this post


Link to post
Share on other sites
I thought nvidia had a shader debugger that could debug both vertex and pixel shaders. As for programming your shader to use the maximum number of lights and setting the others to black, I thought of that, but it would seem inefficient to calculate the lighting per vertex for a bunch of lights that are black, and at the moment, I only know shader assembly, so there is no conditional checking. Can CG and other hlsl systems can compile down to shader asm? Finally, as lack of hardware support goes, I thought about that too, but supposedly vertex shaders can be done reasonably in hardware, and if you don't have pixel shaders, then you just don't have per pixel lighting.

[edited by - Inversed on April 4, 2004 5:40:50 PM]

Share this post


Link to post
Share on other sites
"I thought nvidia had a shader debugger that could debug both vertex and pixel shaders."

yes.

for multiple lights, nv40 has the ability to do real loops inside the shaders, that was what the demo ngill saw was about, you can download a few gdc papers about nv40 ans ps/vs 3.0 on nvidia''s developper page.

Share this post


Link to post
Share on other sites
If I ever come to code some shaders I would use RenderMonkey (from ATI). Quite a few known game companies have posted their comments on it on the ATI web site. It supports shaders only and nothing else, but it comes with some demo models, textures and best of all dozens of demo shaders for all shader versions, but mainly for pixel shader 2.0.
It is so cool because when you change the shader code, you click a button and have the result on screen in an instant. No more black screen! Instead: Syntax highlighting and a text window that catches the shader-compiler messages.
The data is organized in a nice tree structure with icons, so you''ll find everything quickly.
STOP BANGING YOUR HEAD! USE RENDERMONKEY!!
...no I am not from ATI, it''s just... this program rox.

Share this post


Link to post
Share on other sites
Hehe, I thought GLSlang is more or less a feature for previews in Raytracing software? I recently read an article about shaders and it said that it would be too slow for games, yet powerful and almost reaching the quality of commercial raytracing software.

Share this post


Link to post
Share on other sites
quote:
Original post by NerdIII
Hehe, I thought GLSlang is more or less a feature for previews in Raytracing software? I recently read an article about shaders and it said that it would be too slow for games, yet powerful and almost reaching the quality of commercial raytracing software.

Huh ?

GLSL has nothing to do with raytracing. You might be confusing it with RenderMan shader language.

GLSL is a vendor independent high level shading language for OpenGL. Just what HLSL is for Direct3D. It''s also a replacement for Cg. It can do the same things as those two, ie. you can use it to program vertex and pixel shaders without resorting to ASM. GLSL has no ''performance'', it''s just a language spec. It will compile down to ASM targeted at the GPU in the user system at runtime. The bottleneck factor is the speed of the GPU executing vertex and pixelshaders, nothing more. Just as for Cg, HLSL, or hand written ASM.

Share this post


Link to post
Share on other sites
There are several ways to solve the permutation problem.

For lower targets, e.g. vs_1_1, you would resort to making a set of shaders, one for each combination that you need. This would be done with a set of Macro tricks.

For vs_2_0, a host of new options become available, you can perform static branching and basic looping. However, on some video cards the static branching may be no different then using a permutation of the shader.

in ps_3_0, branching and looping are available, but don''t expect the cost to be free - its quite likely that the hardware will have to execute both sides of any conditional.

Share this post


Link to post
Share on other sites
No, I did not confuse the two, I read an article comparing D3D, OpenGL and RenderMan and it sounded to me as if GLSLang was so high level that it would emulate everything in software that was not hardware supported without asking. But now that you said it is just like HLSL I see it different.

Share this post


Link to post
Share on other sites
The biggest issue I find with using shaders is that it cuts your market short very quickly. As soon as you use shaders you''re pushing for an audience that is made of hardcore gamers that have the hardware to run the new games.
Average people who don''t play games (other then the occasional freecell round) are able to get by on 600mhz computers with old geforce cards. We''re reaching a point now where slower computers are no longer obsolete because they arn''t able to function with necessary software(word processing, internet). The generic public isn''t pushed to update their hardware. I wonder if longhorn or the next OS will require more from the hardware to push people to update.
Low system requirements I think are key in getting sales specially when you''re a freelance game developer.
~Wave

Share this post


Link to post
Share on other sites
so build it in thoughtfully, use it to make things better but don''t make it a requirement. Shaders can be used to do a number of effects that really are optional. If you implement it right, you should actually broaden your sales base, those that do have newer hardware will apreciate it being used, and the game looking/playing better, those that have older hardware will still be able to enjoy the game. It''s all part of considering scalability when designing your engine.

Share this post


Link to post
Share on other sites
I plan to scale my engine by making dx8+ cards able to do diffuse & specular bump mapping, whereas < dx8 cards will do lightmaps & shadows only, with vertex lighting.

Share this post


Link to post
Share on other sites
Guest Anonymous Poster
Sorry for the shameless plug, but if you are interested in using RenderMonkey 1.5, there is a book comming out in the next 2-3 weeks which implements several shading techniques and exclusiveley uses RM. The 1.5 version will also be included on the CD-ROM.

http://www.course.com/catalog/product.cfm?isbn=1-59200-092-4

Share this post


Link to post
Share on other sites
One of the big problems I see with shaders and "current generation" video cards is that the techniques to get ncie visuals on a card without vertex/pixel shaders are completely different than the approach you can take with shaders. And there isn''t an easy fallback. At least with multiple textures, for example, you can fall back to multipass rendering.

Thus in regard Wavewash''s comments, the independent game dev is in a bit of a quandry. He can target a low end card, say a geforce 2 and hope for a bigger audience, but at the same time the people with "high end card" (read thas something with shaders) are laughing at the graphics... that is unless the independent game dev has the time to write good code for both generation of cards, which he probably doesn''t.

I think a partial solution might be to provide "good enough" graphics for the old generation cards, and then focus on getting the most out of current and next gen cards.

Share this post


Link to post
Share on other sites
>>waiting for either ATI or 3DLabs to make a big
>>glowing ''download version 1.5'' button so it can finaly grab a
>>version with GLSlang support

i have been on the 3Dlabs OpenGL shading language Master Class event in Munich, last week; and there they used RenderMonkey with GLSL support to show us all this fancy stuff; i''m not sure what version it was, but, IIRC it was 1.5 (but it could have been a special "for 3dlabs only"-version, dont know)

DJSnow
---
this post is manually created and therefore legally valid without a signature

Share this post


Link to post
Share on other sites
I think the two biggest pitfalls are (1) learning a shader language (not an easy task if you''ve never used ASM) and (2) they keep getting better! You can do things with ps3.0 that aren''t even physically possible with earlier versions. I like to see graphics hardware advance as much as the next guy, but it took what, a year before ps1.4 was completely obsolete? Most people don''t have the cash to buy bleeding-edge video cards every other week.

I''m still a fan of doing things in software. If your code is tight, you don''t even need hardware acceleration.

That was a half-joke.

Share this post


Link to post
Share on other sites
Sometimes you just have to make the cut at some level. The art path isn''t backwards compatible. You don''t have to be on the bleeding edge of hw because different games have different rendering styles. There is a need for cell shading, for super realistic shading, for a combination of 2D and 3D shading, etc. It''s not like everyone is going to go super realistic with their games of all the sudden. It wouldn''t fit the style of some games.

Share this post


Link to post
Share on other sites