Jump to content
  • Advertisement
Sign in to follow this  
INVERSED

Simple Glow Effect

This topic is 5456 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

So, I was playing Fable last night, and I noticed use of the glow effect everywhere. I thought it was really neat, so I decided to see how it was done. A quick search of this forum gave me a large list of topics discussing glow effects. I spent most of the day working on a non shader version of the effects, and the results are this: http://img.photobucket.com/albums/v453/SeraphicArtist/Glow.jpg http://img.photobucket.com/albums/v453/SeraphicArtist/Glow2.jpg So I'm very happy with the results so far. I'm curios to hear from people who have done this in shaders. Shaders still scare me cuz I don't like the idea of having to rewrite different shaders for different versions of graphics hardware. I guess if I'm going to be a "graphics programmer" I better learn to deal with it. Does the shader version of such an effect really give that much of a speed increase? What benefits does a shader version of this effect offer over the non shader version. [Edited by - INVERSED on October 12, 2004 4:18:01 AM]

Share this post


Link to post
Share on other sites
Advertisement
if you are using direct3d, shaders will work on all hardware of that directx generation (ie. DX9 for the newest shaders). if you are using OpenGL, only use ARB_vertex_program and ARB_fragment_program, and the shaders will work on all hardware supporting OpenGL 1.5+

to simplyfy and bring to a higher level, you could use a shader language such as nVidia's Cg or Direct3D HLSL (C like language but for the graphics card!)

I've implemented the said "glow effect", the shader version gave me quite a bit of flexibility, because i could play with settings on a per-"pixel" level eg. the convolution blur factor etc. it was particularly easy because i used nVidia's Cg (and yes, i do have an ATi Radeon 9500 pro). one thing is that shaders are hell to debug (in OpenGL - what i use) but in the newest DX SDK i belive there is a new shader debugger of somesort.

anyway, worth checking out :)

Cheers,
Danu

PS. my glow effect screenie!

Share this post


Link to post
Share on other sites
Which non-shader method did you use? It's impossible to say whether shaders would give you a speed increase without knowing how you're doing it at the moment [smile]

For the most part, shaders aren't a problem. If you write for the low version ones (like ps.1.1) then it'll run on pretty much any card that supports pixel shaders. The issue of rewriting shaders only arises if you want to take advantage of shortcuts available through later versions - even then you'd only be writing one shader per version (and you'd skip some - one 3.0 shader, one 2.0 shader, and a 1.1 shader is probably enough).

Share this post


Link to post
Share on other sites
The technique I used was one based on the NEHE radial blur tutorial. I rendered the scene to the framebuffer, then I rerendered the scene to a 256 x 256 PBuffer using a glow texture. Then I sampled that PBuffer into another even smaller PBuffer (64 x 64) like 25 times, then I drew the small texture over the whole screen. on my Radeon 7500 it runs about 60 fps, and on my 9500 about 90 fps... that's in windowed mode though. The entire mech is about 1500 polys, which means it's not pulling the best framerates, but I need to try a shaders version for comparison. Still, in a situation where you don't have much geometry, and no shaders, it's respectable.

So, I'm trying out this shader stuff. I just downloaded a ton of stuff for CG. I picked that cuz it's high level and API inspecific. I've always been a little skeptical of CG though cuz I'm an ATI fan and I don't like the idea that NVIDIA might possibly not want to optimize for ATI as much as it could. Being the only crooss-platform/api high level shader language I know of, I guess it will have to do.

Share this post


Link to post
Share on other sites
Cg is the same language as HLSL, which ATI has no problem supporting. AFAIK, the language is not keyed towards GeForces in particular.

Share this post


Link to post
Share on other sites
Quote:
Original post by Sneftel
Cg is the same language as HLSL, which ATI has no problem supporting. AFAIK, the language is not keyed towards GeForces in particular.


i was under the impression that Cg generats faster shader code when the target is nvidia hardware.

i also find it hard to beleive that the HLSL of OpenGL that was created by nVidia will not optimize favourably for nVidia's hardware. :)

Share this post


Link to post
Share on other sites
The language is hardware-independent, but the tools may well not be [wink]

The technique you used sounds a bit like the non-shader approach in GPUGems... might be worth a look, if you've got it. (Is probably worth a look if you haven't got it, either). I reckon there's a trick with multitexturing that would allow you to condense your 25 passes into something more like 5-6...

Share this post


Link to post
Share on other sites
Hi,

Cg and HLSL are not the same. Even when MS and nVidia colaborated, nVidia added some features. So some instructions of Cg may not compile directly on HLSL, but its easy to fix.

Cg is a good language but the lack of support for ATI cards (obviously) makes me uneasy with it. Anyway, nVidia has almost stopped Cg production and embraced the FX framework from DX9 (and in the way, dropping their FX framework CgFX).

Inversed, when you use the FX framework, the FX file may contain many versions (techniques) of the effect. The FX framework select which technique works with the hardware available in the computer. So the hard job is to provide enough versions in order to get better visual quality in as much HW as posible. You dont need to do that anyway, its optional. But I grant you the owner of a GeForce 6800 or Radeon X800 will be displeased seeing an effect that may not be what they expected.

Anyway, my glow, for VS and PS 1.1
http://www.spritekin.com/warscale/screen4.jpg

Luck!
Guimo




Share this post


Link to post
Share on other sites
<rant>
Is it just me or has graphics programming become one big pain in the ass since the introduction of shaders. I mean, it's always been a pain in the ass, but I remember when hardware TnL was a big thing and your choices where use it or don't use it. With shaders you have write like five verisions of an effect. I guess it's my fault for trying to do the sensible thing and not choosing sides in the graphics api debate. I just like flexibility. ... Maybe I should just be an audio programmer or something :)
</rant>

Share this post


Link to post
Share on other sites
Quote:
Original post by INVERSED
<rant>
Is it just me or has graphics programming become one big pain in the ass since the introduction of shaders. I mean, it's always been a pain in the ass, but I remember when hardware TnL was a big thing and your choices where use it or don't use it. With shaders you have write like five verisions of an effect. I guess it's my fault for trying to do the sensible thing and not choosing sides in the graphics api debate. I just like flexibility. ... Maybe I should just be an audio programmer or something :)
</rant>
In a few years, technology should level out somewhat and you won't have to support widely varying shader capabilities. The current state of affairs is extremely transitional, as developers and manufacturers come to a consensus on what cards need to do and how they need to do it.

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

GameDev.net is your game development community. Create an account for your GameDev Portfolio and participate in the largest developer community in the games industry.

Sign me up!