Archived

This topic is now archived and is closed to further replies.

Question about the evolution of lighting

This topic is 5648 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

How do you see the future of real time lighting? Do you think we will continue to use lightmap or will we go towards per-pixel lighting and real time shadows for entire scene?

Share this post


Link to post
Share on other sites
Currently, lightmaps are really just a stop-gap solution for the fact that we can''t do global illumination techniques (radiosity etc) in realtime.

I think that that is the real nirvana for lighting in graphics... realtime per-pixel global illumination. Pursuing this path really eliminates the need for hacks to make shadowing work (as is currently the case).

Share this post


Link to post
Share on other sites
A realtime adaptive radiosity / raytracing hybrid combined with 4 dimensional BRDFs.

We''ll have to wait for the GeForce100 or so...

Share this post


Link to post
Share on other sites
Here you go, some radiosity tutorials for beginners

Radiosity in English. Good introduction of the main concepts.
Radiosity in English part 2.
Radiosity lighting. Little sloppy writting, but informative and contains source.
Hugo Elias radiosity tutorial. A very easy to understand, excellent introduction.

About realtime radiosity: Even per vertex, it would be almost impossible on any non trivial scene. The problem are the form factor calculations, if you want accurate quality on a moderately complex scene (10k-50k tris), it takes hours (or even days) to compute a radiosity solution. There is some reasearch done on realtime progressive radiosity though. It precalc's everything, but as soon as the user moves an object, the radiosity is updated in realtime. This update, however, still takes several seconds. So, it isn't really useable in a game yet.

/ Yann

[Edit] added a link ... and fixed it

[edited by - Yann L on June 27, 2002 4:29:25 PM]

Share this post


Link to post
Share on other sites
In my engine which doesn't support lightmaps (cause I don't know how to do them :D ), I light my scene using this algorithm:


Draw the scene with ambient light;

for (each light)
{
if the light is a point light
{
draw stencil shadow volume;
}

draw the part of the scene which are lighted and add them to the color buffer using blending;
}


It works pretty well, but I just tested it with trivial scenes.

[edited by - Alload on June 27, 2002 5:32:07 PM]

Share this post


Link to post
Share on other sites
I think the best light solution achievable today is:

_ using lightmaps for static geometry
_ using stencil shadow volumes for dynamic geometry

Don''t you think so?

Share this post


Link to post
Share on other sites
quote:
Original post by python_regious
No I don''t. Dynamic per-pixel lights with realtime shadows please.


Are you serious? This means no global illumination at all.

And what realtime shadow technique would you use? I don''t think stencil shadow volumes are fast enough for big scene, aren''t they?

Moreover, which solution do you prefer?

Solution 1:

render scene with ambient color;

for each light
{
render the shadows;
render the illuminated part of the scene and blend it with the color buffer;
}


Solution 2:

render scene with ambient color;

for each light
{
render the shadows;
}

render the illuminated scene and blend it with the color buffer;


The solution 1 tends to be more realistic but is much slower than the solution 2.

Share this post


Link to post
Share on other sites
quote:
How do you see the future of real time lighting?

This is not a thread about our current limitations. Just pointing that out to some people.

------------
aud.vze.com - The Audacious Engine <-- It''s not much, yet. But it''s mine... my own... my preciousssss...
MSN: nmaster42@hotmail.com, AIM: LockePick42, ICQ: 74128155

Share this post


Link to post
Share on other sites
Okay okay, I stop asking question about current implementations of lighting.


How do you see the future of the shadows? The stencil shadow volumes could be really improved by the improvement of the fillrate.

Share this post


Link to post
Share on other sites
There is no future for shadows.

A truly accurate lighting system like radiosity doesn''t need hacks like stenciling to have shadows. I think I am too much of a purist for my own good sometimes... I''m one of those "simulate, not emulate" type developers Yann is very right though... realtime global illumination is still a good way off for less-than-trivial scenes. (and please don''t think I am preaching that this is the only solution... its just what I have taken a shine to *groans at terrible pun*)

It really depends on how near in the future you are talking about. Stencil shadows are not only problematic with regard to fillrate, but it will also become more expensive to calulate the shadow volumes as geometry becomes increasingly complex... and with the way vertex-throughput levels are going with each release of new hardware, this problem is not that far away.

Share this post


Link to post
Share on other sites
Then other shadowing techniques can be used. For instance, indexed shadow mapping isn''t geometry limited, but it will suck up fill rate. And artifacts are pretty bad on occasion. Ok I''ll shut up now.

Death of one is a tragedy, death of a million is just a statistic.

Share this post


Link to post
Share on other sites
As I see it, the trend in lighting is defintely towards global illumination. There is no known shadowing or perpixel lighting technique that could even remotely achieve the quality of a radiosity solution.

For instance, look at this image or this one.

Those are not photographs, they are both 3D scenes rendered using Lightscape, a professional progressive radiosity package. Now tell me one standard perpixel lighting technique that could achieve similar results (except very high resolution lightmaps). Short answer: there is none. Global illumination is the way to go, at least in the long run.

This is how I see the near future: shadow volumes will die. They are totally inefficient when using complex geometry. They will be replaced by shadow maps, which scale far better with very high polycount scenes. BRDFs and HDRI will also be important, future 3D chips will directly support them in hardware (which requires 4D textures, some SGI hardware already support them). The step to hardware assisted global illumination will, however, be very difficult, since the basic concepts are totally different compared to current 3D engine/GPU design philosophy.

/ Yann

Share this post


Link to post
Share on other sites
quote:

BDRF's? HDRI? What are these?


BRDF = Bidirectional reflectance distribution function. A concept to parametrize surface properties, they would replace most of todays pixelshaders. They are already doable in realtime, but only on a very limited scale (since the 4D texture is approximated through two 2D texture sets). Look here for an example.

HDRI = High dynamic range image. A special form of environmental light mapping.

quote:

So how long before we get the quality from the second picture in realtime?


How long until we can visualize it in realtime ? Simple: zero. It can be done on current hardware. The project we developed for our last customer (an architecture walkthrough) was of similar quality on a GF4. The problem is, that the lighting is 100% precalculated radiosity. It is displayed using a very high resolution lightmap (approx. 300 Megs of lightmap textures for one scene). It's not interactive.

/ Yann

[edited by - Yann L on July 2, 2002 6:10:03 PM]

Share this post


Link to post
Share on other sites
quote:
Original post by Yann L
It''s not interactive.


You mean the lighting, not the walkthrough itself, right? A lot of the techniques out there that approximate global illumination look pretty good. Check out this site for some pretty good papers on realtime BRDF. I don''t fully understand them at this point in time (just skimmed over them), but the results are pretty damn good.

Share this post


Link to post
Share on other sites
quote:

You mean the lighting, not the walkthrough itself, right?


I mean the lighting. The walkthrough is full realtime at 50 fps. But you don''t have eg. the radiosity adapt in realtime, if you move a piece of furniture. The only thing we did, is precalculate the radiosity solutions for different positions of eg. a door. If you open/close it in realtime, the system interpolates between the preset solutions. It looks very nice, but it''s not real interactivity.

Share this post


Link to post
Share on other sites