Radiosity
The albedo was the issue. I had a outer space scene but I never tried it. Anyway thanks for the input.
(storing the results elsewhwere and updating all lrec values at one time before the next iteration).
I this case a bit of damping helps (receiver->lrec = accumulated light * 0.98).
After porting to GPU it's unavoidable many values update at the same time, but i never got explosions,
just fluctuations when there are heavy changes in the scene.
The highest aldebo value i use is 0.9.
When you ported your radiosity on the gpu, did you use some sort of hierarchy or just brute force?
When you ported your radiosity on the gpu, did you use some sort of hierarchy or just brute force?
In my case (rendering a view from the lumel), I started with brute force. Later I moved to an iterative lattice distribution pretty much exactly matching Hugo Elias' description of such an approach (render every forth lumel, then find those in between and if close enough lerp otherwise render, then find those in between that subset and lerp if close enough) just adding a "dominant" plane check where I just dot-prod'ed the lumel's normal against the 6 cardinal planes and classified that lumel by the best fitting plane, so far that's been sufficient.
For CPU/GPU side real-time, I cluster by dominant plane and distance "emitters" and then calculate a limited number of form factors (3 usually) for each lumel against those clusters. I then create a vertex buffer for the clusters, on the CPU I just do a simple directional light, but on the GPU I create a point cloud of multiple samples for each cluster (random distribution) and use transform feedback to calculate the results at the end of the frame (shadow maps already exist at this point), I then propogate that to the lumels, and then apply the data back to the lightmap averaging the sample values. Quite fast, sending the updated lightmap to the GPU is slower than everything else combined.
Rendering enviroment maps for each texel may be a good idea for performance, but for accurary you need high resolution.
The number of pixels describes both distance and area of an object, so it's a big difference if a small object with high emission
ends up taking 1 or 2 pixels in the eviroment map, resulting in banding or noise. I found 256 x 256 env maps still not good enough for my needs.
For sure that's not an issue if you handle light sources seperately
I've never seen banding at 128x128 hemicube faces or lower. It does appear to be a problem for some, but I'd imagine it has more to do with a faulty lumel vs world unit ratio determination and misguided attempts at irradiance caching (do it in surface space, not world space) than the actual process of rendering from a lumel. The other villain to the approach is specular. GPU hardware texture filtering creates far more undesirable artifacts as far as I've seen.
Almost all radiosity research and study focuses on diffuse transfer. The original theory of rendering from the lumel also assumed a purely diffuse environment.