an someone be kind enough and explain their GDC slide to me:
http://fileadmin.cs.lth.se/cs/Education/EDAN35/lectures/L10b-Nikolay_DRTV.pdf
I just need a nice detailed description of what the system essentially is with their "probes"
From what I gathered...they use spherical harmonics to deduce irradiance in a scene, and the dynamic objects sample nearby probes - using a sort of voxel cone tracing on the camera which calculates for X+1 and Y-1 shift so that they don't have to essentially render every probe outside of players view?
Please note, I am not a game programmer..nor do I want to be one. I am just a consumer trying to gain better understanding so that I can keep up with Current Gen vs Next Gen etc