ok, so, we are having problems with our current mirror reflection implementation.
At the moment we are doing it very simple, so for the i-th frame, we calculate the reflection vectors given the viewPoint and some predefined points on the mirror surface (position and normal).
Then, using the least squared algorithm, we find the point that has the minimum distance from all these reflections vectors. This is going to be our virtual viewPoint (with the right orientation).
After that, we render offscreen to a texture by setting the OpenGL camera on the virtual viewPoint.
And finally we use the rendered texture on the mirror surface.
So far this has always been fine, but now we are having some more strong constraints on accuracy.
What are our best options given that:
- we have a dynamic scene, the mirror and parts of the scene can change continuously from frame to frame
- we have about 3k points (with normals) per mirror, calculated offline using some cad program (such as Catia)
- all the mirror are always perfectly spherical (with different radius vertically and horizontally) and they are always convex
- a scene can have up to 10 mirror
- it should be fast enough also for vr (Htc Vive) on fastest gpus (only desktops)
Looking around, some papers talk about calculating some caustic surface derivation offline, but I don't know if this suits my case
Also, another paper, used some acceleration structures to detect the intersection between the reflection vectors and the scene, and then adjust the corresponding texture coordinate. This looks the most accurate but also very heavy from a computational point of view.
Other than that, I couldn't find anything updated/exhaustive around, can you help me?
Thanks in advance
Hi guys, I`m starting to learn game development and I'm aiming it for VR development.
Right now I can`t afford any VR equipment so until I can, I want to start learning whatever
principles needed that will follow with me to VR when I will get it.
I started learning the basics of C# and a little bit (play around) with Unity.
I have a lot of mess in my head and I want to make it efficient as possible because I`m
self-learn all of it. I want to make an efficient syllabus to follow, milestones, to know
what I learned and what else should I learn and be as efficient as possible at it.
After I will know my route, I will polish my plan more specifically.
What I need is the information of what exactly I need (and can) to learn right now for
VR development, in what order, and recommended resources for it would be much appreciated.
Thanks a lot in advance.
I was wondering if anyone here has experience with VR development for Unity. Having previous Unity experience, I'd prefer to stick with Unity but am open to other engines. It's something I've been interested in, and I'm wondering which community/technology is the easiest to use.
An Oculus Rift costs around $400, so I'd rather not invest money in that if it doesn't have a big community and support behind it. Another option I was looking at was Google Cardboard for Unity. The guy here has a pretty good starter tutorial on VR dev for Android and the headset is only ~$20 and the Moga controller is only another 20 or so. This is definitely the most economical option, but I'd rather not go down that path if no one uses this technology or if VR on Android is crappy or something.
The VR community doesn't seem like it's that big, so I'm having a hard time getting a feel for what's popular and what direction the technology is going.
Chris "Crispy" Pusczak, CEO and Creative Director of SymbioVR, discusses virtual reality and the peripherals in VR that help with a deeper level of immersion.
PPTX Slides: Download
Hi , I was considering this start up http://adshir.com/, for investment and i would like a little bit of feedback on what the developers community think about the technology.
So far what they have is a demo that runs in real time on a Tablet at over 60FPS, it runs locally on the integrated GPU of the i7 . They have a 20 000 triangles dinosaur that looks impressive, better than anything i saw on a mobile device, with reflections and shadows looking very close to what they would look in the real world. They achieved this thanks to a new algorithm of a rendering technique called Path tracing/Ray tracing, that is very demanding and so far it is done mostly for static images.
From what i checked around there is no real option for real time ray tracing (60 FPS on consumer devices). There was imagination technologies that were supposed to release a chip that supports real time ray tracing, but i did not found they had a product in the market or even if the technology is finished as their last demo i found was with a PC. The other one is OTOY with their brigade engine that is still not released and if i understand well is more a cloud solution than in hardware solution .
Would there be a sizable interest in the developers community in having such a product as a plug-in for existing game engines? How important is Ray tracing to the future of high end real time graphics?