Image reverse rendering

Started by
1 comment, last by udvat 13 years, 8 months ago
From a 2d image I estimated color for each pixel using some technique. I also made the 3d model(mesh) of the image. Now I want to render the 3D model using openGL to see whether my estimated color truly matches the original image color.

I dont want to texture map my image to the model. Rather I want to render
the model using my estimated color information. So I need color information for each vertices of the mesh .But how can I link between
my color information computed per pixel to vertices of the mesh?
Advertisement
The best way would be to texture map your model.

For another method you could try rendering your model to some kind of GBuffer as deffered rendering techniques use and match the 3D position of the pixel with the 3D position of your vertices. When you find a match you use the image coordinates to sample your color data image and add the sampled color to the matched vertex. This technique will require an additional rendering pass and the lookup will be very expensive from a computational point of view.
Would you please add more information to the following lines?

Quote:Original post by Razvan1024
For another method you could try rendering your model to some kind of GBuffer as deffered rendering techniques use and match the 3D position of the pixel with the 3D position of your vertices.


This topic is closed to new replies.

Advertisement