Jump to content

  • Log In with Google      Sign In   
  • Create Account


thurber

Member Since 28 Feb 2013
Offline Last Active Jul 07 2013 04:24 AM

Posts I've Made

In Topic: Rendering to a texture with vertices at UV coordinates

23 April 2013 - 06:29 AM

Hi,

 

I thought I'd respond to this thread in case it helps anyone solve the same issue that I was encountering. I finally worked out that if I rendered in wireframe mode, the edge pixels would always get drawn. When combined with solid rendering this results in the entire UV area being covered, so when sampling the texture it looks correct.

 

In case you're interested, my aim was to bake ambient occlusion fields to textures. I've written this up in a blog-post here: http://blog.mattdev.com/baked-ambient-occlusion-fields-in-cloud-racer/.

 

Thanks again for the useful discussion, it helped set me on the right path to work this out. Finally, some pictures...

 

Here is just solid rendering:

rasterization_1-1024x582.png

 

 

Here is the wireframe rendering:

rasterization_2-1024x582.png

 

When combined, you get this:

rasterization_3-1024x582.png


In Topic: Rendering to a texture with vertices at UV coordinates

03 March 2013 - 08:42 AM

Thanks for the replies, Hodgman & Jason Z!

 

Are you rendering this texture in D3D9, and the just viewing the results in Softimage?

 

Yes. I've also got the results rendering in my game (and looking the same), but I found the texture coordinate view in Softimage useful - and obviously it's safer to use their rendering code, in case something is wrong in another part of my setup :).

 

Your comments about differences in rasterization versus texture sampling have given me some ideas. I'll do some more research into that, and post again once I've got some results.

 

Cheers


PARTNERS