Sign in to follow this  

Painting textures in world

This topic is 3458 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Hey all, I'm starting to write an application that would allow the user to paint on decals and such to textures in a world. I need the user to be able to actually write and save the texture data as opposed to just drawing decals over the original texture. In addition, the mesh shares one giant texture My problem is that I'm not sure how to go about this. The worlds can be arbitrarily meshed (ie, could be smooth surfaces, or could be rocky/bumpy surfaces). My two ideas were: 1) Raytrace in the world and write texels out, ie: foreach(texel in stamp_texture) raytrace get u,v of resulting coordinate write texel to u,v in texture However, this approach is probably too slow to be feasible... as that is a lot of raytracing if the stamp is 256x256 or bigger My other idea was to use some sort of projective texturing. I'm not sure of the math though - I imagine I would have to transform the hit triangles into projection space, and then again loop through and apply the texels. I can't see all the steps needed to do this though... So i was hoping someone smart could help get me on the right track =) Thanks for your help!

Share this post


Link to post
Share on other sites
I'm not sure I can be considered "someone smart", but here's an idea:

You could send a ray from the mouse pointer in the direction the camera is pointing and see whether it hits a mesh/triangle. Next you calculate the Barycentric coordinates where exactly on the triangle the ray hits. Then you interpolate between the triangle's UV coordinates to find where on the 2d texture the ray hits and draw a dot on the texture.

The D3DX library has a handy function for ray-intersections: D3DXIntersectTri() and one for saving textures back to a file: D3DXSaveTextureToFile()

Share this post


Link to post
Share on other sites
I was once trying to cook up a demo that did this entirely in Hardware, but I kinda lost interest half way thru - I'm still convinced it can easiely be done in hardware. Basically you start out with some kind of shadow-mapping approach including a projective texture (you don't just do the lighting computations for pixels in the light, you also do texturing from the viewpoint) - But is is only like a preview: If you wanna bake that bit into the actual texture, you need to have it ready for render-to-texture and project the right texels into texture-space to render to them. I recommend you read up on shadow-mapping and try to truly understand it - the render-to-texture in texture-space is just a small extension to that really.

Edit:
More formally, my idea was to render the mesh using it's texture-coordinates as vertices and ignoring transformation matrices for that, while doing the "shadowing" in cameraspace.

Share this post


Link to post
Share on other sites

This topic is 3458 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

Sign in to follow this