Jump to content

View more

Image of the Day

#ld38 #screenshotsaturday Mimosa Fizz action gif #2 https://t.co/TUzdppvfUL
IOTD | Top Screenshots

The latest, straight to your Inbox.

Subscribe to GameDev.net Direct to receive the latest updates and exclusive content.


Sign up now

How do I read a CubeMap on CPU side? (Or how the GPU texCube function read a pixel from a CubeMap)

4: Adsense

Old topic!

Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.


  • You cannot reply to this topic
2 replies to this topic

#1 tpastor   Members   

121
Like
0Likes
Like

Posted 14 April 2012 - 07:37 AM


I need to extract the one pixel from a CubeMap (I already have the 6 textures extracted) using the normal vector. How can I achieve this ?

I want something very similar to what texCUBE(link) does in GPU but in CPU.

I am using XNA and C#, but an example in any language may help.


#2 mrhyperpenguin   Members   

467
Like
0Likes
Like

Posted 14 April 2012 - 09:27 AM

The largest component of the normal vector tells you which face it intersects. Next divide the normal vector by the absolute value of the largest component. Then scale and shit the other two components into the usual range for uvs ([0, 1] on each axis).

For example if you had the vector <-3, -1, 2>.

First it intersects the negative x face. Then

<-3, -1, 2> / |-3| = <-1, -1/3, 2/3>.

Finally just scale the remaining two components from [-1, 1] to [0, 1], <-1/3, 2/3> * 0.5 + 0.5 = <2/3, 5/6>. And similarly for each other face.

Victor

#3 tpastor   Members   

121
Like
0Likes
Like

Posted 14 April 2012 - 09:45 AM

Thanks a lot !




Old topic!

Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.