new fast ray tracing algorithm

Started by
50 comments, last by janos 19 years, 6 months ago
If You are interested in using this algorithm, with purposes please contact me via this e-mail: sinnlabs@list.ru Full documentation is here. Tracing the rays for fun and profit: The ways of spherical ray tracing by (--Sinn-Labs--) ----------------------------------------------------------------------------------------------------------------------------------------------- Intro: ----------------------------------------------------------------------------------------------------------------------------------------------- Authors: Alex "krio" V. Kossiakov, student of Moscow State University, Faculty of Computational Mathematics and Cybernetics ----------------------------------------------------------------------------------------------------------------------------------------------- Copyrights: © Alex V. Kossiakov, all rights reserved. If you read this document, then everything is copyrighted and protected by law. You may freely distribute this document only without any modifications. The original document in russian which was copyrighted MUST be distributed within this document, attached after the "[EOF]" marker. The attached document contains main concept of the algorithm so you can't use it or it's minor parts in commercial ways without my permission. ----------------------------------------------------------------------------------------------------------------------------------------------- About Sinn Labs: Born in depth of Moscow school 1134, migrated to Moscow State University. With this document we are hoping to get some money resources for our next topic (which might blow you away). That's why we had to copyright this "technology". ----------------------------------------------------------------------------------------------------------------------------------------------- About the topic: Ray tracing was a very expensive technology in computer graphics scene. In fact everybody thought it is not matching with traditional graphics pipeline. But it all was in past. And this document should explain why. ----------------------------------------------------------------------------------------------------------------------------------------------- Special thanks: Kevin R. Harris for his opengl samples (www.codesampler.com) ----------------------------------------------------------------------------------------------------------------------------------------------- Lets go! The spherical coordinates is a representation of 3d out of 3 parameters: two angles - alpha, beta and the distance r to the point. Well, nothing really new, because we all remember how to generate a sphere: x = r * cos(b) * cos(a) y = r * cos(b) * sin(a) z = r * sin(b) What if we will take alpha and beta to be equal our x*psi and y*phi on the screen?? Yes! lines will still be lines, so triangles will still be triangles!! What does this mean? This means that we can forget about the distances for a while. Looks like a spherical projection. And we are just taking the part of that sphere as an our screen and the eye's location in the 0. Notice that we'll get a prospectively-correct image. But. For each frame we'll have to calculate millions of arctangents to know what alpha and what beta each vertex has. No problem. Let’s just use the standard algorithm to get a prospectively-nice image - instead of arctangents we'll just divide x and y by r (let’s call it ortho projection). Well, ok. But lets look how a plane looks like in spherical coordinates: Ax + By + Cz + D = 0 means r * (A * cos(b) * cos(a) + B * cos(b) * sin(a) + C * sin(b)) + D = 0 r * k + D = 0 r = - D/k Good, but we must calculate lots of sin and cos, that will take a lot of time... If alpha and beta depend on screen's x and y, then we can just precalculate them and put in an array of lets say 640x480x3 elements: cos(b)*cos(a) ; cos(b)*sin(a) ; sin(b). And how should we calculate A,B,C and D? We have lets say all three vertexes of triangle M1(x1,y1,z1), M2(x2,y2,z2), M3(x3,y3,z3): A = y1 (z2 - z3) + y2 (z3 - z1) + y3(z1 - z2) B = z1 (x2 - x3) + z2 (x3 - x1) + z3(x1 - x2) C = x1 (y2 - y3) + x2 (y3 - y1) + x3(y1 - y2) - D = x1(y2 z3 - y3 z2) + x2 (y3 z1 - y1 z3) + x3 (y1 z2 - y2 z1) Looks like we can calculate the distance from an eye to any pixel on any triangle fast enough. Now, how do we use this all? How do we draw our scene? Well, personally I was shocked when realized that classical ray tracing is brute forcing all the pixels on the screen and at every pixel we search for triangles which we intersect with that ray and then looking which triangle is first on the distance. What else can we do instead this brute force?? Lets try it that way: Per frame before rendering: for each triangle to calculate all the needed above data AND where actually that triangle should be rendered on the screen: for each y that is in the bounding box of that triangle we precalculate xmin and xmax where the y is intersecting our triangle. Just a basic rasterization!! When rendering: for each triangle; for y that is in bounding box; for x that is from xmin to xmax DO: check if that pixel is in the screen; check if in the depth buffer for that x and y the ID of the triangle is the ID of our triangle, if so - we have intersection and can do our per pixel calculations. That is all for primary rays.. What's up with the secondary? We can do just exactly the same, but at first we only have to recalculate the spherical coordinates with a zero at the point where the second ray comes from. Oh, and normals are calculated with ortho projecting {A,B,C} vector. And reflection angle is just simple linear combination of the primary ray's angle and normal's angle. Just realize a line coming through this two 2d angles - what we need is a third point on that line which is as far away from the normal's as ray's from the normal's and is located in one of two ways: ray's - normal's - reflection's angles; or: reflection's - normal's - ray's angles. Refraction is calculated some similar way. Just draw a 2d projection of angles into dots on a paper and see it by your self. What’s up with spheres? Nothing really. Only the fact that we should calculate the r in a such way: -r^2 = alpha^2 + betta^2 - sphere_radius^2. Normals should be represented in angles and calculated with ortho projection, gladly we can convert our spherical coordinates to standard ones in a few multiplies. Reflections and refracts are calculated in similar to triangles way from this point. How to code it on gpu? There are actually number of ways. I'm not sure yet which one is the best. And now just realize how fast it will work if Nvidia/Ati will use it directly in the rasterization process to pass the data (pixel's distance or 3d location) to pixel shaders... But before it happens you may try to pass that data in a texture or three, depending on what depth quality you are aiming at. There are lots of other things for them to think about, and it's their job. This stuff works really faster than the traditional ray tracing. I did do a lot of test-coding, got amazing results, but we are not distributing inner sources just because I’m not the best coder around and I had a very tight time limit. ------------------------------------------------------------------------------------------------------------------------------------------------ Outro: Everything what is imaginable IS possible. ------------------------------------------------------------------------------------------------------------------------------------------------ [EOF] [ATTACHED ORIGINAL COPYRIGHTED DOCUMENT] Ñôåðè÷åñêèé ray-tracing. Ñòàíäàðòíûé àëãîðèòì ray-tracing, èëè ìåòîä òðàññèðîâêè ëó÷åé, áûë ïðåäëîæåí Àðòóðîì Àïïåëåì (Arthur Appel) åùå â 1968 ãîäó è äîïîëíåí àëãîðèòìîì îáùåé ðåêóðñèè, ðàçðàáîòàííûì Whitted â 1980 ãîäó.  îñíîâíîì îí ñâîäèòñÿ ê íåîáõîäèìîñòè äëÿ êàæäîãî ïèêñåëÿ íà ýêðàíå â äåêàðòîâîé ñèñòåìå êîîðäèíàò ïðîâåñòè ëó÷ è èñêàòü ïåðåñå÷åíèÿ ñ ïëîñêîñòÿìè òðåóãîëüíèêîâ, çàòåì äëÿ êàæäîãî ïåðåñå÷åíèÿ ñìîòðåòü, ïîïàäàåò ëè ëó÷ â òðåóãîëüíèê. Ïåðâîå ïîïàäàíèå ïî ëó÷ó, çàäàííîãî ïàðàìåòðè÷åñêè, è åñòü òîò ïèêñåëü, öâåò êîòîðîãî äàëüøå íàäî ñ÷èòàòü. Ïðî÷èå âàðèàöèè ýòîãî àëãîðèòìà àíàëîãè÷íî ñëîæíû ïî âû÷èñëåíèÿì. Èñïîëüçóåìûå òàê æå, ïèêñåëüíûå Shader'û äîñòàòî÷íî áûñòðû è âñå ïîñëåäíèå âèäåîêàðòû ïðèñïîñîáëåíû èìåííî ïîä íèõ. Íî íåâîçìîæíî îñóùåñòâèòü íîðìàëüíûé äîñòóï èç øåéäåðà ê ïðî÷åé ãåîìåòðèè ñöåíû. Äëÿ ñòàíäàðòíûõ òðåóãîëüíèêîâ ìåòîä ïðèìåíèì, íî íå äëÿ íåïðÿìîëèíåéíîé ãåîìåòðèè - ñôåð è ò.ï. Âèçóàëüíîå êà÷åñòâî ïðèìåíåíèÿ ïèêñåëüíûõ øåéäåðîâ íå ìîæåò ñðàâíèòüñÿ äàæå ñ òðàäèöèîííûì ray-tracing'îì. Ñóòü ïðåäëàãàåìîãî ñôåðè÷åñêîãî ray-tracing'a: Äëÿ êàæäîãî ïðèìèòèâà ñ÷èòàþòñÿ íóæíûå ñôåðè÷åñêèå êîîðäèíàòû ðàç â êàäð. ×òî ìîæíî äåëàòü êàê ñòàíäàðòíûì ñïîñîáîì ïåðñïåêòèâíîãî ïðåîáðàçîâàíèÿ äëÿ îáû÷íûõ äåêàðòîâûõ ñèñòåì êîîðäèíàò, òàê è ÷åðåç ôîðìóëû ïðåîáðàçîâàíèÿ äåêàðòîâûõ êîîðäèíàò â ñôåðè÷åñêèå. Âî âòîðîì ñëó÷àå ìû ïîëó÷èì ïåðñïåêòèâíî-ïðàâèëüíóþ ïðîåêöèþ.  èòîãå ìû ïîëó÷àåì ëèíåéíóþ çàâèñèìîñòü óãëîâûõ ñîñòàâëÿþùèõ ñôåðè÷åñêèõ êîîðäèíàò íà ïëîñêîñòè îò êîîðäèíàò íà íàøåì ýêðàíå. Çäåñü íå ïðèâîäèòñÿ äîêàçàòåëüñòâ òîãî, ÷òî ïðè òàêîé àíàëîãèè ñîõðàíÿþòñÿ âñå âîçìîæíûå ñâîéñòâà, ïîòîìó ÷òî ñïîñîáîâ äîêàçàòåëüñòâà î÷åíü ìíîãî, ýòî íå âõîäèò â ðàìêè äàííîãî òåêñòà è, ê òîìó æå, ýòî ïðîâåðåíî ýêñïåðèìåíòàëüíî. Çàòåì îñóùåñòâëÿåòñÿ ñòàíäàðòíàÿ ðàñòåðèçàöèÿ, íàïðèìåð scan-line àëãîðèòì, íî â óãëîâîé ñîñòàâëÿþùåé ñôåðè÷åñêèõ êîîðäèíàò. Ïîñëå ýòîãî ìû ìîæåì ïåðåáðàòü â öèêëå êàæäûé ïðèìèòèâ è òðàññèðîâàòü òîëüêî ëó÷è, ïîïàäàþùèå â ýòè ïðèìèòèâû: ïåðåáèðàåì êàæäóþ ñòðî÷êó îò ìèíèìàëüíîé äî ìàêñèìàëüíîé, à â êàæäîé ñòðî÷êå ïåðåáèðàåì îò ïðîñ÷èòàííîãî ðàíåå ìèíèìàëüíîãî ñòîëáöà äî ìàêñèìàëüíîãî. Äëÿ êàæäîãî òàêîãî ïîïàäàíèÿ ïðîñ÷èòûâàåòñÿ äèñòàíöèÿ è ñðàâíèâàåòñÿ ñ áóôåðîì ãëóáèíû. Ïîñëå ýòîãî ïîïàäàíèå ñ÷èòàåòñÿ äåéñòâèòåëüíûì è ïðîèñõîäèò ðàñ÷åò òåêñòóðíûõ êîîðäèíàò, âòîðè÷íûõ è ïðî÷èõ ëó÷åé è äðóãèõ ñâîéñòâ, äëÿ âû÷èñëåíèÿ öâåòà ýòîãî ïèêñåëÿ. Òàêèì îáðàçîì, ìû ïî÷òè ïîëíîñòüþ èçáàâëÿåìñÿ îò ïðèñóùåãî ray-tracing'ó ìåòîäà ãðóáîé ñèëû, ñîõðàíÿÿ âñå åãî âîçìîæíîñòè, è âíîñÿ åùå ìíîæåñòâî ðàçíûõ îïòèìèçàöèé. Îñíîâíûå îòëè÷èÿ íîâîãî àëãîðèòìà, êîòîðûå ìîãóò ïðèìåíÿòüñÿ ïî îòäåëüíîñòè â ðåàëèçàöèè ray-tracing'a: 1. èñïîëüçîâàíèå ñôåðè÷åñêèõ êîîðäèíàò äëÿ òðàññèðîâêè ïåðâè÷íîãî ëó÷à. 2. èñïîëüçîâàíèå ñôåðè÷åñêèõ êîîðäèíàò äëÿ òðàññèðîâêè âòîðè÷íûõ ëó÷åé è âñåõ îñòàëüíûõ ëó÷åé. 3. ïðåäâàðèòåëüíûé ïðîñ÷åò ïðèíàäëåæíîñòè ïèêñåëåé ðàáî÷åé ïëîñêîñòè ê ïðèìèòèâàì (ïðåäâàðèòåëüíàÿ ðàñòåðèçàöèÿ) ÷åðåç ïåðåõîä íà ðàáî÷óþ ïëîñêîñòü, ëèáî èç ñôåðè÷åñêèõ êîîðäèíàò ëèáî èç äåêàðòîâûõ. 4. èñïîëüçîâàíèå ïðåäâàðèòåëüíîé ðàñòåðèçàöèè äëÿ ïî-ïðèìèòèâíîé òðàññèðîâêè. 5. ïðîñ÷èòûâàíèå óãëîâ ëó÷åé îòðàæåíèÿ è ïðåëîìëåíèÿ ëèíåéíîé êîìáèíàöèåé äâóìåðíûõ óãëîâ ïðåäûäóùåãî ëó÷à è íîðìàëè. 6. ïðåäâàðèòåëüíàÿ ðàñòåðèçàöèÿ ïðèìèòèâîâ â êîîðäèíàòàõ âòîðè÷íûõ è ïðî÷èõ ëó÷åé äëÿ èõ òðàññèðîâêè. Ìíîãèå ýòè îñîáåííîñòè ìîãóò áûòü ðåàëèçîâàíû, èëè óæå ðåàëèçîâàíû â äðóãèõ, íå ray-tracing`îâûõ, ìåòîäàõ îòîáðàæåíèÿ. Íàïðèìåð ìîæíî èñïîëüçîâàòü ñôåðè÷åñêèå êîîðäèíàòû â ðåàëèçàöèè ñòàíäàðòíûõ shader`îâ. Îñíîâíûå îòëè÷èÿ íîâîãî àëãîðèòìà ïîçâîëÿþò: 1. íå âûñ÷èòûâàòü ëó÷è ïàðàìåòðè÷åñêè. 2. èñïîëüçîâàòü óãëîâûå ñîñòàâëÿþùèå ñôåðè÷åñêèõ êîîðäèíàò êàê êîîðäèíàòû ïèêñåëÿ íà ýêðàíå. 3. â ñëåäñòâèè ïðèìåíÿòü ê ray-tracing ñòàíäàðòíûå àëãîðèòìû ðàñòåðèçàöèè è áóôåðèçàöèè. 4. â ñëåäñòâèè òðàññèðîâàòü íå êàæäûé ïèêñåëü íà ýêðàíå, à ëèøü ïèêñåëè, ïîïàäàþùèå â òðåóãîëüíèê èëè äðóãîé ïðèìåòèâ. 5. â èòîãå - ðèñîâàòü, ïåðåáèðàÿ â öèêëå âñå ïðèìèòèâå íà ñöåíå, ïèêñåëÿìè, ïîïàäàþùèìè â ïðèìèòèâû, è ïðè ýòîì çíàòü î êàæäîì ïèêñåëå â ïðèìèòèâå åãî ðåàëüíûå êîîðäèíàòû. 6. íà êàæäûé ïèêñåëü íà ýêðàíå ïðèìåíÿòü òîëüêî òå âû÷èñëåíèÿ, êîòîðûå òðåáóþòñÿ äëÿ âû÷èñëåíèÿ åãî âòîðè÷íûõ ëó÷åé, òåêñòóðíûõ êîîðäèíàò è ò.ï. Âñå îñòàëüíûå âû÷èñëåíèÿ ìîæíî ïðîèçâîäèòü ðàç â êàäð äëÿ êàæäîãî ïðèìèòèâà â ðàìêàõ åãî ðàñòåðèçàöèè, åñëè ñâîéñòâà ðàñòåðèçàöèè èçìåíèëèñü îòíîñèòåëüíî ïðåäûäóùåãî êàäðà. 7. âòîðè÷íûå ëó÷è, shadow rays è ò.ï. òàê æå ïðîñ÷èòûâàòü â ñôåðè÷åñêèõ êîîðäèíàòàõ. 8. îñóùåñòâëÿòü ïðåîáðàçîâàíèÿ íå ïåðåõîäÿ èç ñôåðè÷åñêèõ â äåêàðòîâû êîîðäèíàòû è îáðàòíî. Àííîòàöèÿ. Äàííûé àëãîðèòì ÿâëÿåòñÿ êðóïíûì øàãîì ê ðåíäåðèíãó â ðåàëüíîì âðåìåíè ñ âûñîêèì êà÷åñòâîì èçîáðàæåíèÿ. Íà CPU ìîæíî ïîëó÷èòü ñóùåñòâåííîå óñêîðåíèÿ ðàáîòû ray-tracing'a. Åñëè ýòîò ìåòîä âîçìîæíî áóäåò ðåàëèçîâàòü íà GPU, òî ìîæíî áóäåò äîñòè÷ü î÷åíü õîðîøèõ ðåçóëüòàòîâ, íî äëÿ ýòîãî ìîæåò ïîòðåáîâàòüñÿ ðåîðãàíèçàöèÿ pipeline'a. [END OF ATTACHED DOCUMENT, EVERYTHING ABOVE THIS LINE IN THE ATTACHED DOCUMENT IS COPYRIGHTED, ALEX V. KOSSIAKOV: ©2004 Àëåêñåé Êîñÿêîâ Âàëåíòèíîâè÷]
Advertisement
why did you post gibberish instead of english, french, spanish, japanese, [insert language that my browser actually reads and i can translate if necessary]?

Beginner in Game Development?  Read here. And read here.

 

so far im not impressed. for primary rays its probably nice, but you might aswell just rasterize then. for secondary rays i see no advantage whatsoever, and it certainly wouldnt be able to compete with a decent space-partitioning method.
the problem is, for primary rays it's like rasterization, and for secondary, no adwantage. Also, how using spherical coordinates can give any significant benefit over using projected coordinates on the screen plane? It's faster to project each triangle onto screen plane before rendering, and then rasterize in usual way (don't really see difference between both methods except that your uses angles and standard rasterization uses coordinates on screen plane. And coordinates on screen plane also defines directions, just like angles. Except that it's alot faster to project onto screen than to compute angles). In summary: angles is evil :\, and always avoided for a good reason.
Quote:Original post by Alpha_ProgDes
why did you post gibberish instead of english, french, spanish, japanese, [insert language that my browser actually reads and i can translate if necessary]?

what gibberish ? you simply don't have cyrillic font installed.
You don't understand me. Thanks to spherical coordinates i know the angle of secondary rays and then WITH SAME ALGORITHM i trace intersections with plane( floor, wall, etc). All parts of algorithm for primary rays are in use when tracing secondary rays: rasterization,angles as "screen" coordinates. The advantage of spherical coordinates is the simplicity of algorithm of finding out the angles of secondary rays(refracts, reflects). We also get rid of parametrical ray specification.

But the best thing here is the usage of ray tracing in\after the rasterization process. The advantage of this algorithm instead shaders is the access to geometry\texture or any data of scene in any moment of tracing AND we can "shade" true spheres or other non-linear surfaces.

Ñïåöèàëüíî äëÿ òåáÿ, Äìèòðèé: ñôåðè÷åñêèå êîîðäèíàòû äàþò ïðîñòîå âûñ÷èòûâàíèå óãëîâ âòîðè÷íûõ ëó÷åé. Ìû èçáàâëÿåìñÿ îò ïàðàìåòðè÷åñêîãî çàäàíèÿ ëó÷åé è äëÿ âòîðè÷íûõ êîîðäèíàò òàêæå èñïîëüçóåòñÿ ðàñòåðèçàöèÿ òîëüêî óæå äëÿ îäíîé òî÷êè.


p.s. If you don't like angles - don't use them.
So the main idea is using rasterisation for first-hit optimisation of ray tracing? There's nothing wrong with that, but the idea is old as the hills. And if you're doing anything interesting, like global illumination (why would you ray trace otherwise?) first hit is a small part of performance anyway.
"That is all for primary rays.. What's up with the secondary? We can do just exactly the same, but at first we only have to recalculate the spherical coordinates with a zero at the point where the second ray comes from."
What is actually said here is that we can do that with "ortho" projection (basic rasterization with standard perspective projecting) and then work with it as with a spherical projection. everything speeds up again in compare with parametrical brute forcing, including global illumination, reflects and refracts.
Quote:Original post by krio
"That is all for primary rays.. What's up with the secondary? We can do just exactly the same, but at first we only have to recalculate the spherical coordinates with a zero at the point where the second ray comes from."
What is actually said here is that we can do that with "ortho" projection (basic rasterization with standard perspective projecting) and then work with it as with a spherical projection. everything speeds up again in compare with parametrical brute forcing, including global illumination, reflects and refracts.

Ortho projection isn't perspective projection, standard or otherwise.
Quote:
With this document we are hoping to get some money resources for our next topic (which might blow you away). That's why we had to copyright this "technology".

Erm. Copyright doesn't stop me using your algorithm.
CoV
Quote:Original post by krio
You don't understand me. Thanks to spherical coordinates i know the angle of secondary rays and then WITH SAME ALGORITHM i trace intersections with plane( floor, wall, etc). All parts of algorithm for primary rays are in use when tracing secondary rays: rasterization,angles as "screen" coordinates. The advantage of spherical coordinates is the simplicity of algorithm of finding out the angles of secondary rays(refracts, reflects). We also get rid of parametrical ray specification.

refracts and reflects are as easy as it comes in cartesian coordinates. also, dont you need a complete rerasterization for every second ray? they dont have the convenience of all originating from the same point.

Quote:
But the best thing here is the usage of ray tracing in\after the rasterization process. The advantage of this algorithm instead shaders is the access to geometry\texture or any data of scene in any moment of tracing AND we can "shade" true spheres or other non-linear surfaces.

now youve really lost me.
Quote:p.s. If you don't like angles - don't use them.
i k

This topic is closed to new replies.

Advertisement