Jump to content
  • Advertisement

ITboy_Lemon

Member
  • Content Count

    10
  • Joined

  • Last visited

Community Reputation

224 Neutral

About ITboy_Lemon

  • Rank
    Member
  1.   Maybe i should get that book too. Thanks for the great material about PBR.     Thought I might give an update on this. I started reading PBR:From Theory to Implementation. As Radikalism said, pbrt is their code example. It seems that they're building more of a non-real time "film" renderer. Some of the picture examples in the book are so good I can't tell they aren't real even carefully looking at them although they may have purposefully used materials that would render well and other materials might not look as real. But the picture I'm thinking of said it took the computer 34 hours to render using pbrt. That's a lot slower than 60fps. :-)    You got me really interested in this subject and I've kind of been focused on it for a couple days now./   After looking around a bit more, I think I may have found a better book for further study on the subject. It covers Cook-Torrence and the stuff on the second page of the link you originally gave (which I've taken the time to read over the past couple of days). The book is "Real Time Rendering" and it appears the latest edition is the third written back in 2008, almost a decade ago. I would think it's time for an update on this book after a decade of advancements in rendering, but the "advanced" section of the book seems to just now being used in modern games and it seems to cover the PBR algorithms like Cook-Torance and Lambert and Fesnel. I think I've had this book on my shelf since it was first published back in 2008. I know I've had it for a long time. I think it was one of the better resources when I was learning Blinn-Phong but with that I went directly to the chapters that covered Blinn-Phong in each book that I was working from rather than reading them cover to cover. I've started trying to read this cover to cover so that I can get a more complete understanding of the whole field of real time rendering and specifically the PBR algorithms like Cook-Torance and such. It looks like it may be a better book for learning how to do this stuff for game engines than "PBR: From Theory to Implementation" although I only started reading that and kind of abandoned it once it looked like it was more for off-line rendering.   Yes, Real time rendering is an awesome book. As i know, most of algorithms in real time are hack tricks about the physical accurate way. So if you want to know more about why such algorithm work like this, knowing something like ray-tracing about off-line rendering is very helpful!
  2.   Maybe i should get that book too. Thanks for the great material about PBR.
  3. Physically based doesn't mean physically accurate, just that it is based on real physics (unlike, e.g. Phong shading, which was based on intuition, or the standard implementation of Blinn-Phong which is based on a solid theoretical foundation but blatantly violates the conservation of energy). However, yes, comparing any PBR technique to the real world lets us measure how correct it is. Or, in games we often compare our PBR renderers to existing film-quality PBR renderers :lol: No. Rendering covers the whole system (lighting, geometry, shadows, shading, post processing, etc) whereas shading is just the interaction between lights and materials. PBS is a small part of PBR. I haven't read your two articles yet, but just to be clear, PBS/PBR are not nouns / they are not a specific thing. PBS/PBR are adjectives that can be used to describe endless things. Two different games might both say "We do PBR!", but their algorithms/shaders/code are probably completely different from each other, because what they mean is "our renderer uses some algorithms that are based on approximations of physics". As for combining direct vs indirect lighting - all lighting is additive. Indirect lighting is just another type of light source such as spot, point, directional, etc... In the real world there is a lot of indirect lighting going on. When you reproduce a real world material using PBS, but only light it with direct lights (point, spot, etc) they tend to look very wrong, as this lighting situation is unrealistic... So once you have PBS, figuring out a way to do more and more PBR (such as indirect lighting) becomes important. There is no one standard implementation of PBS, but the most common techniques at the moment are covered in the Unreal 4 rendering course notes by Brian Karis.   Thank you for your reply.   From your reply, i get some important points: 1. PBS is not a specific algorithm 2. In real world, lighting contains direct and indirect lighting, so if you want to make your graphics more realistic, you must implement as more indirect lighting as you can
  4. Thank you for your reply.   What you say is ray-tracing, as i know, it is not real-time, so i can not use it in game engine.   What i really want to know here is physical based shading, just like lighting calculation in Unreal and Unity.   http://blog.selfshadow.com/publications/s2014-shading-course/ This link is what i talk about.   I am sorry if i misunderstand Physical based shading and Physical based Rendering, but as i know PBS is the approximate way to PBR, am i right?
  5. Hi guys   Recently, i do some research on Physical based shading(PBS,PBR).Finally i got two articles which has implementation details.   (1) http://www.codinglabs.net/article_physically_based_rendering.aspx (2) Advance Lighting and Shading in Direct3D 9 from book [ShaderX2 Introduction and Tutorial with DirectX 9] (i am sorry, i can not found this article in internet)   In (1), it use environment map to do light calculation. And in (2) it use light that we define (just like point light, directional light) to do light calculation.   My question is:   (1) What difference between two method?   (2) In my understanding, (1) do the indirect lighting, (2) do the direct lighting, are they?   (3) If so, do i need both of them in my light calculation? and how to combine them in the final result?   If anyone know about this, please teach me. Thanks very much!
  6. ITboy_Lemon

    How to rotate the texture coordinate in the HLSL

    Thank you for your patience.  Here is my new Flare.fx , and it works now , but I still have some questions. //------------------------------------------------------------------------------------- // declaration : Copyright (c) , 2013 , XJ . All right reserved . // brief : This effect file make a flare effect . // author : XJ // file : Flare.fx // data : 2013 / 10 / 27 //-------------------------------------------------------------------------------------- //define the global variant uniform extern float4x4 gWVP ; uniform extern float gRotationFlare ; uniform extern float gRotationBlend ; uniform extern texture gTexFlare ; uniform extern texture gTexBlend ; //define the OutputVS struct OutputVS { float4 posH: POSITION0 ; float2 texFlare: TEXCOORD0 ; float2 texBlend: TEXCOORD1 ; }; //define the sampler sampler BlendTex = sampler_state { Texture = <gTexBlend> ; MinFilter = LINEAR ; MagFilter = LINEAR ; MipFilter = LINEAR ; AddressV = WRAP ; AddressU = WRAP ; }; sampler FlareTex = sampler_state { Texture = <gTexFlare> ; MinFilter = LINEAR ; MagFilter = LINEAR ; MipFilter = LINEAR ; AddressV = WRAP ; AddressU = WRAP ; }; //define the Vertex Shader OutputVS FlareVS(float3 posL: POSITION0, float2 tex: TEXCOORD0) { //Zero out the OutputVS OutputVS outVS = (OutputVS) 0 ; //Transform the local position to the homogenous clip space outVS.posH = mul(float4(posL, 1.0f), gWVP); outVS.texFlare = tex ; outVS.texBlend = tex ; outVS.texFlare -= 0.5 ; outVS.texBlend -= 0.5 ; float cFlare = cos(gRotationFlare); float sFlare = sin(gRotationFlare); float cBlend = cos(gRotationBlend); float sBlend = sin(gRotationBlend); outVS.texFlare = mul(outVS.texFlare, float2x2(cFlare,-sFlare,sFlare, cFlare)); outVS.texBlend = mul(outVS.texBlend, float2x2(cBlend, -sBlend, sBlend, cBlend)); outVS.texFlare += 0.5f ; outVS.texBlend += 0.5f ; //Done return outVS ; } //define the Pixel Shader float4 FlarePS(float2 texFlare: TEXCOORD0, float2 texBlend : TEXCOORD1) : COLOR { //Get the texl float3 blend = tex2D(FlareTex, texFlare).rgb; float3 flare = tex2D(BlendTex, texBlend).rgb; //Caculate the final color float3 color = blend * flare ; //Done return float4(color, 1.0f); } //define the technique technique FlareTech { pass P0 { vertexShader = compile vs_2_0 FlareVS(); pixelShader = compile ps_2_0 FlarePS(); } } Here are my questions : 1. I did not use the 3x3 matrix , is there something wrong , or pitfall with the 2x2 ? 2. I still use the same theory as the first Flare.fx , which means I do the translate with - 0.5 and rotate the texture coordinate ,at the end I translate the texture coordinate with + 0.5 . I did not transform it in the rotation coordinate system as you metioned , is there something wrong with it ?   Thank you again !!!
  7. Nice article , I will follow your advice .
  8. ITboy_Lemon

    How to rotate the texture coordinate in the HLSL

    I get your idea . Here is my second Flare.fx : //------------------------------------------------------------------------------------- // declaration : Copyright (c) , 2013 , XJ . All right reserved . // brief : This effect file make a flare effect . // author : XJ // file : Flare.fx // data : 2013 / 10 / 27 //-------------------------------------------------------------------------------------- //define the global variant uniform extern float4x4 gWVP ; uniform extern float gRotationFlare ; uniform extern float gRotationBlend ; uniform extern texture gTexFlare ; uniform extern texture gTexBlend ; //define the OutputVS struct OutputVS { float4 posH: POSITION0 ; float2 texFlare: TEXCOORD0 ; float2 texBlend: TEXCOORD1 ; }; //define the sampler sampler BlendTex = sampler_state { Texture = <gTexBlend> ; MinFilter = LINEAR ; MagFilter = LINEAR ; MipFilter = LINEAR ; }; sampler FlareTex = sampler_state { Texture = <gTexFlare> ; MinFilter = LINEAR ; MagFilter = LINEAR ; MipFilter = LINEAR ; AddressV = CLAMP ; AddressU = CLAMP ; }; //define the Vertex Shader OutputVS FlareVS(float3 posL: POSITION0, float2 tex: TEXCOORD0) { //Zero out the OutputVS OutputVS outVS = (OutputVS) 0 ; //Transform the local position to the homogenous clip space outVS.posH = mul(float4(posL, 1.0f), gWVP); outVS.texFlare = tex ; outVS.texBlend = tex ; outVS.texFlare.x -= 0.5 ; outVS.texBlend.x -= 0.5 ; outVS.texFlare.y = -outVS.texFlare.y ; outVS.texFlare.y += 0.5f ; outVS.texBlend.y = -outVS.texBlend.y ; outVS.texBlend.y += 0.5f ; outVS.texFlare.x = outVS.texFlare.x*cos(gRotationFlare) - outVS.texFlare.y*sin(gRotationFlare) ; outVS.texFlare.y = outVS.texFlare.x*sin(gRotationFlare) + outVS.texFlare.y*cos(gRotationFlare) ; outVS.texBlend.x = outVS.texBlend.x*cos(gRotationBlend) - outVS.texBlend.y*sin(gRotationBlend) ; outVS.texBlend.y = outVS.texBlend.x*sin(gRotationBlend) - outVS.texBlend.y*cos(gRotationBlend) ; outVS.texFlare.x += 0.5f ; outVS.texBlend.x += 0.5f ; outVS.texFlare.y -= 0.5f ; outVS.texFlare.y = -outVS.texFlare.y ; outVS.texBlend.y -= 0.5f ; outVS.texBlend.y = -outVS.texBlend.y ; //Done return outVS ; } //define the Pixel Shader float4 FlarePS(float2 texFlare: TEXCOORD0, float2 texBlend : TEXCOORD1) : COLOR { //Get the texl float3 blend = tex2D(FlareTex, texFlare).rgb; float3 flare = tex2D(BlendTex, texBlend).rgb; //Caculate the final color float3 color = blend * flare ; //Done return float4(color, 1.0f); } //define the technique technique FlareTech { pass P0 { vertexShader = compile vs_2_0 FlareVS(); pixelShader = compile ps_2_0 FlarePS(); } } But the same result . Here is the question of this demo : In this chapter's demo directory there are two textures, as shown in the left and middle images of Figure 11.20. Use multi-texturing to combine them together to produce the image at the right in Figure 11.20. In addition, animate the flare by rotating it as a function of time (rotate the color and grayscale textures at different rates). Display the resultant texture on each face of a cube. (Hint: The center in texture coordinates is not the origin; it is (1/2, 1/2), and thus the rotation will be off because the rotation equations rotate about (0, 0). Therefore, you will need to first translate the texture coordinates so that the center is at the origin, apply the rotation transformation, and then translate back so that the center is back at (1/2, 1/2) for texturing.)   I follow  the advice of the exercise , but i do not understand why is the same result as before . Can you help me ?
  9. These days , I have read the <Intruduction to the 3D Game Programming with DirectX 9.0c : A shader Approach> . And when I do the exercise in the Chapter 11 Texture ,  I come across a problem . I want to rotate the texture coordinate to do the texture animation in the Flare.fx file . Here is my code : //------------------------------------------------------------------------------------- // declaration : Copyright (c) , 2013 , XJ . All right reserved . // brief : This effect file make a flare effect . // author : XJ // file : Flare.fx // data : 2013 / 10 / 27 //-------------------------------------------------------------------------------------- //define the global variant uniform extern float4x4 gWVP ; uniform extern float gRotationFlare ; uniform extern float gRotationBlend ; uniform extern texture gTexFlare ; uniform extern texture gTexBlend ; //define the OutputVS struct OutputVS { float4 posH: POSITION0 ; float2 tex: TEXCOORD0 ; float4 rotationFlare: COLOR0 ; float4 rotationBlend: COLOR1 ; }; //define the sampler sampler BlendTex = sampler_state { Texture = <gTexBlend> ; MinFilter = LINEAR ; MagFilter = LINEAR ; MipFilter = LINEAR ; }; sampler FlareTex = sampler_state { Texture = <gTexFlare> ; MinFilter = LINEAR ; MagFilter = LINEAR ; MipFilter = LINEAR ; }; //define the Vertex Shader OutputVS FlareVS(float3 posL: POSITION0, float2 tex: TEXCOORD0) { //Zero out the OutputVS OutputVS outVS = (OutputVS) 0 ; //Transform the local position to the homogenous clip space outVS.posH = mul(float4(posL, 1.0f), gWVP); outVS.tex = tex ; //Compute the 2x2 matrix float cosf = cos(gRotationFlare) ; float sinf = sin(gRotationFlare) ; float4 rotation = float4(cosf,-sinf,sinf,cosf); outVS.rotationFlare = rotation ; cosf = cos(gRotationBlend); sinf = sin(gRotationBlend); rotation = float4(cosf,-sinf,sinf,cosf); outVS.rotationBlend = rotation ; //Done return outVS ; } //define the Pixel Shader float4 FlarePS(float2 tex: TEXCOORD0, float4 rotationFlare:COLOR0, float4 rotationBlend:COLOR) : COLOR { float2 flare_tex = tex ; float2 blend_tex = tex ; //Translate the texture coordinate flare_tex -= 0.5f ; blend_tex -= 0.5f ; //Rotate the texture coordinate flare_tex = mul(flare_tex, float2x2(rotationFlare)); blend_tex = mul(blend_tex, float2x2(rotationBlend)); //Translate the texture coordinate flare_tex += 0.5f ; blend_tex += 0.5f ; //Do some logic and get the texl color float3 flare = (float3)0 ; if(flare_tex.x < 0 || flare_tex.x > 1 || flare.y < 0 || flare.y > 1) { flare = float3(0.0f, 0.0f, 0.0f); } else { flare = tex2D(FlareTex, flare_tex).rgb ; } float3 blend = (float3) 0 ; if(blend_tex.x < 0 || blend_tex.x > 1 || blend_tex.y < 0 || blend_tex.y > 1) { blend = float3(0.0f, 0.0f, 0.0f); } else { blend = tex2D(BlendTex, blend_tex).rgb ; } //Done. return float4(flare*blend, 1.0f); } //define the technique technique FlareTech { pass P0 { vertexShader = compile vs_2_0 FlareVS(); pixelShader = compile ps_2_0 FlarePS(); } } This file will do the multi-texturing and rotate the texture code in the cube object . Here is the problem , some time , the texture will be stretched not be rotated .Just like this : [attachment=18574:QQ??20131028144534.jpg]   but actually , the real effect I want to get is this : [attachment=18575:QQ??20131028142939.jpg] just rotate the multi-textured texture , not stretched one .   So , is there anybody who can save me ?  
  10. ITboy_Lemon

    DirectX about the matrix

    Thank you for your help . I had alread solved this problem .
  11. ITboy_Lemon

    DirectX about the matrix

    Hello every . I am new here . I have a problem with the understanding of view matrix .   I draw a polyline with DrawPrimitive(D3DPT_LEINSTRIP , ...) in the x-y plane . And I put the eye in (0.0f,0.0f,-10.0f) . But I can not see the image . But when I do not set the view matrix and projection matrix , the image will display . I do not know why .   This is my vertex buffer : void Exercise::buildVB() {     //get Direct3D Device     IDirect3DDevice9* _pDevice = m_pD3d->getDevice();     //craete the vertex buffer     HR(_pDevice->CreateVertexBuffer(6 * sizeof(VertexPos),D3DUSAGE_WRITEONLY,0,         D3DPOOL_MANAGED,&m_pVB,0));     //lock the vertex buffer     VertexPos * _pData = NULL ;     HR(m_pVB->Lock(0,0,(void**)&_pData,0));     //set the vertex     /*_pData[0] = VertexPos(0.0f, 0.0f, 0.0f);     _pData[1] = VertexPos(1.0f, 0.0f, -3.0f);     _pData[2] = VertexPos(3.0f, 0.0f, -2.0f);     _pData[3] = VertexPos(2.0f, 0.0f, -1.0f);     _pData[4] = VertexPos(2.5f, 0.0f, 1.0f);     _pData[5] = VertexPos(1.0f, 0.0f, 3.0f);*/     _pData[0] = VertexPos(0.0f, 0.0f, 0.0f);     _pData[1] = VertexPos(0.1f, -0.3f,0.0f );     _pData[2] = VertexPos(0.3f, -0.2f, 0.0f);     _pData[3] = VertexPos(0.2f, -0.1f,0.0f );     _pData[4] = VertexPos(0.25f, 0.1f,0.0f );     _pData[5] = VertexPos(0.1f, 0.3f,0.0f );          //unlock the vertex buffer     HR(m_pVB->Unlock()); }   This is the draw method which draw the image : void Exercise::draw() {     //get Direct3D Device     IDirect3DDevice9* _pDevice = m_pD3d->getDevice();     //set stream source     HR(_pDevice->SetStreamSource(0,m_pVB,0,sizeof(VertexPos)));     //set the index     //HR(_pDevice->SetIndices(m_pIB));     //set the vertex declaration     HR(_pDevice->SetVertexDeclaration(VertexPos::_vertexDecl));     //set the world matrix     D3DXMATRIX _worldM ;     HR(_pDevice->SetTransform(D3DTS_WORLD,D3DXMatrixIdentity(&_worldM)));     //set the view matrix     HR(_pDevice->SetTransform(D3DTS_VIEW,&m_ViewM));     //set the projection matrix     HR(_pDevice->SetTransform(D3DTS_PROJECTION,&m_ProjM));     //set the render state     HR(_pDevice->SetRenderState(D3DRS_FILLMODE,D3DFILL_WIREFRAME));     //draw the primitive     HR(_pDevice->DrawPrimitive(D3DPT_LINESTRIP,0,6));      }   Here is my code about the view matrix and projection matrix: void Exercise::buildViewM() {     D3DXVECTOR3 _pos(0.0f,0.0f,-10);     D3DXVECTOR3 _target(0.0f,0.0f,0.0f);     D3DXVECTOR3 _up(0.0f,1.0f,0.0f);     D3DXMatrixLookAtLH(&m_ViewM,&_pos,&_target,&_up); } void Exercise::buildProjM() {     float _width = m_pD3d->getD3dParam().BackBufferWidth;     float _height = m_pD3d->getD3dParam().BackBufferHeight;     D3DXMatrixPerspectiveFovLH(&m_ProjM,D3DX_PI,_width/_height,1.0f,1000.0f); }   Is there any problem with my code ??? I appriciate your help .  
  12. ITboy_Lemon

    Managing Decoupling

    This is awesome !!! It is my first time to give comment in the GameDev just because this article . The four point is very useful .I will look every article of this series .   Thanks for sharing .     Adrian
  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

GameDev.net is your game development community. Create an account for your GameDev Portfolio and participate in the largest developer community in the games industry.

Sign me up!