Jump to content
  • Advertisement

parroteye

Member
  • Content Count

    19
  • Joined

  • Last visited

Community Reputation

220 Neutral

About parroteye

  • Rank
    Member

Personal Information

  • Interests
    Programming

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

  1. Hi! In a dispatch with multiple groups, suppose the following: Thread Group A: - Thread1 in the group read, modify and then write UAV[x,y] --- NOTE no other thread in groupA handles to UAV[x,y] Thread Group B: - Thread28 in the group ends up reading, modifying and then writing UAV[x,y] as well. -- NOTE no other thread in groupB handles to UAV[x,y] What happens? Do I get race conditions between groups? If so, what could be an efficient strategy to avoid this? I assume they do, if so, do atomic operations (i.e. Interlocked etc.) work across multiple groups on UAVs? Also, what performance penalty may I incur?
  2. I already have baseline renderer. I just wanted to learn the "CS way of thinking" while implementing something. I mean I guess I know how to map an existing pixel shader to a compute shader... I think... but I would really like to get into the right mindset to leverage compute Yes, I mean the Microsoft one, I was googling for samples and ended up on their github page. Looks great, but a bit too advanced for me at the moment As I wrote above I think I know how to translate a pixel shader into compute, I feel I don't know how to take advantage of compute or to start thinking properly about them and not 1 thread == 1 pixel. I keep reading about how compute allow for new and interesting things and that's what I would like to start looking at.. I guess with a blur I could try sharing samples? Do you have any resources that explain in a noob friendly way (this is my first real approach to graphics programming) how compute helps in that? I have read post around mentioning access patterns and stuff like that, is there something you suggest to read around this ? Any basic guidelines? Thanks! Jacques
  3. Hi Everyone! I am at a point where I know what a compute shader is, but still can't get my head completely around it. I feel like I would need to get my hand dirty, but if I look at stuff like the Mini Engine my brain hurt a bit... do you have a path of useful things to implement in cs that will gradually give me a better understanding and a more solid grasp of CS? Thank you!! Jacques
  4.    Hi!   This is probably a stupid questions, but I try to read back a staging texture on cpu, the format is 16bit float, however to my limited knowledge there is no standard 16bit float in C++, how can I read the values in a meaningful way?  The mapping procedure work, I tried with 32bit float and I read them fine.    Thank you very much!   Jacques 
  5.   Hey all!    I am starting to implement dynamic cubemaps in my renderer. I have to top level mip by rendering 6 faces into 6 separate RTs. I then merge them into a cubemap.    All of this is fine, I now want to filter the mips with some brdf convolution a-la UE4.  I can't find how can I render to a specific mip of the cubemap? I assume I need to render to different cubemaps and then merge them into a mip chain? How can I do this? I am not finding info online...   To be clear, I am not asking for the convolution technique per se, but just how to set the render to separate mips in d3d or how to merge separate targets into a mip chain.    Thank you very much   Jacques
  6.   It was indeed that! After posting I googled a bit, to find that comp is not the number of components of the output data like I thought it was! 
  7. Hi!   Up to now I have only been using dds, I am now trying to load some jpg and png textures to test something, but I have had mixed luck with stb    All I do to load is  stbi_load(path, &w, &h, &c, STBI_rgb_alpha); Now some textures load ok,   some load with a strange stripey pattern   Stripey pattern that is not visible in the source image.   I tried STBI_rgb for the one that loaded strangely, but the result is even more wrong in most cases, like I would expect actually...    What am I doing wrong? The texture resource is created in all cases with DXGI_FORMAT_R8G8B8A8_UNORM   thanks   Jacques  
  8. Ha, still that flipping causing me problems. For debug reasons I output the sampled result of envbrdf and used that as the sphere colour and immediately noticed that the green bit (shoult be at grazing angles) was at the center of the sphere... With the last mip thing it does look better than before... still feel a bit too shiny, but is clearly more plausible. Thank you all!
  9.   The top one is the non metal one :\   The light source is the surrounding environment - your current reflections are the specular component of that lighting. You also need the diffused reflections from the environment. As a super quick hack, sample the bottom mip level (1x1 pixel) of your cubemap in the normal direction and use that as the diffuse component. This is wrong, but closer than nothing. In the long run, you can use these same techniques to properly pre-convolve the environment against your diffuse BRDF. Or you can store those results in SH, etc, instead of another cubemap.     I tried that, still looks pretty shiny :( ... I wrongfully generated only up to the 16x16 mip, but still.. [Top one is metalness 0, bottom one is metalness 1]  
  10.       HA! Nevermind, it looks like for some reason the envBRDF generated by the code on the course notes generates something that has the y axis flipped if compared to the one showed, so I forgot I added a sign flip to make it match. it now looks more like expected (with stronger response at grazing angles and overall less energy from reflections on non-,metals).    However, it s still quite strong... maybe now it's because of the balancing MJP mentioned     So this is now metal==1 vs metal==0     Does this look plausibly alright to you? (again, no analytical lights here). To me it still looks incredibly metallic :( What should I balance here, there is no other source of light but the reflections
  11. Thanks for the reply and suggestions, I will definitively make sure the diffuse matches the specular bit :)   I will try and check precise values, but certainly, especially in the back of the object (w.r.t to light) the intensity of reflections is way higher than 4% and given the code I posted above this is to be expected, right? I mean the "specular colour/F0" is scaling only first channel of the envbrdf texture and the y channel is only scaled by the result from cubemap, regardless of the material metalness. In this case, even if non-metal meant 0, I will always get something since metalness is not taken into account when generating the envbrdf texture...    I wonder if that the above is all fine, and the y channel is always supposed to be very small but at grazing angles, hence I might have some bugs in computing such angles (and therefore I pickup always high values when I shouldn't?). That sounds plausible, right? I am not sure how possible, but I wonder if I could've some errors in computing the view vector...    To validate I was not doing something odd I literally copy-pasted the code that generates the envBRDF texture from UE presentation.    As for how I apply the reflections, I do that as a post-process pass blending on top of the already lit scene and I use the formula I posted above.   Thanks   Although, this to me looks kinda plausible for an NdotV (sorry if it doesn't... I am just starting in the field :P )    This is the sphere with NdotV  and this is what I get with IBL only (no analytical lights) on metalness of 0       HA! Nevermind, it looks like for some reason the envBRDF generated by the code on the course notes generates something that has the y axis flipped if compared to the one showed, so I forgot I added a sign flip to make it match. it now looks more like expected (with stronger response at grazing angles and overall less energy from reflections on non-,metals).    However, it s still quite strong... maybe now it's because of the balancing MJP mentioned
  12. Thanks for the reply and suggestions, I will definitively make sure the diffuse matches the specular bit :)   I will try and check precise values, but certainly, especially in the back of the object (w.r.t to light) the intensity of reflections is way higher than 4% and given the code I posted above this is to be expected, right? I mean the "specular colour/F0" is scaling only first channel of the envbrdf texture and the y channel is only scaled by the result from cubemap, regardless of the material metalness. In this case, even if non-metal meant 0, I will always get something since metalness is not taken into account when generating the envbrdf texture...    I wonder if that the above is all fine, and the y channel is always supposed to be very small but at grazing angles, hence I might have some bugs in computing such angles (and therefore I pickup always high values when I shouldn't?). That sounds plausible, right? I am not sure how possible, but I wonder if I could've some errors in computing the view vector...    To validate I was not doing something odd I literally copy-pasted the code that generates the envBRDF texture from UE presentation.    As for how I apply the reflections, I do that as a post-process pass blending on top of the already lit scene and I use the formula I posted above.   Thanks
  13. Hey all!  I am looking into making my reflections ok, so I started looking at UE4 solution for pre convolved reflections.   I think I get the gist of it and I will spend more time looking at the original paper a bit more, but I have a question doubt... Looking at the code, this line confuse me. In particulár the red part (from their paper) float3 PrefilteredColor = PrefilterEnvMap( Roughness, R ); float2 EnvBRDF = IntegrateBRDF( Roughness, NoV ); return PrefilteredColor * ( SpecularColor * EnvBRDF.x + EnvBRDF.y );   So EnvBRDF is computed regardless of the receiving material metalness, which I guess is fine. The problem I have is that since the red part is not scaled by the metalness/spec color, I always get reflections (and quite strong based on my first results) even on a material with 0 metalness... is that correct? If I look at the back of a non-metal object where there is no light from the analytical source, it looks like metal... can somebody shed some light on it for me?      Thank you, Jacques
  14. parroteye

    Apply shadow map on scene

    Thank you guys! I was missing the final space transform to get the correct range!
  15. Hey all,   I sorted out my matrices (previous topic), but I am having problems with applying the shadows on the scene...  So I have a shadow map computed as:  float4 shadowpos = mul(float4(inputpos,1.0), modelMatrix); shadowpos = mul(shadowpos, shadowVPMatrix); return shadowpos; And it renders a plausible shadow map.    Now in my lighting pass, I am doing: float shadow = 1.0; float4 lightspacePos = mul(float4(worldpos, 1.0),shadowVPMatrix); float shadowSample = shadowmap.Sample(sampler, lightspacePos.xy).r; if (compareforshadow(lightspacePos.z, shadowSample )) {     shadow = 0.0; } The matrix shadowVPMatrix is exactly the same as the shadow map pass acording to Nsight, worldpos is reconstructed and it is correct (I validated it trying with a world position value stored in gbuffer). I also tried dividing the lightspacePos,xyz by its w, but still wrong...    What can is wrong? How can i go about and investigate this?    Again i am very sorry if this is a stupid question, this are really my first steps in this field :(     Thank you!! Jacques
  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

GameDev.net is your game development community. Create an account for your GameDev Portfolio and participate in the largest developer community in the games industry.

Sign me up!