Jump to content
  • Advertisement

Anden

Member
  • Content Count

    5
  • Joined

  • Last visited

Everything posted by Anden

  1. I'm in the process of moving my old Direct3D11 graphics code to Vulkan, and figured that I might as well get gamma correct from the beginning, but it's not quite right.   The problem is that the resulting image is too bright. What's being done as of now is simply to render objects with an albedo map to a G-buffer and then blitting that to the swapchain. There are many things that can influence the final result, and considering that the final result contradicts my intuition, there ought to be something I don't understand and I hope somebody can verify what is right and explain what is wrong.   Previously (which actually looked fine) I was using the first fit for swapchain format and color space when creating the swapchain, which in my case meant a UNORM format. In lack of other options, Vulkan seems to force color space to be sRGB, and since the specification indicates that the color space defines how the presentation engine interprets the data, I figured I might as well just use an sRGB format to be correct because I don't do any manual conversions.   This change made my image too bright, so my thought was that maybe the textures were loaded wrong the whole time. For DDS textures, the library provides the format, for which I use the Vulkan equivalent. For PNG textures I have to specify the format myself which is RGBA8_UNORM. I was surprised that the textures were not sRGB, but I double checked opening the images in VS and RenderDoc.   My G-buffer is RGBA8_UNORM.   All in all, I render linear textures to a linear G-buffer followed by blitting to an sRGB swapchain with sRGB color space. If I understand correctly, hardware should do sRGB conversions when needed, most notably when blitting the G-buffer. To me, this seems like it should work. Funny thing is that changing G-buffer to sRGB produces the same result.   What about shaders? If I sample an sRGB texture (which I don't right now), will the data be linear in the shader? If I write from the fragment shader, will there be any conversion depending on the render target format or does it just write the data as is, assuming that I have correctly converted it myself?
  2. Nice to see that my assumptions seem to be correct. The question was probably a bit ambiguous; the Direct3D version was likely correct, the "previous" version was just what I went with when getting Vulkan up and running and does not refer to the Direct3D version. I'm aware of what sRGB implies, I just wanted to know the particular rules of conversions in Vulkan.   I have also tested performing a fullscreen pass that simply outputs a gradient, and comparing to reference pictures of correct gamma indicates that something is indeed wrong. At the end of the day I'm none the wiser and have probably overlooked something. I will continue searching and see what can be the issue and look back here if somebody has chipped in with more suggestions or perhaps sources to conversion rules (in the spec I only found conversion rules regarding blitting, but nothing concerning shaders).   Why would you expect it to be sRGB when you are not uploading the texture as such ? Is RGBA8_UNORM a sRGB format ?   The format to use is just a remnant of when I was getting things up and running. I expected the texture to be sRGB because many images are stored with that encoding, but was surprised that it was not and could conclude that the texture was not the problem. I included that statement in the question for completeness as to what I considered when trying to figure out what might be erroneous.
  3. I recently added a sky to my application, and while testing I simply wrote red pixels wherever the sky shader was invoked. When doing this I noticed that a few pixels inside solid geometry turned red, and they vary depending on camera angle. The way I render the sky is that I render a fullscreen triangle at the far plane and use the depth buffer for early z-rejection, only shading fragments where the far plane is visible (i.e. no geometry has been rendered). This led me to believe that the depth buffer still contained its clear value at those texels.   It turns out that the GBuffer have holes where a fragment has not been generated, and these holes seem to appear where two triangle edges meet. This is the final image. It's quite dark, but you're looking at a wall with two windows and a slanting roof above it (see the image below for a clearer view as to where the roof begins). To the right is a pillar. Notice the few pixels in a row where the wall meets the roof between the two windows (you might have to view the full images to see it clearly). [attachment=29004:finalimage.png]   Here are the normals of the GBuffer. Notice that the same erroneous texels are grey, which is the clear value (I clear to 0.5 because in [0,1] that corresponds to the zero vector in [-1,1]). [attachment=29005:gbuffernormals.png]   I understand that a fragment is generated by the rasterizer if the center of the pixel is contained in the triangle, and for this reason it seems that the center is inside neither of the triangles. Could this happen if due to precision problems the triangle edges are very close to the center of the pixel but not quite overlapping it? Note that this is a very rare case with several pixels in a row; usually it's a few single pixels scattered all over the image at places where triangle edges meet.   Has anyone else stumbled upon similar issues? How could these holes be fixed? If I have a dark indoor environment and a bright sky it's a very annoying artifact when a few bright pixels appear and disappear at random locations when moving the camera.
  4. I'm not quite sure how to verify that. The mesh comes from an obj-file so the positions ought to be defined once and then indexed. Since I can't make an index buffer for every attribute I have to duplicate some data, but that's all I do. There's no modification of the positions, they are just copied.   Another thing I observed is that the holes don't seem to appear on the edge of adjacent triangles in a quad, but rather on the edge of the quad itself. Not sure if that helps though.   I tried scaling up another mesh to see if the same issues appeared there (because it seems there's always a somewhat large triangle involved) but I couldn't see anything similar. An online viewer of the mesh showed the very same kind of holes that I see in my application, so I'm really starting to think it's the mesh itself that's faulty. If it's the mesh that's bad, then it could very well be that positions are duplicated in the file and not quite equal. I'll have too look into this in more detail.   Edit: I started to look inside my obj-file and found that about half of the vertex positions were duplicated, but I couldn't find any vertices that were the same apart from small rounding differences. Nevertheless, I found it odd that positions were duplicated (after all, the obj-file uses indexing so there really is no reason for any attribute to be duplicated in the file) so I began googling for things releated to unreal editor and duplicated attributes. I found http://irrlicht.sourceforge.net/forum/viewtopic.php?f=2&t=31090 which describes the issue I'm having. At first I thought you meant that the holes would appear on the T-junction, but the last post of the linked topic made me understand that the entire edges may go "out-of-sync" because the vertex at the T-junction may not lie exactly on the larger edge. It also explains why I don't see holes on the edges inside quads.   [attachment=29021:finalimage2.png] [attachment=29022:finalimage2wireframe.png]   In case somebody else stumbles upon this in the future these images might help. Consider the wall between the two windows. The holes do not appear at the T-junction itself, but because the vertex at the T-junction might not lie exactly on the roof triangle edge, the wall triangle edge is not perfectly synced with the roof triangle, causing holes.   Now the question is, how would one prevent this? Is it perhaps due to bad triangulation in UnrealEd (which is what I used to create the mesh)?
  5. I took a look at the mesh in wireframe mode and while these pixels lie on a triangle edge, there's no vertex there.
  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

GameDev.net is your game development community. Create an account for your GameDev Portfolio and participate in the largest developer community in the games industry.

Sign me up!