Jump to content
  • Advertisement

bodigy

Member
  • Content Count

    12
  • Joined

  • Last visited

Community Reputation

124 Neutral

About bodigy

  • Rank
    Member

Personal Information

  • Interests
    Design
    DevOps
    Programming
  1. Great news!! I replaced SDL completely with glfw, and now this problem is gone! My borderless fullscreen window mode, and normal window mode, all work perfectly now. I honestly don't even want to know why it happened anymore. I'm just glad it's over! Glfw seems much more lightweight and better suited for my needs. Thank you all for the suggestions!
  2. I tried this - both `SDL_SetWindowGammaRamp` and `SDL_SetWindowBrightness`, with a test value of 2.0. The effect is exactly the opposite of what I needed / expected. If my game is running in windowed mode, the entire screen becomes brighter, as in, including everything on screen, other windows, the taskbar etc.. But fullscreen, it's still the same brightness (too dark) - so still even darker than normal. SDL_SetWindowGammaRamp seems to have no effect when fullscreen.
  3. Never heard of PresentMon, great tool, thanks! I ran it, and "PresentMode" indeed is different when it's running fullscreen with no overlays. Windowed, it says "Composed: Flip", fullscreen "Hardware Composed: Independent Flip". I'll take a look at the results in more detail when I get home, but I think we're getting closer to an answer! (Next question will be; why the brightness change and how to fix it :)) Good advice! But, it also leaves a lurking feeling that... you know... "I still have an annoying, unresolved issue - how can I build on buggy stuff like this?"
  4. Thank you, sounds interesting, I'll try it out when I get home! Right, but currently I also use SDL for input handling, so it'll be some work to switch to glfw. That's why I don't want to do it (for now) if I don't have to.
  5. Correct. If I find out that it's caused by SDL, I'll definitely look for an alternative. But until then I wouldn't want to waste time on that. I'd rather develop my game Sorry, I didn't think my pipeline details were relevant for this. See below. Any window size that doesn't cover the entire screen works fine. Borderless or not. I just tried this morning to remove every mention of SRGB in my code, the colors / brightness got messed up as expected, but the windowed vs. fullscreen issue remained. In windowed it was too bright, and in fullscreen it "looked correct". But still inconsistent. So SRGB doesn't really seem to be the problem here. I'm suspecting some weird Windows or Nvidia feature, but can't figure out what. Setup is pretty basic. Let's say my screen is 4k and I render in 1080p. In this case, I have 2 FBOs (1080p), both with a single (S)RGB attachment. I render (quad) sprites to the first one. Then I do a full-screen post-FX pass and render it to the second one. Just some chromatic aberration (I know, I know) and film grain, but I don't think that's relevant. Then I blit the 1080p buffer to the default framebuffer (4k). I don't need exclusive fullscreen mode, hence this solution - which, as far as I know, e.g. Valve games use as well. In earlier versions I didn't do glBlitFramebuffer but rendered directly to the default FBO, and still had the same problem. Apologies, I thought I provided enough, about OS, GPU, etc. But you had some good questions already, so thank you!
  6. Hi gamedevs! I've been googling my issue for days now, nothing comes up that would solve it. People have similar problems, but this one seems unique to me?! To get to the point, here's what I have: custom C++ engine, doing the baby steps, for now. I can render 2D sprites with transparency properly OpenGL (4.4 core) with SDL2, glew 2 SRGB Frambuffers, I render to the 1st, then post-effects to the 2nd, then blit it to the default FBO Windows 10 desktop PC, 4k screen (well, two of them, different types) GeForce GTX 1050 Ti, latest drivers SDL_GL_SetAttribute(SDL_GL_FRAMEBUFFER_SRGB_CAPABLE, 1); glEnable(GL_FRAMEBUFFER_SRGB); So, the problem: In windowed mode, the gamma is fine. Whenever I go "full screen" (not even exclusive mode, just borderless, i.e. SDL_WINDOW_FULLSCREEN_DESKTOP), the screen suddenly becomes too dark. Here's the fun part: if I then press the volume up/down multimedia key, then while the Win10 overlay is showing, the gamma becomes correct!! But the moment when the volume overlay is gone, boom, darker screen again. While it's "switching", I can see pink-ish, green-ish artifacts for about 1/10 of a second. Switching back to windowed, it's fine again. Screenshots done while fullscreen look fine. Now, I have no idea what causes this - is it SDL, OpenGL, Windows, GPU drivers, or my monitor (both of them do this)? I'm hoping someone can help figure this out. I'll be eternally grateful, since this is making me pull my hair out. Disclaimer: I'm not that experienced in this area (graphics programming, GPUs), my greatest achievement was a deferred renderer playground (it used WinApi - not SDL). Help me please!! Adam B EDIT: Forgot to mention that I've checked many settings already. In NVIDIA Control Panel, I don't have anything suspicious, under "Adjust desktop color settings", it's "Use NVIDIA settings". Nothing that would indicate any kind of overrides. Windows 10's Game Mode ON/OFF doesn't make a difference. Game bar is OFF.
  7. This is now resolved. The problem wasn't with NVMeshMender of course... :) I had several mistakes all working together against me.   Hope someone finds my GL_INT_2_10_10_10_REV pack function useful - since I personally couldn't find anything with Google!
  8. Here are the tangent vector artifacts:  
  9. Partial success - I managed to come up with the below, which works with normals :) But now I see issues with tangents, probably because of some other issue I have. U32 packNormalizedFloat_2_10_10_10_REV(float x, float y, float z, float w) {     const U32 xs = x < 0;     const U32 ys = y < 0;     const U32 zs = z < 0;     const U32 ws = w < 0;     return U32     (         ws << 31 | ((U32)(w       + (ws << 1)) &   1) << 30 |         zs << 29 | ((U32)(z * 511 + (zs << 9)) & 511) << 20 |         ys << 19 | ((U32)(y * 511 + (ys << 9)) & 511) << 10 |         xs <<  9 | ((U32)(x * 511 + (xs << 9)) & 511)     ); }
  10. Wow, thanks, your answer made me realize something!   I used to have similar code which I replaced with the MeshMender, so now I tried reverting back (and telling the mender not to recalculate normals) and to my surprise it had the same artifacts! I then realized there's something else I changed at the same time....   Which is the GL_INT_2_10_10_10_REV pack logic... With my old version, it works fine -- unfortunately I wrote that a long time ago and now I realize it relies on undefined behavior. :( http://stackoverflow.com/questions/3784996/why-does-left-shift-operation-invoke-undefined-behaviour-when-the-left-side-oper return U32 ( (((I32)(w) & 3) << 30) | (((I32)(z * 511.0f) & 1023) << 20) | (((I32)(y * 511.0f) & 1023) << 10) | ((I32)(x * 511.0f) & 1023) ); Now I'm trying to come up with something which isn't undefined... Stay tuned :) Or if anybody has some code I'd be grateful!
  11. This shouldn't have to do anything with UV-s, if I'm not mistaken. These are ONLY the "raw" view-space normal vectors on the screenshots; they simply come from the vertex attribute array.   Just to be sure, I tried with the same mesh but all UVs removed, and I'm seeing the same thing. :( ... :)
  12. I'm working on a mesh converter (from OBJ to custom binary format). I've been trying to get tangent vector calculation to work properly but there were some issues with my code, so I switched to NVMeshMender (though without the "d3d9" dependency, I changed the source slightly to use glm).   Problem is, it's still not working, but not even without the tangents. Whether I instruct NVMeshMender to (re)calculate the normals or not, it introduces artifacts in the raw normals, which I did not have before.   I load the OBJ file line-by-line, collect unique vertices (which have equal V, N, UV indices), then run MeshMender on this data. I store the normals and tangents packed in GL_INT_2_10_10_10_REV format. Strange thing is, only a couple of normals are messed up, and it seems to be random.   Here's a screenshot (note it might seem too dark but that's because I forgot to turn off the tone mapping pass). The Iron Man model does not seem to have artifacts, but nearly every other model seems to do.   By posting this I'm hoping that somebody will recognize this kind of artifact and can point me in the right direction. Any help is greatly appreciated!   EDIT: When I also export the normals from Blender in the OBJ, the results are slightly different but the artifacts are still present (e.g. the monkey's eyes have more artifacts horizontally)...  
  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

GameDev.net is your game development community. Create an account for your GameDev Portfolio and participate in the largest developer community in the games industry.

Sign me up!