Jump to content
  • Advertisement

LHLaurini

Member
  • Content Count

    102
  • Joined

  • Last visited

Community Reputation

675 Good

About LHLaurini

  • Rank
    Member
  1. Hi.   I made some tests with a simpler scene. First, some code (this is a very simplified version of the code): UpdateBuffer(Mesh, &Object->WorldMatrix, Material); glBindBuffer(GL_UNIFORM_BUFFER, <UBO>); char* Ptr = reinterpret_cast<char*>(glMapBuffer(GL_UNIFORM_BUFFER, GL_WRITE_ONLY)); if (Ptr) { std::copy(<Buffer source>, <Buffer source> + <Buffer size>, Ptr); glUnmapBuffer(GL_UNIFORM_BUFFER); } glBindVertexArray(<Object's vertex array>); glDrawArrays(GL_TRIANGLES, <First vertex>, <Vertex count>); The code above is executed once per object. The results seem to change if I use glBufferSubData instead of glMapBuffer, so here's a "table": UBO update method | 1 object | 2 objects¹ | 3 objects¹ ------------------+----------+------------------------- glMapBuffer | OK | OK | WRONG (1) glBufferSubData | OK | WRONG (2) | WRONG (3) ¹ It didn't matter if the objects used the same VBO or not. What I mean by "wrong" depends on the combination.   For (1), it rendered the first object ok, the second one disappeared and the third one was rendered with the wrong VBO, but with the right world matrix and material.   For (2) and (3), it rendered only the last object, with the right world matrix and material, but with the wrong VBO.   Okay, we're getting closer. The UBO update method really matters. Let's try something. UBO update method | Run until the UBO update for the 2nd object | Run until the UBO update for the 3rd object ------------------+---------------------------------------------+--------------------------------------------- glMapBuffer | OK | Same as (1) above glBufferSubData | Same as (2) above | Same as (3) above Hmm, interesting. So glMapBuffer is erasing the second object after it's been already drawn and glBufferSubData is erasing everything.   Now, as you may know, I have no f***ing idea what the f*** is happening. I feel like even if I manage to solve this issue, another issue even worse is going to appear, so that's why I'm officially giving up support for Intel Windows OpenGL drivers. It's just not worth it. I've spent two weeks trying to solve it. But that's not a big deal since Intel GPU users on Windows can still use Direct3D 10.1.   So thanks to everyone who tried to help me and sorry that I wasted your time by not being able to solve this issue. Best regards.
  2. Hi again. Sorry for the late reply. Thank you to everyone who's trying to help me.   I had to google a bit to remember what that meant. No, I don't have any immediate mode code.     I'm not checking for errors through glGetError yet (because of lazyness) but I'm using GL_KHR_debug which reports even performance warnings.     That's tricky. The Ubuntu driver only supports 3.2 and 3.3 when using the core context. The greatest non-core version available is 3.0, so the best I could do would be having a 3.0 context and loading 3.1 extensions. I'll try that if nothing else works.     That's true. The Windows driver doesn't support geometry shaders, for example. But my engine checks if all needed extensions are present, so unless I'm missing an extension (which I don't think I am) that's also not it. I'm going to take a look at it anyway.     My engine always does that.     I don't know what GetLastError has to do with anything, but I'll take a look at that.     I don't think so. According to the reference "No other errors are recorded until glGetError is called, the error code is returned, and the flag is reset to GL_NO_ERROR.". Also, ApiTrace always lists all errors and warnings.     If I had a "proper Windows gaming PC" I wouldn't be using a laptop. Decent computers are pretty expensive where I live, so it's not that simple. Remember: just because you can afford a good computer it doesn't mean everybody can.  :)     Yeah, that's something I should do now. Even though ApiTrace verifies errors after each call, it's better to do it on my own.     My engine is designed not to assume any kind of default value.     THIS. For some reason, I haven't even thought about using a simpler scene. Doing this will probably make finding the problem much easier.   Again, thank you to everyone who replied. Have a nice day.
  3. Hello everyone!   I've been recently porting my Direct3D engine to OpenGL so it could run on platforms other than Windows, but it didn't work as intended. After a lot of debugging (using ApiTrace) I'd decided to try to run it on Ubuntu and guess what? It worked exactly like I wanted it to. Earlier this afternoon I tried to use llvmpipe (a software rasterizer) on Windows and it ran just like it did on Ubuntu.   Most objects seem to be rendered at the same place (maybe it's (0,0,0)?) and all of them have the same color so I'm thinking it may be a problem with the uniform buffers, but ApiTrace doesn't give any related error and the values seem OK.   This is how it looks like with llvmpipe: [attachment=33261:Should.png]   This is how it looks like without it: [attachment=33262:Windows.png]   I don't know if ApiTrace .trace files are machine independent, but I'm including one in case someone has time to take a look at it: [attachment=33267:Test.zip]   Now, there's something very important that I should say. My laptop has Intel integrated graphics which is known to have very crappy OpenGL drivers on Windows (supports 3.1), but the MESA driver on Ubuntu is much better (supports 3.3 core). Also, when on Windows it runs on a OpenGL 3.1 context, but on Ubuntu and when using llvmpipe it runs on a OpenGL 3.3 core context. I don't know if that could make a difference.   Does someone have any idea of what it could be? Could it be a driver bug? Could MESA drivers be allowing something that shouldn't be?   Thank you for reading. Best regards.
  4. LHLaurini

    Skinning an OpenGL window

    Like cgrant said, there's no such thing as an OpenGL window. What manages the windows are APIs like GLFW and GLUT, which usually only support the most commonly used features, such as setting a window's title, position and size.   Also, "skinning" is not very clear. What do you want to do to your window? Do you want to remove the system title bar and use one of your own? Do you want to have a custom shaped window?   Depending on the library you're using you may be able to do what you want. But it's very likely that you'll need to have different code for each OS.
  5. LHLaurini

    Copy Non-MSAA texture to MSAA texture

    Okay... If I were you I would use a shader, even if you just need a 1x1 rectangle because a staging texture may hit performance too hard. This is just a suggestion because I have no idea about what you could do instead anymore.
  6. LHLaurini

    Copy Non-MSAA texture to MSAA texture

    Okay, so could you tell us what exactly you need to achieve and why do you need to "draw some pixels of the rendertarget texture with different color" manually? That would help us a lot.
  7. LHLaurini

    Copy Non-MSAA texture to MSAA texture

    What about: - Render stuff to a MS texture - Resolve to staged not-MS texture - Copy staged not-MS texture to SwapChain But I wouldn't do that. Staged textures are very ineficient since they need to be kept in RAM memory.
  8. LHLaurini

    How to compile shaders in DX11

      Ok, nevermind, I found the problem, forgot to set the inputlayout.      I'm probably writing this too late, but are you using the debug layer? It's usually very easy to know what's wrong when you enable it and it sounds like you have it off. If you were doing something wrong with the pipeline (and you were) you would've received a warning in the Output window of Visual Studio.
  9. LHLaurini

    How to compile shaders in DX11

      Actually, the feature is still present in Visual Studio 2015. To use it you just need to specify a file name in the option "Header File Name" under "Output Files", then it will output both .cso and .h files. If you don't want the .cso files anymore, just erase the contents of the option "Object File Name" which is also under "Output Files".   But I tried it and it output a 113 KB, 3207 lines long header file for a simple shader which just had return pow(Texture.Sample(Sampler, input.uv), 1.0f / 2.2f); so when you try compile a bunch of shaders using this method you'll probably end up only increasing compilation time, memory consumption (as Pink Horror pointed out) and the executable's file size, without any significant advantages. But it's still a valid way of loading shaders.   I've uploaded the compiled shader header if you want to take a look: [attachment=30969:PSHDR.zip]   It's called PSHDR.hlsl but I've disabled HDR previously, so there's only gamma correction.
  10. LHLaurini

    How to compile shaders in DX11

      When you say effects 11, do you mean this? https://github.com/Microsoft/FX11 So, I can just include it in my VS project and compile the effects code from the FL's book without changes? Thanks for the reply!     I've never used that library, so I can't say for sure, but I took a look and it seems that the answer is yes. It seems there's some differences from the original Effects 11 (take a look at this page) but not many.   Still, I recommend you to only use effects for educational purposes. Throw them away when you start working in something serious.
  11. LHLaurini

    How to compile shaders in DX11

    If you are using Visual Studio, yes. But there's a little detail: you need to specify what profile you want the shader to compile to (what kind of shader and what version of HLSL). That's pretty easy, you just need to go to the properties of the HLSL file (right-click the file in the solution explorer and click properties) and set it up. If you need any help and you can't find it on Google or MSDN, just ask. EDIT: Take a look at this. Look for "Shader type" and "Shader model". EDIT2: Also, you either need to name your main function "main" or you must set the Entrypoint name parameter to whatever you're using (like PSTextured), so the compiler knows where your code starts from.
  12. LHLaurini

    How to compile shaders in DX11

    Hi! First: The DirectX SDK is kinda deprecated, so there's no need for you to download it. BUT I recommend you to download it because it has many samples and offline docs, but don't use it for compilation. There's basically three ways to compile HLSL shaders: D3DX, D3DCompile and fxc.exe. D3DX is deprecated and not included in the Windows SDK. I don't recommend this one. D3DCompile compiles shaders at runtime, which I also don't recommend (unless you really need runtime compilation) because it CAN make initialization slower. And it also isn't the easiest method. fxc.exe is a compile-time shader compiler which will output .cso (or whatever) files which you can simply read and pass the data directly to ID3D11Device::Create*****Shader. This is the easiest method if you are using Visual Studio because it compiles it automatically for you. I recommend you to read this page in MSDN (http://blogs.msdn.com/b/chuckw/archive/2012/05/07/hlsl-fxc-and-d3dcompile.aspx) if you wish to know more. BTW, effects are deprecated and somewhat slower than simple HLSL shaders. Good luck. PS: _void_'s answer is also pretty good. +1 for the snippet.
  13. LHLaurini

    Directly access XMMATRIX ellements

    Glad to hear you managed to make it work. :)
  14. LHLaurini

    Directly access XMMATRIX ellements

    What you need to do is to store your XMMATRIX into a XMFLOAT4X4. For that you can use XMStoreFloat4x4 (https://msdn.microsoft.com/en-us/library/windows/desktop/microsoft.directx_sdk.storing.xmstorefloat4x4(v=vs.85).aspx). Then you just use your XMFLOAT4X4 like you were trying to use the XMMATRIX and boom. Done. It's not very obvious for starters and is a pain in the A for everyone. That's one of the reasons why I moved to GLM. This for REF_Cracker: disabling SIMD is not a solution and will probably cause greater problems. Also why do you assume OP is using Visual Studio? There's lots of compilers and IDEs, you can't assume such thing.
  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

GameDev.net is your game development community. Create an account for your GameDev Portfolio and participate in the largest developer community in the games industry.

Sign me up!