Jump to content
  • Advertisement

ravarcade

Member
  • Content Count

    8
  • Joined

  • Last visited

Community Reputation

123 Neutral

About ravarcade

  • Rank
    Newbie

Personal Information

  1. I double checked ... and validation layer was not working. Not reporting any errors, even added deliberately, like missing vkDestroyBuffer befor vkFreeMemory. I tested it on my app and in SaschaWillems simple triangle (after changing in line 32 to #define ENABLE_VALIDATION true). Also, i'm sure, that validation layer was working in my app some time ago. I have no idea why it stops working. Maybe after i updated VulkanSDK from 1.0.54.0 to 1.1.82.1. Anyway, today i updated SDK to 1.1.101.0 and validation layer works again. So, now i have nice error raport when i use unsupported VkFormat.
  2. @Andrey OGL_D3D: You have right. VK_FORMAT_R8G8B8_UNORM is not supported. For this forma vkGetPhysicalDeviceFormatProperties return all bits in VkFormatProperties struct =0. Now i know how to check it. Thanks for help.
  3. @Andrey OGL_D3D: simple vulkan project? HA HA Ok. Here is SaschaWillems simple triangle project. https://github.com/SaschaWillems/Vulkan/blob/master/examples/triangle/triangle.cpp Only 3 modifications are needed: 1. Line 42: change type from float[3] to uint32_t uint32_t color; // float color[3]; 2. Lines 370-375: store color as uint32_t not as float[3] std::vector<Vertex> vertexBuffer = { { { 1.0f, 1.0f, 0.0f }, 0x000000ff /*{ 1.0f, 0.0f, 0.0f }*/ }, { { -1.0f, 1.0f, 0.0f }, 0x0000ff00 /*{ 0.0f, 1.0f, 0.0f }*/ }, { { 0.0f, -1.0f, 0.0f }, 0x00ff0000 /*{ 0.0f, 0.0f, 1.0f }*/ } }; 3. Line 946 vertexInputAttributs[1].format = VK_FORMAT_R8G8B8_UNORM; // VK_FORMAT_R32G32B32_SFLOAT; In line 996 you have call to vkCreateGraphicsPipelines It will crash inside amdvlk64.dll with "div by zero" if you are using latest amd drivers. If change in line 946 format from VK_FORMAT_R8G8B8_UNORM to VK_FORMAT_R8G8B8A8_UNORM... all will work. If you change to VK_FORMAT_R8G8_UNORM or VK_FORMAT_R8_UNORM It will work too, but you will lost blue and green color on triangle. Same problem exist for VK_FORMAT_R16G16B16_UNORM, VK_FORMAT_R16G16B16_SNORM, VK_FORMAT_R8G8B8_SNORM. So all formats with 3 rgb values. There is no problem for R, RG or RGBA. I did not tested 16-bit float formats. 32-bit RGB (VK_FORMAT_R32G32B32_SFLOAT) works.
  4. I get very strange error in my small vulkan project. I get div by zero error deep inside amdvlk64.dll (ver. 25.20.15031.1000) after call to vkCreateGraphicsPipelines. In short, when i try to pass as attrib data in VkFormat = VK_FORMAT_R8G8B8_UNORM or VK_FORMAT_R16G16B16_UNORM app crash without any usefull info. Solution: In every VkPipelineVertexInputStateCreateInfo struct passed to vkCreateGraphicsPipelines, replace VK_FORMAT_R8G8B8_UNORM with VK_FORMAT_R8G8B8A8_UNORM (or VK_FORMAT_R16G16B16_UNORM with VK_FORMAT_R16G16B16A16_UNORM) To make things clear: - there is no problem with nvidia gfx cards - vertex data stride is 16 bytes (for vertex: 3x4 bytes, offset = 0, for color 3x1 bytes, offset=12, 1 byte wasted at end) - when i changed R8G8B8_UNORM to R8G8B8A8_UNORM as param for vkCreateGraphicsPipelines all start working (no change in shader programs or datas) - when i changed R8G8B8_UNORM to R8G8_UNORM or R8_UNORM also program was working (with missing blue or green color on screen as expected) - same crash when i use R16G16B16_UNORM (with vertex data stride = 20 byes and 2 bytes wasted) - Vulkan vaidation layer was enabled. I put this here, because i wasted 2 day to find source of problem and it may save time for somebody else. I know, that it is not a big problem. Attrib input R8G8B8 gives no benefits and can be safely replaced with R8G8B8A8. Questions: 1. Is VK_FORMAT_R8G8B8_UNORM or VK_FORMAT_R16G16B16_UNORM forbiden as attrib for vertex data only on AMD gfx cards? Drivers bug? 2. Is there way in Vulkan API to query what attrib formats are supported by drivers?
  5. Because i'm writing mod to game, and everything i do, is in intercepted SwapBuffer. I do not load any libs.
  6. Found solution!   Problem was in init of glActiveTexture: PFNGLACTIVETEXTUREARBPROC pglActiveTexture = NULL; #define glActiveTexture pglActiveTexture; pglActiveTexture = (PFNGLACTIVETEXTUREPROC) wglGetProcAddress("glActiveTextureARB"); For some reason when i replace: glActiveTexture(GL_TEXTURE1);  with pglActiveTexture(GL_TEXTURE1); it start working.
  7. Hi I'm trying to pass 2 textures to fragment shader. But in shader only one texture is avaible.   I found that problem is in glActiveTexture.   After call with any args, like  glActiveTexture(GL_TEXTURE1), glActiveTexture(GL_TEXTURE2) .... i alway end in texture unit 0 here is part of code: glActiveTexture(GL_TEXTURE0); // debug GLenum e = glGetError(); // e = 0 GLint nn; glGetIntegerv(GL_ACTIVE_TEXTURE, &nn); // nn = GL_ACTIVE_TEXTURE = GL_TEXTURE0 glBindTexture(GL_TEXTURE_2D, leftEyeTextureHandle); glActiveTexture(GL_TEXTURE1); // !!! <- it does not work //debug GLenum e = glGetError(); // e = 0 GLint nn; glGetIntegerv(GL_ACTIVE_TEXTURE, &nn); // nn = GL_ACTIVE_TEXTURE = GL_TEXTURE0 !!!!!! glBindTexture(GL_TEXTURE_2D, rightEyeTextureHandle); I already cheacked: - GL_EXTENSIONS for GL_ARB_multitexture - tested "clean" example for multitexturing program and it work on same computer (GL_ACTIVE_TEXTURE gives GL_TEXTURE1 ...after glActiveTexture(GL_TEXTURE1)); - glActiveTextureARB - GL_MAX_TEXTURE_IMAGE_UNITS = 20 - GL_MAX_TEXTURE_UNITS = 4 - tests on other computers (gfx ati / nvidia)   Problem is in opengl state but i can't figure where is problem. Is there any "switch" to turn off multitexturing ?    fragment program: #version 150 uniform sampler2D left; uniform sampler2D right; in vec2 UV; out vec3 color; void main(){ if (mod(trunc(gl_FragCoord.x), 2.0) < 0.5) { color = texture(left, UV); } else { color = texture(right, UV); } }
  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

GameDev.net is your game development community. Create an account for your GameDev Portfolio and participate in the largest developer community in the games industry.

Sign me up!