• Advertisement

dpadam450

Member
  • Content count

    3278
  • Joined

  • Last visited

Community Reputation

2357 Excellent

About dpadam450

  • Rank
    Contributor

Personal Information

  1. OpenGL Packing char in float texture

    That's it, thanks.
  2. Sampling a floating point texture where the alpha channel holds 4-bytes of packed data into the float. I don't know how to cast the raw memory to treat it as an integer so I can perform bit-shifting operations. int rgbValue = int(textureSample.w);//4 bytes of data packed as color // algorithm might not be correct and endianness might need switching. vec3 extractedData = vec3( rgbValue & 0xFF000000, (rgbValue << 8) & 0xFF000000, (rgbValue << 16) & 0xFF000000); extractedData /= 255.0f;
  3. OpenGL GLSL Light structure

    I must have been smoking crack. It all works now with no changes. I think I was editing the wrong fragment shader for the vertex shader. And therefore it was bad testing.
  4. OpenGL GLSL Light structure

    That's correct. Using it in the fragment shader seems to not effect anything. Copy/pasted the structure declarations, they are identical.
  5. I have this code below in both my vertex and fragment shader, however when I request glGetUniformLocation("Lights[0].diffuse") or "Lights[0].attenuation", it returns -1. It will only give me a valid uniform location if I actually use the diffuse/attenuation variables in the VERTEX shader. Because I use position in the vertex shader, it always returns a valid uniform location. I've read that I can share uniforms across both vertex and fragment, but I'm confused what this is even compiling to if this is the case. #define NUM_LIGHTS 2 struct Light { vec3 position; vec3 diffuse; float attenuation; }; uniform Light Lights[NUM_LIGHTS];
  6. I saw a video for start citizen that says they use the same rig/animations for both the 1st and 3rd person view (as opposed to floating arms). I haven't found any details on how they did it though. My initial issues would be a skinny character vs a fat character, where in first person you would have to warp the shoulder and crunch them closer to center of body so they fit in the frame. I'm very curious how star citizen (or any other games) have implemented this. Most people seem to just chop off the 3rd person somewhere after the legs in first person so that you can see your legs, and then have just the arms in first person floating by the camera.
  7. FINALLY, upgrading my engine to openGL 4. I was having some trouble so I started with a stripped down application and was wondering if VAO's are required, because I have a sample working, but if I remove the VAO then it doesn't seem to like drawing my triangle.
  8. Optimization SIMD 256-bit

    Sadly the 256 simd was the same as my normal matrix multiplication. 128 simd was 2ms faster.
  9. Optimization SIMD 256-bit

    Right, that was on that website, that was what I was looking for run-time detection.
  10. Optimization SIMD 256-bit

    NVM found a solution. https://software.intel.com/en-us/articles/introduction-to-intel-advanced-vector-extensions
  11. I've been starting to optimize my code in anyway possible. I saw that some cpu's actually have 256-bit SIMD, but I was wondering if there is a way to detect this and fallback to the 128-bit on an unsupported cpu, or how else to deal with this.
  12. IBL Diffuse wrong color

    I can support HDR in my engine, but at the asset level, I dont support HDR textures. So yes at some point I need to support loading in my skies specifically as HDR (no other specific asset in games other than potentially an emissive channel can I think would need HDR). HDR and PBR are completely separate things. PBR is how light interacts and bounces. HDR is in respect to capturing a proper range of light/photons. So no HDR is not a requirement of PBR.
  13. IBL Diffuse wrong color

    The skybox is my cubemap for IBL for my PBR shader.
  14. https://tse3.mm.bing.net/th?id=OIP.fCJwPm8kAFQIdWH9AZ7lWQEsDI&pid=15.1&P=0&w=251&h=168 First, I'm using standard RGB8 textures for my skybox, not HDR. If I use a skybox with an image like above, downsampling it simply gives me a blue tint down my mip map chain. For PBR rougher surfaces, they all just look blue tinted when they shouldn't be. I was thinking maybe this was because of using LDR textures where the brighter pixels would spread further, and thus making the downsamples brighter. I would think that the clouds wouldn't be so bright to overcome downsampling more to white even as HDR, but maybe that is wrong. Even in a very cloudy day with minimal sun and blue sky, I would think even with HDR that it would just downsample to blue, but it depends on how many bright pixels are clustered and how high their range is.
  15. When you get more complex animations due to input changes (think of something like FIFA), then you may blending several animation and it is easier to deal with on the cpu, especially since you will need final bone transforms for physics related things.  However putting baked animations into a buffer/texture can be good for something like crowds, where they have a few idle or walk animations. You can then offset into the buffer and have all your instances playing a random section of an idle loop.
  • Advertisement