Jump to content
  • Advertisement
Sign in to follow this  
luorax

Shader Storage Buffer driving me nuts

This topic is 1335 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Hey there!

 

I've got an issue that I've already wasted on more hours than it deserves, and hasn't been able to solve it, so I'm asking for help.

 

I've got a persistently mapped buffer object that is bound through a named shader storage buffer bindpoint, which contains the parameters for all my terrain instances. Originally I was using a statically sized UBO with the maximum possible terrain count, and did the same for other type of game objects (materials, light sources, entities, etc.), but while they worked fine on my desktop configuration, the arrays were too large for the GPU in my laptop, for example, so I decided to try a different route. And it's been working fine for everything else, but the terrain part of the code is kinda bugged.

 

It's highly WIP, I'm experimenting with a lot of different ideas - you may suggest things not related to my problem, but please keep this in mind!

 

The C-side structure looks like this:

    struct TerrainData
    {
        glm::mat4 m_model;
        glm::mat4 m_normal;
        GLint m_layerCount;
        GLfloat m_tileBias;
        GLfloat m_tileScale;
        GLfloat m_pad;
        GLuint m_layer0;
        GLuint m_layer1;
        GLuint m_layer2;
        GLuint m_layer3;
        GLuint m_layer4;
        GLuint m_layer5;
        GLuint m_layer6;
        GLuint m_layer7;
        GLuint m_layer8;
        GLuint m_layer9;
        GLuint m_layer10;
        GLuint m_layer11;
        GLuint m_layer12;
        GLuint m_layer13;
        GLuint m_layer14;
        GLuint m_layer15;
        GLuint m_layer16;
        GLuint m_layer17;
        GLuint m_layer18;
        GLuint m_layer19;
        GLuint m_layer20;
        GLuint m_layer21;
        GLuint m_layer22;
        GLuint m_layer23;
        GLuint m_layer24;
        GLuint m_layer25;
        GLuint m_layer26;
        GLuint m_layer27;
        GLuint m_layer28;
        GLuint m_layer29;
        GLuint m_layer30;
        GLuint m_layer31;
    };

Some pretty basic data: model/normal matrices, triplanar parameters, material indices, etc.

 

The way I access it in GLSL is like this:

//////////////////////////////////////////////////
//  Terrain parameters
struct TerrainParameters
{
    mat4 mModel;
    mat4 mNormal;
    int iLayerCount;
    float fTileBias;
    float fTileScale;
    uvec4 uLayers[8];
};

layout (std140, binding=STORAGE_BINDING_ACTOR) buffer TerrainData
{
    TerrainParameters terrainParameters[];
};

#define _terrain terrainParameters[terrainId-1]

//////////////////////////////////////////////////
//  Model matrix
mat4 getModelMatrix(uint terrainId)
{
    return _terrain.mModel;
}

//////////////////////////////////////////////////
//  Normal matrix
mat4 getNormalMatrix(uint terrainId)
{
    return _terrain.mNormal;
}

//////////////////////////////////////////////////
//  Tile bias/scale
float getTileBias(uint terrainId)
{
    //return 0.5;
    return _terrain.fTileBias;
}

float getTileScale(uint terrainId)
{
    //return 0.00025;
    return _terrain.fTileScale;
}

//////////////////////////////////////////////////
//  Layers
int getLayerCount(uint terrainId)
{
    //return 2;
    return _terrain.iLayerCount;
}

uint getLayerId(uint terrainId, uint index)
{
    //return index+1;
    
    if (index>31)
        return 0;

    return _terrain.uLayers[index/4][index%4];
}

Now here is what's wrong: if I comment out all the layer getters, and use the burnt-in magic numbers currently in the comments, then the model and normal matrices read back are the proper ones. If I try to access any of the layer attributes (tile bias/scale, layer count, material indices), then while I can get back the appropriate value, my model and normal matrices are screwed. I tested this by using a "magic model matrix" in the vertex shader - the vertices appear, the materials are okay, but the normals are wrong.

 

Does anyone have any idea what could be wrong? I've already tried multiple things (unrolling that array in the struct, using std430 just to name the most common mistakes people make), none of them helped, and I'm out of ideas.

 

P.s: Here are 2 screenshots, just to give you a grasp of an idea what things look like. I just threw together a bunch of random things (height map, blend maps, materials, etc) I could find on the internet, but for development, they work just fine. (Triplanar mapping also looks like crap, but that's the topic of a different thread)

  • Without magic numbers: all the normals are wrong (bottom right quad). Specular lighting is also screwed up, only a huge bright circle appears, as it can be seen on the first quad
  • With magic numbers: correct normals, correct specular lighting. But it's fully driven by magic numbers.

 

[EDIT]
So, I've tested it on my laptop too, and it works perfectly on it, with a gt635m GPU. My AMD has an r7 260x, and it's bugged there. Hmmpf.

Edited by luorax

Share this post


Link to post
Share on other sites
Advertisement

Edit: Nevermind, it looks like you have your padding in your C struct, so it should line up properly.

 

I'm no expert, but I thought your variables in the struct needed to be 4 component aligned.  Try this as your struct in your shader.

 

struct TerrainParameters
{
    mat4 mModel;
    mat4 mNormal;
    int iLayerCount;
    float fTileBias;
    float fTileScale;

    float padding;     //<------------------------added this guy here so that the next variable can be aligned properly.
    uvec4 uLayers[8];

};

 

 

cheers, 

 

Bob

Edited by Scourage

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

Participate in the game development conversation and more when you create an account on GameDev.net!

Sign me up!