• 13
• 18
• 19
• 27
• 9

A question about HLSL global variable

This topic is 3561 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

Recommended Posts

I use HLSL to implement matrix palette. In the SDK sample, the global matrix array is defined like this: // Matrix Pallette static const int MAX_MATRICES = 26; float4x3 mWorldMatrixArray[MAX_MATRICES] : WORLDMATRIXARRAY; Now, I want to define this array accroding to the number of GPU constant registers, so newer GPUS can use bigger matrix palette to improve fps. But how?

Share on other sites
You can't really dynamically (according to the number of vertex shader constants) change your matrix palette size. Usually, your data is pre-crunched according to a set maximum number of matrices. But that's not your main question.

To my knowledge, you can't really do it in HLSL with automatic constant assignation. There are alternatives though.

1. You can manually assign registers to all your constants. All your 'static' constants go at the beginning and you set the last one to be your matrix palette.
float4 variable1 : register(c0);float4 variable2 : register(c1);....float4x3 palette : register(cN);

2. You can compile a variation of your shader at load time according to the space you want to allow for the palette. You only have to specify a #define value when compiling the shader with the correct number of matrix entries you want to allow.
D3DXMACRO defines[2] =    {        { "MAX_MATRICES", "64" },        { NULL, NULL }    };D3DXCompileShader(..., ..., defines, ..., ........ );// In your HLSL, you'll have a macro called MAX_MATRICES defined to 64.