Posted 12 July 2012 - 03:57 PM
Just for the OP's info, a Texture2DArray in HLSL (for SM4.0+ hardware and with the appropriate API support) does not take an array of Texture2D objects like you are doing. The API defines a Texture2DArray type which is a different type of texture, taking an "arraysize" parameter that defines how many textures are in the array. This is much the same way as 3D textures or cubemaps are different types of textures. E.g. for D3D11 it is specified as part of the D3D11_TEXTURE2D_DESC struct which is used when creating the texture. If your API (XNA in this case) doesn't support this type of texture then you can't use one.
It would probably compile alright as the shader compiler is a separate program, and all it knows about is the shader source code, the type of shader (vs, ps, etc) and the shader profile to be used (2_0, 3_0, etc). If the OP had set things up appropriately then there's no reason for the shader compiler to choke on it.
It appears that the gentleman thought C++ was extremely difficult and he was overjoyed that the machine was absorbing it; he understood that good C++ is difficult but the best C++ is well-nigh unintelligible.