Using strings in HLSL

Started by
1 comment, last by BattleMetalChris 12 years, 1 month ago
I'm making a shader that will draw a line of text to the screen using a bitmap containing all the characters needed (this is mainly because I can't get DrawText() to work at all.)

The way I am intending to implement it is that I've used a little tool I've written to encode the bounding box coords in uv-space of each character into the pixels in the top few lines of the bitmap. I then pass the bitmap and my line of text into the shader - for each character in the text string, the geometry shader draws a quad and uses the ascii value of the character to lookup into the texture and get the uv coords for that character.

Now I've come to write the .fx file I've come up against a problem I'd expected to be trivial; How can I pass a string into the shader and then access it?

The HLSL reference says that there is a string type, but that 'There are no operations or states that accept strings, but effects can query string parameters and annotations.' Does this mean I can pass one in, but can't read it? Or define one within the shader, but can't pass one in? The (untested) code I have to pass the string into the shader is:
m_textStringVar->SetRawValue((void*)text.data(), 0, text.length() * sizeof(char));
where 'text' is a std::string and m_textStringVar is an ID3D10EffectStringVariable*
Advertisement
You wouldn't want to use any "string" type...you'd want to pass your string to your geometry shader as an an array of indices, with each index corresponding to the value for each character in the string. Then you could use SV_PrimitiveID to index into that array to get the index of the character you want to render, and then use that index to look up into another array containing the UV coordinates for that character. The arrays that you pass to the shader could either be in a constant buffer, or in a typed buffer
Given that 32 bits seems to be the smallest integer data-type hlsl can use, would it be prudent to pack a string of 64 chars into 16 ints, pass them like that, then use bitshifting to unpack them in the shader?

What about packing into four int[4]s (I know hlsl packs everything internally into arrays of four 32-bits), or would that be exactly the same as passing 16 ints?

This topic is closed to new replies.

Advertisement