Sign in to follow this  
ShawMishrak

[.net] [XNA] VertexBuffer.SetData with bytes

Recommended Posts

ShawMishrak    146
First off, does VertexBuffer.SetData<T> "officially" support byte arrays on Windows and XBox 360? For instance, making a call like "vb.SetData<byte>(someByteArray)", where someByteArray is a byte[] object that contains raw vertex data? I'm running into a situation where this call is not working. On Windows, everything works fine. The byte array is pushed to the graphics card, and rendering is performed properly. On XBox 360, however, something is not working. The model is just not drawing, but there are no thrown exceptions or debug output. (unless there is a way to enable Direct3D debug output from the XBox?). The code is 100% copy-and-paste between the Windows project and the XBox project. The relevant code is as follows:
System.IO.FileStream modelStream = new System.IO.FileStream("test1.model", System.IO.FileMode.Open, System.IO.FileAccess.Read, System.IO.FileShare.Read);

byte[] mdl = new byte[modelStream.Length];
modelStream.Read(mdl, 0, (int)modelStream.Length);
            
vb.SetData<byte>(0, mdl, 0, 3*20, 20, SetDataOptions.None);
(I also tried the basic SetData(mdl) call, but this also works on Windows and not XBox.) test1.model is a binary file that contains 3 vertices of size 20 bytes each (5 floats * 4 bytes/float), with a structure as follows (C++ struct layout):
struct
{
float x, y, z;
float u, v;
}
I'm not seeing any fundamental reason why this should not work. The vertex declaration is set up as follows:
decl = new VertexDeclaration(graphics.GraphicsDevice, new VertexElement[] { 
new VertexElement(0, 0, VertexElementFormat.Vector3, VertexElementMethod.Default, VertexElementUsage.Position, 0), 
new VertexElement(0, 12, VertexElementFormat.Vector2, VertexElementMethod.Default, VertexElementUsage.TextureCoordinate, 0) });
I know the "proper" way to do things now are to use XNA's mesh formats, but I am porting my C++ engine to XNA and I would really like to be able to keep my model formats without having to write converters to load a byte array into a C# struct vertex format like how the tutorials show SetData<T> being used. I'm starting to wonder if this is just unsupported usage on XBox, especially since I have not had a problem with this on Windows.

Share this post


Link to post
Share on other sites
joelmartinez    338
I don't know about it being supported or not ... but I have to question why you're trying to avoid the content pipeline?

I know it's probably a bit of a pain to get the functionality up and running, but in the long run you're opening up a great point of extensibility. You could relatively easily write a custom Content Importer to take your custom files and turn them into proper XNA models.

Share this post


Link to post
Share on other sites
ShawMishrak    146
I agree that a custom importer would work for this particular situation, but I'm very curious as to *why* this is not working. If I instead use a C# struct of floats, populate it with the floating-point values encoded in the file, and call something like "vb.SetData<MyVertex>(mdl)", everything works as expected. It just seems to be doing some kind of internal work that is messing around with the data.

Share this post


Link to post
Share on other sites
joelmartinez    338
Quote:
Original post by ShawMishrak
... but I'm very curious as to *why* this is not working ...

... It just seems to be doing some kind of internal work that is messing around with the data.
You could always just fire up .NET Reflector against the x86 and the 360 version of the XNA assemblies and see if they're doing something different :-P

Share this post


Link to post
Share on other sites
ShawMishrak    146
Well, I finally figured out what the issue is. I forgot that the XBox 360 uses PowerPC processors.

The short answer is that the XNA Vertex Buffer wrapper must internally convert the byte stream into floating-point numbers for processing, or the ATI graphics processor is big-endian (which may just make sense if they wanted to match the graphics processor to the big-endian PowerPC processors). Either way, the byte stream was actually being processed as floating-point numbers in big-endian format, not little-endian as it is on Windows (Intel). Flipping the bytes in groups of 4 fixed the problem.

For those of you unfamiliar with this concept, the floating-point number 1.0f can be represented in two ways:

Little-Endian (Intel): 000080BF
Big-Endian (PowerPC): BF800000 (note the mirror effect)

Hopefully someone will be able to benefit from this and avoid my stupid mistake in the future. If you're doing byte-level manipulations, keep in mind the endianness of your platform! :)


Joel,

That's a good idea, unfortunately I do not believe the XBox assemblies are accessible. The XBox reference assemblies that are installed as part of Game Studio Express appear to be little more than just what their name implies, references. Viewing the assemblies with ildasm, all of the methods just immediately return. They're most likely just there for compilation purposes.

Share this post


Link to post
Share on other sites
ShawMishrak    146
As far as I know, no. Conversion routines wouldn't be hard to write. You just take the number of bytes that are in your data type and mirror them.

However, unless you're sharing binary data between Windows and XBox and *not* using the XNA content pipeline, it should not be an issue.

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

Sign in to follow this