Sign in to follow this  
Chr1sen

Direct3D UYVY texture.

Recommended Posts

Hello! I am making custom video player that can play videos using UYVY textures.

Now I've made sure two things(fairly new GPU card):

1) my computer supports nonpower textures
2) my computer supports UYVY textures
3) the data and width/height is correct(frame). I've verified it with many different ways. For example, doing the conversion manually into RGB32 and using that as texture, it works.

Bottom line is;
my GPU can't handle UYVY textures sometimes.

For example, some video resolutions work fine, such as:
1024x720
720x528 or something..

On other video resolutions, the texture output is completely gibberish. It's just a bunch of messy stuff and reminds me zigzag lines.
Are there some limitations I am not aware of? Or has anyone dealt with this?


On another note, I am using Direct3D9, can anyone suggest me some keywords to google around so I could upload YUV data into shader and then take over the "sampling" process & converting it into RGB process?

As far I've searched;
you can have only one texture active at shader stage, with DX9, so it means I have to upload all the YUV data into texture at once.

So, for example UYVY frame will take width*height + width*height/2 + width*height/2 pixels, and I am not sure if GPUS even support so big textures ?! when talking about fullHD movies.
I am also not sure how should I do the sampling myself. Edited by Chr1sen

Share this post


Link to post
Share on other sites
When you copy data into your texture, do you take the pitch into account? D3D tells you the pitch when you lock/map the texture.
 
e.g. a 19x3 texture might look like this, where in between the rows (0/1/2), there's some padding (P):
    pitch
|----------------------|
    width
|-----------------|
0000000000000000000PPPPP
1111111111111111111PPPPP
2222222222222222222PPPPP
you can have only one texture active at shader stage
No, there's many sampler slots that you can bind textures to, per shader stage. Edited by Hodgman

Share this post


Link to post
Share on other sites

Thanks. I got everything working now.

It's a shame that I really didn't see people use the "Pitch" property. I guess they are mostly dealing textures that have the Pitch same as Source width.
 

		/// <summary>
		/// Copies content into destinationBuffer.
		/// </summary>
		/// <param name="texture">actual texture </param>
		/// <param name="contentToCopy">source bfufer</param>
		/// <param name="sourcePitch">source stride</param>
		private static void CopyToTexture(Texture texture, byte[] contentToCopy, int sourcePitch)
		{
			var data = texture.LockRectangle(0, LockFlags.Discard);
			var y = 0;

			for (var scanLineStart = 0; scanLineStart < contentToCopy.Length; scanLineStart += sourcePitch, y++)
			{
				data.Data.WriteRange(contentToCopy, scanLineStart, sourcePitch);
				data.Data.Seek((y + 1)*data.Pitch, SeekOrigin.Begin);
			}

			texture.UnlockRectangle(0);
		}

 


And if anyone is interested, then I ended up using this for my YUV->RGB shader:
http://subversion.assembla.com/svn/AvP/branches/workbranch/trunk/shaders/fmvPixel.psh

though I am thinking about writing my own shader that could use only one sampler2D, so I could pass packed YUV formats directly into the GPU. Though I am not sure where to begin. Is that even a good idea? The texture is defintely gonna be huge as hell though.

Edited by Chr1sen

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

Sign in to follow this