Texture2D.GetData() returns completely transparent data (MonoGame)

Started by
7 comments, last by phil_t 8 years, 4 months ago

I'm trying to use GetData() on a Texture2D, as such:


        /// <summary>
        /// Manually replaces a specified color in a texture with transparent black,
        /// thereby masking it.
        /// </summary>
        /// <param name="Texture">The texture on which to apply the mask.</param>
        /// <param name="ColorFrom">The color to mask away.</param>
        private static uint[] SINGLE_THREADED_TEXTURE_BUFFER = new uint[MaxResampleBufferSize];
        public static void ManualTextureMaskSingleThreaded(Texture2D Texture, uint[] ColorsFrom)
        {
            var ColorTo = Color.TransparentBlack.PackedValue;

            var size = Texture.Width * Texture.Height;
            uint[] buffer = SINGLE_THREADED_TEXTURE_BUFFER;

            Texture.GetData<uint>(buffer);

            var didChange = false;

            for (int i = 0; i < size; i++)
            {
                if (ColorsFrom.Contains(buffer[i]))
                {
                    didChange = true;
                    buffer[i] = ColorTo;
                }
            }

            if (didChange)
            {
                Texture.SetData(buffer, 0, size);
            }
            else return;
        }

But it always returns an array of 0s. Why? :\

Advertisement

What format is the texture in?

Does MaxResampleBufferSize match the size of the texture?

MaxResampleBufferSize is 1024 * 768. I tried changing it to the size of the texture, it doesn't seem to have made a difference. As far as I can tell, the format is Color.

Here's how I'm loading it:


        /// <summary>
        /// Gets an Texture2D instance from the FileManager.
        /// </summary>
        /// <param name="AssetID">The FileID/InstanceID of the texture to get.</param>
        /// <returns>A Texture2D instance.</returns>
        public static Texture2D GetTexture(ulong AssetID)
        {
            Stream Data = GrabItem(AssetID);

            using (MemoryStream PNGStream = new MemoryStream())
            {
                if (IsBMP(Data))
                {
                    Bitmap BMap = new Bitmap(Data);
                    BMap.Save(PNGStream, System.Drawing.Imaging.ImageFormat.Png);
                    PNGStream.Seek(0, SeekOrigin.Begin);
                }
                else
                {
                    Paloma.TargaImage TGA = new Paloma.TargaImage(Data);
                    TGA.Image.Save(PNGStream, System.Drawing.Imaging.ImageFormat.Png);
                    PNGStream.Seek(0, SeekOrigin.Begin);
                }

                return Texture2D.FromStream(m_Game.GraphicsDevice, PNGStream);
            }
        }

        /// <summary>
        /// Checks if the supplied data is a BMP.
        /// </summary>
        /// <param name="Data">The data as a Stream.</param>
        /// <returns>True if data was BMP, false otherwise.</returns>
        private static bool IsBMP(Stream Data)
        {
            BinaryReader Reader = new BinaryReader(Data, Encoding.UTF8, true);
            byte[] data = Reader.ReadBytes(2);
            byte[] magic = new byte[] { (byte)'B', (byte)'M' };
            return data.SequenceEqual(magic);
        }

What does the Format property of the texture return?

And you've verified the texture is not black, i.e. it displays ok when you render it?

What happens if you pass Color as the type for GetData, instead of uint?

Texture.Format is Color.

Color as the type for GetData doesn't change anything, I get a bunch of colors with 0,0,0,0 :(

And yes, they render correctly.

Ok, then this may be a bug with MonoGame or something. Searching around the web it seems to be common problem (or was in the past) on some GPUs.

At any rate, retrieving the texture data from the GPU is expensive. Is there any reason you have to do this masking at runtime instead of in a pre-processor at compile time?

Thanks! :)

The problem is, I'm working with old data that uses color masks rather than alpha. Seems like the most reliable way to make masking work on all GPUs, then, is to write an extension that will run before the game starts and converts all the BMPs to PNGs with alpha.

However, this is slightly worrying, because the reason I was looking into this problem again (I had originally ignored it for the time being) was that I was trying to set a RenderTarget2D on the GraphicsDevice, and it was failing with an InvalidOperationException. I wonder if this is also a Monogame bug rolleyes.gif

It seems the solution was closer than I thought it would be. All I had to do was call Bitmap.MakeTransparent() on my bitmaps :)

If that's all you're trying to do, there's a "transparent color" property on textures you have included in your content project. Just set it to the required color (at least textures in XNA content projects had that, I'm not sure if MonoGame still requires XNA content projects or if they have their own content processor support now)

This topic is closed to new replies.

Advertisement