Scale vertices to pixels

Started by
4 comments, last by ExErvus 8 years, 6 months ago

Sorry if the title a bit unclear but it's the best I could come up with ^^

What I mean is: I want to use quads to display the UI of my game instead of drawing bitmaps through Direct2D as I do currently.

So I have my UI textures which are of course of a fixed size, e.g. 64*64px

Now I want to have a screen space quad which is scaled so the texture pixels are always 1:1 with the screen pixels to avoid scaling artifacts

What would be the best or easiest way to do this?

1. Precompute the quads e.g. on setting screen resolution

2. Have one quad defined somewhere, then scaling it in the render method according to the texture size

3. Same as 2 but do the scaling in the shader.

4. Some other method I didn't think of?

Advertisement
Create an orthographic projection matrix with the width and height set to your screen width and height in pixels.
https://msdn.microsoft.com/en-us/library/windows/desktop/bb205347(v=vs.85).aspx
Then you can specify vertex positions in pixels.

As for precomputing or scaling I would first of all wrap that behavior into a 2D drawing interface that hides your implementation

class Canvas2D
{
public:
    void StartFrame();
    void DrawImage(float x, float y, float w, float h, Texture2D& texture);
    void DrawImage(float x, float y, float w, float h, Texture2D& texture, float srcX, float srcY, float srcW, float srcH);
    void EndFrame();
    //... any other useful drawing methods
};
I would probably then implement option #3 since it would be the simplest create. If your GUI becomes really complex #3 will incur a lot of draw calls and could be potentially slow. You then could then change the implementation to be #1 and batch subsequent draw calls with the same image together. But odds are #3 would work just fine.
My current game project Platform RPG

So, my current implementation looks like this:

I've got a Quad class which contains a vertex and an index buffer as well as an SRV for the texture and is initialized based on the desired position and size:


using System;
using SharpDX;
using SharpDX.Direct3D11;
using SharpDX.DXGI;
using Buffer = SharpDX.Direct3D11.Buffer;
using Device = SharpDX.Direct3D11.Device;

namespace DX11Renderer.Graphics
{
    public class Quad : IDisposable
    {
        public Buffer VertexBuffer;
        public Buffer IndexBuffer;

        public ShaderResourceView Texture;

        public InputLayout Layout;

        public Quad(Device device, byte[] signature, Vector2 position, Vector2 size)
        {
            var vertices = new[]
            {
                new Vector4(position, 0, 1),
                new Vector4(position.X + size.X, position.Y, 0, 1),
                new Vector4(position + size, 0, 1),
                new Vector4(position.X, position.Y + size.Y, 0, 1)
            };

            var uvs = new[]
            {
                new Vector2(0),
                new Vector2(0, 1),
                new Vector2(1, 1),
                new Vector2(1, 0)
            };

            var indices = new[]
            {0, 1, 2, 2, 3, 0};

            var vertexStream = new DataStream((Utilities.SizeOf<Vector4>() + Utilities.SizeOf<Vector2>()) * 4, true, true);
            var indexStream = new DataStream(sizeof(int)*6, true, true);

            for (int i = 0; i < 4; i++)
            {
                vertexStream.Write(vertices[i]);
                vertexStream.Write(uvs[i]);
            }

            for (int i = 0; i < 6; i++)
            {
                indexStream.Write(indices[i]);
            }

            VertexBuffer = new Buffer(device, vertexStream, new BufferDescription(Utilities.SizeOf<Vector4>()*4 + Utilities.SizeOf<Vector2>() * 4, 
                ResourceUsage.Default, BindFlags.VertexBuffer, CpuAccessFlags.None, ResourceOptionFlags.None, 0));
            IndexBuffer = new Buffer(device, indexStream, new BufferDescription(sizeof(int) * 6, 
                ResourceUsage.Default, BindFlags.IndexBuffer, CpuAccessFlags.None, ResourceOptionFlags.None, 0));

            Layout = new InputLayout(device, signature, new[]
            {
                new InputElement("POSITION", 0, Format.R32G32B32A32_Float, 0, 0),
                new InputElement("TEXCOORD", 0, Format.R32G32_Float, 16, 0)
            });
        }

        public void Dispose()
        {
            VertexBuffer.Dispose();
            IndexBuffer.Dispose();
        }
    }
}

I create a Quad e.g. in my UIElementButton class like this:


_buttonQuad = new Quad(GameServices.GetService<RendererD3D11>().D3DDevice,
                                   GameServices.GetService<RendererD3D11>().Shaders["QuadShader"].Signature,
                                   Vector2.Zero,
                                   new Vector2(256, 64));
            _buttonQuad.Texture = ShaderResourceView.FromFile(GameServices.GetService<RendererD3D11>().D3DDevice, @"Content\Interface\Menu\Button.dds");

Which is then send to the DrawQuad method in the QuadRenderer class:


public void DrawQuad(Quad quad, Shader shader, Device d3DDevice)
        {
            _context.VertexShader.SetConstantBuffer(0, _cBuffer);
            _context.UpdateSubresource(ref _transform, _cBuffer);
            _context.InputAssembler.PrimitiveTopology = PrimitiveTopology.TriangleList;
            _context.InputAssembler.SetVertexBuffers(0, new VertexBufferBinding(quad.VertexBuffer, 24, 0));
            _context.InputAssembler.SetIndexBuffer(quad.IndexBuffer, Format.R32_UInt, 0);
            _context.InputAssembler.InputLayout = quad.Layout;
            _context.VertexShader.Set(shader.VertexShader);
            _context.PixelShader.Set(shader.PixelShader);
            _context.PixelShader.SetSampler(0, _sampler);
            _context.PixelShader.SetShaderResource(0, quad.Texture);
            _context.OutputMerger.SetDepthStencilState(_state);

            _context.DrawIndexed(6, 0, 0);
        }

FYI, Sampler is AnisotropicWrap and DepthStencilState is Disabled

The data is then to be rendered on the GPU using this simple Shader:



cbuffer QuadBuffer: register(b0)
{
	float4x4 projection;
}

Texture2D Texture;

SamplerState textureSampler;

struct VSQuadIn
{
	float4 position: POSITION;
	float2 texcoord: TEXCOORD;
};

struct VSQuadOut{
    float4 position : SV_Position;
    float2 texcoord: TexCoord;
};
 
VSQuadOut VertexShaderFunction(VSQuadIn In)
{
	VSQuadOut Out;
	Out.texcoord = In.texcoord;
	Out.position = mul(In.position, projection);
	return Out;
}

float4 PixelShaderFunction(VSQuadOut input) : SV_TARGET
{
	float4 textureColor = Texture.Sample(textureSampler, input.texcoord);
	textureColor.a = 1;

	return textureColor;
}

technique fsQuad
{
	pass Pass1
	{
		VertexShader = compile vs_4_0 VertexShaderFunction();
		PixelShader = compile ps_4_0 PixelShaderFunction();
	}
};

-------

But when I run the game the Quads don't render and when I take a look using the graphics debugger there's nothing showing up on the Input Assembler and VertexShader stages at all

Actually, when I look at the position and texture coordinate values in the vertex shader stage are completely bogus. Either 0 or NaN or impossibly small with exponents down to E-47. And some don't show anything in the Input Buffers windows of the graphics debugger at all

D'oh, finally realized I completely derped and forgot to reset the position on the vertex and index streams before writing to the buffers on quad creation ^^

What would be the best or easiest way to do this?

Create your source textures to accommodate the largest, more widely available screen resolution(1920x1080 is a common one) and scale textures down. No artifacts!

This topic is closed to new replies.

Advertisement