Jump to content

  • Log In with Google      Sign In   
  • Create Account


Member Since 21 Dec 2005
Offline Last Active May 29 2012 04:21 AM

Topics I've Started

Instancing my terrain patches

25 May 2012 - 05:02 AM

Hello again! I'm currently rendering my terrain as a bunch of patches. There are different LOD levels of the patch. Each LOD level has it's own index buffer. No vertex buffer is used, the planar position of the vertex is calculated with SV_VertexID and the height is then sampled from a texture. Below is the shader for this:

// Constant buffers //

cbuffer Initial : register(b0) {
  int NumCells;
  float SliceSize;

cbuffer EveryFrame : register(b1) {
  matrix World;
  matrix View;
  matrix Projection;

// Input/Output structures //

struct AppToVertex {
  uint VertexID : SV_VERTEXID;

struct VertexToPixel {
  float4 Position : SV_POSITION;

// Shader resources //

Texture2D HeightMap : register(t0);

// Samplers //

SamplerState HeightMapSampler : register(s0) {
  AddressU = CLAMP;
  AddressV = CLAMP;

// Helper function prototypes //

float2 GetPosition(uint vertexID);
float GetHeight(float2 position);

// Vertex Shader //

VertexToPixel VS(in AppToVertex input) {
  VertexToPixel output = (VertexToPixel)0;

  // Calculate the location of this vertex
  float2 p = GetPosition(input.VertexID);

  // Calculate the height of this vertex
  float h = GetHeight(p);

  // Apply transform
  output.Position = float4(p.x, h, p.y, 1);
  output.Position = mul(output.Position, World);
  output.Position = mul(output.Position, View);
  output.Position = mul(output.Position, Projection);

  return output;

// Helper method for calculating the position of a specific vertex //

float2 GetPosition(uint vertexID) {
  int numVerts = NumCells + 1;

  float x = (float)(vertexID % numVerts) * (SliceSize / NumCells);
  float z = (float)(vertexID / numVerts) * (SliceSize / NumCells);

  return float2(x, z);

// Helper method for sampling the height map //

float GetHeight(float2 position) {
  // Calculate the texture coordinate
  float u = position.x / NumCells;
  float v = position.y / NumCells;

  // Sample the height map
  return HeightMap[float2(u, v)].r;

Now, I would like to instance patches that currently have the same LOD level and has the same materials (if that's necessary, better if that's not needed).
I can have a structured buffer that contains data about each patch (patch X location, patch Z location, materials etc) and use SV_InstanceID to fetch it. But my concern is the height map. I tried using a simple array, like so:
Texture2D HeightMap[8]
But you cant dynamically index an array using SV_InstanceID like you can do with the structured buffer. So does anyone know how I can do instead? Putting the height map in the structured buffer would be cool, but you can't really do that can you...? Is there any other way, like a texture buffer (is this what tbuffer is for?) or something like that?
Or maybe I can use a separate structured buffer like this:
StructuredBuffer<Texture2D> HeightMaps;

Any ideas are welcome!

Thanks in advance!

[D3D11] HLSL, StructuredBuffer and System Values

03 May 2012 - 04:16 AM

Hi! I'm trying to create my own SpriteBatch class, actually made one a while back, but had some lousy implementation. I've seen some code around here that used a structured buffer, something like this:
StructuredBuffer<Sprite> SpriteData;

struct Sprite {
	 float2 Position;
	 float2 Size;
	 byte TextureIndex;

Now, I'm not looking for help implementing this, I'm just curious if anybody know any good guides, examples or tutorials on structured buffers. I get the structured buffer thing, having one sprite struct instance per sprite. But what vertices would you send to the graphics card? I mean, if all the data you need is in the structured buffer, what would the vertices contain? They're obviously needed, else there would be nothing to process in the vertex shader. And how do you take use of the system-generated values, such as SV_InstanceID and SV_VertexID? On the samples I saw, SV_InstanceID (I think) were used to fetch the right Sprite instance from the structured buffer, but can't really get how this works.

Again: I'm not asking anyone here to help implement a SpriteBatch class/shader, only if there is any good guides or similar. Havn't found much searching around...

Thanks in advance!

DirectX 11 - Changing settings after initialization

22 April 2012 - 12:50 PM

Hello there!

I have a small question regarding changing device settings (such as multisampling, refresh rate, etc) after the device has been initialized.

These things is what I know: Changing fullscreen mode requires a single call:
swapChain.IsFullscreen = !swapChain.IsFullscreen

Changing screen resolution is fairly simple too, it's written at the SlimDX tutorial page. This is what I did:
// Sets the forms size
window.Form.ClientSize = new Size(width, height);

// Dispose the current render target and create a new one
swapChain.ResizeBuffers(1, 0, 0, Format.R8G8B8A8_UNorm, SwapChainFlags.AllowModeSwitch);
using (var resource = Resource.FromSwapChain<Texture2D>(swapChain, 0))
			renderTarget = new RenderTargetView(device, resource);

// Set the new target

Now to the problem. If I want to change any other settings (settings in the SwapChainDescription structure), such as multisampling and refresh rate, what do I need to do? Do I need to recreate the device and swap chain objects? If that is the case, what happens to resources created or loaded with the device, such as effects, textures, vertex and index buffers, etc.? Do these need to be recreated aswell?

Thanks in advance!

Vertices gets distorted camera is far away in the world

19 January 2012 - 11:23 AM

Hello there! I began working on a project a few weeks ago, where I wanted to implement "infinite" terrain, constantly streaming data from the harddrive. I have a free-view-camera, the code for updating the view matrix looks like this:

public void Update(Clock clock) {

   var rotation = Matrix.RotationYawPitchRoll(yaw, pitch, 0);
   right = Vector3.TransformCoordinate(Vector3.UnitX, rotation);
   up = Vector3.TransformCoordinate(Vector3.UnitY, rotation);
   forward = Vector3.TransformCoordinate(Vector3.UnitZ, rotation);

   view = Matrix.LookAtLH(position, position + forward, up);

This code works perfectly when the camera is near origo (0, 0, 0). But as soon as I get further away, at distances such as (5000, 10, 5000) the rotation of the camera starts acting a bit odd. The vertices looks distorted and sort of moves around a bit when you rotate, and sometimes the camera takes big rotation jumps, without reason. Hard to explain, and even harder to show with pictues. Hope you understand!

Is there something I have overlooked or is it "supposed" to happen when you're at these kind of distances away from origo?

Not sure if it matters, but I'm working with DirectX 11 in SlimDX. Thanks in advance!


EDIT 2 :
Title is supposed to be "Vertices gets distorted when camera is far away in the world"

DirectX 11 Custom Model Format

02 November 2011 - 01:59 PM

The time has come where I need to import models to my game (or application). Since I'm using DirectX 11 there is no built in support for that and the .X file format is old and stuff, I want to create my own model format. I don't think loading my own custom format would be very hard, but I can't find any help on how to export my models to my custom format. I'm using 3ds Max 2010, and I know you can either create a MAXscript or create a plugin with C++, but C++ isn't my main language. C# is. So that would be pretty hard. Does anybody know any good tutorials or guides on how to create your own plugin for 3ds Max? Havn't found anything myself.

Another solution, that would be much better for me, would be creating a program that reads the native 3ds Max format (.max) and then export it. I could use C# for that. But then the question remains: How do I read the native format? Is there any documentation on how the file format is laid out? Any guides?

Thanks in advance!