So, I decided to check out the new VS Express 2013 for windows, which required me to upgrade to Windows 8.1. I installed VS, created a new app from the DirectX template and debugged it to see the spinning cube.
Now it's the next day, and my question is, how do I start Visual Studio? It's not on my start page. Doing a search for VS Express brings up web pages. Do I have to hunt for the .exe in Windows Explorer? I've been a hobbyist Windows programmer for 25 years, but... I no longer know how to use my computer...
I haven't had this problem since I was 13 years old using a Commodore 64. I am very sorry.
Ok, I've got this issue solved. I have the river sizes for each edge as part of the input for the vertex shader. Then in the hull shader patch constant function I determine the texture to choose and also if the texture should be flipped, and then in the domain shader I interpolate the coordinates for sampling the (u,0.0f) row of the texture which represents the river crossing.
Another issue is of course getting the right mip level at the patch corners, I cheated on this and simply used a zero value which is correct for this particular scheme only.
I think I have the issue nailed down. I just have to fix it in a way that is not overly complicated in the shader and not too memory hogging overall. I want to keep my instance data small so I can build huge, huge maps.
Here is the issue with displacement mapping when your patch is designed to take a whole texture. Generally when we design tiling textures, we design them so that the edges are adjacent to the other edge. For example, the left edge is designed to flow seamlessly into the right edge for a tiling texture. Surface patches however, lie on top of each other at the edge, NOT adjacent to one another! This problem would exist even with square patches in my case.
No matter what kind of texture you use, even a high precision one, it is vital that shared patch edges sample the same exact data. Close is not good enough in this case, as the hardware seems to like to accentuate these cracks and even from far away they look like blinking Christmas tree lights.
The fact that I'm using square textures on a triangle patch complicates things, but there is one thing that is in my favor. All of the complex river interactions happen on the interior of the textures. At the patch edges, the only thing that ever happens is that one of three river sizes or none will cross there, and also they might be flipped. So I could sample either one of four 1D textures there or even use certain of my simpler textures at uv = (u, 0). I just have to devise a scheme for the shader to know what river size is needed and whether its coordinate should be inverted.
EDIT: I should add that these issues wouldn't exist if your texture border weren't co-mingling with your patch border. I could layer a rocky texture over the whole terrain without cracking. This only comes up if you are trying to build large textures out of pieces made to fit together at patch borders.
Ok, so I figured it out. Not only do I have to transpose the cotangent frame, but I also had two texture arrays, one for the height map and one for the normal map, and had also failed to specify which register they should go in. I decided to use my normal texture as a color texture to see what it would look like, and was shocked to see my height map instead! Using the conditional statement in the pixel shader worked sortof because it somehow forced the compiler to use the right texture. Setting the t registers to their appropriate slots fixed most of my problems, as I was sampling the wrong texture.
The only problem I have left is that using that conditional statement, which I suppose I don't really need, still looks bad on my nVidia gpu.
If anybody knows what automatic process is mucking this up, I would love to hear about it. In any case, it appears that my normal mapping code is correct now.
Ok, here is a screenshot of some of the new developments. The most difficult part of this by FAR was drawing text. I did everything I could think of to avoid dx 11.1, but everything else had some reason why I couldn't use it. Anyway its working now. A cool thing I added was the ability to pick a hex with the mouse hover! (see the little green hex cursor) That made debugging the river system a lot easier, because if I saw a problem I could know which hex it was within a sea of hexes, and also print hex specific debug info on the selected hex. It works very well too, in an oblique view I can select a hex that is miles and miles (or whatever distance units you prefer) away.
The river system supports 3 sizes of river and currently just runs from the watersheds to the tiles that have no outlet. Right now, I'm just painting the rivers blue to make sure it's working, but later I will use these textures to make displacement and normal maps, and also possibly flow maps so that I can make the water surface flow down the river.
So the reason I needed to figure out this varonoi scheme is because now that I have rivers mostly working, I need to throw in the ocean and lakes. For now, I just need to paint them blue like the rivers to make sure it all works in a system, then I will start making it fancy.
Ok, I found the bug related to my line drawing and it is related to this topic. My new constant buffer had the projectionView matrix and a translation vector. I had forgotten to set the constant buffer in my line drawing component, and what happened was that, even though I set new shaders for the new component, the old constant buffer was still available and in use.
I was very confused at first because I was assuming that my new constant buffer was being updated, since the projectionView matrix was rotating things just fine, but my translation vector was not working. It turns out that I was using the projectionView matrix from my other constant buffer which was still set! Since the translation vector did not exist in my old buffer, it was just set to zero.
So it turns out that you can set a new shader, but all the other stuff attached to it is still attached and treated separately. If you think about it, I guess it needs to be that way.
Ok, after trying the ref device, the only error reported was related to the hull shader that shouldn't be there. Getting rid of unneeded shaders fixed the crash. Naturally, the line drawing I was trying to do was way off the mark, but it's the first time I looked at it. Nothing is ever right the first run.
What I take from this is that, when you are drawing objects in a component like fashion where other things are also drawn, you should set all unused shaders to null in case some other component is setting them.
Oh, I'm so silly. I forgot to include the SharpDX toolkit dlls into my project, and when I did, it gave me Texture2D.Load and also a Texture2D(non-static).Save! In any format including dds! The SharpDX Toolkit actually uses the DirectXTex and DirectXTK libraries for such things. Well, that was easy.
Ah yes, I'm getting ready to test out the river system. I have a series of textures for normal/displacement that I made programmatically for each kind of river junction such as:
There's three river sizes that can run along the hex vertices according to certain overly complicated rules. This time around I am also trying to support rivers going into and out of lakes, which will be interesting. It won't just have one ocean level and that's water, it will have higher altitude bodies of water and particle effect waterfalls etc.
That's all theory for now! First I have to load my textures to see if any of this is feasible!
Back in the DX9 days before texture arrays, loading a texture was as simple as Texture2D.FromFile(). Now for DX11, things have become vastly more complicated. I've been googling this issue and I've been seeing various contradictory information that indicates that certain convenient helper functions have gone legacy. I looked at the way the rastertek multitexturing tutorial does it, and I am totally confused by it. It seems to load bitmap files directly into shader resource views and makes an array of shader resource views and that's the texture array... In any case I can't find the same functions in the latest version of SharpDX. I've seen some examples of texture loading with WIC, but nothing for texture arrays.
My question is, what would be the proper way as of today, 5/28/2013, to load a series of png files into a single texture array with automatically generated mipmaps? Is there a simple way to do that or are we talking about a thousand lines of code before we can pass this to a shader?
Here is my solution:
public static class TextureUtilities
public static Texture2D LoadTextureArray(string names)
ImagingFactory factory = new ImagingFactory();
List<BitmapSource> bitmaps = new List<BitmapSource>();
foreach (string name in names)
BitmapSource bitmap = LoadBitmap(factory, name);
Texture2D texArray = new Texture2D(Game.GraphicsDevice, new Texture2DDescription()
Width = bitmaps.Size.Width,
Height = bitmaps.Size.Height,
ArraySize = names.Length,
BindFlags = BindFlags.ShaderResource | BindFlags.RenderTarget,
Usage = SharpDX.Direct3D11.ResourceUsage.Default,
CpuAccessFlags = SharpDX.Direct3D11.CpuAccessFlags.None,
Format = SharpDX.DXGI.Format.R8G8B8A8_UNorm,
MipLevels = CountMips(bitmaps.Size.Width, bitmaps.Size.Height),
OptionFlags = SharpDX.Direct3D11.ResourceOptionFlags.GenerateMipMaps,
SampleDescription = new SharpDX.DXGI.SampleDescription(1, 0),
int stride = bitmaps.Size.Width * 4;
for(int i=0;i < bitmaps.Count; ++i)
var buffer = new DataStream(bitmaps.Size.Height * stride,true,true);
DataBox box = new DataBox(buffer.DataPointer, stride, 1);
Game.GraphicsDevice.ImmediateContext.UpdateSubresource(box, texArray, Resource.CalculateSubResourceIndex(0, i, CountMips(bitmaps.Size.Width, bitmaps.Size.Height)));
ShaderResourceView view = new ShaderResourceView(Game.GraphicsDevice, texArray);
public static BitmapSource LoadBitmap(ImagingFactory factory, string filename)
var bitmapDecoder = new SharpDX.WIC.BitmapDecoder(
var result = new SharpDX.WIC.FormatConverter(factory);
private static int CountMips(int width, int height)
//lifted from SharpDX Toolkit
int mipLevels = 1;
while (height > 1 || width > 1)
if (height > 1)
height >>= 1;
if (width > 1)
width >>= 1;
You can use Perlin noise for literally EVERYTHING. There is no natural phenomena that you can simulate that won't benefit from this kind of noise input. Terrain maps, mold growth, pimple production, creature mannerisms, river networks, galaxy formation, you name it. Perlin noise simulates nature on many, many levels.
EDIT: I forgot to mention social aggression in bonobos, and many other things.
Nice job sticking to it - so where is the obligatory screen shot showing us the low and high res triangles all at once???
Alrighty then. I'm actually still futzing with the LOD settings, but I do have an interesting discovery. Here is the new code, I tried to do some culling of the backward triangle patches, but in this case the surface can protrude too much and the gaps are visible. I also tried to greatly reduce the tessellation in that case, but the popping was very noticeable.
Here is the interesting part. I originally chose 'fractional_even' arbitrarily. Switching to 'fractional_odd' can completely change the look of the scene. First is the view from central Taffy Mountain looking west with fractional even partitioning:
Now the from the same vantage with fractional odd partitioning:
At least in this particular case, I would say that fractional odd has far fewer undesireable artifacts! It's an interesting consequence.