• Create Account

# Waaayoff

Member Since 11 Sep 2009
Offline Last Active Apr 24 2016 11:35 AM

### #5103839The death of D3D9

Posted by on 23 October 2013 - 01:07 PM

Because Windows XP still runs on 39% of PCs.

Edit: Also i think some find it easier not to have to deal with shaders at first. They do tend to complicate things a bit.

### #5102576Directx11 animations

Posted by on 19 October 2013 - 01:45 AM

I wrote a simple explanation with code here a while back. Take a look at it and feel free to PM me with any questions.

### #5097911Gerstner wave function [HLSL]

Posted by on 30 September 2013 - 01:31 PM

Thanks for the help!

The code helped a lot, although it doesnt work with a plane for some reason. Still I get the idea and can now write the rest myself.

I appreciate the help!

The code works by transforming vertices. A plane only has 4. You need to use a grid.

Anytime.

### #5097753Gerstner wave function [HLSL]

Posted by on 30 September 2013 - 01:01 AM

Here's some code I wrote a while back. It's only a test code for a single wave. You will want to create a number of waves at different frequencies to obtain a more realistic look.

```cbuffer World
{
float3 SunDirection;
}

cbuffer Instance
{
float4x4 WorldViewProj;
}

cbuffer PerFrame
{
float DeltaT;
}

Texture2D DiffuseTexture;
SamplerState Sampler;

struct VS_IN
{
float4 position : POSITION;
float2 uv 		: TEXCOORD0;
};

struct VS_OUT
{
float4 position	: SV_POSITION;
float2 uv		: TEXCOORD0;
};

{
VS_OUT output = (VS_OUT)0;

// Gerstner Wave

float A = 5;	// amplitude
float L = 50;	// wavelength
float w = 2*3.1416/L;
float Q = 0.5;

float3 P0 = input.position.xyz;
float2 D = float2(0, 1);
float dotD = dot(P0.xz, D);
float C = cos(w*dotD + DeltaT/100);
float S = sin(w*dotD + DeltaT/100);

float3 P = float3(P0.x + Q*A*C*D.x, A * S, P0.z + Q*A*C*D.y);

output.position = mul(WorldViewProj, float4(P,1));
output.uv = input.uv;

return output;
}

{
return float4(DiffuseTexture.Sample(Sampler, input.uv).rgb, 1);
}
```

And some literature:

http://http.developer.nvidia.com/GPUGems/gpugems_ch01.html

http://hyperphysics.phy-astr.gsu.edu/hbase/waves/watwav2.html

http://www.gamedev.net/blog/715/entry-2249487-ocean-rendering/

http://graphics.ucsd.edu/courses/rendering/2005/jdewall/tessendorf.pdf

### #5089195Managers, which pattern should I use

Posted by on 26 August 2013 - 09:34 AM

...

Globals::getInstance()->textureManager()->getTexture("baby.png");

Globals::getInstance()->inputManagers[PlayerId]->isButtonDown('A');

Not the best practice, but accessing things easily from everywhere is nice too. Also I cant keep all my other managers singleton-free.

That is just ugly. It reminds of me of the OGRE engine and how much i hated its managers. Like said before, there is nothing wrong with passing pointers or localizing functionality.

### #5088370Is SSAO view dependent?

Posted by on 23 August 2013 - 07:27 AM

Ok so i took another look at the code that uses position directly instead of reconstruction and found a bug. Fixed it and got the following result:

Which looks correct, right? If so, it's the damn reconstruction code again. It just won't work!!

### #5078741Per-Pixel Point Light

Posted by on 18 July 2013 - 11:52 AM

Output the normals from the pixel shader as colour. Do they look correct?

### #5075694Precomputed Atmospheric Scattering - Irradiance table?

Posted by on 06 July 2013 - 05:43 AM

I'm trying to implement Bruneton's algorithm but i'm not sure the irradiance results are correct. The output texture is black but the values aren't zero. If i multiply them by 100 i get the following:

Which closely resembles the image found in this implementation by Stefan Sperlhofer. So my queston is, is the table supposed to have such low values? Because the Transmittance and Inscatter tables don't.

Also, in the algorithm the single irradiance is multiplied by zero and discarded... Why calculate it in the first place???

### #5074106a = b = new c?

Posted by on 30 June 2013 - 04:09 AM

Also, I didn't realize it was that unclear to read...

Then why did you create this thread? ;-)

You weren't sure how it works, that's IMHO a good proof that it isn't clear to read (at least for you and as it's your code...).

Yes but that's like saying i shouldn't use smart pointers because they're not readable for me since i don't know how they work.. I get what you guys are saying about readability but i just thought this was a simple assignment operation that was common knowledge for non beginners (unlike me)

### #5073525Terrain lighting seams?

Posted by on 28 June 2013 - 02:19 AM

I was thinking maybe it's because my normals are being interpolated when sampling the chunk normal map in the pixel shader?
Sounds reasonable. If you're using REPEAT texture access you're set up for surprises. Real thing is, you don't assemble chunks to make a big terrain. We slice big terrains into chunks instead. All bounduary samples must use inter-chunk fetch. Seamless chunks hide the problem at an extreme authoring cost and limitation, don't do that.

I already use neighbour chunk vertices for border cases if that's what you mean

Is this seam there if you just go by normals, no normal maps?

What do your texture coordinates look like?

(Your normal map isn't in a texture atlas, is it?)

Yes, i just tested and the normals look fine when passed with the vertex. I guess it is a problem with the texture sampling... Any idea how to fix it? Or rather, why not just pass the normals instead of using normal maps? Does bump mapping even work when the texture resolution is the same as the vertex resolution?

And no i'm not using an atlas; a separate texture per chunk

If you are using bezier patches, make sure you are joining your patches with at least C1 continuity.

I don't even know what that means

### #4991567Effect/Shader system design :: variable/cbuffer system

Posted by on 18 October 2012 - 03:52 PM

Hello,

After much thought i approached the problem like this: (Note that this is largely based on the Hieroglyph 3 engine)

I have a ShaderParameter abstract class that every variable type (vector, matrix, texture, etc..) inherits from. The derived classes (one for each type) contain the data. You can call it to get/set data, get type.. This class also has an Update ID (more on this in a second).

Next i have a class that the user interacts with to set shader parameters. The variables are created when a new shader is compiled and stored in an std::map (or hashmap if you want).

Now I divide my parameters into 3 'types' just like the D3D API: Constant buffers, textures and sampler states. A shader class has 3 vectors that contain wrappers around the D3D parameters. The Constant Buffer wrapper has a pointer to the actual buffer and a list of variable descriptions that belong in that buffer. The description struct looks like this:

struct VariableDesc
{
ShaderParameter* pParameter; // pointer to variable
unsigned int UpdateID;
unsigned int Offset; // offset into the buffer
};

Now the UpdateID variable is used to check if the buffer needs to be updated. Whenever the user sets a new value for the variable, its UpdateID that i talked about before is incremented. When you're binding a constant buffer, you loop through its variables and if one of the VariableDesc's UpdateID doesn't match the ShaderParameter's UpdateID, you remap the buffer's content. This is a simple optimization.

Next up you will have a RenderEffect. This is a simple POD struct that contains shaders and render states. It is passed to the Pipeline class which is responsible for binding everything. The RenderEffect doesn't really need to know about the details and so it has no functions.

Finally you have a ShaderStage. This one is an intermediary between the Shader class and the Pipeline class. It simply exists to take care of redundant API calls. Since the Shader class has a list of wrappers in contiguous memory and not the actual D3D pointers, the ShaderStage collects the D3D data and binds all shader resources in 4 calls. One for the shader, one for the cbuffers, one for the textures and one for the samplers.

I can't post code now but if you want i will when i get home. (Be warned though, there's a LOT of it since we're talking about the core of a rendering engine)

Posted by on 24 September 2012 - 02:35 PM

I see that you are not doing any error checking. I realize you're a beginner but it can help a lot. Most DirectX functions return an error value if they fail. You should check for that

I believe you forgot the texture file extension in this line:
D3DXCreateTextureFromFile(d3ddev, L"ParticleSmokeCloud64x64", &d3dtex);

### #4980094D3D11: CORRUPTION: ID3D11DeviceContext::ClearRenderTargetView: First paramete...

Posted by on 14 September 2012 - 09:52 AM

Insert a breakpoint at the ClearRenderTargetView function line and run it. Does g_pRenderTargetView point to NULL?

If so, the render target wasn't created properly. Also make sure you're setting the render target beforehand.

### #4974190Terrain normals messed up, don't know why?

Posted by on 28 August 2012 - 11:53 AM

Oh. My. God. Kill. Me. Now.

You were right. I forgot about some debugging code i left in my renderer where i bind the shader parameters only once. Changed that and everything worked. I feel so stupid right now.

Anyway, thank you!!!!!!!!

### #4953944Chunked LOD vertex data?

Posted by on 29 June 2012 - 06:56 AM

I don't know if it's relevant but i'm implementing the Rendering Very Large, Very Detailed Terrains version.

PARTNERS