Jump to content

  • Log In with Google      Sign In   
  • Create Account

MJP

Member Since 29 Mar 2007
Offline Last Active Today, 12:46 AM

#5056466 Ugly lines on screen.

Posted by MJP on 24 April 2013 - 03:27 PM

Make sure that the size of your back buffer/swap chain is the same as the size of your window's client area.




#5056284 where can i find shader assemble docs

Posted by MJP on 24 April 2013 - 12:41 AM

Vertex shader instructions

 

Pixel shader instructions

 

'dcl_color' could be dcl_usage input, dcl_usage output, or dcl_semantics, depending on the context and whether it's used in a vertex shader or a pixel shader.




#5054737 Normal Mapping Upside Down

Posted by MJP on 18 April 2013 - 06:06 PM

Tanget/bitangent vectors need to point in the direction that your U and V texture coordinates are increasing, so it depends on how you setup the texture coordinates of your quad vertices.




#5053678 DirectX ToolKit

Posted by MJP on 15 April 2013 - 08:30 PM

There are a few samples on the CodePlex page, have you had a look at those?




#5053565 Is real time rendering book, third edition, still good?

Posted by MJP on 15 April 2013 - 02:39 PM

Just buy it, you won't regret it.




#5053368 Light halos, lens flares, volumetric stuff

Posted by MJP on 15 April 2013 - 01:48 AM

Lens flares and lenticular halos don't really have to do anything with volumetric lighting, that's phenomena that results from light refracting and reflecting inside of a lens enclosure. Producing a physically-plausible result in a game would require some attempt at simulating the path of light through the various lenses, for instance by using the ray-tracing approach taken by this paper. Just about all games crudely simulate these effects using screen-space blur kernels combined with sprites controlled by occlusion queries.
 

Volumetrics is mostly concerned with the scattering and absorption of light as it travels through participating media. Most games don't come anywhere close to simulating this, since it's complex and expensive. I don't think you will get very far with pre-computing anything, since you typically want to simulate the fog so that it moves about the level. You also usually want to attenuate the density with noise, to produce more realistic-looking cloud shapes.




#5053041 Point Sprite Vertex Size

Posted by MJP on 14 April 2013 - 12:22 AM

What does a value being in the emitter properties have to do with using shaders?

I'm going to be blunt here: fixed function is a waste of time. All available hardware supports shaders, and uses shaders under the hood to implement the fixed-function feature set from DX9 and earlier. There's absolutely no reason to learn it, and there's no reason to use it in any new project. Anything you can do in fixed-function can be done in shaders, and probably more efficiently since you can tailor it to your exact needs.

FVF codes are outdated cruft from the pre-DX9 era. If you're set on using DX9 then you should at least use vertex declarations. They completely replace FVF codes, and offer functionality that can't be used with FVF codes (for instance, the aforementioned PSIZE). In your case you would add an additional float to your struct for storing the point size, and then you would specify a D3DVERTEXELEMENT9 with D3DDECLUSAGE_PSIZE




#5053040 Generic Shader for DirectX 11?

Posted by MJP on 14 April 2013 - 12:11 AM

You can't really have a single generic vertex shader, since your vertex buffer(s) need to provide every element expected by the vertex shader. So if the vertex shader expects positions, normals, and texture coordinates, you need to provide all of those otherwise your call to create the input layout will fail. So you either need to use different shaders, or you would need to provide "dummy" vertex data if the element is present.

Anyway I would highly recommend that you check out BasicEffect from DirectXTK. It should do exactly what you want, and the library also provides a few other effects if you want to go a little more advanced. All of the source code is available as well, so you can see how they do it.




#5052695 Point Sprite Vertex Size

Posted by MJP on 13 April 2013 - 12:21 AM

Why? There's no performance advantage to using fixed-function, since it's just going to be emulated in a shader (which may well be slower than a vertex shader that you've authored).

Anyway if you really don't want to use a vertex shader, I think you can use PSIZE as a vertex element.




#5052611 Point Sprite Vertex Size

Posted by MJP on 12 April 2013 - 05:33 PM

You can output a float from your vertex shader with the PSIZE semantic. This will override what you specify for D3DRS_POINTSIZE.




#5052524 Binding shader resources

Posted by MJP on 12 April 2013 - 12:16 PM

Any resources stay bound to the context until you overwrite them, or call ClearState. If I were you, I would use the VS 2012 graphics debugger (or PIX, or Nsight, or AMD GPU PerfStudio) to capture a frame so that you can when and where the textures are getting bound and unbound. This should let you track down where your bug is.




#5052271 Separate DirectCompute context?

Posted by MJP on 11 April 2013 - 05:08 PM

You can't have multiple immediate contexts per device, you would have to create multiple devices (you can do this in a single process). You can definitely share resource data between two devices, you just have to handle the synchronization yourself using the DXGI sync primitives. Basically you create a resource on one device and specify the D3D11_RESOURCE_MISC_SHARED_KEYEDMUTEX flag, and then you pass the shared handle to OpenSharedResource to get an interface to the same resource on the other device. Then you use IDXGIKeyedMutex to synchronize any access to the shared resource.

I'm not 100% sure if this will give you the behavior you want (totally independent command streams with no implicit synchronization or dependencies), but I *think* it should work.




#5052269 Building Shaders

Posted by MJP on 11 April 2013 - 05:04 PM

We have an complex in-house system that's used for building and processing content. It will export meshes and other data from maya files, process vertex data, compile shaders, pack archives and do anything else that's necessary to generate efficient, runtime-ready data that can be loaded on the target platform. In our content system materials will define which permutations they can have, and will also specify which shader code to use. Then when we process a mesh we will look at the materials assigned to it, figure out which permutation it needs, and compile the necessary shaders.




#5051687 Does the HLSL compiler in Visual Studio 2012 remove unreferenced constant buf...

Posted by MJP on 09 April 2013 - 07:26 PM

The compile will absolutely strip unused resources from the compiled shader. You can verify this very easily by checking the resulting assembly dumped from fxc.exe.

Constant buffers are either stripped entirely, or left intact. The compiler won't selectively strip out parts of the constant buffer that aren't used, because this would change the layout the buffer.




#5051434 Visual Studio 2012 graphical debugger and offscreen buffers

Posted by MJP on 09 April 2013 - 02:38 AM

The easiest way to do it is to pull up the Graphics Event List, and find the draw call(s) where you draw to that render target. When you select the draw call, the render target will pop up the graphics experiment view.

 

For finding draw calls it helps to wrap related sections in events using ID3DUserDefinedAnnotation*. That you can quickly see where you shadow pass is, or your SSAO pass, or whatever you're looking for. The arrows at the top of the event list will jump to the previous or next draw call, which is also handy.

 

*The old way to do this is to use D3DPERF_BeginEvent and D3DPERF_EndEvent, which were used for PIX. These still work with the VS 2012 debugger, but they require linking to d3d9.lib.






PARTNERS