• Advertisement
Sign in to follow this  

DX11 DX11 - Preparing for Release Mode

This topic is 1658 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Hi guys,

 

a few weeks ago my computer went completely into a Blue Screen of Death, with the WHEA error (Windows Hardware Error Architecture, [that's bad]), but luckily, I kept a backup of everything.

 

Now as I'm writing, I'm porting my project to the new pc, and the repair shop managed to create a complete backup, but they switched two hard drives, so D:\ was now F:, which messed up some stuff. dry.png

 

Now, To the POINT!

 

I re-installed Visual Studio (IDE), but it somehow didn't support the debug version of the runtime c++, which I find wierd (notes obtained from external sources), so now I'm trying to make my Project work in the release mode, which is quite a pain in the ass... laugh.png

 

I managed to make it run, but at a certain point (found that 'point' by the way), it crashes the NVIDIA Kernel Driver, which isn't exactly what was supposed to happen, unless I was drunk making my application, but I don't remember so. huh.png

 

The crash seems to occur when I send some shader resources to the pixel shader, as so:

devcon->PSSetShaderResources(3, 1, &random);

The resource is created in a class called C3DEngineObject, which a member called PostProcess, which has a function called Render.

 

The function render has a paremeter called random, as seen above, but the wierd thing is that in debugging mode (in the dead pc), it worked fine! Perfect actually!

 

So why fail now? Does it have something to do with the release mode?

 

So my real question is, how can sending a resource to the pixel shader cause a Graphics Driver crash? ohmy.png

 

Thanks! 

 

PS. If you need more, please say so.

Share this post


Link to post
Share on other sites
Advertisement


I re-installed Visual Studio (IDE), but it somehow didn't support the debug version of the runtime c++, which I find wierd (notes obtained from external sources), so now I'm trying to make my Project work in the release mode, which is quite a pain in the ass... 

 

So your project can't compile and run in debug mode right now?

Share this post


Link to post
Share on other sites

Do you mean it doesn't cmpile in debug, or device creation fails when using the debug dx runtimes? If you mean the latter, and are using vs2010, you need to install the remove vs2012 debugging tools (free), and it will work again.

Share this post


Link to post
Share on other sites

It compiles fine, but when I run (in debug mode), this message arrives (From memory): The dll MMSVCP100d.dll could not be found. 

 

PS. I'm using VS 2012

Share this post


Link to post
Share on other sites

looks like the debug version has dependencies on a dll that's no longer installed. a Google search should tell you what package its from (earlier version of VS, some MS runtime of some sort, etc). then all you have to do is get the dll.

 

long ago i learned that games can behave differently in debug a release mode. stuff that works in debug doesn't in release (string pooling in my case - a long time ago). so i don't use debug at all anymore. haven't for years. i start a new project, set it to release mode, set my compiler options, and then code away with no worries.

 

when i need hardcore debugging i use trace variables and hi-rez timers. used to use trace files for debugging graphics when you can't display stuff on the screen easily, but i have't even needed that in years either. i think the last time i used trace files was when i was writing my own 3d p-correct t-mapped poly engine between the time that MS bought Brender (when i was about to license a copy) and the time they released it as DirectX v1.0 (with no retained mode!).

 

if you can live without the features of debug mode, you may want to just switch to release mode and be done with it. in the long run, debug mode is irrelevant to the final version of the game, its just a crutch to help get you there.

Share this post


Link to post
Share on other sites

Working in release mode may be a bit difficult especially if you're working on an experimental or unstable system. Because of the code optimizations, it is almost impossible to debug your code unless you disable those optimization in your project properties. The best way is to probably make sure you have a debug and release version of your application from time to time to make sure everything is working fine and immediately spot any problems that may cause a bigger problem later on.

 

I have encountered problems similar to that where it would run completely fine in debug but crash in release mode 3-4 times many years ago. Sadly i don't exactly remember what it was. But mine i think was more or uninitialized variables than missing libraries and or dlls.

 

@Migi0027: I don't really have any solutions but a few simple questions comes into mind:

 

* Are you sure your device context is a valid pointer? Your device context may be valid in debug mode but you might have a different setting in creating your device context in release mode.

 

* Is there a possibility of memory leaks? Are you constantly creating graphics resources and possibly not properly releasing or leaking them? I would sometimes do this and feel really stupid after realizing how simple the problem was but insanely hard to track down.

 

* If you're hundred percent sure your code should work then there may be a bug with your graphics driver. Try updating your graphics driver to the latest one.

 

* Based on your original post, i'm going to assume you've managed to recover the OS and use that existing OS for your new computer. I normally don't trust my OS after having a problem something like yours. Maybe some of your libraries were deleted or corrupted. Maybe try just to do a fresh install and see if that would work?

Share this post


Link to post
Share on other sites

 Norman: I agree with the information you gave, also because I have previous experience where the debugger automatically initialized variables for me, which, as in your case, was very hard to track down.

 

Brent: One very important note that you told me (which I am going to experiment with), is to disable code optimization, just for testing, to track down the issue in a simpler manner.

 

Your Questions:

  1. I'm not sure how a device context can be created differently based on the mode, please tell me how.
  2. I actually haven't paid much attention in my code to this, so thanks for reminding me!
  3. I don't believe that that's the case, but I'll keep that in mind.
  4. I'll keep that in mind, but that will be my last decision.
Edited by Migi0027

Share this post


Link to post
Share on other sites

Ok, now I've figured out that it doesn't only crash in that function, but also when sending the constant buffers wacko.png

 

Is there a difference in how you initialize directx in debug and release mode?

Share this post


Link to post
Share on other sites

When you create your device context the D3D11CreateDevice(...) has a parameter that accepts a creation flag. I'm not really doing anything special but adding an extra flag whenever i'm in debug mode.

UINT createFlags = 0;
...

if ( pDesc->bDebug )
	createFlags |= D3D11_CREATE_DEVICE_DEBUG;
...

Having that flag, D3D gives me some additional debug info whenever there is something wrong or any warning messages that i may have. Try doing that in release mode and see if anything unusual shows up in visual studio's output window.

Share this post


Link to post
Share on other sites

That flag was removed when porting to the new pc.

 

Do you have any idea how you can get the inside information on what wen't wrong? If possible.

Share this post


Link to post
Share on other sites

If useful, here's how I initialize Directx:

#pragma region NativeDirectx
	HRESULT result;
	IDXGIFactory* factory;
	IDXGIAdapter* adapter;
	IDXGIOutput* adapterOutput;
	unsigned int numModes, i, numerator, denominator, stringLength;
	DXGI_MODE_DESC* displayModeList;
	DXGI_ADAPTER_DESC adapterDesc;
	int error;
	DXGI_SWAP_CHAIN_DESC swapChainDesc;
	D3D_FEATURE_LEVEL featureLevel;
	ID3D11Texture2D* backBufferPtr;
	D3D11_TEXTURE2D_DESC depthBufferDesc;
	D3D11_DEPTH_STENCIL_DESC depthStencilDesc;
	D3D11_DEPTH_STENCIL_VIEW_DESC depthStencilViewDesc;
	D3D11_RASTERIZER_DESC rasterDesc;
	D3D11_VIEWPORT viewport;
	float fieldOfView, screenAspect;


	// Store the vsync setting.
	bool m_vsync_enabled = false;

	// Create a DirectX graphics interface factory.
	result = CreateDXGIFactory(__uuidof(IDXGIFactory), (void**)&factory);
	if(FAILED(result))
	{
		CE_ERROR("Error in creating CESDK", "FATAL ERROR");
	}

	// Use the factory to create an adapter for the primary graphics interface (video card).
	result = factory->EnumAdapters(0, &adapter);
	if(FAILED(result))
	{
		CE_ERROR("Error in creating CESDK", "FATAL ERROR");
	}

	// Enumerate the primary adapter output (monitor).
	result = adapter->EnumOutputs(0, &adapterOutput);
	if(FAILED(result))
	{
		CE_ERROR("Error in creating CESDK", "FATAL ERROR");
	}

	// Get the number of modes that fit the DXGI_FORMAT_R8G8B8A8_UNORM display format for the adapter output (monitor).
	result = adapterOutput->GetDisplayModeList(DXGI_FORMAT_R8G8B8A8_UNORM, DXGI_ENUM_MODES_INTERLACED, &numModes, NULL);
	if(FAILED(result))
	{
		CE_ERROR("Error in creating CESDK", "FATAL ERROR");
	}

	// Create a list to hold all the possible display modes for this monitor/video card combination.
	displayModeList = new DXGI_MODE_DESC[numModes];
	if(!displayModeList)
	{
		CE_ERROR("Error in creating CESDK", "FATAL ERROR");
	}

	// Now fill the display mode list structures.
	result = adapterOutput->GetDisplayModeList(DXGI_FORMAT_R8G8B8A8_UNORM, DXGI_ENUM_MODES_INTERLACED, &numModes, displayModeList);
	if(FAILED(result))
	{
		CE_ERROR("Error in creating CESDK", "FATAL ERROR");
	}

	// Now go through all the display modes and find the one that matches the screen width and height.
	// When a match is found store the numerator and denominator of the refresh rate for that monitor.
	for(i=0; i<numModes; i++)
	{
		if(displayModeList[i].Width == (unsigned int)sw)
		{
			if(displayModeList[i].Height == (unsigned int)sh)
			{
				numerator = displayModeList[i].RefreshRate.Numerator;
				denominator = displayModeList[i].RefreshRate.Denominator;
			}
		}
	}

	// Get the adapter (video card) description.
	result = adapter->GetDesc(&adapterDesc);
	if(FAILED(result))
	{
		CE_ERROR("Error in creating CESDK", "FATAL ERROR");
	}

	// Store the dedicated video card memory in megabytes.
	m_videoCardMemory = (int)(adapterDesc.DedicatedVideoMemory / 1024 / 1024);

	// Convert the name of the video card to a character array and store it.
	error = wcstombs_s(&stringLength, m_videoCardDescription, 128, adapterDesc.Description, 128);
	if(error != 0)
	{
		CE_ERROR("Error in creating CESDK", "FATAL ERROR");
	}

	// Release the display mode list.
	delete [] displayModeList;
	displayModeList = 0;

	// Release the adapter output.
	adapterOutput->Release();
	adapterOutput = 0;

	// Release the adapter.
	adapter->Release();
	adapter = 0;

	// Release the factory.
	factory->Release();
	factory = 0;

	// Initialize the swap chain description.
	ZeroMemory(&swapChainDesc, sizeof(swapChainDesc));

	// Set to a single back buffer.
	swapChainDesc.BufferCount = 1;

	// Set the width and height of the back buffer.
	swapChainDesc.BufferDesc.Width = sw;
	swapChainDesc.BufferDesc.Height = sh;

	// Set regular 32-bit surface for the back buffer.
	swapChainDesc.BufferDesc.Format = DXGI_FORMAT_R8G8B8A8_UNORM;

	// Set the refresh rate of the back buffer.
	if(m_vsync_enabled)
	{
		swapChainDesc.BufferDesc.RefreshRate.Numerator = numerator;
		swapChainDesc.BufferDesc.RefreshRate.Denominator = denominator;
	}
	else
	{
		swapChainDesc.BufferDesc.RefreshRate.Numerator = 0;
		swapChainDesc.BufferDesc.RefreshRate.Denominator = 1;
	}

	// Set the usage of the back buffer.
	swapChainDesc.BufferUsage = DXGI_USAGE_RENDER_TARGET_OUTPUT;

	// Set the handle for the window to render to.
	swapChainDesc.OutputWindow = hWnd;

	// Turn multi sampling off.
	swapChainDesc.SampleDesc.Count = 1;
	swapChainDesc.SampleDesc.Quality = 0;

	// Set to full screen or windowed mode.
	if(windowed)
	{
		swapChainDesc.Windowed = true;
	}
	else
	{
		swapChainDesc.Windowed = false;
	}

	// Set the scan line ordering and scaling to unspecified.
	swapChainDesc.BufferDesc.ScanlineOrdering = DXGI_MODE_SCANLINE_ORDER_UNSPECIFIED;
	swapChainDesc.BufferDesc.Scaling = DXGI_MODE_SCALING_UNSPECIFIED;

	// Discard the back buffer contents after presenting.
	swapChainDesc.SwapEffect = DXGI_SWAP_EFFECT_DISCARD;

	// Don't set the advanced flags.
	swapChainDesc.Flags = 0;

	// Set the feature level to DirectX 11.
	featureLevel = D3D_FEATURE_LEVEL_11_0;

	// Create the swap chain, Direct3D device, and Direct3D device context.
	result = D3D11CreateDeviceAndSwapChain(NULL, D3D_DRIVER_TYPE_HARDWARE, NULL, 0, &featureLevel, 1, 
		D3D11_SDK_VERSION, &swapChainDesc, &swapchain, &dev, NULL, &devcon);
	if(FAILED(result))
	{
		CE_ERROR("Error in creating CESDK", "FATAL ERROR");
	}

	ZeroMemory(&texd, sizeof(texd));

	texd.Width = sw;
	texd.Height = sh;
	texd.ArraySize = 1;
	texd.MipLevels = 1;
	texd.SampleDesc.Count = 1;	
	texd.SampleDesc.Quality = 0;
	texd.Format = DXGI_FORMAT_D32_FLOAT;
	texd.BindFlags = D3D11_BIND_DEPTH_STENCIL;

	ID3D11Texture2D *pDepthBuffer;
	dev->CreateTexture2D(&texd, NULL, &pDepthBuffer);

	ZeroMemory(&dsvd, sizeof(dsvd));

	dsvd.Format = DXGI_FORMAT_D32_FLOAT;
	dsvd.ViewDimension = D3D11_DSV_DIMENSION_TEXTURE2D;

	dev->CreateDepthStencilView(pDepthBuffer, &dsvd, &zbuffer);

	if (pDepthBuffer != nullptr)
		pDepthBuffer->Release();

	// get the address of the back buffer
	ID3D11Texture2D *pBackBuffer;
	swapchain->GetBuffer(0, __uuidof(ID3D11Texture2D), (LPVOID*)&pBackBuffer);

	// use the back buffer address to create the render target
	dev->CreateRenderTargetView(pBackBuffer, NULL, &backbuffer);

	if (pBackBuffer != nullptr)
		pBackBuffer->Release();

	// set the render target as the back buffer
	devcon->OMSetRenderTargets(1, &backbuffer, zbuffer);


	// Set the viewport
	ZeroMemory(&viewport, sizeof(D3D11_VIEWPORT));

	viewport.TopLeftX = 0;    // set the left to 0
	viewport.TopLeftY = 0;    // set the top to 0
	viewport.Width = sw;    // set the width to the window's width
	viewport.Height = sh;    // set the height to the window's height
	viewport.MinDepth = 0;    // the closest an object can be on the depth buffer is 0.0
	viewport.MaxDepth = 1;    // the farthest an object can be on the depth buffer is 1.0

	devcon->RSSetViewports(1, &viewport);
	#pragma endregion

Share this post


Link to post
Share on other sites

A quick look at your code seems to have enough security checks to catch if there's anything wrong with the device creation and setup process. I'm afraid i have no more suggestions to your problem. sad.png

 

The worse bug that i can actually think of is if somewhere in your code is doing a buffer overflow and overwrites your graphics device context. I'm not sure if this can be detected with the use of _CrtSetDbgFlag() and _CrtDumpMemoryLeaks() but with those security checks that you have above i doubt you're doing this mistake somewhere in your code.

 

I hope someone here can help you out and i'm also eager to learn what exactly is the problem with your code and its fix.

Edited by BrentChua

Share this post


Link to post
Share on other sites

Do you think using a copy pasted DX11 SDK folder could cause this problem?

 

(I didn't download the sdk again... too lazy... laugh.png )

Share this post


Link to post
Share on other sites

I'm not sure... I don't exactly know the exact content of the SDK installer but i'm guessing besides the API it installs some additional DX runtimes in your systems folder as well. That may be a reason to have a crash like a wrong version of the runtime libraries? But then again it should cause an error message just before your app runs instead of having a weird crash.

 

Is the crash a BSOD?

Share this post


Link to post
Share on other sites

It even fails when sending the shader: (Though, here it doesn't crash, but it sends an exception [ Access violation reading location 0xBAADF015. ] )

 

devcon->VSSetShader(pVS, 0, 0);

 

, and the shader is created and validated, no errors!

 

This is really wierd.

Share this post


Link to post
Share on other sites

Ok, just tweaking, looking and stuff. 

 

And it finally ran! (... in debug mode happy.png ) 

 

But black...

 

So I ran PIX, and found out that the full screen quad was being rendered, with color. So looking over the code again, the stencils weren't being restored properly.

 

But I guess the next goal is to make it run in Release Mode, but for now, I'm just going to play around a bit.

 

Just one question: Can Directx 11 with c++ corrupt the system, or cause BSOD, even if the system is healthy?

 

Thanks for your help! wink.png

Share this post


Link to post
Share on other sites

Ok, just tweaking, looking and stuff. 

 

And it finally ran! (... in debug mode happy.png ) 

 

But black...

 

So I ran PIX, and found out that the full screen quad was being rendered, with color. So looking over the code again, the stencils weren't being restored properly.

 

But I guess the next goal is to make it run in Release Mode, but for now, I'm just going to play around a bit.

 

Just one question: Can Directx 11 with c++ corrupt the system, or cause BSOD, even if the system is healthy?

 

Thanks for your help! wink.png

The code itself shouldn't cause BSOD or other "enjoyable" Windows hangs and crashes... 

Any crashes/hangs/BSOD/nuclear disaster/celin dion song/zombie pandemic/etc. caused by a direct3d application are related with the quality of the video drivers only; with the introduction of WDDM (since Vista), most of problems appear as driver crashes with reset to desktop (or just with some seconds of hangs without the application crash if you are lucky)...

Edited by Alessio1989

Share this post


Link to post
Share on other sites

 

Ok, just tweaking, looking and stuff. 

 

And it finally ran! (... in debug mode happy.png ) 

 

But black...

 

So I ran PIX, and found out that the full screen quad was being rendered, with color. So looking over the code again, the stencils weren't being restored properly.

 

But I guess the next goal is to make it run in Release Mode, but for now, I'm just going to play around a bit.

 

Just one question: Can Directx 11 with c++ corrupt the system, or cause BSOD, even if the system is healthy?

 

Thanks for your help! wink.png

The code itself shouldn't cause BSOD or other "enjoyable" Windows hangs and crashes... 

Any crashes/hangs/BSOD/nuclear disaster/celin dion song/zombie pandemic/etc. caused by a direct3d application are related with the quality of the video drivers only; with the introduction of WDDM (since Vista), most of problems appear as driver crashes with reset to desktop (or just with some seconds of hangs without the application crash if you are lucky)...

 

 

Phew.. happy.png

 

Ok, thanks!

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
  • Advertisement
  • Popular Tags

  • Advertisement
  • Popular Now

  • Similar Content

    • By AxeGuywithanAxe
      I wanted to see how others are currently handling descriptor heap updates and management.
      I've read a few articles and there tends to be three major strategies :
      1 ) You split up descriptor heaps per shader stage ( i.e one for vertex shader , pixel , hull, etc)
      2) You have one descriptor heap for an entire pipeline
      3) You split up descriptor heaps for update each update frequency (i.e EResourceSet_PerInstance , EResourceSet_PerPass , EResourceSet_PerMaterial, etc)
      The benefits of the first two approaches is that it makes it easier to port current code, and descriptor / resource descriptor management and updating tends to be easier to manage, but it seems to be not as efficient.
      The benefits of the third approach seems to be that it's the most efficient because you only manage and update objects when they change.
    • By evelyn4you
      hi,
      until now i use typical vertexshader approach for skinning with a Constantbuffer containing the transform matrix for the bones and an the vertexbuffer containing bone index and bone weight.
      Now i have implemented realtime environment  probe cubemaping so i have to render my scene from many point of views and the time for skinning takes too long because it is recalculated for every side of the cubemap.
      For Info i am working on Win7 an therefore use one Shadermodel 5.0 not 5.x that have more options, or is there a way to use 5.x in Win 7
      My Graphic Card is Directx 12 compatible NVidia GTX 960
      the member turanszkij has posted a good for me understandable compute shader. ( for Info: in his engine he uses an optimized version of it )
      https://turanszkij.wordpress.com/2017/09/09/skinning-in-compute-shader/
      Now my questions
       is it possible to feed the compute shader with my orignial vertexbuffer or do i have to copy it in several ByteAdressBuffers as implemented in the following code ?
        the same question is about the constant buffer of the matrixes
       my more urgent question is how do i feed my normal pipeline with the result of the compute Shader which are 2 RWByteAddressBuffers that contain position an normal
      for example i could use 2 vertexbuffer bindings
      1 containing only the uv coordinates
      2.containing position and normal
      How do i copy from the RWByteAddressBuffers to the vertexbuffer ?
       
      (Code from turanszkij )
      Here is my shader implementation for skinning a mesh in a compute shader:
      1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 struct Bone { float4x4 pose; }; StructuredBuffer<Bone> boneBuffer;   ByteAddressBuffer vertexBuffer_POS; // T-Pose pos ByteAddressBuffer vertexBuffer_NOR; // T-Pose normal ByteAddressBuffer vertexBuffer_WEI; // bone weights ByteAddressBuffer vertexBuffer_BON; // bone indices   RWByteAddressBuffer streamoutBuffer_POS; // skinned pos RWByteAddressBuffer streamoutBuffer_NOR; // skinned normal RWByteAddressBuffer streamoutBuffer_PRE; // previous frame skinned pos   inline void Skinning(inout float4 pos, inout float4 nor, in float4 inBon, in float4 inWei) {  float4 p = 0, pp = 0;  float3 n = 0;  float4x4 m;  float3x3 m3;  float weisum = 0;   // force loop to reduce register pressure  // though this way we can not interleave TEX - ALU operations  [loop]  for (uint i = 0; ((i &lt; 4) &amp;&amp; (weisum&lt;1.0f)); ++i)  {  m = boneBuffer[(uint)inBon].pose;  m3 = (float3x3)m;   p += mul(float4(pos.xyz, 1), m)*inWei;  n += mul(nor.xyz, m3)*inWei;   weisum += inWei;  }   bool w = any(inWei);  pos.xyz = w ? p.xyz : pos.xyz;  nor.xyz = w ? n : nor.xyz; }   [numthreads(1024, 1, 1)] void main( uint3 DTid : SV_DispatchThreadID ) {  const uint fetchAddress = DTid.x * 16; // stride is 16 bytes for each vertex buffer now...   uint4 pos_u = vertexBuffer_POS.Load4(fetchAddress);  uint4 nor_u = vertexBuffer_NOR.Load4(fetchAddress);  uint4 wei_u = vertexBuffer_WEI.Load4(fetchAddress);  uint4 bon_u = vertexBuffer_BON.Load4(fetchAddress);   float4 pos = asfloat(pos_u);  float4 nor = asfloat(nor_u);  float4 wei = asfloat(wei_u);  float4 bon = asfloat(bon_u);   Skinning(pos, nor, bon, wei);   pos_u = asuint(pos);  nor_u = asuint(nor);   // copy prev frame current pos to current frame prev pos streamoutBuffer_PRE.Store4(fetchAddress, streamoutBuffer_POS.Load4(fetchAddress)); // write out skinned props:  streamoutBuffer_POS.Store4(fetchAddress, pos_u);  streamoutBuffer_NOR.Store4(fetchAddress, nor_u); }  
    • By mister345
      Hi, can someone please explain why this is giving an assertion EyePosition!=0 exception?
       
      _lightBufferVS->viewMatrix = DirectX::XMMatrixLookAtLH(XMLoadFloat3(&_lightBufferVS->position), XMLoadFloat3(&_lookAt), XMLoadFloat3(&up));
      It looks like DirectX doesnt want the 2nd parameter to be a zero vector in the assertion, but I passed in a zero vector with this exact same code in another program and it ran just fine. (Here is the version of the code that worked - note XMLoadFloat3(&m_lookAt) parameter value is (0,0,0) at runtime - I debugged it - but it throws no exceptions.
          m_viewMatrix = DirectX::XMMatrixLookAtLH(XMLoadFloat3(&m_position), XMLoadFloat3(&m_lookAt), XMLoadFloat3(&up)); Here is the repo for the broken code (See LightClass) https://github.com/mister51213/DirectX11Engine/blob/master/DirectX11Engine/LightClass.cpp
      and here is the repo with the alternative version of the code that is working with a value of (0,0,0) for the second parameter.
      https://github.com/mister51213/DX11Port_SoftShadows/blob/master/Engine/lightclass.cpp
    • By mister345
      Hi, can somebody please tell me in clear simple steps how to debug and step through an hlsl shader file?
      I already did Debug > Start Graphics Debugging > then captured some frames from Visual Studio and
      double clicked on the frame to open it, but no idea where to go from there.
       
      I've been searching for hours and there's no information on this, not even on the Microsoft Website!
      They say "open the  Graphics Pixel History window" but there is no such window!
      Then they say, in the "Pipeline Stages choose Start Debugging"  but the Start Debugging option is nowhere to be found in the whole interface.
      Also, how do I even open the hlsl file that I want to set a break point in from inside the Graphics Debugger?
       
      All I want to do is set a break point in a specific hlsl file, step thru it, and see the data, but this is so unbelievably complicated
      and Microsoft's instructions are horrible! Somebody please, please help.
       
       
       

    • By mister345
      I finally ported Rastertek's tutorial # 42 on soft shadows and blur shading. This tutorial has a ton of really useful effects and there's no working version anywhere online.
      Unfortunately it just draws a black screen. Not sure what's causing it. I'm guessing the camera or ortho matrix transforms are wrong, light directions, or maybe texture resources not being properly initialized.  I didnt change any of the variables though, only upgraded all types and functions DirectX3DVector3 to XMFLOAT3, and used DirectXTK for texture loading. If anyone is willing to take a look at what might be causing the black screen, maybe something pops out to you, let me know, thanks.
      https://github.com/mister51213/DX11Port_SoftShadows
       
      Also, for reference, here's tutorial #40 which has normal shadows but no blur, which I also ported, and it works perfectly.
      https://github.com/mister51213/DX11Port_ShadowMapping
       
  • Advertisement