Jump to content

  • Log In with Google      Sign In   
  • Create Account

We're offering banner ads on our site from just $5!

1. Details HERE. 2. GDNet+ Subscriptions HERE. 3. Ad upload HERE.


RythBlade

Member Since 01 Jan 2012
Offline Last Active Aug 13 2013 04:44 AM

#4933131 [DX11] HLSL 5 setting SamplerState from within the shader isn't working

Posted by RythBlade on 20 April 2012 - 04:05 AM

For all those that are interested - just found this post which discusses a nice way to deal with this situation, which is what I'm now going to do. The developer automatically sets up a set of SamplerState objects in their engine and set them in the shaders by default. The shaders have a common include containing the register references for all of these so that all shaders have access to set of samplers etc automatically.

[Edit:] Though I'd give a quick update - this method works fantastically - all my shaders now have immediate access to any samplers they might need and its simplified my some parts of my engine code a lot! Works perfectly.


#4933130 [DX11] HLSL 5 setting SamplerState from within the shader isn't working

Posted by RythBlade on 20 April 2012 - 04:02 AM

No I'm not.....goddamn it! I was hoping for a quick win.....Ok. Yeah it looks like you're right, just stumbled onto this post after reading your reply. It seems my google-fu is still not as strong as I'd hoped.
Thanks very much for the help!


#4933121 [DX11] HLSL 5 setting SamplerState from within the shader isn't working

Posted by RythBlade on 20 April 2012 - 03:35 AM

Hi there,
I'm having some trouble with a hlsl shader I've written for a deferred shader to do texture mapping onto a road in my game.
Basically I define a SamplerState at the top of my shader which I then use to sample my textures to map onto the surface.
I've defined it based on the MSDN documentation so that I shouldn't need to create and set and Samplers from within my C++ code and it can all be done from within the shader.
However when I run it, the sampler seems to be ignored. The directx debug gives the following warning:

D3D11: WARNING: ID3D11DeviceContext::Draw: The Pixel Shader unit expects a Sampler to be set at Slot 0, but none is bound. This is perfectly valid, as a NULL Sampler maps to default Sampler state. However, the developer may not want to rely on the defaults.  [ EXECUTION WARNING #352: DEVICE_DRAW_SAMPLER_NOT_SET ]

Here are the relevant code snippets from my shader:
// textures are passed in from my application
Texture2D widthMap  : register ( t0 );
Texture2D lengthMap  : register ( t1 );
// the sampler state defined in shader code so that my application
// doesn't have to
SamplerState MySampler
{
Filter = MIN_MAG_MIP_POINT;
    AddressU = Wrap;
    AddressV = Wrap;
};
// pixel shader which samples the texture
POut PShader(PIn input)
{
.....
output.colour = float4(lengthMap.Sample(MySampler, float2(input.texCoord.y, 0.0f)));
......
return output;
}


Ignore the dodgy texture coordinate - the premise for my texture sampling is to "build" the actual texture out of 2 textures which define the u colour and v colour - when both the u and v are filled in then I use the colour, otherwise that pixel is set to black - I'm using it to generate the road lines so that I can define any road line layout I want based on 2 very small textures.

I've dug through a few of the DirectX samples and I can see them declaring the SamplerState at the top just like I have and seem to have no such problems.
I've also tried declaring a samplerstate for each texture I want to sample and within each state I set the "Texture" field to the target texture. I changed it to the current version as this is how the directx samples seem to do it.

This problem is also present everywhere I sample a texture in my deferred shaders as well!

I've got no idea what I've missed. I can't see any settings that I need to set to tell DirectX to use what ever the shader itself supplies, as far as I was aware - declaring it in my shader should work fine.

I can post more examples of my shader files if needed.

Has anyone got any suggests???

Thanks very much!!


#4911941 DirectX 11 can't find all of my graphics adapters

Posted by RythBlade on 11 February 2012 - 06:30 AM

No worries, that's why I put all the detail Posted Image
Those are some good tutorials! I'd also have a look through the tutorils available on directtutorial.com, they give some nice descriptions of what everything does, an explanation of the parameters and why they've done it. You have to pay for some of them, but I did perfectly fine with the free ones.


#4899906 DirectX 11 can't find all of my graphics adapters

Posted by RythBlade on 05 January 2012 - 04:12 AM

No worries! Thought I'd better make sure I document my solution as I know how helpful these boards are!

Update - Pix also suffers the same problems! Make sure you make similar alterations in the NVidia control panel for PIX as we did above. Either add a new profile for the Pix executable or modify the global settings.
Make sure that you have closed Pix while making these changes - or at least restart it after you've made them.
Note that your application will run fine when running your experiment, but when you attempt to inspect the rendering and debug the pixels after the exeriment, if will constantly fail as Pix isn't able to create the feature level 11 devices like the application did.

I assume this will be the same for all of the DX utilities as I can't see them in the profile list on the NVidia control panel.


#4899622 DirectX 11 can't find all of my graphics adapters

Posted by RythBlade on 04 January 2012 - 10:33 AM

I've found some more information on this problem. Like I said earlier, some machines - espcially laptops have a second low power graphics chip to minimise power consumption when high performance isn't needed. The NVidia Optimus technology is in charge of determining this for NVidia cards. That is what is interferring with my ability to create feature level 11.
The Optimus is apparently triggered whenever DirectX, DXVA or Cuda calls are made. You can also create an application profile via the NVidia Control Panel which targets an executeable and enables/disables its features according to what you set. Most games register with NVidia and add themselves to their profile database, which is distributed as part of Optimus. The DirectX SDK has several of these profiles which is why the samples can see and make use of the GeForce card and why I can't.
I'm not sure about Optimus triggering itself when it detects DirectX.....as I wouldn't have this problem in the first place. It seems a temperamental utility at present.

So anyway - I've added my own profile to the NVidia control panel to enable the graphics card when my application starts and reset the global settings back to auto-select the graphics adapter to use (just so I don't forget to switch it back off and then wonder where all my battery is going....) and everything works fine.

I've found a link to the white paper explaining this. Pages 15, 16, 17 are the ones to read on this issue.

Thanks again for your help with matter!!


#4899568 DirectX 11 can't find all of my graphics adapters

Posted by RythBlade on 04 January 2012 - 06:49 AM

OMG I'VE GOT IT!!!

Right then I'll explain my fix which actually poses another question. I'm quite certain it wasn't the drivers - as the NVidia updater said I was up-to-date, but never the less I forced the install of the latest drivers from their site. Opened up caps viewer and I still got the same strange results I got last time (see above).

But that started me thinking again. I've been stepping through a DX sample - that car tesselator one - which successfully sets up a DX 11 feature level - in a more round about fashion than I do.
I had look through the NVidia Control Panel application and stumbled onto the graphics adapter settings. There's a combo box that says "preferred graphics processor". It was set to auto-select between the intel chip and GeForce card. I set it to always use the GeForce card and now my set up works perfectly. Caps viewer also now displays what we were expecting under the graphics card - it's a little weird though, the orignal empty one is still there and the intel one has been replaced by another GeForce folder that displays everything we were expecting.

For everyone's reference follow these steps to alter the default display adapter in the NVidia Control Panel:
-Open the NVidia Control panel by right-clicking on an empty area of the desktop and select it from the drop down menu
-If it asks - make sure you're using the advanced view
-Expand 3D Settings in the window on the left and click Manage 3D Settings
-In the main window, select the Global Settings tab
-In the combo box marked Preferred Graphics Processor, expand the box and select your graphics card, instead of Auto-Select

So for my own reference, as there's obviously a way to make DX 11 choose the NVidia properly as the samples do it, any ideas on how this might be achieved.....
its almost as if the max feature level support code is choosing the minimum feature level needed.......
I'll investigate further and let you know if I find anything!

Thanks very much for you help with this though - its been great to get another viewpoint! Let me know if you need any help in the future - I promise I'm not as bad at this as I may seem :P


#4899533 DirectX 11 can't find all of my graphics adapters

Posted by RythBlade on 04 January 2012 - 04:08 AM

I've managed to make a little progress with this but it's still not working. I've managed to get it to find my NVidia card when enumerating through the graphics adapters by altering the following lines in the code I posted earlier:
IDXGIFactory* factory = NULL;
has been changed to:
IDXGIFactory1* factory = NULL;

and
result = CreateDXGIFactory(__uuidof(IDXGIFactory), (void**)&factory);
has been changed to:
result = CreateDXGIFactory1(__uuidof(IDXGIFactory1), (void**)&factory);

It still won't create my objects at feature level 11, and actually as the caps viewer displayed, if I force it to choose the NVidia adapter, it can't create any feature level!!
The create and swap chain function always returns "-2005270524" which I assume is some DXGI error code to say the device doesn't support the feature level.
The Intel graphics family adapter still only creates devices at feature level 10.1.......


PARTNERS