Sign in to follow this  
Followers 0
_Flame_

DX11
DX11 software mode.

25 posts in this topic

Hi.
Is it possible to run DX11 application in software mode on Win7? I've tried but i've had an error "incorrect parameter". I know that i need to put handle of module as one of parameters and i did it but it's not a handle where a window is. Is it ok? Could anyone help me? Maybe anyone has an example? Thanks.
0

Share this post


Link to post
Share on other sites
For software mode you're supposed to provide a handle to a DLL that implements a D3D11 driver + rasterizer in software. If you don't have one, you can't use it.

There is a built-in software rasterizer, called [url="http://msdn.microsoft.com/en-us/library/gg615082.aspx"]WARP[/url]. To you use it you pass D3D_DRIVER_TYPE_WARP when creating the device.
1

Share this post


Link to post
Share on other sites
Thanks for answer. So as i understand correctly i need to provide a handle of my main application but not a handle of dll where my engine is? And after that it will work in software mode well? Ok, i will try. Just another question. What's the difference between software and reference mode? And what do you mean "built-in software rasterizer, called WARP". I've heard a little bit about that mode and i've thought that it's software mode which works a little bit faster when usual software mode and i don't know why. But the problem is that only DX10.1 is available on Win7 with that mode and it's not suitable for me.
0

Share this post


Link to post
Share on other sites
[quote name='_Flame_' timestamp='1342070397' post='4958258']
But the problem is that only DX10.1 is available on Win7 with that mode and it's not suitable for me.
[/quote]
I'm not sure where you heard that, but you heard wrong. Full D3D11 is available on Windows 7. Even if you have downlevel hardware you can still use feature levels to target it with the D3D11 API - http://msdn.microsoft.com/en-us/library/windows/desktop/ff476876%28v=vs.85%29.aspx
0

Share this post


Link to post
Share on other sites
It's me again but under abit different nickname coz i don't have pass here from my original nickname [img]http://public.gamedev.net//public/style_emoticons/default/smile.png[/img]

I've read about those limitations here - [url="http://msdn.microsoft.com/en-us/library/windows/desktop/ff476328%28v=vs.85%29.aspx"]http://msdn.microsof...8(v=vs.85).aspx[/url]

[quote]A WARP driver, which is a high-performance software rasterizer. The rasterizer supports feature levels 9_1 through level 10_1 with a high performance software implementation
[b]Note[/b] The WARP driver that Windows 8 includes supports feature levels 9_1 through level 11_1.[/quote]

But anyway i'm not successfull in the any software modes now. My app is .net. And i do such thing - GetModuleHandle("MayApp.vcshost.exe"); and this function retuns handle which isn't null but i again have error - "Incorrect parameter" in the function [font=monospace]D3D11CreateDivice. [/font]What's wrong? Edited by _Flame1_
-1

Share this post


Link to post
Share on other sites
The handle is expected to be that of a native module (commonly a dll), which implements and exports the D3D driver entrypoints (there are lot of them).

All the stuff you need to implement is listed in the Windows DDK under the graphics drivers section. The basics of implementing a software driver are almost exactly same as implementing the user-mode portion of a hardware driver, though direct kernel access is not strictly necessary in pure software. In addition, software driver necessarily needs to implement the actual drawing operations such as the whole graphics pipeline, which would otherwise reside on GPU hardware.

I am under the impression that D3D 11.1 will be released on Vista and 7 as well, after '8' launches. This would enable you to use the WARP driver with 11.0 feature set. Edited by Nik02
0

Share this post


Link to post
Share on other sites
Very funny Nik02. I need to implement a software driver for that.

[quote]A reference driver, which is a software implementation that supports every Direct3D feature. A reference driver is designed for accuracy rather than speed and as a result is slow but accurate. The rasterizer portion of the driver does make use of special CPU instructions whenever it can, but it is not intended for retail applications; use it only for feature testing, demonstration of functionality, debugging, or verifying bugs in other drivers. This driver is installed by the DirectX SDK. This driver may be referred to as a REF driver, a reference driver or a reference rasterizer.[/quote]

Where can i get that driver? [img]http://public.gamedev.net//public/style_emoticons/default/smile.png[/img]
I see that DX runtime has D3D11Ref.dll. What's that? Edited by _Flame1_
-3

Share this post


Link to post
Share on other sites
Guys why are you decreasing my reputation? I just need to use software mode for my application, no more. [img]http://public.gamedev.net//public/style_emoticons/default/smile.png[/img] Is it problem? Edited by _Flame1_
0

Share this post


Link to post
Share on other sites
[quote name='_Flame_' timestamp='1342039817' post='4958157']
I know that i need to put handle of module as one of parameters and i did it but it's not a handle where a window is. Is it ok?
[/quote][quote name='MJP' timestamp='1342041509' post='4958168']
For software mode you're supposed to provide a handle to a DLL that implements a D3D11 driver + rasterizer in software. If you don't have one, you can't use it.
[/quote][quote name='_Flame_' timestamp='1342070397' post='4958258']
So as i understand correctly i need to provide a handle of my main application but not a handle of dll where my engine is?
I do such thing - GetModuleHandle("MayApp.vcshost.exe"); and this function retuns handle which isn't null but i again have error - "Incorrect parameter" in the function D3D11CreateDivice. What's wrong?
[/quote][quote name='Nik02' timestamp='1342096354' post='4958374']
The handle is expected to be that of a native module (commonly a dll), which implements and exports the D3D driver entrypoints (there are lot of them).
[/quote][quote name='_Flame1_' timestamp='1342100958' post='4958397']
Very funny Nik02. I need to implement a software driver for that.
[/quote]You asked about your incorrect use of the handle parameter... MJP told you that this parameter is supposed to be used to load a custom D3D Driver DLL ([i]and tried to steer you away from this and onto the WARP software driver[/i])... You ignored him and kept trying to pass a handle to your application as if it were a software D3D driver... Nik02 again explained that the handle is for a driver DLL, and if you want to use it, you must implement this DLL yourself... and then you treat his reply as a joke, and tell him what they both told you already?? I dont understand how you're simultaneously understanding their advice and ignoring their advice!

You can either
* load a software driver DLL (which you can write yourself, in theory...),
* use WARP with it's feature level restrictions, or
* use the reference device, although it's supposed to only be used for debugging purposes. Edited by Hodgman
4

Share this post


Link to post
Share on other sites
Dear Hodgman i don't know this topic and it's easy for me not to understand correctly. They said that i need to put a handle where driver is. But i thought that it's in my application. Ok. it was a mistake, now i've got it. I've ask some example or maybe small piece of code. Advice about implementing a driver is really joke for me. [img]http://public.gamedev.net//public/style_emoticons/default/smile.png[/img] I didn't ask that at all. All i need is to run my application in software mode. What should i do for that? I don't need abstract advice i need practical. [img]http://public.gamedev.net//public/style_emoticons/default/smile.png[/img]

[quote]
You can either
* load a software driver DLL (which you can write yourself, in theory...),
* use WARP with it's feature level restrictions, or
* use the reference device, although it's supposed to only be used for debugging purposes.
[/quote]

1. I can't do the first one.
2. How?
3. How?

What the hell should i do for 2 and 3 options? Edited by _Flame1_
0

Share this post


Link to post
Share on other sites
For WARP you just specify D3D_DRIVER_TYPE_WARP in your D3D11CreateDeviceAndSwapChain call. That's all documented in the SDK.

As I flagged above, be very very certain that you do in fact need a software device before jumping in and creating one. You indicated that your understanding was that Windows 7 doesn't support D3D11 but that understanding is wrong. Even if you do need to run a program on downlevel (i.e. D3D9 or 10 class) hardware you can still do so via feature levels - check the link I provided above.

So it's likely that you don't in fact need a software device at all.
0

Share this post


Link to post
Share on other sites
[quote]For WARP you just specify D3D_DRIVER_TYPE_WARP in your D3D11CreateDeviceAndSwapChain call. That's all documented in the SDK.[/quote]
I use D3D11CreateDevice and this flag doesn't work for me. I've tried all software flags(ref, warp and soft) and no one works. I've got an error "incorrect parameter".

[quote]As I flagged above, be very very certain that you do in fact need a software device before jumping in and creating one. You indicated that your understanding was that Windows 7 doesn't support D3D11 but that understanding is wrong. Even if you do need to run a program on downlevel (i.e. D3D9 or 10 class) hardware you can still do so via feature levels - check the link I provided above.[/quote]
No, you didn't understand me. I wrote that WARP doesn't have DX11 if i correctly understand msdn.
0

Share this post


Link to post
Share on other sites
What are the exact features present in the 11.0 profile that you can't live without?

Functionality equivalent to tessellation and compute shaders is fairly trivial to implement in software, even though the rest of the drawing would happen in actual GPU hardware (even legacy GPUs). Of course, software is always going to be slower at these, and neither WARP nor your theoretical custom sw driver won't change that.

And, as I said previously, the newest version of WARP does support the D3D11 feature set. You just have to wait until the 11.1 runtime is released later this year. If you can't really wait, the only way is to write your own driver - which couldn't be farther from trivial, but is possible for an experienced software engineer team with a lot of time on their hands. Writing the driver from scratch very likely takes more time than waiting for the new WARP driver, though; no matter how experienced the team is. Edited by Nik02
1

Share this post


Link to post
Share on other sites
Nick02 why are you teaching me what i need and what i don't need. I've already said what i need. If it's impossible now then ok i will cope with it. I don't care about performance at all. The only thing that i need is DX11 software mode to run my app without appropriate hardware, no more.
-3

Share this post


Link to post
Share on other sites
The thing is, you've already been told that you can run your app without appropriate hardware.

You don't need a software mode for this.
You can still use the D3D11 API.

Just use [url="http://msdn.microsoft.com/en-us/library/windows/desktop/ff476876%28v=vs.85%29.aspx"]feature levels[/url] and it will work. If you must, you can go so far as to specify D3D_FEATURE_LEVEL_9_1 and it will even work on ancient D3D9 class hardware. So unless there are specific features that only D3D11 has and that you must use, which was Nick02's question, you already have an appropriate solution that does not require a software mode.
0

Share this post


Link to post
Share on other sites
mhagain. I know that i can use DX11 with profiles. It's not a secrect for me. I need a software mode with full featured DX11 profile. Again, you advice me that i don't need. It seem to me that this discussion is useless for me. It's better to close that topic if it goes that way. For some reason you just like decreasing reputation instead of real help.
-1

Share this post


Link to post
Share on other sites
We tried to be helpful and offer alternative ways to proceed, but it seems that the effort was wasted.

I know that MJP, mhagain and Hodgman are experienced software developers as well as helpful guys in general; we generally know what we're doing and we try to reply with our best knowledge. Usually, if you hit a dead end, the wise way to proceed is to find alternative ways. We presented several such ways, all of which are viable (though with hugely varying amount of effort required) but you insist on clawing your way through the thick brick wall that blocks your path.

If we had more accurate information about the scenario you actually want to accomplish, we would be able to offer more accurate help. But, personally, I don't have any incentive left to do that. I'll just concentrate on helping those that appreciate the effort.

Bye,
-N
4

Share this post


Link to post
Share on other sites
[quote name='_Flame_' timestamp='1342207571' post='4958880']
I don't care about performance at all. The only thing that i need is DX11 software mode to run my app without appropriate hardware, no more.
[/quote]

D3D_DRIVER_TYPE_REFERENCE is exactly that, as others have already suggested. It has all full D3D11 features.
0

Share this post


Link to post
Share on other sites
Flame, do you have the DirectX SDK installed? Debug, Reference [s]and WARP[/s] drivers are [i]only[/i] available from the SDK - since they are supposed to be debug tools. Edited by Bacterius
0

Share this post


Link to post
Share on other sites
WARP isn't a debugging tool, it's intended to be deployed. So I'm pretty sure you don't need to install the SDK to use it.
1

Share this post


Link to post
Share on other sites
[quote name='MJP' timestamp='1342402740' post='4959413']
WARP isn't a debugging tool, it's intended to be deployed. So I'm pretty sure you don't need to install the SDK to use it.
[/quote]
Indeed [url="http://msdn.microsoft.com/en-us/library/windows/desktop/gg615082%28v=vs.85%29.aspx#how_to_use_warp"]it is part of the D3D runtime[/url] - my mistake. Thanks MJP.
0

Share this post


Link to post
Share on other sites
Yes. It's not good if it will be nessesary to install DirectX SDK but it's alright.
Ok, here is my code.

[code]D3D_DRIVER_TYPE driver = D3D_DRIVER_TYPE_REFERENCE;
D3D11CreateDevice(pAdapter, driver, 0, 0, 0, 0, D3D11_SDK_VERSION, &pDevice, NULL, NULL );[/code]

Also i've tried D3D_DRIVER_TYPE_WARP. And i have HRESULT - 0x80070057(The parameter is incorrect. ) all time.
Yes, i have installed latest DirectX Sdk.
0

Share this post


Link to post
Share on other sites
Maybe you need the other 2 [font=courier new,courier,monospace]out[/font] params to not be [font=courier new,courier,monospace]NULL[/font]?[code]D3D_FEATURE_LEVEL level = 0;
ID3D11DeviceContext* pContext = 0;
D3D_DRIVER_TYPE driver = D3D_DRIVER_TYPE_REFERENCE;
D3D11CreateDevice(pAdapter, driver, 0, 0, 0, 0, D3D11_SDK_VERSION, &pDevice, &level, &pContext );[/code]
[Edit] Also, did you see this restriction? It implies that [font=courier new,courier,monospace]pAdapter[/font] should be [font=courier new,courier,monospace]NULL[/font] for a reference driver.
[quote]If you set the pAdapter parameter to a non-NULL value, you must also set the DriverType parameter to the D3D_DRIVER_TYPE_UNKNOWN value.[/quote] Edited by Hodgman
1

Share this post


Link to post
Share on other sites
Thanks. D3D_DRIVER_TYPE_REFERENCE is working now with NULL adapter.

But i can't create swap chain now.
The function CreateSwapChain returns error - 0x887A0001(DXGI_ERROR_INVALID_CALL).


Also i can't run my application with WARP.
[code]D3D_DRIVER_TYPE driver = D3D_DRIVER_TYPE_WARP;
D3D11CreateDevice(0, driver, 0, 0, 0, 0, D3D10_SDK_VERSION, &pDevice, NULL, NULL );[/code]
I have an error "The parameter is incorrect" again. Edited by _Flame1_
0

Share this post


Link to post
Share on other sites
Any ideas? Firstly i invoke D3D11CreateDevice and secondly IDXGIFactory->CreateSwapChain;

My SwapChainDescription is quite simple.

[code]DXGI_SWAP_CHAIN_DESC sd;
ZeroMemory( &sd, sizeof( sd ) );
sd.BufferCount = 1;
sd.BufferDesc.Width = 640;
sd.BufferDesc.Height = 480;
sd.BufferDesc.Format = DXGI_FORMAT_R8G8B8A8_UNORM;
sd.BufferDesc.RefreshRate.Numerator = 60;
sd.BufferDesc.RefreshRate.Denominator = 1;
sd.BufferUsage = DXGI_USAGE_RENDER_TARGET_OUTPUT;
sd.OutputWindow = g_hWnd;
sd.SampleDesc.Count = 1;
sd.SampleDesc.Quality = 0;
sd.Windowed = TRUE;[/code]

I took it from msdn example.
0

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!


Register a new account

Sign in

Already have an account? Sign in here.


Sign In Now
Sign in to follow this  
Followers 0

  • Similar Content

    • By Enitalp
      Hi all.
      I have a direct2d1 application with all my UI. And now i'm trying to insert 3d rendering in my UI. I tried a lot of thing as i'm new to that. and failed....
      So my UI contain a control that is a 3d render. so stupidly i was thinking of making a 3d rendertarget, get the bitmap of that. and draw it at the place of my control.
      So i created this function
      public Bitmap1 CreateTarget(int i_Width, int i_Height) { Texture2DDescription l_Description = new Texture2DDescription(); l_Description.BindFlags = BindFlags.RenderTarget; l_Description.Format = m_BackBuffer.Description.Format; l_Description.Width = i_Width; l_Description.Height = i_Height; l_Description.Usage = ResourceUsage.Default; l_Description.ArraySize = 1; l_Description.MipLevels = 1; l_Description.SampleDescription = new SampleDescription(1, 0); l_Description.CpuAccessFlags = CpuAccessFlags.None; l_Description.OptionFlags = ResourceOptionFlags.None; Texture2D l_RenderTarget = new Texture2D(m_Device, l_Description); BitmapProperties1 properties = new BitmapProperties1() { PixelFormat = new PixelFormat(l_Description.Format, SharpDX.Direct2D1.AlphaMode.Premultiplied), BitmapOptions = BitmapOptions.Target, DpiX=96, DpiY = 96 }; Bitmap1 m_OffscreenBitmap; using (Surface l_Surface = l_RenderTarget.QueryInterface<Surface>()) { m_OffscreenBitmap = new Bitmap1(m_2DContext, l_Surface, properties); } return m_OffscreenBitmap; }  
      And my control does a simple :
      if (m_OldSize != Size) { m_OldSize = Size; if (m_OffscreenBitmap != null) { m_OffscreenBitmap.Dispose(); } m_OffscreenBitmap = i_Param.CurrentWindow.CreateTarget(Size.Width, Size.Height); } i_Context.DrawContext2D.DrawBitmap(m_OffscreenBitmap, m_Rect, 1.0f, BitmapInterpolationMode.Linear);  
      Here is my problem, if BitmapOptions is different from BitmapOptions = BitmapOptions.Target | BitmapOptions.CannotDraw
      i crash when creating my new Bitmap1 because of invalid params.
      and if i let it, i crash at present because :
      Additional information: HRESULT: [0x88990021], Module: [SharpDX.Direct2D1], ApiCode: [D2DERR_BITMAP_CANNOT_DRAW/BitmapCannotDraw], Message: Impossible de dessiner avec une bitmap qui a l’option D2D1_BITMAP_OPTIONS_CANNOT_DRAW.
       
      I must admit i'm out of idea. and i'm stuck. Please help.
      Does my method is totally wrong ?
      I tried to make my control owning is own 3d device so i can render that at a different pace than the 2d and did get the same result
       
       
       
       
    • By ErnieDingo
      Before you read, apologies for the wall of text!  
      I'm looking to leverage efficiencies in DirectX 11 calls, to improve performance and throughput of my game.  I have a number of bad decisions I am going to remedy, but before I do, I am just wanting to get input into I should put effort into doing these.
      I've been running for a while with a high frame rate in my game, but as I add assets, its obviously dipping (its a bit of an n squared issue).  I'm fully aware of the current architecture, and I'm looking to take care of some severe vertex buffer thrashing i'm doing at the moment. 
      Keep in mind, the game engine has evolved over the past year so some decisions made at that time in hindsight are considered bad, but were logical at the time.
      The scenarios:
      Current: my game world is broken up by quad tree.  I'm rendering the terrain geometry and water geometry separately and in different vertex buffers.   Currently I am using Raw Draw Calls which means that I am very wasteful on computational power.  
      Goal: Use Index buffers to reduce vertices by 80%, compress my index buffers and vertex buffers into one index buffer and vertex buffer.  I can't reduce the number of draw calls as its per leaf.
      Current: Static assets such as trees etc are bound to each leaf of my quad tree, as I traverse the tree to see whats in view/out of view, I trim the leaf which in turn trims all the static assets.  This means there is an instance buffer for each node AND for each mesh.  
      Goal: Compress the instance Buffers into one instance buffer per mesh (Ie, even if 10 meshes are in 1 vertex buffer, I need 10 instance buffers), for all meshes, compress the meshes into 1 index buffer and 1 vertex buffer.  I can not reduce the number of draw calls.
      Current: My unlimited sea function reuses the same tile mesh and just remaps with a constant buffer.  This means, if there are 10 tiles, there are 10 draw calls and 10 constant buffer updates.
      Goal: Simple, Use an instance buffer and remove the constant buffer updates (I was lazy and wanted to do this quick :)).  Reduces it to 1 draw call, 1 instance buffer bind and 1 vertex buffer bind.
      Current: Each shader, i'm rebinding the same constant buffers, these buffers only change at the start of a new scene (shadow AND rendered).  
      Goal: Create a map of buffers to be bound once per context, use consistent registers.   Combine wasteful buffer structures into 1 buffer.  Reduce number of constant changes.  More negligible for deferred contexts but still worth it.
      All these changes are not difficult as I have layered my graphics engine in such a way that it doesn't disturb the lower levels.  Ie. Instance management is not bound to mesh directly, mesh management allows for compression easily.    All static buffers are set immutable in my game, so vertex, index and most index buffers are immutable.
      So the questions: 
      - Are some or all changes worth it?  Or am I going to just suffer from draw calls?  
      - I am assuming at the moment that Setting vertex buffers, index buffers, instance buffers are part of the command buffer?  Is this correct, i'm looking to reduce the number of calls pushed through it.
      - I assume in a deferred context world, that constant buffers when set are not persistent across contexts when I execute command lists.
      - Lastly, should I look into Draw Indexed instanced indirect to accumulate draw calls?  And would I get any benefit from the GPU side doing this?
       
       
       
    • By Zototh
      I am using slimDX and am having a problem with a shader. I have an instance Shader that works perfect but I needed one for drawing fonts manually. The idea is to create the plane and simple instance it with separate position color and texture coordinates for each char.  I know this post is terribly long but any help would be appreciated. I tried to provide everything needed but if you need more I will be glad to post it.
      This is the shader. the only difference between it and the working one is the instance texture coordinates. I was able to render 4,000 spheres with 30,000 faces with the original and still maintain a 100+ framerate. I don't know if that is a lot but it looked like it to me.
      cbuffer cbVSPerFrame:register(b0) { row_major matrix world; row_major matrix viewProj; }; Texture2D g_Tex; SamplerState g_Sampler; struct VSInstance { float4 Pos : POSITION; float3 Normal : NORMAL; float2 Texcoord : TEXCOORD0; float4 model_matrix0 : TEXCOORD1; float4 model_matrix1 : TEXCOORD2; float4 model_matrix2 : TEXCOORD3; float4 model_matrix3 : TEXCOORD4; // this is the only addition float2 instanceCoord:TEXCOORD5; float4 Color:COLOR; }; struct PSInput { float4 Pos : SV_Position; float3 Normal : NORMAL; float4 Color:COLOR; float2 Texcoord : TEXCOORD0; }; PSInput Instancing(VSInstance In) { PSInput Out; // construct the model matrix row_major float4x4 modelMatrix = { In.model_matrix0, In.model_matrix1, In.model_matrix2, In.model_matrix3 }; Out.Normal = mul(In.Normal, (row_major float3x3)modelMatrix); float4 WorldPos = mul(In.Pos, modelMatrix); Out.Pos = mul(WorldPos, viewProj); Out.Texcoord = In.instanceCoord; Out.Color = In.Color; return Out; } float4 PS(PSInput In) : SV_Target { return g_Tex.Sample(g_Sampler, In.Texcoord); } technique11 HWInstancing { pass P0 { SetGeometryShader(0); SetVertexShader(CompileShader(vs_4_0, Instancing())); SetPixelShader(CompileShader(ps_4_0, PS())); } } this is the input elements for the 2 buffers
      private static readonly InputElement[] TextInstance = { new InputElement("POSITION", 0, Format.R32G32B32_Float, 0, 0, InputClassification.PerVertexData, 0), new InputElement("NORMAL", 0, Format.R32G32B32_Float, InputElement.AppendAligned, 0, InputClassification.PerVertexData, 0), new InputElement("TEXCOORD", 0, Format.R32G32_Float, InputElement.AppendAligned, 0, InputClassification.PerVertexData, 0), new InputElement("TEXCOORD", 1, Format.R32G32B32A32_Float, 0, 1, InputClassification.PerInstanceData, 1 ), new InputElement("TEXCOORD", 2, Format.R32G32B32A32_Float, InputElement.AppendAligned, 1, InputClassification.PerInstanceData, 1 ), new InputElement("TEXCOORD", 3, Format.R32G32B32A32_Float, InputElement.AppendAligned, 1, InputClassification.PerInstanceData, 1 ), new InputElement("TEXCOORD", 4, Format.R32G32B32A32_Float, InputElement.AppendAligned, 1, InputClassification.PerInstanceData, 1 ), new InputElement("TEXCOORD", 5, Format.R32G32_Float, InputElement.AppendAligned, 1, InputClassification.PerInstanceData, 1 ), new InputElement("COLOR", 0, Format.R32G32B32A32_Float, InputElement.AppendAligned, 1, InputClassification.PerInstanceData, 1 ) }; the struct for holding instance data. 
      [StructLayout(LayoutKind.Sequential)] public struct InstancedText { public Matrix InstancePosition; public Vector2 InstanceCoords; public Color4 Color; }; instanceData buffer creation. Instance Positions is a simple List<InstancedText> above
      DataStream ds = new DataStream(InstancePositions.ToArray(), true, true); BufferDescription vbDesc = new BufferDescription(); vbDesc.BindFlags = BindFlags.VertexBuffer; vbDesc.CpuAccessFlags = CpuAccessFlags.None; vbDesc.OptionFlags = ResourceOptionFlags.None; vbDesc.Usage = ResourceUsage.Default; vbDesc.SizeInBytes = InstancePositions.Count * Marshal.SizeOf<InstancedText>(); vbDesc.StructureByteStride = Marshal.SizeOf<InstancedText>(); ds.Position = 0; instanceData = new Buffer(renderer.Device, vbDesc);  
      and finally the render code.
      the mesh is a model class that contains the plane's data. PositionNormalTexture is just a struct for those elements.
      renderer.Context.InputAssembler.InputLayout = new InputLayout(renderer.Device, effect.GetTechniqueByName("HWInstancing").GetPassByIndex(0).Description.Signature, TextInstance); renderer.Context.InputAssembler.PrimitiveTopology = PrimitiveTopology.TriangleList; renderer.Context.InputAssembler.SetVertexBuffers(0, new VertexBufferBinding(mesh.VertexBuffer, Marshal.SizeOf<PositionNormalTexture>(), 0)); renderer.Context.InputAssembler.SetIndexBuffer(mesh.IndexBuffer, SlimDX.DXGI.Format.R32_UInt, 0); renderer.Context.InputAssembler.SetVertexBuffers(1, new VertexBufferBinding(instanceData, Marshal.SizeOf<InstancedText>(), 0)); effect.GetVariableByName("g_Tex").AsResource().SetResource(textures[fonts[name].Name]); EffectTechnique currentTechnique = effect.GetTechniqueByName("HWInstancing"); for (int pass = 0; pass < currentTechnique.Description.PassCount; ++pass) { EffectPass Pass = currentTechnique.GetPassByIndex(pass); System.Diagnostics.Debug.Assert(Pass.IsValid, "Invalid EffectPass"); Pass.Apply(renderer.Context); renderer.Context.DrawIndexedInstanced(mesh.IndexCount, InstancePositions.Count, 0, 0, 0); }; I have been over everything I can think of to find the problem but I can't seem to locate it.
      my best guess is the instance data buffer is wrong somehow since VS graphics debugger shows no output from vertex shader stage
       but I just can't see where.
    • By Jordy
      I'm copying mipmaps of a BC3 compressed texture region to a new (and bigger) BC3 compressed texture with ID3D11DeviceContext::CopySubresourceRegion.
      Unfortunately the new texture contains incorrect mipmaps when the width or height of a mipmap level are unaligned to the block size, which is 4 in the case of BC3.
      I think this has to do with the virtual and physical size of a mipmap level for block compressed textures: https://msdn.microsoft.com/en-us/library/windows/desktop/bb694531(v=vs.85).aspx#Virtual_Size
      There is also a warning:
      I don't know how to account for the physical memory size and if that's possible when using ID3D11DeviceContext::CopySubresourceRegion.
      Is it possible, and if so, how?
    • By thefoxbard
      From what the MSDN states, there are two ways of compiling HLSL shaders: either at runtime or "offline" -- using a tool like fxc.exe, for instance
      My question is, are there any risks in using pre-compiled shaders in the final game? I mean, is there any situation in which the pre-compiled shaders might not work?
      Or ideally shaders should always be compiled when lauching the game?
  • Popular Now