• Advertisement


  • Content count

  • Joined

  • Last visited

Community Reputation

198 Neutral

About SmellyIrishMan

  • Rank
  1. [SharpDX] Unable to map a texture to staging buffer

    Clearly I had some twisted understanding of what I was trying to do.   I finally got D3D debugging working (it's been refusing to work for me the last while) and it clearly said that my destination buffer was the wrong size given the source. Once the debugging information was coming out it was pretty explicit in what I was doing wrong . D3D11 ERROR: ID3D11DeviceContext::ResolveSubresource: The SrcSubresource and DstSubresource dimensions are not equal. SrcSubresource = { width:64, height:64, depth:1 }. DstSubresource = { width:32768, height:1, depth:1 }. [ RESOURCE_MANIPULATION ERROR #290: DEVICE_RESOLVESUBRESOURCE_DESTINATION_INVALID] D3D11 ERROR: ID3D11DeviceContext::ResolveSubresource: The formats of each Resource are not compatible each other. Source Resource format is (0xa, R16G16B16A16_FLOAT). Destination Resource format is (0, UNKNOWN). [ RESOURCE_MANIPULATION ERROR #292: DEVICE_RESOLVESUBRESOURCE_SOURCE_INVALID] D3D11 ERROR: ID3D11DeviceContext::ResolveSubresource: Both Resources must be the same Resource type (ie. Texture2D). [ RESOURCE_MANIPULATION ERROR #290: DEVICE_RESOLVESUBRESOURCE_DESTINATION_INVALID] D3D11 ERROR: ID3D11DeviceContext::ResolveSubresource: Destination Resource must be a D3D11_USAGE_DEFAULT, without multisampling, and without the BIND_DEPTH_STENCIL flag. The Destination Resource has 1 samples. [ RESOURCE_MANIPULATION ERROR #290: DEVICE_RESOLVESUBRESOURCE_DESTINATION_INVALID] I had just a simple buffer stream. What I should have been using was another Texture2D only setup for staging. Here is what the fix looks like once I've moved over to using a Texture2D. Texture2DDescription localBufferDesc = new Texture2DDescription(); localBufferDesc.Width = 64; localBufferDesc.Height = 64; localBufferDesc.MipLevels = numOfMips; localBufferDesc.ArraySize = 1; localBufferDesc.Format = SharpDX.DXGI.Format.R16G16B16A16_Float; localBufferDesc.SampleDescription.Count = 1; localBufferDesc.SampleDescription.Quality = 0; localBufferDesc.Usage = ResourceUsage.Staging; localBufferDesc.BindFlags = BindFlags.None; localBufferDesc.CpuAccessFlags = CpuAccessFlags.Read; localBufferDesc.OptionFlags = ResourceOptionFlags.None; SharpDX.Direct3D11.Texture2D localbuffer = new SharpDX.Direct3D11.Texture2D(device, localBufferDesc); device.Copy(emptyTexture, localbuffer); DataStream data = new DataStream(8 * 64 * 64, true, true); DataBox box = ((DeviceContext)device).MapSubresource(localbuffer, 0, 0, MapMode.Read, MapFlags.None, out data); quickTest = data.ReadHalf4();
  2. So I'm trying to create an empty texture with a few mip levels, write to that texture using a compute shader, then read those values back to the application. At the moment I have created the texture and my compute shader is running and filling the mips with solid colours. When I try to map the resource though I am not able to read anything.   Here is the basic setup. Texture2DDescription textureDesc; textureDesc.Width = 64; textureDesc.Height = 64; textureDesc.MipLevels = 1; textureDesc.ArraySize = 1; textureDesc.Format = SharpDX.DXGI.Format.R16G16B16A16_Float; textureDesc.SampleDescription.Count = 1; textureDesc.SampleDescription.Quality = 0; textureDesc.Usage = ResourceUsage.Default; textureDesc.BindFlags = BindFlags.UnorderedAccess | BindFlags.ShaderResource; textureDesc.CpuAccessFlags = CpuAccessFlags.None; textureDesc.OptionFlags = ResourceOptionFlags.None; SharpDX.Direct3D11.Texture2D emptyTexture = new SharpDX.Direct3D11.Texture2D(device, textureDesc); UnorderedAccessViewDescription uavDesc = new UnorderedAccessViewDescription(); uavDesc.Format = SharpDX.DXGI.Format.R16G16B16A16_Float; uavDesc.Dimension = UnorderedAccessViewDimension.Texture2D; uavDesc.Texture2D.MipSlice = 0; UnorderedAccessView uavMip0 = new UnorderedAccessView(device, emptyTexture, uavDesc); computeShader.SetParameterResource("gOutput", uavMip0); computeShader.SetParameterValue("fillColour", new Vector4(0.1f, 0.2f, 0.3f, 1.0f)); computeShader.Apply(); device.Dispatch(1, 64, 1); BufferDescription bufferDesc = new BufferDescription(); bufferDesc.Usage = ResourceUsage.Staging; bufferDesc.BindFlags = BindFlags.None; bufferDesc.SizeInBytes = 8 * 64 * 64; bufferDesc.CpuAccessFlags = CpuAccessFlags.Read; bufferDesc.StructureByteStride = 8; bufferDesc.OptionFlags = ResourceOptionFlags.None; SharpDX.Direct3D11.Buffer localbuffer = new SharpDX.Direct3D11.Buffer(device, bufferDesc); device.Copy(emptyTexture, 0, localbuffer, 0, SharpDX.DXGI.Format.R16G16B16A16_Float); DataStream data = new DataStream(8 * 64 * 64, true, true); DataBox box = ((DeviceContext)device).MapSubresource(localbuffer, MapMode.Read, MapFlags.None, out data); Half4 value = data.ReadHalf4(); So no matter how many values I read at the end, they're always (0, 0, 0, 0). I'm not really sure where the problem lies at the moment.  
  3. SampleCmpLevelZero samples return 0 only

    And just to show that RenderDoc was trying to warn me of my downfall, here's some shots with the unset and set sampler. They've even gone and highlighted it in red for me   I had noticed this earlier in my debugging but I dismissed it too easily. I guess I didn't trust my tools. Lesson learned.   [attachment=26724:SamplerFixed.png] [attachment=26725:SamplerWarning.png]
  4. SampleCmpLevelZero samples return 0 only

    Amazing what some food can do. So after hours spent on this I finally have my answer.   Don't create your samplers in the .fx file. It's dependent on the Effects library and so it won't actually set the parameters that you ask for, just the default values.   So yeah, moving the sampler description into code and setting it there fit everything right up.   For some similar issues you can look here at least.   http://www.gamedev.net/topic/625963-samplecmplevelzero-just-returns-black/ http://www.gamedev.net/topic/633909-how-to-use-texture-samplers/
  5. Hello!   It's been a long day of trying to get shadow mapping implemented. It started off pretty well and I'm in a good spot, only this little texture sample is causing me some issues. I'm trying to use SampleCmpLevelZero to get some free PCF but the function returns 0, no matter what I put into it.   My shadow map looks good, and if I do a simple check then everything works out pretty well. For example this piece of code comes out with some good output and shows that there is some depth information that makes sense in both the local pixel depth and the shadowMap. float depthFromLightToThisPixel = pIn.ShadowPosH.z; float depthFromLightToClosestPixel = gShadowMap.Sample(sam, pIn.ShadowPosH.xy).r; float depthDiff = abs(depthFromLightToThisPixel - depthFromLightToClosestPixel); return float4(0.9f, 0.9f, 0.9f, 1.0f) * depthDiff; This code however, just returns 0. Fully black for all pixels. Even if I force the depthFromLightToThisPixel to be 0, 1, or any other value. float depthFromLightToThisPixel = pIn.ShadowPosH.z; float shadow = gShadowMap.SampleCmpLevelZero(ShadowSampler, pIn.ShadowPosH.xy, depthFromLightToThisPixel); return float4(0.9f, 0.9f, 0.9f, 1.0f) * shadow; I've come across a few possible issues and I think I've narrowed it down to either;   1.Sampler is incorrect. 2.Texture format is incorrect.   Here's the sampler. (BorderColor just for testing) SamplerComparisonState ShadowSampler { Filter = COMPARISON_MIN_MAG_LINEAR_MIP_POINT; AddressU = BORDER; AddressV = BORDER; AddressW = BORDER; BorderColor = float4(0.5f, 0.5f, 0.5f, 1.0f); ComparisonFunc = LESS_EQUAL; }; And my textures are set up as follows   Texture2D; R24G8_Typeless DepthStencilView; D24_UNorm_S8_UInt ShaderResourceView; R24_UNorm_X8_Typeless   Now R24_UNorm_X8_Typeless should be OK to use the comparison function according to the bottom of the page here (https://msdn.microsoft.com/en-us/library/windows/desktop/ff476132%28v=vs.85%29.aspx). When I run the shader through RenderDoc I get a texture format of R24G8_Typeless but I assume that's OK and the ShaderResourceView is still doing the read in it's own format. I've attached a file showing the texture format in RenderDoc.   Ah. I had a thought before I left that it might be related to mips since that's what level the compare happens on. I should be generating just a single level of mip for the texture. However if I use SampleLevel in the first block of code for example; depthFromLightToClosestPixel = gShadowMap.SampleLevel(sam, 0, pIn.ShadowPosH.xy).r; Then I just get 1.0 returned. Solid white instead of solid black. Perhaps this is the root cause.   I'm pretty stumped at the moment as to what's causing this. Hopefully I haven't overlooked anything and left out any information. Thanks for reading! Time for me to eat!
  6. Yeah, that'll teach me to look at answers at 2am. You're right that it makes sense to have both the format and filters in sRGB.   Thanks for the help!
  7. Thanks very much for the function explanation, it's definitely a little bit confusing .   I might need to look into some more details since applying just the filter didn't change things, but applying both the format and filter did. imageLoadInfo.Format = SharpDX.DXGI.Format.R8G8B8A8_UNorm_SRgb; imageLoadInfo.Filter = SharpDX.Direct3D11.FilterFlags.SRgb | SharpDX.Direct3D11.FilterFlags.None; Thanks MJP!
  8. Hey guys,   So I'm looking into gamma/linear correctness and all of that goodness but hit a bit of a hurdle tonight. I'm specifically using SharpDX at the moment but that shouldn't have much of an impact on what I'm trying to accomplish. SharpDX.Direct3D11.ImageLoadInformation imageLoadInfo = new SharpDX.Direct3D11.ImageLoadInformation(); imageLoadInfo.BindFlags = SharpDX.Direct3D11.BindFlags.ShaderResource; imageLoadInfo.Format = SharpDX.DXGI.Format.R8G8B8A8_UNorm_SRgb; SharpDX.Direct3D11.Resource sRGBTexture = SharpDX.Direct3D11.Texture2D.FromFile(device, filepath, imageLoadInfo); textureView = new SharpDX.Direct3D11.ShaderResourceView(device, sRGBTexture); The problem is that no matter if I use R8G8B8A8_UNorm or R8G8B8A8_UNorm_SRgb, the sampler in the shader is still reading the same values. If I manually adjust the values in the shader to account for gamma with pow(sampler.sample(), 2.2) then things are back on track again. Perhaps I'm misunderstanding things but I thought that if I changed the format of the ResourceView then it should automatically apply or not apply gamma correction in the texture read.   For instance; https://msdn.microsoft.com/en-us/library/windows/desktop/hh972627(v=vs.85).aspx   Am I mistaken?
  9. Hey Gamedev,   I'd like to create a reveal effect that slowly reveals the model in a scene. At the moment I have a very simple pixel shader that will reveal the model over time with a quick sin wave at the edge of the reveal for effect. The result is something like this... Now, there is not a lot of impact with just this for a number of reasons. One thing that I would like to add is particle at the edge of the reveal. I'm trying to think of a number of ways to execute this but I could really do with some help since I'm sure that this is something that has been done a million times before. This is basically what I would be aiming for. Please excuse the extremely enviable msPaint skills I'm developing this for use in Unity but I'm really new to it and don't really know exactly what's available to me. So if you have something that works specifically for Unity that is fine, but I'm really looking for just some general pointers on how to execute something like this. Any advice, pointers, links, reference materials are all very much appreciated. Thanks for taking the time to read my post.
  10. How should I do environment reveal/texture blending?

    Cheers Rick, While I'm not a master I have a good grip, both theoretically and practically, on all of the methods you've mentioned. I guess I was just looking for some suggestions as to how I could approach the problem and your post has certainly pointed out some new things for me to think about so thanks for the post, it's very much appreciated!
  11. At the moment I am trying to research possible ways to go from a solid black screen and then slowly reveal/hide the world to the player. This is not a short fade in at the end of a load but would take place over the length of a "round" which would be anything from 1-20minutes long. It would also need to be quite stylistic, allowing for highly creative reveals. Imagine a model transitioning from a wireframe into a fully textured model, only it doesn't just transition from bottom to top but instead, text is slowly written or paint splatter marks are projected to reveal the texture of the model. I guess it's something along the lines of multi-texture blending similar to what is done with terrain but I'm not positive, maybe masking? Anyway... At the moment I have 2 or 3 different ideas that are floating around in my head. 1; UV map the environment and have a regular colour texture but also have a greyscale reveal texture (Revealing the colour texture underneath), black areas would be visible immediately and then whiter areas would slowly get revealed as the game progressed. This could allow for all sorts of creative reveals. The problems I currently see with it is that I don't know if there would be discontinuity between different parts of the UV map. Also it mean a full extra texture for each environment, but I think this is reasonable and the bit-depth of the greyscale can be knocked down a little if needs be. 2; If the camera is static, then the reveal could be done in screen space but I think this would not give the most convincing effect overall. It would be good for screen transitions etc that are quick but not for a long reveal. Perspective would be quite hard to deal with. Although I suppose if the camera was static then I could take the first approach to generate the reveal texture and then take a 2D snapshot of the environment with the reveal texture to generate a screen-space reference which should save on memory, but restricts us to a static camera. 3; Do a geometry check for distance/intersection and then manipulate the edges of this using some reference image to match the concept style. I'm avoiding a straight-up programmatic approach for now as an artistic approach would provide a much better feel. I'm trying to find material and reference to better illustrate what I'm talking about but I'm having a hard time doing so. Something like the following but without the bouncing/animated effects, just the spreading; [url="http://www.youtube.com/watch?v=uM9zNt8BJ-Q&feature=related"]Youtube[/url] Or picture Peter Parker being taken over from Venom, Predator going from visible->invisible, etc. If you have any references, papers, articles, videos or know the official name of the technique I'm trying to describe that would be great. I hope that I've done a reasonable job describing what I'm trying to achieve so if you have any suggestions then I would be glad to hear them.
  12. OpenGL Vertices, Normals and Indices

    Think about a cube also. A cube has 8 vertices, but there needs to be 3 normals at each of those vertices (for a total of 24 normals) if you want to get accurate shading results. Have a look at this image for an example of that. [url="http://www.cuboslocos.com/images/OSG_CubeNormals.jpg"]Cube with proper normals.[/url] These additional normals are typically created when the angle between two normals on the same vertex break a certain threshold. If you had just one averaged normal at each vertex then you get [url="http://www.eecs.berkeley.edu/%7Eug/slide/docs/slide/spec/images/normals_shared_gouraud.gif"]this[/url] and that's going to give you some funky shading.
  13. Particle Effect

    [quote name='vicer1234' timestamp='1307397702' post='4820270'] how to give particle a blast effect 2d. eg: 10 particles and what sort of velocity should i give to make it appear as blast??? [/quote] You need to analyse the effect that you're looking to accomplish and see what type of velocity/positioning etc... that it displays. So taking your example of a blast... well, normally a blast comes from a very specific point, so the initial position of the particles will be very close to one another and to the source of the blast. Then lets look at the velocity, unless the blast is from something very specific then an explosion typically has not specific direction and instead just spread out in all directions, so you should distribute your particles velocity over a unit circle. Have a look at this for example; [url="http://www.youtube.com/watch?v=_up622Gtsmk"]blast effect[/url] With particles (and most simulation) you need to study your target, figure out its general properties and then try to match these.
  14. How to converse tangential velocity

    So I got around to messing about with this again today and managed to come up with this; originalVelocity = strand.Links[i].PhysicsEntity.linearVelocity; normalisedVelocity = new Vector3(originalVelocity.X, originalVelocity.Y, originalVelocity.Z); normalisedVelocity.Normalize(); strand.Links[i].PhysicsEntity.linearVelocity = originalVelocity - (contactNormal * Vector3.Dot(contactNormal, normalisedVelocity)); strand.Links[i].PhysicsEntity.linearVelocity *= 0.9f; So I'm checking the angle of penetration and removing this from the velocity in order to just get the tangential quantity. The simulation runs a whole lot smoother now anyway, so I'm happy.
  15. How to converse tangential velocity

    Hey, I'm trying to write a hair simulation program at the moment and have come stuck on a little math problem that I'm hoping someone can help me out with. I want the strands of hair to "slide" over the head so I think I need to keep any tangental velocity of the strand when it collides with the hair. At the moment I'm just moving the strand out of the collision and zeroing it's velocity as so... p1 = strand.Links[i].Position; //Move link out of head and zero movement. p1Dash = p1 + (contactNormal * (contactDepth)); strand.Links[i].PhysicsEntity.centerPosition = p1Dash; strand.Links[i].PhysicsEntity.linearVelocity = Vector3.Zero; strand.Links[i].PhysicsEntity.linearMomentum = Vector3.Zero; strand.Links[i].PhysicsEntity.angularVelocity = Vector3.Zero; strand.Links[i].PhysicsEntity.angularMomentum = Vector3.Zero; I figure I need to conserve the tangental velocity somehow, but I'm not sure how to go about this (how do I find the appropriate tangent, what do I cross etc...) Here are some diagrams to illustrate the problem. http://img522.imageshack.us/img522/556/physicsquestion.png This picture is what I think I need. I need to send the colliding particle along the collision plane's tangent in order to have it move "around" the object. http://img515.imageshack.us/img515/7173/physicsquestionprograme.png These are some picture of what I have at the moment. In the first slide you can see the bunching of the links as they are just stopped solid. The second slide shows some progress but this progress is very bumpy as the position of the link is simply pushed out at each frame. Finally, the 3rd slide shows the end result by just pushing out the links. Not bad, but it doesn't look good in motion. [Edited by - SmellyIrishMan on April 3, 2010 8:46:00 AM]
  • Advertisement