Jump to content
  • Advertisement


  • Content Count

  • Joined

  • Last visited

Community Reputation

167 Neutral

About empodecles

  • Rank
  1. empodecles

    SharpDX and R32_Float Texture Format

    I switched over to using a standard RGBA pixel format to test things out and make sure my pixel shader was doing what I envisioned.  And it seems to.   Finding that the CopyResource to the CPU Staging texture is having unexpected results though when I map the data to a float[], byte[] or color[] array... still working on debugging that!  Seeing if it is something wrong I did.  I haven't been able to figure out how to view the Copied texture. I get an error when I try to render or save it to file. But the CopyResource and Map.Read of the data seems to go pretty quick.  Only working with a 512x512 texture at the moment (and that will probably be enough)   I have to do a bunch of (pretty complicated...) physics processing on the data after.  Though to this stage getting the data "manually" was the bottleneck.   The pixel shader seems to generate the data I need very fast!  Its now trying to get access to that data on CPU side I'm working through.
  2. empodecles

    SharpDX and R32_Float Texture Format

    Rendering the texture to the screen for debugging purposes was simple with a PixelFormat of R8B8G8A8....  I was hoping to do it with a R32_Float.   But your second point was the one I ultimately need!  I need the data from the shaders in a float[] for other calculations... Eventually I will try tacking Compute Shaders, but I'm on a time crunch and want to stick with what I know right now! Thanks for the pointer on copying the resource for CPU acces. I did that before with something else and completely forgot about that method.   cheers P
  3. Hello, I am creating a R32_Float Texture/RenderTarget which I am writing to in a shader. For debugging purposes I was hoping to render this texture to the screen but I can't seem to find any method to do this. Tried creating a Direct2D bitmap, but I get a Pixel Format Not Supported exception.   The end result is I want to be able to read from the texture on the CPU side after it has been created on the GPU side.     thanks P.   (texture description creation) var desc = new SharpDX.Direct3D11.Texture2DDescription() { ArraySize = 1, BindFlags = SharpDX.Direct3D11.BindFlags.RenderTarget | SharpDX.Direct3D11.BindFlags.ShaderResource, CpuAccessFlags = SharpDX.Direct3D11.CpuAccessFlags.None, Format = SharpDX.DXGI.Format.R32_Float, Width = width, Height = height, MipLevels=1, OptionFlags = SharpDX.Direct3D11.ResourceOptionFlags.None, SampleDescription = new SharpDX.DXGI.SampleDescription(1, 0), Usage = SharpDX.Direct3D11.ResourceUsage.Default };
  4. Cool thanks! I will give that a shot. :)
  5. Just started learning to use SharpDX instead of XNA. SharpDX forums have been closed down so I'm asking over hear, hopefully someone will point me in the right direction..   I am trying to create a TextureCube at run-time based on the current camera position (this would only happen on occasion when a specific "dirty" flag has been set) to be used for environment mapping on reflective surfaces. it was easy and straight forward to do in xna... Just to be clear...I don't want to be loading the texture from file! They will be coming from a Render Target at run time. Pretty much all the examples I have seen to do with TextureCubes show loading it from a dds file and that won't work for me.   I have 6 seperate Texture2D objects created (one for each view axis) now I just need to combine them into a TextureCube/Texture2D array so that I can use them in my shader (I already have the shader working when I load the TextureCube from a static dds file, but that is not what I want)   cheers p
  • Advertisement

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

GameDev.net is your game development community. Create an account for your GameDev Portfolio and participate in the largest developer community in the games industry.

Sign me up!