Jan K

  • Content count

  • Joined

  • Last visited

Community Reputation

616 Good

About Jan K

  • Rank
  1. Like it... seems to be a very promising remake of an old Game Boy Hit Kuru Kuru Kururin. I've already seen a few other remakes on the iPhone, but none was really good. I am really looking forward to this one. You said something about a beta in the near future? How can I register for it?
  2. Detecting when libRocket GUI handles input

    One idea would be to insert a "background element" into your UI, which stretches across the entire context and then attach an event listener to that, which detects mouse-move events. So when you know your mouse moved and inject that into Rocket and your listener on the background element does not get a mouse-move event, then rocket handled it internally for some other element (same for mouse clicks).   You could actually just take the data from the background element and feed that back into your engine (mouse-moves / clicks / etc.).   I haven't tried this, have just stumbled across Rocket and this particular issue and would be interested in hearing, whether this solves this important issue, so let me know what you come up with.
  3. Ah, I didn't notice that! Thanks for the info, I'll try it tomorrow. Jan.
  4. Don't I need a D3D9 device for that?
  5. Hi I'd like to add custom messages to the log that can be created when using PIX and a debug-context. I thought that ID3D11InfoQueue would allow this, but it seems not to work. Also D3D9 seemed to have BeginEvent / EndEvent calls to group batches visibly in the log. I can't find anything similar in D3D11. Anything I'm missing here? Thanks, Jan.
  6. Ok, I found it out. D3D does not use the same format as OpenGL for compressed volume textures. Instead simply each slice of the volume is compressed like a normal 2D texture. No data rearrangement is necessary. Then the sysmempitch etc. are as follows (for DXT5 / BC3 or DXT3 / BC2 formats): D3D11_SUBRESOURCE_DATA data; data.SysMemPitch = 16 * (TexWidth / 4); data.SysMemSlicePitch = data.SysMemPitch * (TexHeight / 4); Maybe it will help someone in the future. I didn't find much regarding this topic on Google, so far. Jan.
  7. Yeah tried all that, doesn't work. The problem with 3D textures is that they are stored as 4x4x4 blocks, so one "line" would actually be a slab (or volume) and it is not intuitive how the "line" and "slice" are then to be considered. And since the D3D docu doesn't mention the sysmempitch for compressed formats at all (for 2D textures neither), this is extremely confusing. Jan.
  8. Hi there I am trying to load a 3D texture into D3D that is compressed using the DXT5 / BC3 format. I have successfully loaded it into OpenGL, where the NV_vtc extension explicitly defines how to do this (see here: http://www.opengl.org/registry/specs/NV/texture_compression_vtc.txt). So I am positive that my data-layout is correct. Since the whole 2D DXT stuff is identical in OpenGL and D3D, I assume there is no difference with 3D textures. Now I also want to be able to load it in D3D11, but all I get is a crash in ID3D11Device::CreateTexture3D OR corrupted data (depending on whether I start it in debug-mode or not). PIX already told me, that I cannot use a DXT texture as a render-target, so I fixed that, but otherwise it does not seem to mind the data. I assume the problem is that I really don't know how to fill out the D3D11_SUBRESOURCE_DATA structure. I have no idea what to put into SysMemPitch and SysMemSlicePitch, when the data is a 3D texture in DXT5 format. Any help would be greatly appreciated. Jan.
  9. Hi I have an editor to create levels for my 3D engine. I am mostly quite happy with it, but over time more and more functionality is added, making it harder to maintain and making compile-time quite long. So far everything is statically compiled in and the editor actually has to know each object type. Also every editing operation has to be added to the editor itself. Now i was thinking, maybe if I redesign the core of the editor, maybe i could make it more into a "sandbox", where the editor itself only provides general functionality, but then one can (dynamically) link modules into it, which implement one specific feature (for example a mode to create objects of a certain type). As far as I know, programs like 3D Max allow to do that. Obviously this is a difficult task. I consider myself an experience programmer, but the scale of such an undertaking is too big to "just try my best". Long story short: I am looking for some literature that might give me tips how to implement such an editor. What interfaces do i need to expose, where to draw the line, etc. Any links, hints, books are appreciated. If anyone has done such a thing before, I would be happy to hear about it. Thanks, Jan.
  10. Yeah, i have implemented a queue now, where the resources are destroyed with a 1 second delay. It works. But it feels hacky.
  11. The thing is that the library that implements the allocating/storing/rendering is not meant to be used for high-performance rendering. Its only purpose is to make it quick and easy to get something on screen. It implements a rendering style similar to OpenGL's immediate mode. As a result the object that encapsulates one rendering-batch is immediately destroyed after rendering. Of course for repeatedly rendering the same data, that needs to be fast, i wouldn't recreate resources all the time, but in this case it is really meant to be a quick and easy method for rendering, not a high-performance method. And i would like to keep it that way. Anyway, so there doesn't seem to be a way to have D3D manage / tell me when the buffer can be safely released. That's a pity, it somehow feels broken. Thanks, Jan.
  12. Hi I have stumbled upon an annoying problem: For text-rendering and some other stuff, i use an abstraction that creates vertex and index-buffers on the fly, issues the drawcall and then destroys the buffer again. Now i found out that destroying the buffer immediately is a bad idea, because the result will be undefined (namely i get rendering-artifacts). The DirectX documentation specifically mentions this also (for example here: http://msdn.microsoft.com/en-us/library/bb173588%28VS.85%29.aspx - "The method will not hold a reference to the interfaces passed in. For that reason, applications should be careful not to release an interface currently in use by the device.") So the question is HOW do i properly destroy the buffer, without worrying about whether it is still in use ? I could defer the destruction by putting the buffer-pointer into a queue and releasing all queued buffers with a delay of 2-3 frames, simply hoping that until then the GPU is indeed finished rendering with them. But this feels like a very hacky solution that's bound to come back to haunt me sooner than later. So is there any better, REAL solution to this problem ? Thanks, Jan.
  13. That's great thank you! So is the documentation incorrect or did i misunderstand it? http://msdn.microsoft.com/en-us/library/ff476392%28VS.85%29.aspx Under Remarks it says "Immutable, and depth-stencil resources cannot be used as a destination." That was what i was referring to. Thanks again, Jan.
  14. I usually only want to make an exact copy of a depth-texture, such that i can make a snapshot of the current depth-buffer and use it for later render-passes as an input-texture. I am a bit surprised, that even an EXACT copy of depth-stencil textures is not supported. The question for me is then: How do i write a shader to copy depth-stencil textures (i am mostly interested in the depth-values) ? Would i bind the destination texture as a depth-stencil target and then sample the source-texture and write the value as "depth" ? Or should i bind the destination texture as a color render-target and then write the sampled depth-value as "red" (ignoring depth-output entirely) ? In the latter case i would need to create a RenderTargetView for the destination-texture. But which format would i use then? DXGI_FORMAT_R24_UNORM_X8_TYPELESS ? DXGI_FORMAT_D24_UNORM_S8_UINT ? This seemingly simple operation (in OpenGL) turns out to be a bit confusing for me. Thanks, Jan.