Jump to content
  • Advertisement


  • Content Count

  • Joined

  • Last visited

Community Reputation

271 Neutral

About KarimIO

  • Rank

Personal Information

  • Role
    Creative Director
    Technical Artist
  • Interests

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

  1. @Wyrframe By your definition, any graph is a scene graph, which seems technically true but I don't see how a "Gang-boss" relationship is general enough to refer to all objects in the scene, which is what I perceive as the definition. I have implemented use of Folders and Levels, and will implement layers (or tags, or both), but I'm talking about hierarchy / graphs within a scene / level and don't think anything (except maybe layers) can be considered scene graph. I do like the idea of using static / dynamic / movable to dictate the use, but there still might be cases where you don't want a static object to move with its parent, which was organized in that way simply for semantic reasons @Sacaldur Interesting idea for the organizational nodes, might go with it. EDIT: But isn't that made redundant by the use of flattening static nodes?
  2. @Wyrframe I know the difference between a scene and a graph, I just figured most scene graphs were actually scene trees (I've never seen on that's an actual graph - Unity, Unreal, I think Godot, all use trees). They almost always just act as a transformation hierarchy. Can you point me to any scene graphs in an engine that don't act as such so I can get what you mean?
  3. So I was debating whether or not to have a scene graph in my engine, because most people just use it for organization, rather than transformation, and I don't want to waste tons of matrix multiplications on things just when the level designer intended to organize their scene. My solution is to have multiple scene graphs, I was thinking of having one for organization and one for transformation. Does this make sense? If not, how do you handle scene graphs in your projects? EDIT: Also, if you flatten some of the structure (ie: leaving the transformation for easy editing, but flatten prior to runtime), how do you decide what to flatten?
  4. KarimIO

    STB Image

    It works great here, been using it for years.
  5. Hey guys! Three questions about uniform buffers: 1) Is there a benefit to Vulkan and DirectX's Shader State for the Constant/Uniform Buffer? In these APIs, and NOT in OpenGL, you must set which shader is going to take each buffer. Why is this? For allowing more slots? 2) I'm building an wrapper over these graphics APIs, and was wondering how to handle passing parameters. In addition, I used my own json format to describe material formats and shader formats. In this, I can describe which shaders get what uniform buffers. I was thinking of moving to support ShaderLab (Unity's shader format) instead, as this would allow people to jump over easily enough and ease up the learning curve. But ShaderLab does not support multiple Uniform Buffers at all, as I can tell, let alone what parameters go where. So to fix this, I was just going to send all Uniform Buffers to all shaders. Is this that big of a problem? 3) Do you have any references on how to organize material uniform buffers? I may be optimizing too early, but I've seen people say what a toll this can take.
  6. Sorry for the late reply, but does this mean I can just make everything RGBA without negative consequences?
  7. Okay thanks! I've built my own dds exporter and importer so I can easily modify it to support this, I was just confused most of the flags related to channels were about uncompressed ones according to the msdn. Also I'll let the question about whether or not dxt_rgb stand for curiosity's sake.
  8. Isn't that just for uncompressed data? Or for all of it?
  9. Hey guys, is there a difference in performance between GL_S3TC_DXT1_RGB and GL_S3TC_DXT1_RGBA? I want to use the latter in all cases if possible because dds files have no way to check if there's an alpha bit or not.
  10. But those issues can be mitigated by using features like parallax cubemapping and SSAO. Not fully, to be sure, but when it's the high frequency data that matters, I don't see how cramming more low-frequency data in can help all that much. And how would they even be combined? I skimmed through most of this and it's quite interesting so far. That was my thought process for the most part but I think many are able to combine lightmaps and SH quite well. Sounds interesting! I'll try this out then. Do you have any articles about this technique?
  11. I was considering only using those kinds of approaches on smaller areas, if at all (RSM, VoxelGI, and the like would be supported in volumes) because they're so expensive. But I'll check out your link, thanks
  12. Hey guys, Are lightmaps still the best way to handle static diffuse irradiance, or is SH used for both diffuse and specular irradiance now? Also, do any modern games use direct light in lightmaps, or are all direct lighting handled by shadow maps now? Finally, how is SH usually baked? Thanks!
  13. Hey guys So I was wondering how modern terrain and water geometry works both with and without tesselation. Essentially: 1) Is Geoclipmapping still the best CPU tesselation technique? 2) Is Geoclipmapping still used with tesselation? 3) Is non-tesselated water just flat? Is there any other (reasonable) ways to simulate it? Do people use Geoclipmapping for that too? Thanks!
  14. KarimIO

    Copy Z-Buffer in DirectX

    To clarify to those finding this later, and so someone can correct me if I'm wrong, this is a good solution, but you can not use it with the default depth buffer in opengl. This can be solved by using an fbo in the middle, and simply doing something that doesn't output depth, such as post processing, at the end.
  15. KarimIO

    Copy Z-Buffer in DirectX

    Forgive me if this is marginally off topic, but with OpenGL, is that done this way? glBindFramebuffer(GL_FRAMEBUFFER, fbo); glFramebufferTexture2D(GL_FRAMEBUFFER, GL_DEPTH_ATTACHMENT, GL_TEXTURE_2D, m_depthTexture, 0); glBindFramebuffer(GL_FRAMEBUFFER, fbo2); glFramebufferTexture2D(GL_FRAMEBUFFER, GL_DEPTH_ATTACHMENT, GL_TEXTURE_2D, m_depthTexture, 0);
  • Advertisement

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

GameDev.net is your game development community. Create an account for your GameDev Portfolio and participate in the largest developer community in the games industry.

Sign me up!