Advertisement Jump to content
  • Advertisement


  • Content Count

  • Joined

  • Last visited

Community Reputation

828 Good

About godmodder

  • Rank
    Advanced Member

Personal Information

  • Interests

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

  1. It observes the current position and caches images around that point in concentric circles. I have set a manual threshold so that it caches only e.g. 5 circles. Every time a significant change in position occurs, the cached area moves with it. Some items will already be in cache, others will have to be loaded or deleted. I see what you mean. The sleep() I put in the caching thread sort of reduces the max work the thread can do in a certain amount of time in a crude manner and it does indeed seem to help. I have also played with thread priorities a bit, but this does not seem to help. I've always thought the main thread would take priority and just push the caching thread to the background, but apparently that's not how it works. This puzzles me... Isn't this the whole point of priorities? Only the caching thread reads jpeg files from disk. This only takes a fraction of the time compared to the decoding of these though. I don't understand why the caching thread doesn't just yield when set to a lower priority. It is perfectly fine if it caches fewer images this way, but the rendering should never slow down for it. Surely there exists a mechanism for this no? EDIT: I'm starting to think that thread priority only applies to raw CPU time and that threads compete equally for memory accesses. This sounds counter-intuitive to me, however.
  2. Hi, In my image-based renderer, I want to implement a cache that prefetches images from a neighbourhood for rapid retrieval. I have a main render loop which has to run at 90FPS (VR), and another caching thread which is supposed to run in the background. However, when I change my position, new images are loaded by the cache thread and it often spikes cpu usage to the point where my render thread suffers. This happens even though my render loop is not synchronised with the caching thread. If an image is not in cache and it is supposed to be rendered, it just returns an empty one for the current frame. Nonetheless, I can still observe stuttering when this happens. When I move slowly it is fine however. When I put a sleep(100) inside my caching thread, the stuttering does not occur, but ofcourse the caching is also a lot slower. Any ideas why this happens and how I can solve this? My main theory now is that the caching thread performs a lot of memory accesses to load the images. As RAM only has one address line, other threads also suffer. But how can I ever get around this?
  3. I guess it could, but rendering with an indirect context would be slow, no?
  4. Ok, Samoth. It seems I will have to use Vulkan for this then. Thank you for your help! If I figure out how to do it, I will post the solution here.   EDIT: just saw that my GTX970 exposes the required extensions here. Maybe my drivers are more recent?
  5.   The data is generated on the GPU, so getting it to system memory would be too slow.     Ok, I am confused now. This is possible with VK_external_memory in Vulkan right?
  6. I want to share texture data between different processes, which for legal reasons I cannot merge into one process. (It's stupid, but out of my control)   I know that in Vulkan I can do this with the VK_external_memory and VK_external_semaphore extensions, but I am not aware of how to do this in OpenGL. If I could interop between Vulkan and OpenGL, that would be fine, but I only found an extension GL_draw_vulkan_image. This means I can communicate from Vulkan to OpenGL, but not the other way around.   The problem is that my two processes are heavily dependent  on OpenGL, so porting them to Vulkan would be a pain.
  7. Hello,   I want to produce a texture in one OpenGL process and consume it (read-only) in another OpenGL process. The texture should remain on the GPU at all times. Ideally, I would want to share the texture, but from what I've read this isn't possible.   Another option would be to use NV_copy_image to copy the texture from one context to another, but still on the GPU. Would this work or is something like this simply not possible?   Thanks for helping me out!
  8. godmodder

    Motherboard-dependent code

    Ok, I was confused because I thought: how does it talk to the BIOS? Calling the BIOS requires 16-bit real mode, while the HAL runs in 32/64bit protected mode at all times. It turns out they actually wrote a 16-bit emulator in HAL.dll to deal with this :D   Though I still wonder whether they also deal with PIC/APIC stuff in the HAL. Are those things integrated into CPUs nowadays?
  9. Hi,   Lately I got interested in the architecture of operating systems. So, I was reading the book "Windows Internals" and I read some things about the hardware abstraction layer (HAL). I could not find an answer by googling to the following question: Does an OS contain motherboard-specific code?   To clarify: I know that an OS is very much dependent on CPU architecture. I can imagine running drivers for the motherboard chipset as well. But does the HAL really contain motherboard-specific code? Even for motherboards that run on the same chipset? (e.g. z97)     I would be really grateful if someone could clear this up.  
  10. godmodder

    ANL Expression Parsing

    Nice library name ;)
  11. That's so true. These days game companies need a lot more people and teams are orders of magnitude larger than in the old days. Only a small group of those will be true experts, but a lot of them can just be good. It has been my experience too that you can get a junior job these days without much expert experience. To give an example, none of my junior colleagues had even heard of template meta-programming :P
  12. godmodder

    Calling Functions With Pre-Set Arguments in Modern C++

    Not using STL for this in "modern C++" makes this example a bit contrived imo.
  13. godmodder

    How to Pitch Angry Birds, If It Were an Unknown Indie Game

    If you have a great game, pitching it should not pose a problem. If your game sucks on the other hand, now that's a pitching challenge.
  14. godmodder

    Stopping a negative viral campaign against you?

    Even if it were possible, do you even care to be respected by people who send you death threats over a Kickstarter project?
  • Advertisement

Important Information

By using, you agree to our community Guidelines, Terms of Use, and Privacy Policy. is your game development community. Create an account for your GameDev Portfolio and participate in the largest developer community in the games industry.

Sign me up!