• Announcements

    • khawk

      Download the Game Design and Indie Game Marketing Freebook   07/19/17

      GameDev.net and CRC Press have teamed up to bring a free ebook of content curated from top titles published by CRC Press. The freebook, Practices of Game Design & Indie Game Marketing, includes chapters from The Art of Game Design: A Book of Lenses, A Practical Guide to Indie Game Marketing, and An Architectural Approach to Level Design. The GameDev.net FreeBook is relevant to game designers, developers, and those interested in learning more about the challenges in game development. We know game development can be a tough discipline and business, so we picked several chapters from CRC Press titles that we thought would be of interest to you, the GameDev.net audience, in your journey to design, develop, and market your next game. The free ebook is available through CRC Press by clicking here. The Curated Books The Art of Game Design: A Book of Lenses, Second Edition, by Jesse Schell Presents 100+ sets of questions, or different lenses, for viewing a game’s design, encompassing diverse fields such as psychology, architecture, music, film, software engineering, theme park design, mathematics, anthropology, and more. Written by one of the world's top game designers, this book describes the deepest and most fundamental principles of game design, demonstrating how tactics used in board, card, and athletic games also work in video games. It provides practical instruction on creating world-class games that will be played again and again. View it here. A Practical Guide to Indie Game Marketing, by Joel Dreskin Marketing is an essential but too frequently overlooked or minimized component of the release plan for indie games. A Practical Guide to Indie Game Marketing provides you with the tools needed to build visibility and sell your indie games. With special focus on those developers with small budgets and limited staff and resources, this book is packed with tangible recommendations and techniques that you can put to use immediately. As a seasoned professional of the indie game arena, author Joel Dreskin gives you insight into practical, real-world experiences of marketing numerous successful games and also provides stories of the failures. View it here. An Architectural Approach to Level Design This is one of the first books to integrate architectural and spatial design theory with the field of level design. The book presents architectural techniques and theories for level designers to use in their own work. It connects architecture and level design in different ways that address the practical elements of how designers construct space and the experiential elements of how and why humans interact with this space. Throughout the text, readers learn skills for spatial layout, evoking emotion through gamespaces, and creating better levels through architectural theory. View it here. Learn more and download the ebook by clicking here. Did you know? GameDev.net and CRC Press also recently teamed up to bring GDNet+ Members up to a 20% discount on all CRC Press books. Learn more about this and other benefits here.

radioteeth

Members
  • Content count

    529
  • Joined

  • Last visited

Community Reputation

1778 Excellent

About radioteeth

  • Rank
    Advanced Member

Personal Information

  1. Have you tried outputting csDir to the fragment to see if that is in fact correct? I just managed to get SSR running in my game project a few weeks ago after some tumultuousness.
  2. These were all over the place when I was a kid, before the internet made everyone forget about them.
  3. I started out modding existing games back in the 90s, which gave me a pretty complete idea of what a game is comprised of.
  4. I'm trying to slap screen-space reflections into my engine, and in theory it should work fine but I'm having trouble figuring out how to properly generate the ray vectors. I have the fragment surface normals in world-space, I also have the vectors from camera to fragment in world-space. I can therefore generate the reflection vector itself using:   reflect(camtofrag, fragnormal)   However, I am trying to transform that into screen space properly for outputting to a framebuffer texture, which is then fed into a postfx shader to perform the actual raytracing with itself. My inclination was to do this:   output.xyz = inverse(transpose(modelview)) * reflect(camtofrag, fragnormal);   But the problem is that the reflection vectors shift around a lot when the camera rotates. Surfaces only appear to reflect properly if the camera is at a 45 degree angle to them. A shallower angle results in squished reflections at the edge of the surface where the reflected geometry is connected. Conversely, looking straight at the surface (allowing the shader to reflect whatever is on the outside edges, like a mirror) the reflection stretches deep 'into' the reflecting surface.   Here's a youtube of the described, it may still be uploading at the moment: https://www.youtube.com/watch?v=G2w169gPro4   Here's a set of images showing the reflection vector buffer, and it's clear that there's just too much gradiation across surfaces, and it moves with the camera's rotation. It looks like there needs to be some kind of inverse projection applied so that it's "flatter" and not producing a fisheye sort of reflection: http://imgur.com/gallery/h9w3X   I have a linearized depth buffer, so I figured I could just calculate the direction of the screenspace reflection ray while rasterizing the geometry that will be reflective in the postprocess render, and then in the postprocess do the screenspace reflection ontop of everything using the reflection normals and linear depth buffer, without having to do any more matrix transforms or anything - just trace lines in XYZ, check against depth buffer, and behave accordingly. I'm just using the UV coordinate of the fullscreen quad fragment to sample the reflection vector texture, whose alpha channel contains the linearized depths, and then tracing a line along the reflection vector checking the depths in the alpha channel along the way. As far as I can tell it would work just fine if my reflection vectors were correct - which they clearly aren't judging by the 3 screenshots that show how drastically the normals change just by rotating the camera.   Any ideas off the top of your heads? My goal is to keep this ultra simple and minimal (of course with all the edge fading and artifact mitigation stuff) without storing a bunch of textures to do it. It seems simple enough to just store the reflection vector as generated by the fragment shader of the geometry surfaces themselves - if I could just transform them properly. As I said, I have the exact right reflection vectors in world-space, but I'm just having trouble transforming into screenspace.   Thanks!
  5. They store the pointers in textures, as indexes to nodes.    Check out how they store octrees in here:   http://on-demand.gputechconf.com/gtc/2012/presentations/SB134-Voxel-Cone-Tracing-Octree-Real-Time-Illumination.pdf
  6. Use one texture with R and G channels as your XY velocity, and the B channel as the density?   As for writing to a specific place on the texture, you're going to want to render-to-texture, and in the pixel shader generate what you want for each pixel of that new texture. In this case I would double-buffer, where one texture I am reading from in the shader to generate my output from, and then the output is to a texture of the same size. At the end of the frame, I just swap their roles, so that the texture I was just writing (drawing) to is now the input texture and the other texture I was reading from is now the destination.
  7. It took me a little while to figure this one out as well. What's happening is that the other games are showing the actual measured network RTT, which is a separate thing from the game network update RTT.   The update RTT is naturally going to be as large as 1000ms divided by the update rate Hz, and with different update rates on each end it will keep cycling in a sawtooth fashion as the update rates on both sides line up and fall out of alignment, along with network latency on top of that.   The network RTT is just how long it takes for a packet to travel to and from the other side, outside of the game update packets being sent.
  8. Seems like you're going down the rabbit hole of over engineering, instead of taking a step back and re-evaluating your entire approach altogether.   To my mind, the trick is doing the outer lower-res first, and then progressively refining closer to the camera. If you're handling 'boxes' then something is wrong.
  9.   this...   Make sure you validate your function pointers!
  10. You can convert a quaternion to axis-angle representation by doing something like this: axis.angle = 2 * acos(quat.w) * (3.14159 / 180); // angle of rotation axis.x = quat.x; axis.y = quat.y; axis.z = quat.z; normalize(axis); Maybe that will help.
  11. The CPU is a logic controller that reads assembly instructions from a loaded program in RAM which dictates how program memory is manipulated in RAM, and also how input signals are interpreted and output signals are generated. In the case of modern OS's, the output signals are typically generated over a USB connection, which requires that the program explains to the CPU how to interact and accept the USB device, along with device drivers that basically handle the actual interpretation of hardware signals over USB and generalize them (and responding to them) via an abstraction called an API, or 'Application Programming Interface', which is just a collection of functions that generalize everything that could happen with whatever hardware is communicating via USB.   However, it's been the same since the original serial and parallel ports that proceeded the 'universal serial bus' of modern times.   For graphics, and audio, you *could* control such things over USB, serial, parallel, but they have become their own core parts of the computer because they've been essential since the dawn of the personal computer. So they have their own means of communication. But you could just as well develop your own protocol for communicating with a robotic body via audio output or video output: it's all about the output signals generated and what they are intended to do on the recipient hardware.   It's actually not that complicated once you understand how a CPU itself works.
  12. I'm going to take a stab in the dark, and guess that the situation is that you have an arbitrary vector in space and you want to derive two perpendicular vectors from it?   In that case, you're going to have to choose an axis that is 'most expendable', in that, you're either going to completely avoid and ignore cases where your arbitrary normal is facing along that axis or you're going to write special-case code to handle anything that's within a threshold (aka 'epsilon') of that axis.   It looks like you're already doing this, in a roundabout way. The vector that you have, which is the 'arbitrary normal', will first be used to perform a cross product with a cardinal vector (like '1 0 0', '0 1 0', or '0 0 1', depending on which vector you want to 'avoid the most'). In the case of '0 1 0', which it appears as if your code is already dealing with, and happens to be the 'vertical' vector in most gfx cases, the cross product of the vector and '0 1 0' will give you a vector completely perpendicular to whatever plane is formed by the vector and '0 1 0'.   If you can imagine a vector pointing in any random direction (edit: the 'arbitrary normal vector' you already have), and a vector pointing straight up, this forms two sides of a triangle if they are emanating from one point. The direction that triangle is facing is the result of their cross product. From here you would generate the 3rd vector by simply getting the cross product of your original arbitrary vector and the resulting vector of the first cross product. The 'cardinal' vector that you use in the first cross product is just a place-holder, and only has a bearing on your final outcome if your math precision is crappy and the arbitrary vector is very close to the cardinal vector itself... (edit: because it would form a very skinny triangle that is hard to calculate the normal vector of precisely)   This means that, in the case of using '0 1 0' as your cardinal vector, any arbitrary vector that is very close to it, or its inverse ('0 -1 0') will start to cause precision issues, but would otherwise be fine for all other vectors where the X and Z values are dominant in the starting vector.   This simple two-step algorithm will yield two vectors that are the 'horizontal' and 'vertical' vectors oriented with your initial arbitrary vector, depending on what cardinal vector you decide to use ('1 0 0', '0 1 0', or '0 0 1').. Typically, the vertical vector is used, because most applications involve 'things' that are facing in more horizontal directions than vertical.   If I'm way off base and clearly have no idea what you're talking about, please let me know. Otherwise, I hope this helps.   P.S. Don't forget to normalize the result of each cross product!
  13. Have you taken mass/acceleration into consideration? It sounds like you've completely ignored them and that incorporating them would dampen any 'explosive' issues.
  14. What's your hardware rendering config like? Do you have v-sync enabled?   Have you tried profiling your code to find out what exactly is stalling out for hundreds of ms at a time? This involves adding timing code into your project where you can instantiate a timer object and then call a function to 'start' timing and 'stop' timing for that object, much like a stopwatch. EDIT: you would then create these timers for different pieces of your main update loop code, so as to differentiate what parts of your code are spending more/less time each frame. At the end of a session you would then output the total execution time for all of the timing objects you instantiated. This should help you narrow down what part of your code is stalling. Are you (assuming C/C++) compiling in debug or release mode? What other software do you have running in the background? Have you restarted your computer recently? (some people just sleep/awake their computer for months at a time, and it slows stuff down as memory fragments worse and worse)   I'd suggest running your project on another machine besides the one you are having problems on, to see if it's your project specifically or your machine itself. However, I think profiling your code will help you figure it out better than anything else.
  15. In my game (project) I have a master server, which is strictly for keeping track of game servers themselves - which are operated by users/players themselves. All I am doing is offering the game and a means for players to find eachothers' games. However, NAT punchthrough comes in when one player running a server behind a router would otherwise be invisible to other players' connection attempts. To perform the NAT punchthrough I just have a player/client send a message to the master server that tells it what game server it's trying to connect to, and being that the game server itself is updating the master server as to its existence (aka a 'heartbeat' to keep itself listed) there is already a NAT tunnel provided between the master server and the user's game server, at which point upon receipt of a 'attempting connect to...' message received from another player the master server then tells the game server itself what IP/port a player is trying to connect from - which is when the game server itself then sends out a 'trailblazing' packet to that user's IP/port so as to open a path in its own router's NAT. That's what opens up the channel that will allow the connecting player's "connect" packet to get through and establish gameplay.   If you're running all games on one master server, then there's no need for NAT punchthrough, unless your master is behind some kind of NAT, in which case you just need to set up port forwarding so that all players can connect to it. Otherwise, if for some reason you need clients to be able to intercommunicate, then yes, you're going to need to establish some means of relaying a sort of "hey I'm trying to connect with you" protocol across the master server itself between clients.