Jump to content
  • Advertisement

racarate

Member
  • Content Count

    46
  • Joined

  • Last visited

Community Reputation

320 Neutral

About racarate

  • Rank
    Member

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

  1. Thanks for the replies! I was looking more for advice on writing an in-game profiler and performance visualizer though all those external tools do look nice.
  2. Hey everybody! I am trying to replicate all these cool on-screen debug visuals I see in all the SIGGRAPH and GDC talks, but I really don't know where to start. The only resource I know of is almost 16 years old: http://number-none.com/product/Interactive Profiling, Part 1/index.html Does anybody have a more up-to-date reference? Do people use minimal UI libraries like Dear ImgGui? Also, If I am profiling OpenGL ES 3.0 (which doesn't have timer queries) is there really anything I can do to measure performance GPU-wise? Or should I just chart CPU-side frame time? I feel like this is something people re-invent for every game there has gotta be a tutorial out there... right?
  3. Hi!  I am making an extremely simple (two mesh) visualization on Windows and I was wondering if I could just call wglSwapInterval(1) and run everything as fast as possible.  As long as I hit the 16ms budget, I am guaranteed not to drop any frames, right?   Related, I would like to measure my times like in X-Plane -- they have a statistic for CPU and GPU that gives millisecond counts.  Is that as simple as timing my main game loop?  Would my timer include the call to wglSwapInterval?  That would include the blocking wait, right?  Also, I don't really even know where to begin with measuring my GPU milliseconds.  Is there a function for that in OpenGL or is it all driver-specific?   Thanks, Nick
  4. racarate

    Planet rendering issues

    Look into the multi-frustum technique used by Patrick Cozzi and the Cesium crew. Their book and presentations on terrain rendering are worth reading as well.
  5. I am trying to program an input mechanism wherein the player is standing inside -- but not necessarily at the center -- of a sphere whose interior is populated with video-textured quads.  By clicking and dragging, the player may reposition those quads along the interior surface of the sphere.  Note that there is no actual geometry for the sphere, rather it is implied by the plastering of video-textured quads all over its interior.   I tried so many ways to get the quads to move correctly... rotating about the sphere origin in objectspace, cameraspace, donkeyspace...   My current approach is to do classic ray-sphere intersection based on where the player clicked.   I will then snap the selected video-textured quad to that position.   Now, I know the origin and radius of the sphere and I should be able to figure out the origin and direction of the ray using this approach:   http://antongerdelan.net/opengl/raycasting.html   For collision, I am using the ray-sphere intersection code from Peter Shirley's "Ray Tracing in a Weekend" (not online)   My results are off though -- I am getting hits but not where I expect -- and I don't know where I am going wrong.     Hunches as to my mistakes:   1. Anton's comments about not needing to unproject might be biting me... I have no idea why to use 0.5 for z... 2. Maybe I should just build the ray myself without unprojecting -- I can use the camera FOV and near plane distance to build a ray per-pixel. 3. I probably am mixing up my spaces...  moving the camera seems to change my results. 4. What sort of values should I be seeing for cameraspace clicks?  I expected huge x and y values but they seem to be constrained to [-1...1]. 5. Detecting ray-sphere intersection is different if you are inside the sphere (I am testing with an external sphere first)   Thanks for any help!  I feel like this should be raycasting 101!!!   P.S. I am using threejs P.P.S. But I would like to understand this problem in a library-agnostic manner // via anton var screen_x = event.touches[0].clientX; var screen_y = event.touches[0].clientY; var ndc_x = screen_x / window.innerWidth; var ndc_y = screen_y / window.innerHeight; ndc_x = (2.0 * ndc_x) - 1.0; ndc_y = (2.0 * ndc_y) - 1.0;; ndc_y *= -1.0; var clipspace_x = ndc_x; var clipspace_y = ndc_y; var clipspace_z = -1.0; var clipspace_pos = new THREE.Vector4(clipspace_x, clipspace_y, clipspace_z, 0.0); clipspace_pos.applyMatrix4(this.main_camera.matrixWorldInverse); clipspace_pos.z = -1.0; clipspace_pos.w = 0.0; var cameraspace_pos = clipspace_pos.clone(); cameraspace_pos.normalize(); var sphere_center = new THREE.Vector3(0, 0, -40); var sphere_radius = 10.0; var ray_origin = new THREE.Vector3(0, 0, 0); var ray_direction = cameraspace_pos.clone(); // via peter shirley var center_to_origin = ray_origin.clone(); center_to_origin.sub(sphere_center); var a = ray_direction.clone().dot(ray_direction); var b = 2.0 * center_to_origin.clone().dot(ray_direction);; var c = center_to_origin.clone().dot(center_to_origin) - (sphere_radius * sphere_radius); var discriminant = b*b - 4*a*c; if (discriminant >= 0.0) console.log('********************** YOU HIT ME ****************************');
  6. racarate

    Is OpenCL slowly dying?

    Kind of random, but I came across this OpenCL example of flocking today... hope to try it out soon!   http://www.gameaipro.com/GameAIPro/GameAIPro_Chapter45_Introduction_to_GPGPU_for_AI.pdf
  7. I was writing a simple albedo-metallic-roughness demo this morning and I noticed that when using a single point light a pure metal doesn't show up at all except for its specular glints.   I realize this is part of the currently popular theory, that metals have zero diffuse response...  but it seems nutty to me that you would have completely black pixels when the metal surface doesn't line up with NdotH (or VdotR, depending on your specular formula).   Am I only noticing this because I am using a single point light?  Is this effect not a problem because most games use multiple lights?  Or because they use ambient cubes?  Or are materials just never set to pure metal?
  8. racarate

    Particles in idTech 666

    Yes, absolutely that would be great.  I am really baffled about the basics of particle lighting, I feel like I should probably start figuring out some off these offline tools.   Nick
  9. This is a beginner question, but in this slideshow when they talk about computing particle lighting...   http://advances.realtimerendering.com/s2016/Siggraph2016_idTech6.pdf   ... where do the particle normals come from?  Are the particle normals just the plane normals of the grid?  Or is there something else going on, like a normal map from Houdini or maybe calculating a normal based on the particle being a sphere or the system being a cylinder?   I'm still trying to wrap my head around the basics of particle lighting!     Thanks, Nick
  10. racarate

    Particle Lighting 101

    Sorry, one more quick question.  In the linked article they specify a "quick and dirty" normal map generation formula to simulate a spherical particle:   // billboard_normal == -view_direction half3 n = lerp(billboard_normal, normalize(corner-center), curvature_amount); –    In this example, where would I get the center and corner of my particles?  I am using OpenGL point sprites.  I think they are suggesting to compute this lighting in the vertex shader, so I can't use my knowledge of the final rasterized size (via gl_PointSize) to figure this out.  Any ideas?     Nick
  11. racarate

    Particle Lighting 101

    Ok, I have a follow-up question.  Most of what I've found so far is just making up spherical-ish normal maps for the particles.  That seems odd to me, but I'm giving it a shot.  In this paper, this normal mapping of particles (seems to) depend on reprojecting to the so-called HL2 basis:   Practical Particle Lighting - Roxlu   If I'm understanding basis projection correctly, he is only doing this as a way of summing up the the scene's light.  I could ignore this step and just do regular normal mapping by looping through my analytical lights or looping through my environment cubes, right?  What is the point of this HL2 projection in this case?
  12. I've never used anything but self-lit particles...  I'm going through this talk here and there a lot of references to lighting the particles to fit the scene:   http://www.slideshare.net/guerrillagames/the-production-and-visual-fx-of-killzone-shadow-fall   Also, later on they mention baking out the normals from their Houdini smoke sim.   I'm curious if there is a simpler place to start with particle lighting.  For example, what is the easiest possible way to make the lighting of particles fit a scene?  Everything I find is either self-lit or way too advanced.  This isn't for a particular project, I'm just trying to learn.  Specifically, I think I am missing the intuition behind how a particle has a normal.     Nick      
  13. racarate

    Particle System Tools to Share

    This is old but good paper, also check out the GDC talks about Infamous II and the free PopcornFX editor (similar):   http://www.pixar.com/companyinfo/research/pbm2001/pdf/notesc.pdf
  14. racarate

    Link for Graphics Research Papers

    For terrain, I like Cozzi's book on virtual globes.  Also check out the original clipmap paper and the Microsoft GPU clipmap paper.
  15. Ha, I also wanted to ask the same question.  We are all so curious!   It seems more like material masking than material layering.  I was curious about why you would choose this method though.  From the Allegorithmic post, they made it sound like it would only apply to AAA open world and not mobile.  I have no idea why.  Are there a ton of dependent texture lookups happening?   Isn't this technique just the same thing as terrain splatting?   Also, does anybody have any info on this comment from the Allegorithmic post... I'm not sure what exactly is meant by falloff here:   "When using blended shaders like this on console, there is usually a texture input per material that drives how the material blends with other materials along with some controls for threshold and falloff."
  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!