Advertisement Jump to content
  • Advertisement

frankoguy

Member
  • Content Count

    21
  • Joined

  • Last visited

Community Reputation

135 Neutral

About frankoguy

  • Rank
    Member

Personal Information

  • Interests
    Design
    DevOps
    Programming

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

  1. I've been tweaking the bloom workflow ( luminance calculation, outputting threshold values to texture fraction size of main target, then recombining the original target with gaussian-blurred luminance target, tone-mapping, contrast reduction, gamma correction to final main device) for awhile now. I've finally got it right. Tonemapping is not an exact workflow. There are many different variations of tonemapping operators to choose from. I've been experimenting with several different tonemapping workflows. Some are very complex, explicitly using White and Exposure variables, others are much simpler and get a pleasing result. The tonemapping operator I wanted to use works great--the explosions are HDR and very bright, with lots of definition--but the background (everything that's not an explosion) looks comparatively very dark. Varying contrast reduction and gamma correction doesn't quite fix it. So this video output represents a visually pleasing tonemapping fallback operator--that is also a breakthrough result. Bright explosions, and easy to see the background.
  2. Improved Bloom, tonemapping, contrast reduction, gamma correction:
  3. Improved lens flares, Glow, and tweaked High Dynamic Range lighting procedure.
  4. frankoguy

    My C++ Game Engine from scratch

    Hey jbadams, Thanks. I'll do that.
  5. frankoguy

    My C++ Game Engine from scratch

    Improvements: Glow, lens flares, tonemapping, HDR, and bloom.
  6. frankoguy

    My C++ Game Engine from scratch

    Many subtle improvements. Light Ray occlusion fix: If Center of light source is occluded by foreground object, will not render light ray. If light center is not occluded, will render light rays on-top of opaque fragments. How to do this: render light rays into separate render target. In addition to standard color output, encode light source center screen-space fragment coord into every output pixel in separate render target (every single light ray pixel for the current light ray will have the same screen-space encoded light center coord: pos_light=g_mProj*g_mView*light_world. encoded_color=(pos_light.xyz/pos_light.w+1.0)/2.0). Also, do the same with other opaque fragments for space-ships and such in another separate render target (or make sure you have access to depth buffer). After rendering standard scene into final color output buffer, render the final light ray texture like so (must have access to standard light ray color buffer and encoded light center fragment positions. Next, render light-ray texture in screen-space (fill up entire viewport with texture: use standard identity normalized screen coordinates: i.e. (-1,1) (1,1),(1,-1),(-1,-1) to fill up viewport with quad, and use these same coordinates as texture coordinates (i.e. (coord+1.0)/2.0 ). For every texture coord, lookup the encoded screen-space light position, then use that encoded screen-space light position as a texture coord (already stored as texture coordinate) to look up the foreground space-ship fragment at the screen space light center position. Do a depth comparison of the encoded screen-space light position to the encoded screen-space opaque fragment position (or do depth lookup into depth buffer) (i.e. use the looked up screens-space light center fragment coordinate as a texture coordinate lookup into the opaque color-encoded screen-space position buffer). if center light depth is greater than current opaque fragment depth, then discard the current light-ray fragment (will discard all light ray fragments in light ray image corresponding to this center light position): if(decoded_light_ray_screenspace.z (already z/w) > opaque_fragment_pos.z (also already z/w)) discard;
  7. frankoguy

    My C++ Game Engine from scratch

    Better cloaking transitions, improved light rays and lens flares. Upcoming: working on way to guarantee light rays render ontop of occluding opaque fragments if vector from eye to light source is unobstructed (TBD).
  8. frankoguy

    My C++ Game Engine from scratch

    Here it is
  9. Hello there. I'm not really the blogging type. This is my first ever blog. So I'll do my best. I've been trying to showcase my video game engine from scratch in different professional forums with scant mixed results. I’m currently a happily employed 3D Graphics Programmer in the medical device field who also loves creating graphics programs as a side hobby. It's been my experience that most people who aren't graphics programmers simply don't appreciate how much learning goes into simply being able to code a small fraction of this from scratch. Most viewers will simply compare this to the most amazing video game they’ve ever seen (developed by teams of engineers) and simply dismiss this without considering that I’m a one-man show. What I’m hoping to accomplish with this: I’m not totally sure. I spent a lot of my own personal time creating this from the ground up using only my own code (without downloading any existing make-my-game-for-me SDKs), so I figured it’s worth showing off. My design: Oct Tree for scene/game management (optimized collision-detection and scene rendering path) from scratch in C++. 1. All math (linear algebra, trig, quaternion, vectors), containers, sorting, searching from scratch in C++. 2. Sound system (DirectSound, DirectMusic, OpenAL) from scratch in C++. 3. Latest OpenGL 4.0 and above mechanism (via GLEW on win32/win64) (GLSL 4.4). Very heavy usage of GLSL. Unusual/skilled special effects/features captured in video worth mentioning: 1. Volumetric Explosions via vertex-shader deformed sphere into shock-wave animation further enhanced with bloom post-processing (via compute shader). 2. Lens Flare generator, which projects variable edge polygon shapes projected along screen-space vector from center of screen to light-position (again, in screen-space: size and number of flares based on intensity and size of light source). 2. Real-time animated procedural light ray texture (via fragment shader) additively blended with Volumetric explosions. 3. Active camouflage (aka Predator camouflage). 4. Vibrating shield bubble (with same sphere data in Volumetric explosion) accomplished using a technique very similar to active camouflage 5. Exploding mesh: When I first started creating this, I started out using fixed-function pipeline (years ago). I used one vertex buffer, one optimized index buffer, then another special unoptimized index buffer that traces through all geometry one volume box at a time. And each spaceship “piece” was represented with a starting and ending index offset into this unoptimized index buffer. Unfortunately, the lower the poly-resolution, the more obvious it is what I was doing. Unfortunately, as a result, when the ship explodes you see the triangle jaggies on the mesh edges. My engine is currently unfortunately somewhat married to this design—which is why I haven’t redesigned that part yet (It’s on the list, but priorities). If I were to design this over again, I’d simply represent each piece with a different transform depending upon whether or not the interpolated object-space vertex (input to the pixel shader) was in-front of or behind an arbitrary “breaking plane”. If the position was beyond the breaking point boundary planes, discard the fragment. This way, I can use one vertex buffer + one optimized index buffer and achieve better looking results with faster code.
  10. Hi guys, I have a question for you guys you’ve probably heard/read a million billion times by now. But never quite put this way, I bet: From 2003 to present, I wrote two game engines in C++: one is Direct3D-based, and the other is multi-API/multi-platform based (currently using OpenGL), but only the Direct3D one is completed. They’re both definitely good enough to impress most programmers, even experienced graphics programmers. But not good enough yet to impress your average consumer (that bar is very very high—no one would pay 60 bucks to see this…yet?). I’m not in the Game Programming Industry, but would like to get there now. I have 6 years of professional experience as a software developer, with game-engine- and graphics-related professional accomplishments, and a large amount of game engine programming knowledge—not unlike many people here. I think I have a lot to offer. However, I could always be mistaken, of course—this could be a delusion. How could I best leverage my two “kickass” video game engines to land multiple interviews in the Game Programming Industry? Or at least maximize my exposure, and get noticed more. I appreciate any/all answers, even sarcastic stupid answers—but please, I’m looking for intelligent, helpful answers. Currently, I'm writing security code, and working with ASP.net. And I'm about to blow my brains out--kidding(?) Thanks guys, Franklin
  11. Wow, I solved my own problem. Now I feel kind of stupid for asking the question. I just turned off driver instrumentation, and that solved the issue. Now all my opengl programs run correctly, including my Direct3D apps.
  12. I have a very big problem: I just purchased a book called Beginning OpenGL Game Programming, by Dave Astle and Kevin Hawkins. And all the programs that come with the book crash with the same type of error message box every time on my laptop computer: "The instruction at 0x69666f60 referenced memory at 0x0019194c. The memory could not be written." This exact same kind of error also happens with every OpenGL program I write. The crash always occurs either on the first glClear(myflags) call, or on the first SwapBuffers(hdc) (if I skip the call to glClear() which it crashes on); I'm using wgl to write my opengl apps. All glut-based and wiggle-based OpenGL code crashes on my laptop pc: every single program that came with this book and all my own code as well. I DON'T GET IT!! I'm guessing it's some kind of graphics driver problem?? I just don't know how to resolve (and where to look) in order to fix this. My laptop computer uses onboard video hardware from nvidia 6800 equivalent. With every single Direct3D application I've written, these kinds of runtime crashes don't occur. I've been coding a very long time: so I'm very confident it's not due to a logic mistake. But nevertheless, this is a behavior I haven't seen before. My windows platform sdk allows for opengl and wgl calls to be made. Compiles and links without problems. Could it be that Microsoft is being evil, and is forcing the rest of us to use Direct3D? haha...maybe? Or is it nVidia? or both? Anyways, I'd deeply appreciate any help I can get on this. Thanks
  13. Jason, I re-edited my previous post with the code listing to include the use of the indicated tags to wrap my C++ and HLSL source code in. I believe the code looks much more readable now. If not, please let me know. Thanks.
  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

GameDev.net is your game development community. Create an account for your GameDev Portfolio and participate in the largest developer community in the games industry.

Sign me up!