Jump to content

  • Log In with Google      Sign In   
  • Create Account

We're offering banner ads on our site from just $5!

1. Details HERE. 2. GDNet+ Subscriptions HERE. 3. Ad upload HERE.


Radikalizm

Member Since 05 May 2011
Offline Last Active Today, 10:17 AM

#5198498 Current-Gen Lighting

Posted by Radikalizm on 16 December 2014 - 04:10 AM


I also like the FilmicGames blog by John Hable.

 

As far as I know he stopped posting on that blog, his new page is at http://www.filmicworlds.com/




#5194319 Conservation Factor for Epic’s Shading Model

Posted by Radikalizm on 23 November 2014 - 03:18 PM


Is it correct to use just the Fresnel (from the light’s point of view, hence L•N, not V•N) or should it include the distribution/geometric terms as well? I tend to think these are typically ignored for performance, but for the moment I am more interested in accuracy.

 

As far as I understand it the diffuse term should indeed take surface roughness into account in some way. Applying the 1.0 - F trick is to account for the trade-off between diffuse and specular at glancing viewing angles, but is not a completely accurate approach.

 

Naty Hoffman has provided some references for this stuff in her "Physics and Math of Shading" SIGGRAPH course notes: http://blog.selfshadow.com/publications/s2013-shading-course/hoffman/s2013_pbs_physics_math_notes.pdf

 

Have a look at the bottom of page 20.

 

EDIT:

 

It's also always possible to just have a look at the shader code provided in UE4. I have the source lying around here somewhere, but haven't looked into their shaders too much yet.




#5194229 Book on Physics-Base Rendering

Posted by Radikalizm on 23 November 2014 - 12:24 AM


Do you know of any books on real-time raytracing that helpful?

 

When we're talking about OpenGL we're in the realm of rasterization, not ray tracing. Certain aspects of ray tracing are starting to become common in traditional rasterization-based renderers (screen-space reflections are a good example), but even though actual real-time ray tracers do exist they're generally still quite slow and not viable (as far as I know) for applications such as games.




#5194170 Book on Physics-Base Rendering

Posted by Radikalizm on 22 November 2014 - 02:59 PM

This book is a very good read for anyone interested in designing and programming physically based rendering systems. It will walk you through the theory behind various reflectance models, radiometry, camera systems, various light sources, scattering and light transport in general.

 

It is not however geared towards real-time implementations, but actually walks you through the implementation of a ray tracing system. If you are looking for a book which provides you with OpenGL sample code or with everything explained in relation to OpenGL this is not the right book for you.

 

If you want to get a deep understanding of the theory behind physically based rendering and if you feel like you'd be able to distill and optimize a working real-time implementation out of this material then by all means go for it!




#5187686 Integrating Image Based Lighting

Posted by Radikalizm on 17 October 2014 - 11:54 AM

In an offline quality renderer, you'd sample every pixel in the environment map, treating them as a little directional light (using your full BRDF function), which is basically integrating the irradiance.
This is extremely slow, but correct... Even in an offline renderer, you'd optimise this by using importance sampling to skip most pixels in the environment map.

For a realtime renderer, you can 'prefilter' your environment maps, where you perform the above calculations ahead of time. Unfortunately the inputs to the above are at a minimum, the surface normal, the view direction, the surface roughness and the spec-mask/colour... That's 4 input variables (some of which are multidimensional), which makes for an unpractically huge lookup table.
So when prefitering, typically you make the approximation that the view direction is the same as the surface normal and the spec-color is white, leaving you just with surface normal and roughness.
In your new cube-map, the pixel location corresponds to the surface normal and the mip-level corresponds to the roughness. For every pixel in every mip of this new cube-map, sample all/lots of the pixels in the original cube-map * your BRDF (using the normal/roughness corresponding to that output pixels position).

 

To add to this, there was a quite nice explanation given at SIGGRAPH 2013 on how they filter their cubemaps in Unreal 4. They split the problems into two parts and solve each part separately in different lookup textures. The first part is an actual prefiltered environment map using importance sampling as Hodgman describes, the second part is a lookup table containing your environment BRDF and is expressed in terms of your surface roughness and viewing angle.

 

The only real effort involved in implementing this is building your tools to generate the environment map and LUT. Once this is done it will be a breeze to implement the evaluation of these in your shader.

 

Slides can be found here, course notes here




#5187533 DLL-Based Plugins

Posted by Radikalizm on 16 October 2014 - 06:44 PM

I'm seconding ApochPiQ on this one, I've done DLL-based plugin systems before and I've worked with a lot of projects involving multiple DLLs at once at work and there are just a ton of ways things can get in a messed up state.

 

The things mentioned above are definitely some of the worst aspects of them, especially keeping memory ownership within DLL borders can be very non-trivial.

 

Keeping DLL plugins compatible with newer revisions of your codebase can be a huge pain as well.




#5152748 recreate frostbite destruction physics

Posted by Radikalizm on 10 May 2014 - 02:58 PM

There are probably multiple systems at multiple levels of detail at work here. There's a presentation on how they did smaller scale destruction in their frostbite 2 engine by using a technique they call "Destruction masking using volume distance fields", they're probably also using this to some extent in frostbite 3.

 

The slides for this can be found here: http://www.slideshare.net/DICEStudio/siggraph10-arrdestruction-maskinginfrostbite2




#5149306 Screenshot of your biggest success/ tech demo

Posted by Radikalizm on 25 April 2014 - 12:01 AM

Can't really make any claims about this being my "biggest success" since the game isn't released yet (2 more weeks!), but I (re)wrote the rendering back-end for this and built the largest part of the lighting system, the terrain rendering system and the water rendering system.

Some more shots here and here




#5104760 Abstracting away DirectX, OpenGL, etc.

Posted by Radikalizm on 27 October 2013 - 07:44 AM

I think it's important at this stage to decide if supporting OpenGL in the future is something you actually want to do.

 

A lot of people start out with the idea of supporting OpenGL (or DirectX if they start out with OpenGL) at a later point in time in their rendering engines; this results in them trying to write an abstraction layer on top of their original graphics API which doesn't really do the underlying API any justice.

 

Designing a proper wrapper exposing the functionalities of both APIs to their fullest extent is HARD. It's going to require a complete understanding of both APIs, their designs and their quirks. Add to this that you'll have to design a system which will unify these sometimes completely different beasts into a single API without introducing too much overhead and without sacrificing functionality.

 

If you really do want to support both you could always check out some third party utility libraries for doing vector math and such, I'm sure there'll be plenty of those out there.

 

If support for multiple rendering APIs isn't an absolute must I just wouldn't worry about it and work with the tools provided by SharpDX; no need to put a lot of effort into something which eventually won't have that much of an impact on your end result (ie. your game)




#5104403 Deferred Rendering Questions

Posted by Radikalizm on 25 October 2013 - 11:38 AM

Deferred or immediate contexts have nothing to do with deferred rendering. Deferred contexts should be used when your application is designed to issue rendering commands from multiple threads.

 

A simple google search should provide you with tons of good deferred rendering tutorials and explanations, it's a pretty common beginner's subject :)




#5102930 Flat Projection matrix

Posted by Radikalizm on 20 October 2013 - 02:08 PM

You're probably looking for an orthogonal projection matrix to get that "flat" effect




#5102140 [Resource Request] Where can I learn how to implement 3D animated models into...

Posted by Radikalizm on 17 October 2013 - 08:37 AM

The terms you're looking for are "skinning", "forward kinematics" or "skeletal animation". There should be tons of tutorials available by using just a simple google search, but it's been a while since I researched the topic online.

 

A book I could recommend is "Computer animation - Algorithms and techniques" by Rick Parent, although it might be a bit overkill if you just want to do skeletal animation.




#5101021 How to create a physics engine

Posted by Radikalizm on 13 October 2013 - 08:20 AM


No thanks to the ones who were trying to discourage me.

 

Nobody here is trying to bring you down or anything, we're just trying to give you a realistic picture of what writing a physics engine is about. It's just a fact that if it's your goal to build a game anytime soon it's not a good idea to spend years trying to build a physics engine from scratch first, it's as simple as that.




#5100915 How to create a physics engine

Posted by Radikalizm on 12 October 2013 - 07:10 PM

Note: I'm going to assume this is about implementing a 3D physics engine. 

Additional disclaimer: I've never actually written a 3D physics engine, so I'm just summing up where I'd start off if I were to try and implement one. I have done 2D physics engines though.

 

First of all: Welcome to the forum!

 

Let's get down to business.

 

Is this going to be a physics engine you're actually going to use in a game, or is this a personal educational project?

If it's the first case I'd recommend against writing your own physics engine as there are a ton of excellent professional physics solutions available for use without cost. Examples of these would be Bullet or Nvidia Physx. Havok would also be an option as they have licensing options for low cost and low budget games.

 

 

If this is an educational project there are a couple of places where you could start, and I hope I can explain somewhat why it might not be the best idea to write a fully featured physics engine for a project you want to actually release. Full featured physics engines are massive projects which require quite a bit of expertise.

 

First of all you'll want to have a good grasp of at least classical mechanics and how the concepts found in classical mechanics would translate to actual real-time simulations (warning: this can be tricky). Maybe you should try to get your hands on some introductory textbooks on mechanics and study those until you grasp the concepts explained within to start out.

 

Second, you're going to have to make some decisions on what you want your physics engine to be able to do and what you'll want to use it for. Do you just require basic rigid body physics or do you want to implement some more advanced stuff like soft body physics as well? Do you want it to be able to hande massive amounts of varying bodies flying around everywhere? Do you want to support just simple shapes like boxes, spheres, etc. or do you want to support more complex bodies (meshes?) or even multiple constrained bodies (eg. ragdolls, hinged bodies, ball joints, pistons, etc)? Do you want to implement destruction like bodies being able to break or shatter? Try to write up a list of things you want to implement into this engine first before doing anything else.

 

Once you've taken these steps you'll probably have an idea of how to start out with writing something very primitive like moving a box around by applying a force on it for example.

 

From here on out you'll probably want some interaction between your entities, so you'll have to look into collision detection algorithms and methods of applying these algorithms as efficient as possible (read: acceleration structures).

 

When you've reached this point you'll probably want to come back and ask more specific questions about the features you'd like to implement (if you haven't figured out how to implement them on your own by then). As I said before, physics engines are massive projects so I couldn't sum up the process of implementing one in one forum posts (nor do I know any tutorials or books, sorry).

 

To be honest with you, even though I've been tempted to write one on some occasions I've personally always been quite intimidated by physics engines, even with a pretty good knowledge of the fundamental aspects of classical mechanics. 

 

 

Oh and by the way, your choice of graphics API (OpenGL, DirectX) doesn't really have anything to do with physics engines unless you plan on using your GPU to do some of the heavy lifting.




#5089202 Managers, which pattern should I use

Posted by Radikalizm on 26 August 2013 - 10:16 AM

In my humble opinion the term "manager" should immediately ring an alarm bell when it comes to designing your application. I'm not saying that you should never write classes which completely manage some system, but they have a tendency to turn into god classes pretty quick and that's something you want to avoid at all costs.

 

Cornstalks made a really good point on which systems to implement to handle your texture resources. A system which handles the loading of resources shouldn't be bothered with maintaining a resource's lifecycle or vice-versa. It just bloats the system and will turn it into an unmanageable wreck.

 

The same thing goes for input. You can receive input from many different devices, do you really want to put all of the logic behind receiving that input into one system? Will systems which require input (eg. character controllers) all have to depend on this one bloated system? That's just not a flexible design.






PARTNERS