Jump to content
  • Advertisement

Jihodg

Member
  • Content Count

    51
  • Joined

  • Last visited

Community Reputation

1159 Excellent

2 Followers

About Jihodg

  • Rank
    Member

Personal Information

  • Role
    Programmer
  • Interests
    Art
    Design
    Programming

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

  1. have you disabled mipmapping to check that there is nothing wrong with your texture mip maps or settings?
  2. are you using opengl? CLAMP mode is deprecated!... use CLAMP_TO_EDGE instead PS: I haven't checked in webgl2, but probably it is the same
  3. how do the systems know in which entities they should operate? they operate on entities that have certain components, not just on components alone... i.e: the motion system would operate on all entities with a position component and a velocity component... while the rendering system would operate on all entities with a position component and a sprite component... etc. If you couple the components to the systems, how can different systems operate on the same component?
  4. http://madebyevan.com/shaders/lightmap/
  5. assuming all your mipmaps are correct... are you setting the magnification AND minification filters? after that, and when you get correct trilinear filtering, you can enable anisotropic filtering for improved clarity. It seems to me than only the magnification filter while the minification is still point filtered in your picture.
  6. have you checked this link? https://mynameismjp.wordpress.com/2010/04/30/a-closer-look-at-tone-mapping/ it compares several tone mapping functions (including the uncharted one) with global and local exposure, it even includes a demo full of sliders to try! this guy always makes amazing posts!
  7. you are missing a fundamental detail!... as others said, the point of tone mapping is to bring down your color values from high range to the displayable low range, but... that high range changes every frame! so you must first compute it to generate an exposure value, usually by getting an average luminance of the current frame... just think about it, if the exposure was the same for every frame with a fixed and constant range, you could just scale all your values in a precomputation step to ensure decent results in the traditional low range... which is basically what was done intuitively before
  8. ohh... ok... thanks!... I saw you had a framework, I will definitely check it out now!
  9. looks good!, congratulations... the paper talks about a demo, it would be nice to have access to it to test performance an have a better grasp of the artifacts the algorithm produces and how well it deals with them.
  10. Jihodg

    Gamma correct? freshen up

    hahaha yes... I found it really odd, but since you are still using the classic and really "hacky" ambient+diffuse+specular lighting model anything was possible ;D most of us have moved on to use some kind of PBR, trying to keep conservation of energy and use parameters to represent physical properties instead, hence the roughness/smoothness and metallic workflows. (even though there are still a lot of hacks and approximations obviously) PS: the "everything in the same linear space" rule applies to whichever lighting model you choose
  11. Jihodg

    Gamma correct? freshen up

    Also, as I said before, I don't know exactly how you are calculating your lighting, but clearly if the material diffuse you showed is multiplied by the diffuse from the texture, the result will be a darker, red tinted, brick texture... and is the ambient material value also multiplied by the light ambient? if so, it would explain the quite dark overall ambient lighting
  12. Jihodg

    Gamma correct? freshen up

    yes, exactly!... and you are correct, if a texture is marked as sRGB format, the color values will be converted to linear upon reading and back to sRGB upon writing.
  13. Jihodg

    Gamma correct? freshen up

    I don't know the actual formulas your are using for your light calculations, but of course all the parameters used should be in the same linear space before operating on them!
  14. Jihodg

    Decals in tiled forward render (Forward+)

    That is a pretty reasonable result of a decal projection, ugly, but reasonable. If you want to "fix" that, you simply have to acknowledge why it is happening and devise a solution accordingly. You could do some opacity fading dependent on the angle difference with the original projection and/or the depth of the OBB... or do some mixing with multiple projections per decal... or use 3D textures for your decals and project that into the full extent of the OBB... etc.
  15. as others have already said, there is nothing really wrong with the results you are getting and is perfectly logical once you truly understand how transformations work... BUT... to address what I understand is what you are trying to accomplish, a game engine (and by extension its editors) will usually CHOOSE a defined order to pick rotation axes... for instance, choose to do EULER rotations always using yaw angle first, then pitch and finally roll, and store those values to create a single final rotation matrix (not applying little rotations incrementally)... this way you can simply set the values in ANY order you want and still have a consistent rotation applied.
  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!