Jump to content

  • Log In with Google      Sign In   
  • Create Account

dpadam450

Member Since 18 Nov 2005
Offline Last Active Yesterday, 08:27 PM

#5296922 Artefacts in triangle mesh

Posted by on 17 June 2016 - 01:32 AM

For one, this is why we have normal maps. You can bake nice normals at a super high poly resolution. Your real issue is just not enough edge loops.

 

Anytime you have a vertex that is on an edge that is 90 degrees, or close to it, you need to create loops around it, otherwise it is interpolation lighting over such high angles. Also, this is why you usually apply sub surface division and then back normals onto low poly meshes. For this same type of reason. More polys = better surface representation and lighting.

sa58xtbt44se8j4zg.jpg




#5296249 PBR Shader

Posted by on 12 June 2016 - 03:14 PM

Metals do not diffuse light. All metals and plastics have roughness from 0 to 100%. There is no good # , just one that matches the surface you are trying to mimic.




#5296170 CreateWindow limiting to 2560x1440

Posted by on 11 June 2016 - 10:06 PM

I turned my dpi scaling down to 100% from 150% in system settings and things adjusted according, now how do I deal with this when windows dpi scaling is on?




#5295632 Computing Matrices on the GPU

Posted by on 08 June 2016 - 09:48 AM

I don't have experience doing this on the GPU but are you currently threading your work on the CPU. 100 objects doesn't seem like much.




#5294698 Having trouble using glGetTexImage with a depth texture of an FBO

Posted by on 02 June 2016 - 01:37 PM

Make sure m_TextureID is valid at the time you call getTexImage. You could try creating a 2D texture right before and calling getTexImage on that instead. Are you sure that the texture is filled with data, try glTexImage2D() before getTexImage. See if TexImage2D fails as well.




#5292835 OpenGL Fonts Tip

Posted by on 22 May 2016 - 01:57 AM

I went back into my old font code which is from the old NEHE tutorial years ago. I finally had the need to support multiple fonts because I actually want to release a game. I'm using wglUseFontBitmaps and CreateFont.

 

Anyway maybe someone will stumble upon this and help them down the road but it took me a while to figure. The tutorial was using ANSI_CHARSET instead of  DEFAULT_CHARSET, so 99% of the fonts I used seemed to not work at all. For instance webdings wasn't showing symbols but was printing like a normal font.

font = CreateFont(  -1*fontSize,
                        0,
                        0,
                        0,
                        FW_DONTCARE,
                        FALSE,
                        FALSE,
                        FALSE,
                        DEFAULT_CHARSET,
                        OUT_TT_PRECIS,
                        CLIP_DEFAULT_PRECIS,
                        ANTIALIASED_QUALITY,
                        FF_DONTCARE|DEFAULT_PITCH,
                        fontName);



#5289120 Render to screen buffer vs. to texture problem

Posted by on 28 April 2016 - 12:28 PM

No, hodgman is correct.

d.)Render target texture color = ARGB(1, 1, 1, 1) (than i render this texture it has all pixels alpha == 1)

 

This is wrong. Your target has an alpha channel, and therefore will write to the alpha component. Which in this case will be .3.

If you want to do that properly, createa a render texture as RGB, not ARGB.




#5288167 How can I get the roatation in PhysX in opengl

Posted by on 22 April 2016 - 12:06 PM

[X.x  Y.x    Z.x    Tx]

[X.y Y.y    Z.y     Ty]

[X.z  Y.z    Z.z     Tz]

[0     0       0       1]

 

You can get the helicopters right, up and forward vectors from the matrix itself. Those are 3 vectors in world coordinates. Whatever you are doing with heading would deal with the Zx, Zy, Zz vector.
 




#5287136 Computing an optimized mesh from a number of spheres?

Posted by on 15 April 2016 - 10:49 PM

In blender this is called a Boolean operation. That may lead you to something.

 

Your algorithm however doesn't sound useful. I'm assuming you want to connect a bunch of mountain or hills that are all spheres? Better to use a heightmap or some other scheme.




#5286983 How to scale a model in OpenGL?

Posted by on 15 April 2016 - 12:35 AM

Trying or doing? glScalef is what you want. It is adding to the matrix stack used to multiply all incoming vectors.  Make sure you call glLoadIdentity before, otherwise it will continually multiply the scale matrix everyframe inflating it to insanity and not working.




#5286945 Which per-frame operations are expensive in terms of performance in OpenGL?

Posted by on 14 April 2016 - 06:03 PM

You are asking a problem that may not be necessary for many years to come for you. A lot of people ask this and I tell everyone, optimize when you need to. Hardware is extremely fast nowadays that this shouldn't be a concern. Focus on your game if that is what you are building. If you just want to build the best tech in the world, then that is a different story.




#5285033 How to find the center of a modelview matrix?

Posted by on 04 April 2016 - 10:20 AM

To make things simpler, everything is always in world coordinates. Anytime a matrix is present it takes the points in your 3d model as vectors and stretches those vectors to point somewhere else and gives you new vectors. Any matrix operations afterwords are applied to those new vectors.

 

If you have played Zelda, or any game similar. You might have a light or creature that circles around a main character. It is circling relative to the player. Keyword relative, so you need a vector relative to the player to happen first. So translate some distance from the origin of the world (treating 0,0,0 as the characters position even though he is thousands of units from the origin).  Apply your rotation treating the origin as the player, and then finally translate to the players position.

 

If you don't think locally you might translate to the player first, and then apply rotation, but every rotation takes vectors from 0,0,0 and rotates them. So if you translate thousands of units away first, you are then rotating vectors around the origin that are thousands of units long (centered around the origin).

 

If that clears anything up. Local matrix is simply more of a logical thing.




#5284554 When would you want to use Forward+ or Differed Rendering?

Posted by on 31 March 2016 - 06:59 PM

Forward+ allows more material shaders for specific needs. With deferred you are limiting yourself to run one shader on the entire scene. Also Forward+ doesn't have to write a GBuffer so the bandwidth is lowered. I think most people will be moving to Forward+ if possible. MSAA will be cheaper on Forward+. You also don't have to perform any blending operations in Forward+ like you do with deferred.




#5282452 Per Triangle Culling (GDC Frostbite)

Posted by on 21 March 2016 - 04:05 PM

I came across this presentation and they are talking about using compute for per-triangle culling (standard backface culling, standard hi-Z). I'm not sure what exactly they are talking about. Is this meant to rewrite that part of the pipeline completely? and then when it comes time to draw, just disable backface culling and all the other built-in pipeline stages? I'm not getting why you would write a compute shader to determine what triangles are visible when the pipeline does that already. Even if you turn that stuff off, is this really that much better? Can you even tell the GPU to turn off Hi-Z culling?

 

Slides 41-44

http://www.wihlidal.ca/Presentations/GDC_2016_Compute.pdf

 




#5273377 Combining shadows from multiple light sources

Posted by on 30 January 2016 - 12:34 PM

 

Yes, I'm not getting unwanted artifacts anymore. But I have another question, take a look at this screenshot: http://s16.postimg.org/rzbnd4rv9/ss14538.png Is it normal that point light shadow is brighter than directional light shadow?

 

To this question and your original post.  Shadows are the lack of light received. Most people think doing phong shading and then multiplying shadows works. The shadows as pointed out should be multiplied by the N*L diffuse calculation because it is basically saying "Is the surfacing facing the light: (yes then it is receiving photons)" "If it is facing the light, is something blocking it: (If so, then it receives no photons)".  So if you perform all your lighting equation and texture lookups as if all the photons in the world hit it, it will get lit, and then some random multiplication happens to "darken" the image. It should be darkened by the fact that no light was hitting it in the lighting equation.

 

So this leads to your second question. Light is addative, without light every area in the world is completely black. So if your sun hits any surfaces not in shadow, those will be added quite highly with lighting (lots of photons hitting the surface). Then for every other light, more photons will hit.

 

So you have areas with:

Sun + Point light (which you can see very whitely)

Sun only (which is your brighter shadow because it received quite a lot of photons, just not the extra from the point light)

Point Light Only( this is your dark shadow. Only a few photons from the point light hit the surface, so its still pretty dark)






PARTNERS