Jump to content
  • Advertisement

Tyler Camp

  • Content Count

  • Joined

  • Last visited

Community Reputation

134 Neutral

About Tyler Camp

  • Rank
  1. Hello, I'm rendering 2D shadow geometry to a separate framebuffer for each light source, for later use in a shader. The number of light sources vary so I can't have a strict set of textures to reference.   I want to render to a 3D framebuffer to index into these layers. I've written my code as such, but am getting OpenGL errors: void make_scene_occlusion(Render_SceneOcclusion *output) { const int frame_width = 2048; const int frame_height = 2048; while(glGetError() != GL_NO_ERROR) {} CHECK_ERRORS(); output->num_maps = 0; u32 framebuffer; glGenFramebuffers(1, &framebuffer); glBindFramebuffer(GL_FRAMEBUFFER, framebuffer); u32 rendertextures[12]; glGenTextures(12, rendertextures); output->buf_id = framebuffer; output->num_maps = 12; CHECK_ERRORS(); for(int i = 0; i < 12; i++) { const auto & map = output->occlusion_maps[i]; glBindTexture(GL_TEXTURE_2D, rendertextures[i]); glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, frame_width, frame_height, 0, GL_RGBA, GL_UNSIGNED_BYTE, nullptr); glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR); glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR); glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP); glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP); CHECK_ERRORS(); glFramebufferTexture3D(GL_FRAMEBUFFER, GL_COLOR_ATTACHMENT0, GL_TEXTURE_3D, rendertextures[i], 0, i); output->occlusion_maps[i].idx = i; output->occlusion_maps[i].texid = rendertextures[i]; CHECK_ERRORS(); } }     ... I get GL_INVALID_ENUM after the call to glFramebufferTexture3D.     In some references I've found people using TEXTURE_2D_ARRAY, but I'm rendering with immediate mode so I don't know if the GL context will support it. Thoughts?
  2. Tyler Camp

    Basic Shadow Mapping

      Good bet! After reading your comment I remembered this article: http://www.opengl-tutorial.org/intermediate-tutorials/tutorial-16-shadow-mapping/   ... mentioning the use of front-face culling to remove self-shadowing. I made the change, updated some other areas, and it worked! Thank you!
  3. Tyler Camp

    Basic Shadow Mapping

    Encoded depth map:
  4. I've implemented shadow mapping in my WebGL application based on WebGL Programming Guide by Matsuda et al. The shadow mapping is generally working, but when my depth bias is lower than 0.005 I get this weird black box:       Additionally, without a depth bias you'd expect shadow acne, but in this case I just get a more exaggerated black box covering more of the area.   Depth map generation fragment shader: precision mediump float; vec4 pack_float(float value) { const vec4 bit_shift = vec4(256.0*256.0*256.0, 256.0*256.0, 256.0, 1.0); const vec4 bit_mask = vec4(0.0, 1.0/256.0, 1.0/256.0, 1.0/256.0); vec4 res = fract(value * bit_shift); res -= res.xxyz * bit_mask; return res; } void main() { gl_FragColor = pack_float(gl_FragCoord.z); } Shadow receiver fragment shader: precision mediump float; uniform sampler2D u_Texture0; uniform sampler2D u_ShadowMap; uniform vec3 u_LightDirection; varying vec4 v_PositionFromLight; varying vec2 v_TexCoord0; varying vec4 v_Color; varying vec3 v_Normal; float unpack_float(vec4 rgba_value) { const vec4 bit_shift = vec4(1.0/(256.0*256.0*256.0), 1.0/(256.0*256.0), 1.0/256.0, 1.0); float value = dot(rgba_value, bit_shift); return value; } float getVisibility(sampler2D shadowMap, vec3 normalizedProjectedPosition) { const int sampleRadius = 1; const vec2 shadowMapStep = vec2(1.0 / 2048.0, 1.0 / 2048.0); vec2 shadowMapTexCoord = normalizedProjectedPosition.xy; float visibility = 0.0, sample; for (int y = -sampleRadius; y <= sampleRadius; y++) { for (int x = -sampleRadius; x <= sampleRadius; x++) { sample = unpack_float(texture2D(shadowMap, shadowMapTexCoord + vec2(x, y) * shadowMapStep)); if (normalizedProjectedPosition.z > sample + 0.00015) visibility += 1.0; } } return 1.0 - visibility / pow(float(sampleRadius) * 2.0 + 1.0, 2.0); } void main (void) { vec4 lightColor = vec4(vec3(0.8), 1.0); float dotp = max(dot(-u_LightDirection, v_Normal), 0.0); float visibility = dotp; vec3 shadowCoord = (v_PositionFromLight.xyz / v_PositionFromLight.w) / 2.0 + 0.5; if (shadowCoord.x >= 0.0 && shadowCoord.x <= 1.0 && shadowCoord.y >= 0.0 && shadowCoord.y <= 1.0) { float visibility = getVisibility(u_ShadowMap, shadowCoord); dotp *= visibility; } dotp = max(dotp, 0.2); vec4 texel = texture2D(u_Texture0, v_TexCoord0) * v_Color; gl_FragColor = vec4(texel.xyz * dotp, texel.w) * lightColor; } Shadow matrices: var cameraPos = Controllers.camera; var lightNormal = ProcForest.Settings.Lighting.directionalLight.direction; var lightPos = Math.vecSum(cameraPos, Math.vecMultiply(lightNormal, -1)); shadowModelViewMatrix = new Matrix4(); shadowModelViewMatrix.lookAt(lightPos.x, lightPos.y, lightPos.z, cameraPos.x, cameraPos.y, cameraPos.z, 0, 1, 0); shadowProjectionMatrix = new Matrix4(); var volumeSize = ProcForest.Settings.maxTreeViewDist * 0.8; shadowProjectionMatrix.setOrtho(-volumeSize, volumeSize, -volumeSize, volumeSize, -volumeSize, volumeSize);  Any ideas on why this is happening? If I leave the bias at 0.005 things look alright, but I want to avoid peter panning. The black box tends to be centered at the camera's position, which is where the shadow map is being projected from.
  5. Tyler Camp

    "Standard" Resolution Performance

    Looking back, I now remember how he claimed that DirectX would drop back to software-based rendering and stop using the GPU if you didn't use standard resolutions. At this point I don't know if he was just joking or really misinformed, but I'm now much more wary of his statements.
  6. Tyler Camp

    "Standard" Resolution Performance

    I'm hesitant to bring it up since I'm a freshman (from what I've heard/seen he has a "what I say is final in the classroom" sort of thing going on, and I can't imagine that being corrected by a freshman would end well) and the topic was actually brought up months ago.   There's a general consensus that what he proposed didn't make sense and I'll take that as the answer; I'll correct those that quote him on the subject and will continue to bring questionable ideas to you guys for clarification.
  7. Tyler Camp

    "Standard" Resolution Performance

    Got a lot more responses than I initially thought I would get; the whole thing seemed especially weird to me when another part of his reasoning for my tests not showing anything was that "In 2d (only GDI layer) the resolution is less dependent on hardware specs". I really doubted that XNA used GDI for 2D.   Thanks everyone for the info, the part about TV latency sounds like it might have been what he had been referring to and I'll keep that in mind for the future.
  8. One of my professors has made the claim that using a "standard" resolution (i.e. 1920x1080, 1024x768, etc.) will provide better performance than using  "non-standard" resolution (1500x755, etc.) I've never heard of this before and can't seem to find anything to back his claims. I know a lot of games on consoles render to lower (and "non-standard") resolutions and then upscale for better performance, which is the opposite of what he's stated. I don't know what he really means by "resolution" and he hasn't clarified. (The display resolution? The resolution of any framebuffer that is being rendered to?) He didn't say that this was specifically for any platform, but I was in an XNA on Windows class at the time.   I've tried benchmarking it myself (basic 2D sprite rendering via XNA, fullscreen with different backbuffer resolutions) and didn't see any performance penalties/gains that were out of the ordinary. Asking him about it just got the response of "well, it only happens for more advanced graphics algorithms." Has anyone else heard anything like this?
  • Advertisement

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

GameDev.net is your game development community. Create an account for your GameDev Portfolio and participate in the largest developer community in the games industry.

Sign me up!