Jump to content
  • Advertisement

mede

Member
  • Content Count

    73
  • Joined

  • Last visited

Community Reputation

122 Neutral

About mede

  • Rank
    Member

Personal Information

  1. Hy everyone We developed a volume renderer for medical dataset which is performant enough to provide realtime rendering for current VR Headsets. Currently we testet the VR Renderer with HTC Vive and HP/Microsoft Mixed Reality Headsets. We just created a MacOS and Linux installer for the desktop version. VR currently not available for MacOS or Linux. It would be nice if some Folks could test if this Software is working on theirs systems ? thx http://diffuse.ch/
  2. mede

    Specto

    Specto is focused on advanced visualisation of medical image datasets in 3D. The current Specto visualisation algorithms implement state of the art raytracing technics providing stunning rendering in realtime. Because of the high performance rendering we are also able to provide a Virtual Reality solution up to 2880 x 1600 pixels / 90Hz.
  3. mede

    marvin found sense for his live

    added a small video to my first post... but there are not more than this 3 tricks I screened.
  4. marvin found sense for his live... after travelling the universe he started skating mini-ramp. ;) video including some making of in maya. sadly my old hardware not supports the softshadow feature in a for video usable framerate. some picts from current exercise I did, includes: -normalmappig -realtime cubemapping generation for reflection simulation -soft shadow simulation using shadowmap sampling [Edited by - mede on December 19, 2010 12:17:38 PM]
  5. thx this are a lot of suggestions. GL_TEXTURE_CUBE_MAP_SEAMLESS would be a good solution but sadely this seams not supported by my "older" hardware quadro FX1600m ;( so i would have to look for newer hardware because i dont like hacking workarounds for features which are native suported by newer hardware.
  6. I tried a few things to blurring a cubemap which is generated at realtime by rendering into a texture but nothing was really usable. I would like the cubemap for glossy reflection shading. 1. do sampling in shader with several lookups and average them. Theoretical good solution but needs lot of sampling for no noisy result which is to slow. 2. using automatic generated mipmap levels of the generated cubemap. Would be a fast approach bud sadly the 6 subimages are filtered independent which gives hard boarders between the faces. Is there a better and fast way to solve this problem ? I found some forums from 2007 in which people talked about automatic mipmap creation across the edges in DX10 ? is there a similar funtion in OpenGL ?
  7. I pass now the inverse of the camera view matrix to the shader and transform the reflection vertex back to world space. this seams to be a good solution. If I use Normalmapping is there another solution, than passing the inverse tangentspace transformation to the fragmentshader and transform reflection vertex first back to camera space and then world space ???
  8. hm I changed this... Now the reflection is right if I move the camera, but if I rotate the sphere itself the reflection also rotates which for sure not should be the case...
  9. Hi I try to use a dynamic created cube map for reflection lookup in the per pixel shader. I read a lot of stuff in the Internet but I still have a Problem. The reflection on a sphere always show the reflection looking from origin to Z+ direction, independent on the camera position and direction ??? see pictures. I created the the cubemap using the following code from openglwiki which seams to work. openglwiki before rendering the scene I render the 6 faces (+X-X+Y-Y+Z-Z) in WORLD SPACE. here the lines in the vertex shader m_normal = normalize(gl_NormalMatrix * gl_Normal); m_eyeVec = gl_ModelViewMatrix * gl_Vertex; ... gl_Position = ftransform(); and fragment shader uniform samplerCube environmentTex; ... vec3 normal = normalize(m_normal); ... vec3 refVec = normalize(reflect(m_eyeVec, normal)); vec3 color = textureCube(environmentTex, refVec).rgb;
  10. Hey I actually working on a technique rendering a correct mirror. For this I create a virtual camera behind the mirror and render the scene from this view into a texture which I put onto the mirror. First this sound simple to me but I got a view problems ;( With the mirror plane and the camera it is easily possible to calculate position of the virtual position. --MAIN PROBLEM-- But HOW I GET the camera view direction and camera projection matrix to become a optimal projection of the 4 mirror corners ???? Actually I used the center of mirror as view direction and I try with a lot of hacks to adapt aspect ratio and field of view angle to become all 4 corners inside the frustum. -Because of this I have to adapt also the UV-coords to map the mirror to the skewed projection in the texture which results in perspective distortions in texture interpolation ! -the center of mirror is not the center of projection, because of this the mirror is often not complete in frustum !
  11. Quote:Original post by LucasM You should average the normals/binormals/tangents etc. on your low res mesh to avoid this. this. this was exactly the problem... things I changed now: - use indexed triangle strip to have ONE averaged tangent, binormal, and normal vector for a vertex - also use averaged normals for the low model in maya... the normal map now is different space between uv seams is also a good hint. THX for completeness here the new normal map which works fine:
  12. the sphere is only to simplify the problem, I get the same with e.g. the teapot. later I would like to have a lot of different objects with normal maps in my scene. [Edited by - mede on November 10, 2010 4:56:53 PM]
  13. Hey there I have a problem or/and not enough know-how with normalmapping until now I used them only for walls and planes where everything worked well. But no I tried to use them also for full 3d object, because I had a lot of problems I started with a simple shere. shader code is standard: -transferring light,eye dirs in vertex shader to tangential space -normals from map in fragment shader The normalmap I generated in maya: - low mesh with face normals - high mesh with averaged vertex normals - transfer map -> to generate normal map Sadly I have now a problem at the edges of the triangles. Is there something I can do better or anyone knows this effect ??? thanks a lot... simple shere using the normal map the used normal map
  14. yea the simple alpha test in the shader gives quite good results. thx
  15. hm so this would be a solution: - disable Blending, enable Depth Test - rendering terrain - enable blending, disable Depth Test - rendering trees ordered from far to near
  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

GameDev.net is your game development community. Create an account for your GameDev Portfolio and participate in the largest developer community in the games industry.

Sign me up!