Jump to content
  • Advertisement

Yhonatan

Member
  • Content Count

    72
  • Joined

  • Last visited

Community Reputation

115 Neutral

About Yhonatan

  • Rank
    Member
  1. Hey all, I've got a certain problem - I'm trying to draw a model using VBO, but on 2 different windows. what i'm doing is this: I open 2 windows (win api..) for each I get its Device Context, and create a Rendering Context (for openGL rendering!). then, each time I recieve WM_PAINT, I draw it this way: wglMakeCurrent(_mhDC,_mhRC); ...drawing the scene.... SwapBuffers(_mhDC); _mhDC - is current window Device Context _mhRC - is current window Rendering Context Everything works fine if I draw a triangle on each window, BUT If I use a VBO (even if its only on one window and on the other one I dont draw anything) my program start running, I see a Window (the other one is covered behind), and when I try to move the window, my program crash. two more things: *if I open just 1 window, and draw on it with the vbo, everything works fine. *if I draw the VBO only at the last opened window, it works. if I change the order, it will crash. For example if I'll open 3 windows ,it must be window number 3. Any ideas on what could be the problem and how can I solve it? Thanks in advance! [Edited by - Yhonatan on October 23, 2010 10:08:31 AM]
  2. Hey all, I'm programming in opengl for some time time, and there are some things that I wonder about. When in opengl we send vertices to the Rendering Pipeline, where is the real processing happen? the graphic library only transport these vertices to the graphic card, or does the graphic card only does half of the job and the library gets the other? Thanks In Advance.
  3. Hey guys, Assuming i got a vbo that contains an entire terrain, and i want each of the triangles to have a different texture + that it will combine together. how can I do it? do I have to make a vbo for each triangle? Thanks in advance
  4. Hey all i'm currently working on a opengl program, and I want to add a text box to my program. I've tried few open-source GUI's but none of them supports few languages (like Hebrew for example) and only english. So, after breaking my head i've decided to use MFC and searched for tutorials around the net. The only way I've found to combine both of those is to use OPENGL inside MFC and thats not what i want. I just want a simple textbox inside my OPENGL window, and not my opengl window inside the MFC. Any ideas how to achieve this? Thanks in advance.
  5. Yhonatan

    Project Warfare

    Hey all, I'm currently working on a 3d-first person shooter game And I would like to know what do you guys think. I've made a long movie so it will present everything in the game here is the movie: (press watch in high quality :P) The game's name is Project Warfare and i'm developing it in C++ using opengl and SDL. It currently contains 16,103 Lines of code. I've written everything from scratch (That means I havent used any engine rather than mine) Here's a screen shot: http://img206.imageshack.us/img206/4762/screenvu9.jpg So what do you guys think? What should I change? Did u like it? How about the atmosphere in the game? Thanks
  6. Yhonatan

    Destroying buildings

    Thanks For The Answers I'll try and check it out
  7. Hey All, I'm currently working on a 3d-fps agme, and I want to implement a dynamic environment. So assuming I have a building, and a tank is going through it, I want the building to break but only in the place where the tank was going into it. I really have no idea how to do this - The only idea I have in mind is creating a lot of Meshes (for the building) and then just destroy those who are near the tank when the collision occurs but this way wastes a lot of space and is not a very smart way. are there any other ways to do this? Thanks in advance.
  8. Yhonatan

    problem with normals

    Without looking at your code I guess that the problem is you're calculating each Faces normal and then applying it to the vertices in that face. That's how you get Flat-Shading. In order to get a Smooth Shading you'll need to do an avarage of normals surrounding a vertex Assuming you have a Vertex that is located in 4 faces you compute each normal of those faces , you sum them up and then do an avarage: (each n is a normal of a nearby face of a vertex) n1,n2,n3,n4 VertexNormal=(n1+n2+n3+n4)/4.0f; I hope This Helped
  9. Yhonatan

    openGl transformation

    I'm not sure about this but i got a feeling that u're doing the rotation wrong. instead of calculating how each rotation effects the point, you rotate each invidual coord (y for example) and then you go for the z and etc. What i think you should do is this Point = rotateion-of-a-point-around-yaxis(Point); Point = rotateion-of-a-point-around-xaxis(Point); Point = rotateion-of-a-point-around-zaxis(Point); because each rotation effects the whole axis of a point. (still, I havent looked up in the rotation code but this seems to me like something you have to change anyway)
  10. Make sure that you have disabled lighting, enabled blending and used the right blending function ( glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA); ) if it still doesnt work, try to change the order of the drawing (like if u got a triangle and that quad, first draw the triangle and then the quad) I hope that helps :)
  11. Hey all, first of all I must say that i'm already trying a couple of days to achieve shadow in my game with no success. finally today I've found a good tutorial that worked but it's not working for me. this is what i get: http://img367.imageshack.us/img367/9365/get2al7.jpg this is what I suppose to get (in the toturial): http://img367.imageshack.us/img367/7820/supposefx8.jpg IF i switch the shaders order (Second is first and first is second) this is what i get: http://img529.imageshack.us/img529/9580/getgm4.jpg I really have no idea what's going on and i'm kinda desperate on this one. this is my rending code: glBindFramebufferEXT( GL_FRAMEBUFFER_EXT, g_frameBuffer ); glFramebufferTexture2DEXT( GL_FRAMEBUFFER_EXT, GL_COLOR_ATTACHMENT0_EXT, GL_TEXTURE_2D, g_shadowMapID, 0 ); glFramebufferRenderbufferEXT( GL_FRAMEBUFFER_EXT, GL_DEPTH_ATTACHMENT_EXT, GL_RENDERBUFFER_EXT, g_depthRenderBuffer ); glViewport( 0, 0, RENDERBUFFER_WIDTH, RENDERBUFFER_HEIGHT ); glClearColor( 1.0f, 1.0f, 1.0f, 1.0f ); glClear( GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT ); glMatrixMode( GL_MODELVIEW ); glLoadIdentity(); g_light_pos[0]=Light[0]; g_light_pos[1]=Light[1]; g_light_pos[2]=Light[2]; gluLookAt( g_light_pos[0],g_light_pos[1],g_light_pos[2], // Look from the light's position 0.0f, 0.0f, 0.0f, // Towards the teapot's position 0.0f, 1.0f, 0.0f ); glGetFloatv(GL_MODELVIEW_MATRIX, g_lightproj_matrix); glMatrixMode( GL_TEXTURE ); glLoadIdentity(); glTranslatef( 0.5f, 0.5f, 0.5f ); // Offset glScalef( 0.5f, 0.5f, 0.5f ); // Bias gluPerspective( 45.0, (GLdouble)1024 / 768, 0.1, 1500.0 ); glMultMatrixf( g_lightproj_matrix ); glMatrixMode( GL_MODELVIEW ); glUseProgramObjectARB(first); DrawEntireScene(); glUseProgramObjectARB(0); glBindFramebufferEXT( GL_FRAMEBUFFER_EXT, 0 ); glViewport( 0, 0, 1024, 768 ); glClearColor( 0.0f, 0.0f, 1.0f, 1.0f ); glClear( GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT ); glMatrixMode( GL_MODELVIEW ); glLoadIdentity(); gluLookAt( 0.0f,0.0f,0.0f, // Look from the light's position 0.0f, 0.0f, 0.0f, // Towards the teapot's position 0.0f, 1.0f, 0.0f ); glUseProgramObjectARB(Second); glBindTexture( GL_TEXTURE_2D, g_shadowMapID ); DrawEntireScene(); As you can see it looks Really horrible and it even changes while i move. this is my shaders code: --First.Vertex-- // We will need to send only one var to the fragment shader. varying vec3 vertPos; void main(void) { // Set and send OutPut data gl_Position = gl_ModelViewProjectionMatrix * gl_Vertex; // This is just setting 'vertPos' to the vertex's position // in view space. This way, it will be easy to get the distance // between the light and the vert, as we assume the camera // is positioned at the light source. vertPos = vec4(gl_ModelViewMatrix * gl_Vertex).xyz; } -First.Fragment- varying vec3 vertPos; void main(void) { // The hardest part to understand is how we will store // the depth of the fragment. // This will be done in kind of a pseudo-bitshift // fashion, ie the red,green and blue components will // each store 8bits of the 24bit depth. // NOTE: Both 16bit or 32bit can be done in this fashion // with no speed gain or loss, however 24bit seems to // be enough for most depth buffers, so we will use that. // Firstly, we create the constant 'bit' vectors. // 'bitSh' is the bit shift, designed to 'shift' the // depth value so that only a certain part lies within // the current 8bit component. // The second is the bit mask, designed to cull all numbers // below the 8bit component. Any value above will be culled by // the 'fract()' function. const vec3 bitSh = vec3( 256*256, 256, 1); const vec3 bitMsk = vec3( 0, 1.0/256.0, 1.0/256.0); // We will need to calculate and store the distance between the // light and the fragment. // NOTE: This value needs to be normalized, this can usually be // done by dividing the length by the lights range, but for simplicity here // we are dividing it by 100, which is the 'zFar' value set in 'gluPerspective()'. float dist = length(vertPos)/100.0; vec3 comp; // Now, we apply the bit shift, the commented code is what is // happening under the hood. If you dont understand bit shifting // or masking, just trust me on this. It will move a different section // of the 'distance' over the 0.0f-1.0f range of each component, so that // a 24bit float can be stored in 3*8bit floats. //comp.x=dist*(256*256); //comp.y=dist*256; //comp.z=dist; comp=dist*bitSh; // The next part (again, commented code shows what is happening) // is simply culling all unwanted 'bits'...all we want is a value // between 0.0f and 1.0f, with a precision of 1.0f/256.0f. // If the precision is better (1.0f/512.0f for example) the value // may be rounded off and we will get a bad value. And of course if the // value is above 1.0f the value will be clamped, which // again will produce a bad value. //comp.x=fract(comp.x); //comp.y=fract(comp.y)-comp.x/256; //comp.z=fract(comp.z)-comp.y/256; comp=fract(comp); comp-=comp.xxy*bitMsk; // For a simple one-texture projectsion spot light, this is perhaps not // the best method (GL_DEPTH_COMPONENT should produce nicer results), but for cube // shadow mapping, this is the best way. // This also removes the need for floating point // buffers (GL_RGBA_32F) or for single component rendering (GL_LUMINANCE). // Which in some cases can be a plus for compatability. gl_FragColor=vec4(comp,1); } -Second.Vertex- uniform mat4 lightMatrix; varying vec4 lightPos; varying vec4 lightProj; void main( void) { gl_Position = gl_ModelViewProjectionMatrix * gl_Vertex; lightProj= gl_TextureMatrix[0]* gl_Vertex; lightPos=gl_LightSource[1].position- gl_Vertex; } -Seconds.Fragment- uniform sampler2D shadowMap; varying vec4 lightPos; varying vec4 lightProj; // Use use another bitshift constant, notice that // its the reciprocal of the other bit shift constant // in the shadow.frag shader. const vec3 bitShifts = vec3(1.0/(256.0*256.0), 1.0/256.0, 1); void main(void) { // Firstly, we get the 2D coords of the shadow map, // and get its value. vec3 pos = lightProj.xyz/lightProj.w; vec4 shadmap = texture2D(shadowMap,pos.xy ); //vec4 shadmap = texture2D(shadowMap,gl_TexCoord[0]); // Next we just have some constants, the bias is used to // 'push' the shadow back, to reduce 'Z-fighting' on the // shadow maps. The range is used to normalize the distance // between the fragment and the light, (just like we did // in shadow.frag, notise the value 100). const float bias = 0.1; const float range = 100.0; // We calculate the distance between the fragment and the light. // Then we apply the bias and the normalization by dividing // it by the range; float lightLen=length(lightPos); float lightDist=(lightLen-bias)/range; // Now we apply the bit shift, and add all of the components // together. A dot product does this very cleanly. // This is all we need to do to convert the shadow map into // a distance value. float shad = dot(shadmap.xyz , bitShifts); // Perform the actual shadow map test. If shad is larger than // light dist, then the current fragment is outside of the shadow. float shadow = float(shad >lightDist); // NOTE: For show purposes, the shadow map is also drawn. // The only value we need is 'shadow'. // This is just to show you how freaky the unconverted // shadow map looks. gl_FragColor=shadmap *shadow ; } Does anyone know what Should i do? Thanks in advance
  12. Yhonatan

    problem with shaders

    hey again sorry for flooding but i found my problem in this shader. it is LIGHTDIR and probably that's what causin that no matter what even if i change light position its not working. Probably something is not good with that light dir. when i change the line that uses the light dir to this one: intensity = dot(vec3(0,0,1),normal); it works great! but no matter where i put the light position its still lookin the same
  13. Yhonatan

    problem with shaders

    Quote:Original post by Chaotic_Attractor It probably worked in Shader Designer because it was using transformed normals. The normal vector needs to be transformed in the vertex shader. To do this, you will need to input the transpose of the inverse world matrix to your shader. Then in the vertex shader, compute the transformed normal like this: normal = normalize(transInvWorld * gl_Normal()); Note that if your world matrix is composed only of rotations, transformations, and UNIFORM scalings (i.e. same scaling factor for each axis), you don't need to transpose the inverse world matrix. That's since the world matrix would be orthogonal (i.e. transpose(A) = A) Also, HLSL isn't better than GLSL. The syntax for both is almost the same. When compiled, both reduce the code to assembly language instructions. The only real difference between the two is that HLSL is exclusively for DirectX while GLSL is only for OpenGL. Hey, Thanks for your answer! it really helped me and it went out that this shader wasnt so good because it also didnt work in shader designer ( i made a mistake when i told it worked) so I'm usin a different one (which look a bit weired) and I Transformed the normals this time. Anyways, I got another question, In this shader, the new shader and the Light PerPixel shader i've tried, Changing my light position didnt make any effect or change on the colors (while the shader is on) do you have any idea why this is happening?
  14. Yhonatan

    problem with shaders

    Quote:Original post by Chaotic_Attractor I am not very familiar with GLSL (I use HLSL), but there are two things that jump out at me. Assuming your card does support dynamic branching: 1. Is your normal vector being transformed in the vertex shader? 2. Perhaps you need to use -lightDir in the dot product. I have run into this problem before. It can be a real pain to find this bug. 1.I'm not sure what exacly gl_Normal returns 2. I tried using -lightDir and it didnt work. another strange thing is that I tried putting it out in Shader Designer and it worked great in there. mybe i'm setting up the light wrong or something? p.s Is HLSL better than GLSL?
  15. Yhonatan

    problem with shaders

    Quote:Original post by VerMan Perhaps your graphics card does not support branching and conditions within shaders. It happened to me when I was developing something using an ATI X700. Hey, thanks for ur answer buy I dont think that this is the case. I got Geforce 9500Gt - It's not the best but i'm pretty sure it supports it.
  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

GameDev.net is your game development community. Create an account for your GameDev Portfolio and participate in the largest developer community in the games industry.

Sign me up!