• Content count

  • Joined

  • Last visited

Community Reputation

123 Neutral

About vivendi

  • Rank
  1. Show an image of any size

    I'm developing for Android right now. Currently I use a cube to render a bitmap as texture onto it. The image is quite small, 800x400 and that works fine. Then I tried it with a large image (2400x1630) and for some reason it didn't show. (I'll try to find out later why).   But the actual problem is that the texture is mapped over the entire cube. I'd like to map it in such a way that the image keeps its aspect ratio.   These are my position data for the cube and the texture:   final float[] cubePositionData = { // Front face -1.0f, 1.0f, 1.0f,  -1.0f, -1.0f, 1.0f,  1.0f, 1.0f, 1.0f,  -1.0f, -1.0f, 1.0f,  1.0f, -1.0f, 1.0f,  1.0f, 1.0f, 1.0f };   final float[] cubeTextureCoordinateData = { // Front face 0.0f, 0.0f, 0.0f, 1.0f, 1.0f, 0.0f, 0.0f, 1.0f, 1.0f, 1.0f, 1.0f, 0.0f };
  2. I want to know if it's possible to simply display an image fullscreen. For example, I have two images. One has the size 1900x500 and the other 2440x940.   I'm developing for mobile devices, so those resolutions are way to big compared to the resolutions most phones have right now. So that's why I'd like to "resize" those images so they entirely fit on the screen, keeping its aspect ratio.   But I have no idea how to do this. All I can find is how to convert an image to a texture and display that on two triangles. But I assume that it should be possible to simply display an image on the screen without the need of a 3d object   If so, then how can I do this?
  3. glClearColor alpha

    [quote name='swiftcoder' timestamp='1328108020' post='4908385'] Having used GLFW extensively for some years now (and reimplemented the entire toolkit once as part of a port), I've *never* seen any sign of support for offscreen rendering (it's sitting on the [url=""]wish list[/url], but AFAIK is unimplemented). Exactly what branch/fork of GLFW are you using, and what is the exact sequence of GLFW functions you are calling to obtain an offscreen context? [/quote] I'm using glfw-2.7.2 win32 build. I'm just initializing glfw, that's all (see code below). But i just read that some cards don't care about the pixel test: "A so called pixel ownership test is part of the default OpenGL window buffer operation. The implementation of this test depends on the graphic card and OpenGL driver implementation." -> [url=""][/url] So maybe it's just my graphics card that allows it (GeForce GTX 560 Ti), while many other cards don't support it. Guess i'll have to test my application on a few other cards first before i continue with reading the back buffer. This is all i do. Then i render a model and use glReadPixels() to read it. Then i use the pixels that i got from glReadPixels() and put them in another Photoshop like application which just displays everything fine. [code] if(!glfwInit()) { //MessageBoxA(NULL,"gh","ghj",NULL); exit(0); } [/code] BTW, transparency didn't work after all. Everything got transparent, including my model xD
  4. glClearColor alpha

    [quote name='V-man' timestamp='1328035723' post='4908108'] According to [url=""]http://content.gpwik...utorials:Basics[/url] you need to call glfwOpenWindow(int width, int height, int redbits, int greenbits, int bluebits, int alphabits, int depthbits, int stencilbits, int mode) and you can chose 8 for your alpha. If you don't have a window, then you obviously don't have a backbuffer. If you do have a window and it is not visible, then you have a pixel ownership problem = [url=""]http://www.opengl.or...nership_Problem[/url] As for glColorMask, by default all the channels are enabled. [/quote] Well like i said, GLFW doesn't require you to create a window. That's the main reason why i choose this library.I have no window and yet i can successfully read the back buffer with glReadPixels(). I actually have got it working too now [img][/img] So thanks everyone for the help! I guess setting the BUFFER_BITS did the trick eventually [img][/img]
  5. glClearColor alpha

    [quote name='Brother Bob' timestamp='1328025579' post='4908034'] It is not related to OpenGL, but to the windowing API you use. Where do you create the window and the rendering context? Somewhere you must say, or imply, what color format you want for your frame buffer. For example, in GLUT you need to specify the GLUT_ALPHA flag to the glutInitDisaplayMode function, and in the Windows API you need to set the cAlphaBits field of the PIXELFORMATDESCRIPTOR structure you pass to SetPixelFormat. [/quote] Well the thing is, is that i'm not creating any window... I'm using the GLFW library which allows you to render stuff to the back buffer without creating a window (unlike the GLUT lib...). That was the whole reason i chose for GLFW. I'm clearing my buffer now like this: [code] glColorMask(GL_TRUE, GL_TRUE, GL_TRUE, GL_TRUE); glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT); glClearColor(1.0f,0.0f,0.0f,0.5f); [/code] Obviously, its not working. Guess i'll have to check if GLFW lets me set the pixel format somehowm since i have no real HANDLE to a window. Which is needed when setting the pixel format struct (if i understand it right..).
  6. glClearColor alpha

    [quote name='Brother Bob' timestamp='1328024209' post='4908021'] The the alpha value is always coming out as fully opaque, then you don't have an alpha channel. Check the pixel format. [/quote] I can't really find anything usefull about that. I've been looking through alot of opengl APIs... Could you maybe point me in the right direction?
  7. glClearColor alpha

    [quote name='Waterlimon' timestamp='1328023695' post='4908017'] Try glClearColor(0.0f,0.0f,0.0f,1.0f); instead? [/quote] Ohh i thought transparency was set to 0.0f, but i tried it with 1.0f, still the same result though. Np transparency Any other ideas?
  8. I'm trying to use glClearColor with a full alpha, but for some reason it isn't working. What i'm doing is i only render to the back buffer. So i never call something like swapBuffers(); After that i'm reading all the pixels from the back buffer with glReadPixels(). All the RGB values are stored correctly, but the alpha value is always 255. So every render i clear the screen and then i render a model. Shouldn't that leave me with a fully transparent screen except on the place where my model is drawn? My code is basically like this: [code] void Render(void) { glClearColor(1.0f,0.0f,0.0f,0.0f); glClear(GL_DEPTH_BUFFER_BIT); drawModel(); glReadBuffer(GL_BACK); // probably not needed, but just in case glReadPixels(0, 0, (GLsizei)width, (GLsizei)height, GL_RGBA, GL_UNSIGNED_BYTE, pixels); } [/code] But the backcolor is always red, why not transparent...?? Is there a setting that i'm missing??
  9. color per pixel

    I see, now i get it It works excellent. Thanks alot for your help!
  10. color per pixel

    Hi, thanks alot for your help! It seems to be working fine when i use the following: [code] float dp = (max(dot(normalize(EyePos), normalize(Normal)), 0.0) + MixRatio); //MixRatio = for strength float shade = max((dp - 0.866025) / (1.0 - 0.866025),0.0); //float shade = dp - cos(30 * 3.14159 * 180); [/code] Although i wasn't sure what the 2nd formula you posted was for? It didn't seem to give the desired effect. And i'm also not sure how you got to the number '866025'. It looks like that when i change that number, that it also changes the range of the falloff. Kinda like i'm doing with the MixRatio variable when i caluclate the 'dp' variable.
  11. color per pixel

    It's working now. I had to change the EyeDir with EyePos. Just one other thing that i'm struggling with. Now the gradient goes from all black (0 degrees; a pixel which faces the camera), to 90 degrees, fully transparent (a pixel that isn't facing the camera). Like i wanted. What if i want it to go from 0 degrees (fully black) to 30 degrees (100% transparent) instead of 90 degrees..? I've tried to do some additions, subtractions on the dot product. But they didn't gave the desired result. Anyone any idea?
  12. color per pixel

    [quote name='FXACE' timestamp='1327853293' post='4907352'] If it's 45 degrees rotated is it should contain 50% - black and 50% - transparent? If yes: [code] float shade = max(dot(normalize(EyeDir), normalize(Normal)),0.0); // fixing 'solve' because dot returns [-1.0..1.0] but mix expects result in range [0.0...1.0] [/code] Is this what you want? Best wishes, FXACE. [/quote] Yes, that's exactly what i mean. When it's 45 degrees then the color should be 50% transparent. I tried your example, but it still paints my entire sphere black. I have no compile errors though. So the shader is fine. The calculation is somehow not correct. Any idea what the problem could be?
  13. This should be pretty basic stuff, but i just can't get my head around it. Basically what i want to do is check wether a pixel is facing the camera or not. If the pixel is directly facing the camera, then i want to color it completely black. When it's 90 degrees rotated, then it should have full transparency. I guess i have all the needed stuff in my vertex shader: [code] varying vec3 Normal; varying vec3 EyeDir; varying vec4 EyePos; void main(void) { gl_Position = ftransform(); Normal = normalize(gl_NormalMatrix * gl_Normal); vec4 pos = gl_ModelViewMatrix * gl_Vertex; EyeDir =; EyePos = gl_ModelViewProjectionMatrix * gl_Vertex; } [/code] In my fragment shader i have two textures. One is just some random image and the other is completely black. I guess i have to "mix" them some how, depending on the position of the pixel. But i have no idea how to calculate that... [size=4]This is something i've tried, but didn't worked out as planned.[/size] [code] vec4 RefractionColor = texture2D(RefractionMap, index); // Apply the black material vec4 blackMat = texture2D(Black, gl_TexCoord[0].st); float shade = dot(normalize(EyeDir), normalize(Normal)); // Blend the textures RefractionColor = mix(blackMat, RefractionColor, shade); gl_FragColor = RefractionColor; [/code] [size=4]Anyone any idea how to do this..??[/size]
  14. using shader variables

    [quote name='zacaj' timestamp='1327525662' post='4906218'] Everything youre using in the vertex shader is just provided by opengl. As long as youre calling glVertexPointer etc correctly, the vertex shader looks fine. You might want to start with a fragment shader that just says gl_FragColor=vec4(1) first, so you can be sure its not a texturing problem (if youre having problems). Also, when you reply, please quote me so I get an email, or I may not respond for a while [/quote] Thanks i've managed to get the shader i posted to work. But i made things a bit more difficult for me with an improved shader [img][/img] I now need to pass a vec3 CameraPos and a mat4 ModelWorld4x4 to my shader. I think did the CameraPos part right, but i don't know how to obtain the ModelWorld part. This is what i have so far: [code] GLuint front, back, campos, modelworld; glGenTextures(1, &front); glBindTexture(GL_TEXTURE_2D, front); int success = glfwLoadTexture2D("C:\\models\\reflect1.tga", GLFW_BUILD_MIPMAPS_BIT); glGenTextures(1, &back); glBindTexture(GL_TEXTURE_2D, back); success = glfwLoadTexture2D("C:\\models\\reflect2.tga", GLFW_BUILD_MIPMAPS_BIT); campos = glGetUniformLocation(, "CameraPos"); glUniform3fv(campos, 0.0f, 0.0f, 3.0f); [/code] I got the campos variables from my gluLookAt method. gluLookAt([b]0.0f, 0.0f, 3.0f[/b], 0.f, 0.0f, 0.0f, 0.0f, 1.0f, 0.f); Any idea on how to get the ModelWorld4x4? For the record, this is the vertex shader: [code] uniform vec3 CameraPos; uniform mat4 ModelWorld4x4; varying vec3 R; mat3 GetLinearPart( mat4 m ) { mat3 result; result[0][0] = m[0][0]; result[0][1] = m[0][1]; result[0][2] = m[0][2]; result[1][0] = m[1][0]; result[1][1] = m[1][1]; result[1][2] = m[1][2]; result[2][0] = m[2][0]; result[2][1] = m[2][1]; result[2][2] = m[2][2]; return result; } void main() { gl_Position = ftransform(); mat3 ModelWorld3x3 = GetLinearPart( ModelWorld4x4 ); // find world space position. vec4 WorldPos = ModelWorld4x4 * gl_Vertex; // find world space normal. vec3 N = normalize( ModelWorld3x3 * gl_Normal ); // find world space eye vector. vec3 E = normalize( - ); // calculate the reflection vector in world space. R = reflect( E, N ); } [/code]
  15. I want to apply a shader, just for learning purposes. But i'm not sure how to pass data from my code to my shader. I have the following shader, used for spherical environment mapping: [b]Vertex_Shader[/b] [code] void main() { gl_Position = ftransform(); gl_TexCoord[0] = gl_MultiTexCoord0; vec3 u = normalize( vec3(gl_ModelViewMatrix * gl_Vertex) ); vec3 n = normalize( gl_NormalMatrix * gl_Normal ); vec3 r = reflect( u, n ); float m = 2.0 * sqrt( r.x*r.x + r.y*r.y + (r.z+1.0)*(r.z+1.0) ); gl_TexCoord[1].s = r.x/m + 0.5; gl_TexCoord[1].t = r.y/m + 0.5; } [/code] [b]Pixel_Shader[/b] [code] uniform sampler2D colorMap; uniform sampler2D envMap; void main (void) { vec4 color = texture2D( colorMap, gl_TexCoord[0].st); vec4 env = texture2D( envMap, gl_TexCoord[1].st); gl_FragColor = color + env*0.4; } [/code] How exactly do i pass a sampled2D variable to this shader..?? And i'm also not sure wether the Vertext Shader needs to receive data from my code aswell... Anyone willing to help me out on this one?