Jump to content

  • Log In with Google      Sign In   
  • Create Account


Member Since 02 Mar 2002
Offline Last Active Feb 06 2013 09:01 AM

#4904275 Rendering textures from obj file

Posted by on 19 January 2012 - 08:43 AM

It is a very common question so it went into the FAQ

#4903180 Cubemap problems

Posted by on 16 January 2012 - 04:31 AM

gl_Position = modelview * projection * in_Vertex;

normally people do
gl_Position = projection *modelview * in_Vertex;

#4902984 glDrawArrays segfault, beginner confused

Posted by on 15 January 2012 - 11:37 AM

Am I being stupid? How can GLee report that something's not supported where GL reports that it is?

I think that GLEE is over. Try GLEW or GL3W

#4902626 16 Bit Non-Clamped Texture

Posted by on 14 January 2012 - 05:56 AM

If you want to actually have SHORT, then such integer formats are supported in GL 3. Have a look at the spec. All SM 4 GPUs support integer samplers.

#4900316 The final word on stencil buffer and FBOs

Posted by on 07 January 2012 - 06:04 AM

I don't think that any GPU support a color buffer and stencil buffer. Mots GPUs support the D24S8 format (Depth = 24 bit + stencil = 8 bit ==== total 32 bit)

#4900254 GLSL - Reading Textures

Posted by on 06 January 2012 - 02:49 AM

It depends on the GL specification, and the texture format.

for glteximage2d portion, it says if you use luminance format, GL assembles it in a RGB and sets alpha to 1.
So that's what happens in GLSL as well. In other words, that how the GPU is wired.

#4899992 Multitexturing - Overwriting transparent areas with RGB

Posted by on 05 January 2012 - 10:39 AM

You can create an old context if you want and still use old GL.

#4898203 10 Bit textures and bit-shifting on the GPU

Posted by on 30 December 2011 - 12:15 PM

V-man - Useful to know that the format exists... I'm curious though... Is there a reason that that particular internal format is not mentioned on the follow page:


It depends on the GL version. That page is link into http://www.opengl.org/sdk/docs/man/xhtml/
which says GL 2.1 at the top

You might want to use GL 3.3 http://www.opengl.org/sdk/docs/man3/
or GL 4.2 http://www.opengl.org/sdk/docs/man4/

and the spec files

#4897644 10 Bit textures and bit-shifting on the GPU

Posted by on 28 December 2011 - 05:08 PM

Why don't you use a floating point format like GL_RG16F or GL_RG32F?

#4896517 (Help)My matrices give me a blank screen!

Posted by on 22 December 2011 - 07:52 AM

You should not be able to use the in qualifier on a uniform. The in qualifier is only for attributes (in vec4 in_Position) and varyings (in vec4 mid_Colour) so it is possible that the driver is shitty. is it AMD or nVidia?

#4888742 What to use if both GL 2.0 and ARB shader available?

Posted by on 29 November 2011 - 06:05 AM

This is why there is a FAQ

and more specifically

and yes, you can ask users to buy themselves a GL 2.0 or above card.
There are a lot of Intel drivers out there that could handle GL 2.0 just fine but Intel doesn't write proper drivers for them. They are limited to 1.4 or 1.5 I think and are extremely buggy. You better go with Direct3D if you want Intel.

#4885302 Texture Colors Not exact as Texture

Posted by on 18 November 2011 - 06:58 AM

From the Wiki

and also

#4882991 Why glEnable(GL_LIGHTING) is raising GL_INVALID_ENUM?

Posted by on 11 November 2011 - 11:39 AM

Perhaps you are using a core profile (glutInitContextProfile(GLUT_CORE_PROFILE))
and that means the old GL stuff is dead and gone. GL_LIGHTINGis not valid for glEnable so it raises that error.

I suggest you use glutInitContextProfile(GLUT_COMPATIBILITY_PROFILE) or just create a GL 2.1 context.

#4882974 2 GLSL questions

Posted by on 11 November 2011 - 10:51 AM

You didn't even read the question, did you?

Of course I did. I also wrote that Wiki page. glClientActiveTexture does the job followed by a glEnableClientState(GL_TEXTURE_COORD_ARRAY) and then a call to glTexCoordPointer.

Did you understand his question?

#4881449 ARB_vertex_program on ATI graphics card

Posted by on 07 November 2011 - 10:38 AM

anything i can try to get it running on an ATI card? i'm going to try it on an nVidia card tonight but sadly i have to work on this ATI one!!!:(

ATI is known for strict syntax requirements, nvidia for once has a less strict interpretation of GLSL. That often result in shaders running flawless on nvidia cards, but not running at all on ATI. Always check for errors when compiling you shaders and log them, the error messages contain detailed information about the type and occurrence of syntax errors.

He isn't using GLSL. He is using the 8 year old ASM extension.