Jump to content

  • Log In with Google      Sign In   
  • Create Account

Interested in a FREE copy of HTML5 game maker Construct 2?

We'll be giving away three Personal Edition licences in next Tuesday's GDNet Direct email newsletter!

Sign up from the right-hand sidebar on our homepage and read Tuesday's newsletter for details!


V-man

Member Since 02 Mar 2002
Offline Last Active Feb 06 2013 09:01 AM

#4904275 Rendering textures from obj file

Posted by V-man on 19 January 2012 - 08:43 AM

It is a very common question so it went into the FAQ
http://www.opengl.org/wiki/FAQ#Multi_indexed_rendering


#4903180 Cubemap problems

Posted by V-man on 16 January 2012 - 04:31 AM

gl_Position = modelview * projection * in_Vertex;


normally people do
gl_Position = projection *modelview * in_Vertex;


#4902984 glDrawArrays segfault, beginner confused

Posted by V-man on 15 January 2012 - 11:37 AM

Am I being stupid? How can GLee report that something's not supported where GL reports that it is?

I think that GLEE is over. Try GLEW or GL3W
http://www.opengl.org/wiki/OpenGL_Loading_Library


#4902626 16 Bit Non-Clamped Texture

Posted by V-man on 14 January 2012 - 05:56 AM

If you want to actually have SHORT, then such integer formats are supported in GL 3. Have a look at the spec. All SM 4 GPUs support integer samplers.


#4900316 The final word on stencil buffer and FBOs

Posted by V-man on 07 January 2012 - 06:04 AM

I don't think that any GPU support a color buffer and stencil buffer. Mots GPUs support the D24S8 format (Depth = 24 bit + stencil = 8 bit ==== total 32 bit)
http://www.opengl.org/wiki/Framebuffer_Object_Examples#Stencil


#4900254 GLSL - Reading Textures

Posted by V-man on 06 January 2012 - 02:49 AM

It depends on the GL specification, and the texture format.

EXAMPLE
http://www.opengl.org/sdk/docs/man/
for glteximage2d portion, it says if you use luminance format, GL assembles it in a RGB and sets alpha to 1.
So that's what happens in GLSL as well. In other words, that how the GPU is wired.


#4899992 Multitexturing - Overwriting transparent areas with RGB

Posted by V-man on 05 January 2012 - 10:39 AM

You can create an old context if you want and still use old GL.


#4898203 10 Bit textures and bit-shifting on the GPU

Posted by V-man on 30 December 2011 - 12:15 PM

V-man - Useful to know that the format exists... I'm curious though... Is there a reason that that particular internal format is not mentioned on the follow page:

http://www.opengl.or...lTexImage3D.xml


It depends on the GL version. That page is link into http://www.opengl.org/sdk/docs/man/xhtml/
which says GL 2.1 at the top

You might want to use GL 3.3 http://www.opengl.org/sdk/docs/man3/
or GL 4.2 http://www.opengl.org/sdk/docs/man4/

and the spec files
http://www.opengl.org/documentation/specs/


#4897644 10 Bit textures and bit-shifting on the GPU

Posted by V-man on 28 December 2011 - 05:08 PM

Why don't you use a floating point format like GL_RG16F or GL_RG32F?


#4896517 (Help)My matrices give me a blank screen!

Posted by V-man on 22 December 2011 - 07:52 AM

You should not be able to use the in qualifier on a uniform. The in qualifier is only for attributes (in vec4 in_Position) and varyings (in vec4 mid_Colour) so it is possible that the driver is shitty. is it AMD or nVidia?
.


#4888742 What to use if both GL 2.0 and ARB shader available?

Posted by V-man on 29 November 2011 - 06:05 AM

This is why there is a FAQ
http://www.opengl.org/wiki/FAQ

and more specifically
http://www.opengl.or...k_On_Windows.3F
http://www.opengl.or...tting_Functions
http://www.opengl.or...Loading_Library

and yes, you can ask users to buy themselves a GL 2.0 or above card.
There are a lot of Intel drivers out there that could handle GL 2.0 just fine but Intel doesn't write proper drivers for them. They are limited to 1.4 or 1.5 I think and are extremely buggy. You better go with Direct3D if you want Intel.


#4885302 Texture Colors Not exact as Texture

Posted by V-man on 18 November 2011 - 06:58 AM

From the Wiki
http://www.opengl.org/wiki/Common_Mistakes#GL_TEXTURE_MAG_FILTER

and also
http://www.opengl.org/wiki/Common_Mistakes#Automatic_mipmap_generation


#4882991 Why glEnable(GL_LIGHTING) is raising GL_INVALID_ENUM?

Posted by V-man on 11 November 2011 - 11:39 AM

Perhaps you are using a core profile (glutInitContextProfile(GLUT_CORE_PROFILE))
and that means the old GL stuff is dead and gone. GL_LIGHTINGis not valid for glEnable so it raises that error.

I suggest you use glutInitContextProfile(GLUT_COMPATIBILITY_PROFILE) or just create a GL 2.1 context.


#4882974 2 GLSL questions

Posted by V-man on 11 November 2011 - 10:51 AM

You didn't even read the question, did you?


Of course I did. I also wrote that Wiki page. glClientActiveTexture does the job followed by a glEnableClientState(GL_TEXTURE_COORD_ARRAY) and then a call to glTexCoordPointer.

Did you understand his question?


#4881449 ARB_vertex_program on ATI graphics card

Posted by V-man on 07 November 2011 - 10:38 AM


anything i can try to get it running on an ATI card? i'm going to try it on an nVidia card tonight but sadly i have to work on this ATI one!!!:(

ATI is known for strict syntax requirements, nvidia for once has a less strict interpretation of GLSL. That often result in shaders running flawless on nvidia cards, but not running at all on ATI. Always check for errors when compiling you shaders and log them, the error messages contain detailed information about the type and occurrence of syntax errors.


He isn't using GLSL. He is using the 8 year old ASM extension.




PARTNERS