Jump to content
  • Advertisement
Sign in to follow this  
Glass_Knife

GLSL and GLU Teapot

This topic is 2061 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

If I want to draw a teapot or sphere from the GLU library, but I want to use a vertex and fragment shader, I don't understand how this works.  So far, everything I've done with a vertex shader has defined a layout at the top:

layout(location = 0) in vec4 position;
layout(location = 1) in vec4 color;

void main() {

   ... and so on...

Then I use a vertex array bound to 0 or 1 to pass the data to the shader.  What I don't understand is:  if I draw a teapot glutWireTeapot () 

what do I define in the shader as the input vector?

Edited by Glass_Knife

Share this post


Link to post
Share on other sites
Advertisement

I'm preeeeeety sure glut uses fixed function pipeline to do that. I don't think you'll get vertex buffers out of it.

 

Its pretty old (though freeGlut is more updated), and most people I've seen use it for glutSwapBuffers only.

Share this post


Link to post
Share on other sites

That's what I thought, but I was a tutorial that was using the glu sphere.  I know I was tutorial about shaders using the teapot while I was frantically googling, but I may have misunderstood.

Share this post


Link to post
Share on other sites
The glut teapot is actually the Utah teapot - which was originally defined using a set of bezier patches. There are various versions around on the net, both as bezier patches and triangle/quad meshes. You could use a triangle/quad version for a simple implementation using just vertex and fragment shaders and later a bezier patch version to try out tessellation shaders.

Share this post


Link to post
Share on other sites

For anyone in the future, part of my problem is that I am using JOGL, and the different profiles allow for family GL2 of interfaces which include both the old fixed-function pipeline and the newer shader stuff.  

 

I have been mixing these technologies.  I am not sure if using a newer profile will break some of the code I already have working, so more testing is needed.  And to make matters worse, I tried working on this stuff on my Mac last night - OS X 10.9 - and none of the shaders are compiling because the profile I've been using on Windows and Linux isn't working on the Mac, even though the internet makes it sound like it should.

 

Ahh, programming, I was beginning to think it would be easy.  Thanks for not letting me down. 

Share this post


Link to post
Share on other sites

Ahh yes, they separate functions by the version of OpenGL in which they were created, so you'll be calling GL11 code even if you are using say, 3.3 core profile (for glEnables, glGets, and stuff like that).

 

AFAIK, OSX supports core profiles only. 3.2 specifically. I think the newer releases support an OpenGL 4 core profile (4.0 or 4.1 I think).

 

So no mix and match there. I'm not even sure if you can even load extensions.

Share this post


Link to post
Share on other sites


AFAIK, OSX supports core profiles only. 3.2 specifically. I think the newer releases support an OpenGL 4 core profile (4.0 or 4.1 I think).

 

This is one of the problems that I am having using JOGL.  There really isn't any documentation about this stuff, and looking at objective c code for the iPad doesn't really help me (or if it does I don't get it).  I did find the docs for the newest OS and it said it used on of the '4's, but I guess that isn't enough info if you don't really understand the profile stuff.

Share this post


Link to post
Share on other sites

So now that i've figured out the compatabily/core profile stuff, I tried to get my shaders working only to discover whatever books and tutorials I've been used are using syntax that doesn't seem to be core syntax?  This stuff is really silly.

 

I've been working through tutorial with a #version 330 without "really" understanding what that means.  

Share this post


Link to post
Share on other sites

You should specify #version 330 core, I think that way it tells you if you're using deprecated functions (if you query the infoLog of the individual shaders and the shader programs, if the driver vendor is nice enough to report detailed info, I hear its not always the case).

 

In any case, it works like that for OpenGL in general, not just JOGL/LWJGL. And Apple is specially anal about the versions they support of OpenGL (or anything really). I don't know how JOGL context creation works but in LWJGL you have to ask specifically for a core context, otherwise it defaults to compatibility context.

 

Have you looked at this? http://www.arcsynthesis.org/gltut/ It uses all core functionality. Which means, OpenGL 3.3 core context, and GLSL 3.30 core shaders. It won't tell you how to write a SSAO shader with core 330 GLSL but its a start. I haven't seen JOGL/LWJGL tutorials that use core profile, most of them used fixed function pipeline, so you're left with mostly C++ resources to figure out how to use more recent OpenGL stuff.

 

You also have the official OpenGL spec in OpenGL registry to see whats in core profile and what isn't.

 

I haven't really tried it so I don't know how it works, but I *think* you can have, say, an OpenGL 3.3 core context using #version 400 in the GLSL for example. GLSL works through extensions like everything so you might be able to use newer GLSL syntax on older contexts. But thats useful only if you're after a specific feature I guess.

Share this post


Link to post
Share on other sites


Have you looked at this? http://www.arcsynthesis.org/gltut/

 

BOOM!  That's actually the exact tutorial I'm using that doesn't work on the mac.  I did try using #version 330 core at the top but it didn't work.  I think that I have not selected the correct profile, because the shader compiler complains that it doesn't understand the 'layout' keyword.  I know my shaders are fine because I've been running these examples on Ubuntu 12.04 with no problems.

 

I have been looking at create a core profile in JOGL but so far I've been unsuccessful.

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

GameDev.net is your game development community. Create an account for your GameDev Portfolio and participate in the largest developer community in the games industry.

Sign me up!