Jump to content

  • Log In with Google      Sign In   
  • Create Account

GLSL and GLU Teapot


Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.

  • You cannot reply to this topic
18 replies to this topic

#1 Glass_Knife   Moderators   -  Reputation: 4451

Like
0Likes
Like

Posted 21 February 2014 - 10:56 PM

If I want to draw a teapot or sphere from the GLU library, but I want to use a vertex and fragment shader, I don't understand how this works.  So far, everything I've done with a vertex shader has defined a layout at the top:

layout(location = 0) in vec4 position;
layout(location = 1) in vec4 color;

void main() {

   ... and so on...

Then I use a vertex array bound to 0 or 1 to pass the data to the shader.  What I don't understand is:  if I draw a teapot glutWireTeapot () 

what do I define in the shader as the input vector?


Edited by Glass_Knife, 21 February 2014 - 10:56 PM.

I think, therefore I am. I think? - "George Carlin"
Indie Game Programming

Sponsor:

#2 TheChubu   Crossbones+   -  Reputation: 4354

Like
0Likes
Like

Posted 21 February 2014 - 11:02 PM

I'm preeeeeety sure glut uses fixed function pipeline to do that. I don't think you'll get vertex buffers out of it.

 

Its pretty old (though freeGlut is more updated), and most people I've seen use it for glutSwapBuffers only.


"I AM ZE EMPRAH OPENGL 3.3 THE CORE, I DEMAND FROM THEE ZE SHADERZ AND MATRIXEZ"

 

My journals: dustArtemis ECS framework and Making a Terrain Generator


#3 Glass_Knife   Moderators   -  Reputation: 4451

Like
0Likes
Like

Posted 21 February 2014 - 11:05 PM

That's what I thought, but I was a tutorial that was using the glu sphere.  I know I was tutorial about shaders using the teapot while I was frantically googling, but I may have misunderstood.


I think, therefore I am. I think? - "George Carlin"
Indie Game Programming

#4 dave j   Members   -  Reputation: 592

Like
0Likes
Like

Posted 22 February 2014 - 06:56 AM

The glut teapot is actually the Utah teapot - which was originally defined using a set of bezier patches. There are various versions around on the net, both as bezier patches and triangle/quad meshes. You could use a triangle/quad version for a simple implementation using just vertex and fragment shaders and later a bezier patch version to try out tessellation shaders.

#5 Glass_Knife   Moderators   -  Reputation: 4451

Like
0Likes
Like

Posted 23 February 2014 - 08:51 AM

For anyone in the future, part of my problem is that I am using JOGL, and the different profiles allow for family GL2 of interfaces which include both the old fixed-function pipeline and the newer shader stuff.  

 

I have been mixing these technologies.  I am not sure if using a newer profile will break some of the code I already have working, so more testing is needed.  And to make matters worse, I tried working on this stuff on my Mac last night - OS X 10.9 - and none of the shaders are compiling because the profile I've been using on Windows and Linux isn't working on the Mac, even though the internet makes it sound like it should.

 

Ahh, programming, I was beginning to think it would be easy.  Thanks for not letting me down. 


I think, therefore I am. I think? - "George Carlin"
Indie Game Programming

#6 TheChubu   Crossbones+   -  Reputation: 4354

Like
1Likes
Like

Posted 23 February 2014 - 04:23 PM

Ahh yes, they separate functions by the version of OpenGL in which they were created, so you'll be calling GL11 code even if you are using say, 3.3 core profile (for glEnables, glGets, and stuff like that).

 

AFAIK, OSX supports core profiles only. 3.2 specifically. I think the newer releases support an OpenGL 4 core profile (4.0 or 4.1 I think).

 

So no mix and match there. I'm not even sure if you can even load extensions.


"I AM ZE EMPRAH OPENGL 3.3 THE CORE, I DEMAND FROM THEE ZE SHADERZ AND MATRIXEZ"

 

My journals: dustArtemis ECS framework and Making a Terrain Generator


#7 Glass_Knife   Moderators   -  Reputation: 4451

Like
0Likes
Like

Posted 23 February 2014 - 07:14 PM


AFAIK, OSX supports core profiles only. 3.2 specifically. I think the newer releases support an OpenGL 4 core profile (4.0 or 4.1 I think).

 

This is one of the problems that I am having using JOGL.  There really isn't any documentation about this stuff, and looking at objective c code for the iPad doesn't really help me (or if it does I don't get it).  I did find the docs for the newest OS and it said it used on of the '4's, but I guess that isn't enough info if you don't really understand the profile stuff.


I think, therefore I am. I think? - "George Carlin"
Indie Game Programming

#8 Glass_Knife   Moderators   -  Reputation: 4451

Like
0Likes
Like

Posted 23 February 2014 - 07:35 PM

So now that i've figured out the compatabily/core profile stuff, I tried to get my shaders working only to discover whatever books and tutorials I've been used are using syntax that doesn't seem to be core syntax?  This stuff is really silly.

 

I've been working through tutorial with a #version 330 without "really" understanding what that means.  


I think, therefore I am. I think? - "George Carlin"
Indie Game Programming

#9 TheChubu   Crossbones+   -  Reputation: 4354

Like
0Likes
Like

Posted 23 February 2014 - 09:11 PM

You should specify #version 330 core, I think that way it tells you if you're using deprecated functions (if you query the infoLog of the individual shaders and the shader programs, if the driver vendor is nice enough to report detailed info, I hear its not always the case).

 

In any case, it works like that for OpenGL in general, not just JOGL/LWJGL. And Apple is specially anal about the versions they support of OpenGL (or anything really). I don't know how JOGL context creation works but in LWJGL you have to ask specifically for a core context, otherwise it defaults to compatibility context.

 

Have you looked at this? http://www.arcsynthesis.org/gltut/ It uses all core functionality. Which means, OpenGL 3.3 core context, and GLSL 3.30 core shaders. It won't tell you how to write a SSAO shader with core 330 GLSL but its a start. I haven't seen JOGL/LWJGL tutorials that use core profile, most of them used fixed function pipeline, so you're left with mostly C++ resources to figure out how to use more recent OpenGL stuff.

 

You also have the official OpenGL spec in OpenGL registry to see whats in core profile and what isn't.

 

I haven't really tried it so I don't know how it works, but I *think* you can have, say, an OpenGL 3.3 core context using #version 400 in the GLSL for example. GLSL works through extensions like everything so you might be able to use newer GLSL syntax on older contexts. But thats useful only if you're after a specific feature I guess.


"I AM ZE EMPRAH OPENGL 3.3 THE CORE, I DEMAND FROM THEE ZE SHADERZ AND MATRIXEZ"

 

My journals: dustArtemis ECS framework and Making a Terrain Generator


#10 Glass_Knife   Moderators   -  Reputation: 4451

Like
0Likes
Like

Posted 23 February 2014 - 09:45 PM


Have you looked at this? http://www.arcsynthesis.org/gltut/

 

BOOM!  That's actually the exact tutorial I'm using that doesn't work on the mac.  I did try using #version 330 core at the top but it didn't work.  I think that I have not selected the correct profile, because the shader compiler complains that it doesn't understand the 'layout' keyword.  I know my shaders are fine because I've been running these examples on Ubuntu 12.04 with no problems.

 

I have been looking at create a core profile in JOGL but so far I've been unsuccessful.


I think, therefore I am. I think? - "George Carlin"
Indie Game Programming

#11 TheChubu   Crossbones+   -  Reputation: 4354

Like
0Likes
Like

Posted 23 February 2014 - 10:09 PM

Well I can't help you with JOGL but with LWJGL its more or less like:

 

ContextAttribs cAttribs = new ContextAttribs(3,3).withProfileCore(true); // Object containing context data.
PixelFormat pFormat = new PixelFormat(8,24,8); // 8 bit alpha, 8 bit stencil, 24 bit depth.
 
Display.create(pFormat, cAttribs); // Create the window with a context and pixel format.

 

And that should be it. You can set a specific display mode (ie, resolution, refresh rate) after creating the display or before, its up to you.

 

Are you sure about the version of the context you're running on? JOGL should provide some method to check that.

 

Now that I remember... I think I saw an user with similar issues, the shader compiler complained about the layout qualifiers, so he had to enable the extension by hand in the shader. I think the syntax for enabling GLSL extension was to insert this line on the shaders:

 

#extension ARB_explicit_attrib_location : enable

"I AM ZE EMPRAH OPENGL 3.3 THE CORE, I DEMAND FROM THEE ZE SHADERZ AND MATRIXEZ"

 

My journals: dustArtemis ECS framework and Making a Terrain Generator


#12 tanzanite7   Members   -  Reputation: 1296

Like
2Likes
Like

Posted 24 February 2014 - 03:40 AM

It used to be that Mac only supported OGL 3.2 (there was a fakery way to make the OS believe it has 3.3, not sure whether there has been any official support added) - which iirc does not have attribute layout location in core. The extension of it might be available though, as TheCubu noted.

Glass_Knife, i did not notice you telling how the "Not working" manifests itself? What is the error message given? Are you sure you have OGL 3.3 available in the first place?

Edited by tanzanite7, 24 February 2014 - 03:44 AM.


#13 Glass_Knife   Moderators   -  Reputation: 4451

Like
0Likes
Like

Posted 24 February 2014 - 08:51 AM


Are you sure about the version of the context you're running on? JOGL should provide some method to check that.

 

I have looked at that stuff, but the docs are sketchy.  I though jwjgl looked more promising but didn't see how it would integrate with a swing application.


I think, therefore I am. I think? - "George Carlin"
Indie Game Programming

#14 Glass_Knife   Moderators   -  Reputation: 4451

Like
0Likes
Like

Posted 24 February 2014 - 09:00 AM


Glass_Knife, i did not notice you telling how the "Not working" manifests itself? What is the error message given? Are you sure you have OGL 3.3 available in the first place?

 

I'm a moderator.  We don't have problems, we fix them.  :-)

#version 330

layout (location = 0) in vec4 position;
layout (location = 1) in vec4 color;
out vec4 theColor;

void main()
{
	gl_Position = position;
	theColor = color;
}

Here is the error for the shader:

Compile failure in fragment shader:
ERROR: 0:3: Invalid use of layout 'location'
ERROR: 0:4: Invalid use of layout 'location'
ERROR: 0:9: Use of undeclared identifier 'gl_Position'
ERROR: 0:9: Use of undeclared identifier 'position'
ERROR: 0:10: Use of undeclared identifier 'color'

And so if you're working on this at midnight, you won't notice that you're getting fragment shader errors for your vertex shader.  ;-)

 

But that still hasn't solved the profile issue.

Exception in thread "AWT-EventQueue-0-AWTAnimator" java.lang.RuntimeException: javax.media.opengl.GLException: Not a GL2 implementation
	at com.jogamp.common.util.awt.AWTEDTExecutor.invoke(AWTEDTExecutor.java:58)
	at jogamp.opengl.awt.AWTThreadingPlugin.invokeOnOpenGLThread(AWTThreadingPlugin.java:103)
	at jogamp.opengl.ThreadingImpl.invokeOnOpenGLThread(ThreadingImpl.java:206)
	at javax.media.opengl.Threading.invokeOnOpenGLThread(Threading.java:172)
	at javax.media.opengl.Threading.invoke(Threading.java:191)
	at javax.media.opengl.awt.GLCanvas.display(GLCanvas.java:541)
	at com.jogamp.opengl.util.AWTAnimatorImpl.display(AWTAnimatorImpl.java:75)
	at com.jogamp.opengl.util.AnimatorBase.display(AnimatorBase.java:416)
	at com.jogamp.opengl.util.Animator$MainLoop.run(Animator.java:188)
	at java.lang.Thread.run(Thread.java:744)

I just haven't figured out the magical formula.  But progress is happening...


I think, therefore I am. I think? - "George Carlin"
Indie Game Programming

#15 tanzanite7   Members   -  Reputation: 1296

Like
0Likes
Like

Posted 24 February 2014 - 05:20 PM

And so if you're working on this at midnight, you won't notice that you're getting fragment shader errors for your vertex shader.  ;-)

Reading what the error message says -> the path to victory biggrin.png
 

java.lang.RuntimeException: javax.media.opengl.GLException: Not a GL2 implementation

Looks like OGL2 implementation was asked for ... which i am fairly sure MacOS simply cannot give (unless backwards compatibility was added at some point - which i seriously doubt). Unfortunately, the last time i used OGL with Java i used LWJGL and i do not know anything about JOGL.

Perhaps, if the error message is to believe, you are not properly asking for the right OGL version - source dump of the relevant part might help.

Anyway, if Google is let to believe it should go fairly simply like this (the example i saw was for GL2, i just changed it to GL3 - i mean, should be good enough, right?):
 
GLCapabilities capabilities = new GLCapabilities(GLProfile.get(GLProfile.GL3));
GLCanvas canvas = new GLCanvas(capabilities);

...

GL3 gl = drawable.getGL().getGL3();
gl.doFunkyStuff...
edit: Actually, i have no idea in what state Mac OGL support is nowadays - i think it should be able to support OGL2 when excluding OGL3 (ie. no backward compatibility option). Should not change anything in what i said though.

Edited by tanzanite7, 24 February 2014 - 05:35 PM.


#16 TheChubu   Crossbones+   -  Reputation: 4354

Like
0Likes
Like

Posted 24 February 2014 - 06:46 PM

It used to be that Mac only supported OGL 3.2 (there was a fakery way to make the OS believe it has 3.3, not sure whether there has been any official support added) - which iirc does not have attribute layout location in core.

Explicit attrib location was made core in 3.3!? That explains a lot. I was sure it was a 3.2 core extension.


"I AM ZE EMPRAH OPENGL 3.3 THE CORE, I DEMAND FROM THEE ZE SHADERZ AND MATRIXEZ"

 

My journals: dustArtemis ECS framework and Making a Terrain Generator


#17 Glass_Knife   Moderators   -  Reputation: 4451

Like
0Likes
Like

Posted 24 February 2014 - 07:18 PM


Looks like OGL2 implementation was asked for ... which i am fairly sure MacOS simply cannot give (unless backwards compatibility was added at some point - which i seriously doubt). Unfortunately, the last time i used OGL with Java i used LWJGL and i do not know anything about JOGL.

 

Yes, that seems to be the case.  I fixed the capabilities code to ask for a profile, and everything seems to be working.  What I had done was just ask for the default profile on my Linux box, without realizing that it would default to a backwards compatibility mode, but that the mac only supports the fixed-function pipeline OR shaders, but not both.  

 

Switching the code to specifically use a GL3 profile caught and crashed when I tried to use the FFP, which means it is working, as far as I can tell.


I think, therefore I am. I think? - "George Carlin"
Indie Game Programming

#18 tanzanite7   Members   -  Reputation: 1296

Like
0Likes
Like

Posted 25 February 2014 - 05:13 AM

Yay!, sweet - sweet progress :D

 

*high-five*



#19 Glass_Knife   Moderators   -  Reputation: 4451

Like
0Likes
Like

Posted 25 February 2014 - 08:18 AM

For reference, here is the working template swing app:

package tim.opengl.util;

import java.awt.Container;
import java.awt.event.WindowAdapter;
import java.awt.event.WindowEvent;

import javax.media.opengl.GL2;
import javax.media.opengl.GLAutoDrawable;
import javax.media.opengl.GLCapabilities;
import javax.media.opengl.GLEventListener;
import javax.media.opengl.GLProfile;
import javax.media.opengl.awt.GLCanvas;
import javax.swing.JFrame;
import javax.swing.SwingUtilities;

import com.jogamp.opengl.util.Animator;

public class GLApp extends JFrame implements GLEventListener {
	
	private GLCanvas mainPanel;
	private Animator animator;
	protected GLUtil util = new GLUtil();
	
	public GLApp() {
		
	}
	
	private void createAndShowGUI() {
		
		setTitle( getAppTitle() );
		setSize( 640, 480 );
		Container canvas = getContentPane();
		canvas.add( getMainPanel() );
		
		preCreate();
		animator = new Animator();
		animator.add( getMainPanel() );
		setVisible( true );
		
		animator.start();
	}
	
	private GLCanvas getMainPanel() {
		if( mainPanel == null ) {
			if( !GLProfile.isAvailable( GLProfile.GL3 ) ) {
				throw new RuntimeException("GL3 not available");
			}
			GLProfile profile = GLProfile.get( GLProfile.GL3 );
			mainPanel = new GLCanvas( new GLCapabilities( profile ) );
			mainPanel.addGLEventListener( this );
		}
		return mainPanel;
	}
	
	protected String getAppTitle() {
		return "TBD";
	}
	
	protected void preCreate() {
		
	}
	
	public void onWindowClosing() {
        new Thread(new Runnable() {
            public void run() {
              animator.stop();
              System.exit(0);
            }
          }).start();
	}

	public static void launchApp( final GLApp app ) {
		app.addWindowListener( new WindowAdapter() {
			@Override
			public void windowClosing(WindowEvent e) {
				app.onWindowClosing();
			}
		});
		SwingUtilities.invokeLater( new Runnable() {
			@Override
			public void run() {
				app.createAndShowGUI();
			}
		});
	}

	@Override
	public void display(GLAutoDrawable glad) {
		
	}

	@Override
	public void dispose(GLAutoDrawable glad) {
		
	}

	@Override
	public void init(GLAutoDrawable glad) {
		
	}

	@Override
	public void reshape(GLAutoDrawable glad, int x, int y, int ww, int hh) {
		
	}
}


I think, therefore I am. I think? - "George Carlin"
Indie Game Programming




Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.



PARTNERS