Jump to content
  • Advertisement
  • entries
  • comments
  • views

Playing with shaders

Sign in to follow this  


Now that the terrain engine documentation is out of the way, I can finally put it away for awhile and move onto something more interesting: OpenGL shaders!

OK, after much dicking around, I've finally got something. Here's my first attempt at a GLSL shader:

This is my polka-dot shader. OK, it isn't the greatest thing, but I think it looks cool. It's based on the brick shader in Chapter 6 of the OpenGL Shading Language orange book.

Here are its features:
  • Dot size, colors, and spacing are application-controlled

  • Specular highlighting is applied only after the final color is computed

  • Specular highlighting only appears on the dots

  • The dots' edges are smooth; there's no aliasing.

What a right pain in the ass to get OpenGL shaders set up though.

I don't know about other operating systems, but it seems like in Windows, it's difficult to do any kind of development using OpenGL version 1.2 or greater. The opengl32.lib file is only set up to use OpenGL 1.1, so you end up having to use extensions.

I've played around with OpenGL extensions in the past (using the glActiveTextureARB multitexturing extension) so I know what it's like to get one function working. Basically, you have to use wglGetProcAddress() to get a pointer to the function, cast it to something like PFNOMGWTFBBQROFLCOPTERPROC (different for each function, so you have to look it up), make sure it actually exists, and then you can use it. This isn't bad when your code uses only a few extensions, but when you're doing shader programming, you have to do this for many, many functions.

There's got to be a better way. Fortunately, there is. On teh intarwebs, I found something called GLEW, the OpenGL Extension Wrangler. It fricking rocks! To get support for all extensions that your card supports, all you have to do is call glewInit(). That's it! Then you have your application check for a specific version of OpenGL and bail if your card doesn't support it. This relly makes my life easier. I recommend you download it if you're using OpenGL.

Now that this was out of the way, it was time to write OpenGL code to load and run the shaders. This also turned out to be much more difficult than I thought it would.

To do this, you have to create shader objects for the vertex and fragment program, load text into them using a pointer to an array of strings, compile them, check for errors, return the error log if it fails, create a program, attach the shaders to the program, link the shaders, check for errors, take it out for a dinner and movie, check for errors, and return the error log if it fails. Fortunately, I've encapsulated this into a single function that takes two source filenames as parameters, one for a vertex shader and one for a fragment shader.

Once everything was set up, it was nice and easy to actually write the vertex and fragment shaders. Now I can play around!
Sign in to follow this  


Recommended Comments

There are no comments to display.

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
  • Advertisement

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

GameDev.net is your game development community. Create an account for your GameDev Portfolio and participate in the largest developer community in the games industry.

Sign me up!