Sign in to follow this  

OpenGL confused with opengl direction vectors

Recommended Posts

radom    100
Hi guys. I'm a programmer trying to learn opengl. But some basic concept is confusing me. For example, let's take this command... GLfloat spot_direction[] = { -1.0, -1.0, 0.0 }; glLightfv(GL_LIGHT0, GL_SPOT_DIRECTION, spot_direction); That is a command that I copied from the "opengl red book". They say, that spot_direction parameter passed to the glLight*() will tell opengl the direction of the spot light. When I try to imagine it, spot_light direction is just one point in the x, y, z, w plane. And I'm wondering how a single point can be used as a direction. I feel like I can use that point and say "it is pointing to a zillion directions". Anywayz, I was getting confused by the so called direction vectors and sort of things. They give x, y, z points and say it is pointing "this way". I would appreciate any explanation ... Thanks in advance

Share this post

Link to post
Share on other sites
haegarr    7372
Besides the fact that those light stuff is deprecated anyway ...

A spot light knows of a hard angular limitation named "spot cutoff":
from manpage
params contains three integer or floating-point values that specify the direction of the light in homogeneous object coordinates. Both integer and floating-point values are mapped directly. Neither integer nor floating-point values are clamped.

The spot direction is transformed by the inverse of the modelview matrix when glLight is called (just as if it were a normal), and it is stored in eye coordinates. It is significant only when GL_SPOT_CUTOFF is not 180, which it is initially. The initial direction is (0, 0, -1).
And later:
from manpage
params is a single integer or floating-point value that specifies the maximum spread angle of a light source. Integer and floating-point values are mapped directly. Only values in the range [0,90] and the special value 180 are accepted. If the angle between the direction of the light and the direction from the light to the vertex being lighted is greater than the spot cutoff angle, the light is completely masked. ...

Hence radiation isn't ever homogeneous and hence a direction is needed in some cases.

Share this post

Link to post
Share on other sites
HuntsMan    368
It's the Point vs Vector thing. {-1.0, -1.0, 0.0} is interpeted as a vector, starting at {0, 0, 0} and ending at {-1.0, -1.0, 0.0}, which gives you a direction.

Share this post

Link to post
Share on other sites
ArKano22    650
As HuntsMan said, it is a basic concept: a point defines a direction if you take the vector from <0,0,0> to that point.

I would suggest reading about vector & matrix algebra, the concept of "space" or "basis" and rotation/translation/scale order importance before using OpenGL because if you don´t, you´ll be soon doing things that you don´t fully understand. And debugging code that you can´t understand is a pain in the ass.

Share this post

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

Sign in to follow this  

  • Similar Content

    • By povilaslt2
      Hello. I'm Programmer who is in search of 2D game project who preferably uses OpenGL and C++. You can see my projects in GitHub. Project genre doesn't matter (except MMO's :D).
    • By ZeldaFan555
      Hello, My name is Matt. I am a programmer. I mostly use Java, but can use C++ and various other languages. I'm looking for someone to partner up with for random projects, preferably using OpenGL, though I'd be open to just about anything. If you're interested you can contact me on Skype or on here, thank you!
      Skype: Mangodoor408
    • By tyhender
      Hello, my name is Mark. I'm hobby programmer. 
      So recently,I thought that it's good idea to find people to create a full 3D engine. I'm looking for people experienced in scripting 3D shaders and implementing physics into engine(game)(we are going to use the React physics engine). 
      And,ye,no money =D I'm just looking for hobbyists that will be proud of their work. If engine(or game) will have financial succes,well,then maybe =D
      Sorry for late replies.
      I mostly give more information when people PM me,but this post is REALLY short,even for me =D
      So here's few more points:
      Engine will use openGL and SDL for graphics. It will use React3D physics library for physics simulation. Engine(most probably,atleast for the first part) won't have graphical fron-end,it will be a framework . I think final engine should be enough to set up an FPS in a couple of minutes. A bit about my self:
      I've been programming for 7 years total. I learned very slowly it as "secondary interesting thing" for like 3 years, but then began to script more seriously.  My primary language is C++,which we are going to use for the engine. Yes,I did 3D graphics with physics simulation before. No, my portfolio isn't very impressive. I'm working on that No,I wasn't employed officially. If anybody need to know more PM me. 
    • By Zaphyk
      I am developing my engine using the OpenGL 3.3 compatibility profile. It runs as expected on my NVIDIA card and on my Intel Card however when I tried it on an AMD setup it ran 3 times worse than on the other setups. Could this be a AMD driver thing or is this probably a problem with my OGL code? Could a different code standard create such bad performance?
    • By Kjell Andersson
      I'm trying to get some legacy OpenGL code to run with a shader pipeline,
      The legacy code uses glVertexPointer(), glColorPointer(), glNormalPointer() and glTexCoordPointer() to supply the vertex information.
      I know that it should be using setVertexAttribPointer() etc to clearly define the layout but that is not an option right now since the legacy code can't be modified to that extent.
      I've got a version 330 vertex shader to somewhat work:
      #version 330 uniform mat4 osg_ModelViewProjectionMatrix; uniform mat4 osg_ModelViewMatrix; layout(location = 0) in vec4 Vertex; layout(location = 2) in vec4 Normal; // Velocity layout(location = 3) in vec3 TexCoord; // TODO: is this the right layout location? out VertexData { vec4 color; vec3 velocity; float size; } VertexOut; void main(void) { vec4 p0 = Vertex; vec4 p1 = Vertex + vec4(Normal.x, Normal.y, Normal.z, 0.0f); vec3 velocity = (osg_ModelViewProjectionMatrix * p1 - osg_ModelViewProjectionMatrix * p0).xyz; VertexOut.velocity = velocity; VertexOut.size = TexCoord.y; gl_Position = osg_ModelViewMatrix * Vertex; } What works is the Vertex and Normal information that the legacy C++ OpenGL code seem to provide in layout location 0 and 2. This is fine.
      What I'm not getting to work is the TexCoord information that is supplied by a glTexCoordPointer() call in C++.
      What layout location is the old standard pipeline using for glTexCoordPointer()? Or is this undefined?
      Side note: I'm trying to get an OpenSceneGraph 3.4.0 particle system to use custom vertex, geometry and fragment shaders for rendering the particles.
  • Popular Now