Points and defines
Two questions: in the example code I have they define a flags points in the array:
float flagPoints[36][20][3] //flag is 36x20
void InitializeFlag()
{
int xIdx; // counter index for x-plane
int yIdx; // counter index for y-plane
// loop through all of the flag points and calculate the sin wave
// for the z-coordinate. x- and y- coordinates equal to counter indices
for (xIdx = 0; xIdx < 36; xIdx++)
{
for (yIdx = 0; yIdx < 20; yIdx++)
{
flagPoints[xIdx][yIdx][0] = (float)xIdx;
flagPoints[xIdx][yIdx][1] = (float)yIdx;
flagPoints[xIdx][yIdx][2] = float(sin((((float)xIdx*20.0f)/360.0f)*3.14159f*2.0f));
}
}
}
The z coordinate is calculated using sin b/c the flag is waving. I don''t understand what the third subscript is though.
Secondly, they keep using #defines for stuff like PI and other const variables. I''ve read defines are bad, what other implementation could I use that would be just as efficient?
It looks like the first and second subscripts refer to the number of the point being accessed, and the third subscript refers to the coordinates of that point - that is, 0 in the subscript is x, 1 is y, and 2 is z. In this example they just set the x and y coordinates to be equal to indices, but later if you wanted to make it so the flag could fold back on itself or fall due to gravity or something, the third subscript gives you the flexibility to do that without changing too much code.
As for #define being bad..
My C++ teacher told me never ever to use them because they were evil. Well, he didn''t say "evil," but you got the feeling he crossed himself every time he saw one. The reason he thought that (I''m assuming) is that they are not type-safe. A type-safe alternative is to use const:
const PI = 3.14159;
The benefit of this is that the compiler can give you more information about what (if anything) is going wrong, and rules like scope and such apply normally.
I personally don''t have a big problem with #defines, though I did get 5 points off any program I handed in with one. To me it seems like a lot of people get too caught up in the whole object oriented paradigm to see when something is just simpler - I mean, when your program is 1.5 pages long... who cares if you use a #define. But then, that''s another topic in itself. I hope this helped.
As for #define being bad..
My C++ teacher told me never ever to use them because they were evil. Well, he didn''t say "evil," but you got the feeling he crossed himself every time he saw one. The reason he thought that (I''m assuming) is that they are not type-safe. A type-safe alternative is to use const:
const PI = 3.14159;
The benefit of this is that the compiler can give you more information about what (if anything) is going wrong, and rules like scope and such apply normally.
I personally don''t have a big problem with #defines, though I did get 5 points off any program I handed in with one. To me it seems like a lot of people get too caught up in the whole object oriented paradigm to see when something is just simpler - I mean, when your program is 1.5 pages long... who cares if you use a #define. But then, that''s another topic in itself. I hope this helped.
This topic is closed to new replies.
Advertisement
Popular Topics
Advertisement