Why can't I plot random pixels ? UPD8 - PROBLEM SOLVED!

Started by
-1 comments, last by Anachronism 21 years, 4 months ago
This has been really frustrating. I posted my first questions about random floating point values in the general programming section but now I think the problem is opengl related. I want to plot random pixels all over the screen. Simple enough correct? Well.. The first thing I need to do is generate random floats... These are a couple of methods people told me to use:
    
          bgstars[i].x = ((float)rand()/(float)RAND_MAX * float(80)) * (float)30;
          bgstars[i].y = ((float)rand()/(float)RAND_MAX * float(80)) * (float)30;
                          
          bgstars[i].x = (rand() % 100)/100.0 * 30 - 15;
          bgstars[i].y = (rand() % 100)/100.0 * 30 - 15;
  

And this is where I'm plotting the pixels:

      
      glPushMatrix();
      glBegin(GL_POINTS);
        for(int i = 0; i < MAXSTARS; i++) {
          glColor3f(1.0f,1.0f,1.0f);
          glVertex3f(bgstars[i].x, bgstars[i].y, -20);
        }
      glEnd();
      glPopMatrix();
    
Well... both of these methods seem to work... But EVERY pixel that is plotted is plotted at a whole integer coordinate... This is my problem... Every pixel is exactly one unit away from eachother when the formula above should and is generating random FLOATS... Pixels should fall within units, not on their edges... Any help would be greatly appreciated... Thanks... -Dennis [edited by - anachronism on December 23, 2002 11:53:34 PM]

This topic is closed to new replies.

Advertisement