I'm trying to generate fibonacci distributed points on a sphere. However when I try to do so in the shader on the GPU, the calculation of sin/cos becomes very different than if I would have calculated Math.sin/Math.cos on the CPU. Here is part of the vertex shader of the fibonacci function:

attribute float index; //array with numbers between [1, 2500000]
float inc = 3.141592653589793238462643383279 * (3.0 - sqrt(5.0));
float off = 2.0 / 2500000;
float yy = index * off - 1.0 + (off / 2.0);
float rr = sqrt(1.0 - yy * yy);
float phi = index* inc; // big number, "5476389.695241543"-biggish
vec3 fibPoint = vec3(cos(phi) * rr, yy, sin(phi) * rr); // calculates sin/cos wrong

And when I calculate the Math.sin(phi) and Math.cos(phi) of the fibPoint vector in javascript, and throw these values in as attributes into the shader instead, so the code looks like:

/* short version of code from javascript */
var y = index * off - 1 + (off / 2.0);
var r = Math.sqrt(1 - y * y);
var phi = index * inc;
var cosphi = Math.cos(phi);
var sinphi = Math.sin(phi);
....
..
/* throw cosphi/sinphi into the shader as attributes along with the index */
/* vertex shader */
attribute float index;
attribute float sinphi;
attribute float cosphi;
float inc = 3.141592653589793238462643383279 * (3.0 - sqrt(5.0));
float off = 2.0 / 2500000;
float yy = index * off - 1.0 + (off / 2.0);
float rr = sqrt(1.0 - yy * yy);
//float phi = index* inc;
vec3 fibPoint = vec3(cosphi * rr, yy, sinphi * rr);

So my question is, why doesn't WebGL's sin/cos give same or similar result as javascript's Math.sin/cos? As far as I know, both input is radians, and both output [-1, 1]. May it be because "phi" are too large numbers so some part get truncated in the cos/sin?

Thanks for reading.