Jump to content
  • Advertisement

Psychopathetica

Member
  • Content count

    175
  • Joined

  • Last visited

Community Reputation

259 Neutral

About Psychopathetica

  • Rank
    Member

Personal Information

  • Role
    Game Designer
  • Interests
    Audio
    Programming

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

  1. Psychopathetica

    Raycast From Camera To Mouse Pointer

    Nevermind. That wasn't really needed. I should slap myself. Turns out that when I update the matrices of any of the 3D objects, I was multiplying the X values by a stupid aspect ratio that wasn't needed at all, throwing off the values completely since the picking wasn't using an aspect ratio. So I took aspect ratios out of the equation and it worked. Can't believe it, this whole time! Worked like a charm when using camera.position x y and z values for the picking origin and ray origin of my 3D line, but I noticed it was a teeny bit off at certain angles, although acceptable. Just for the hell of it, I replaced the origin's camera position with the cameras inverted view matrix, and low and behold, it was pixel perfect accurate at all angles, even when my picking method was using floats instead of doubles!!!
  2. Psychopathetica

    Raycast From Camera To Mouse Pointer

    I'll give it a shot, even though my near z is 1 In the mean time, I'm constructing a camera array so there are 2 cameras to see where the hell my ray origin is going. Gonna see it as a 3D line. It's pretty hard to see it when its on ya.
  3. Psychopathetica

    Raycast From Camera To Mouse Pointer

    Hello. For the last two weeks, I've been struggling with one thing... 3D object picking. And I'm near getting it right! Works great when facing front! With a first person style camera, I can go up, down, forward, backward, strafe left, straft right, and it works. Problem is, when I rotate the camera, the other end of the ray that is not the mouse end goes off in another place other than the camera, completely throwing it off! So I'm going to go step by step, and see if you guys can spot what went wrong. The first step was to normalize the mouse device coordinates, or in my case, touch coordinates: public static float[] getNormalizedDeviceCoords(float touchX, float touchY){ float[] result = new float[2]; result[0] = (2f * touchX) / Render.camera.screenWidth - 1f; result[1] = 1f - (2f * touchY) / Render.camera.screenHeight; return result; } which in turn is converted into Homogeneous Clip Coordinates: float[] homogeneousClipCoords = new float[]{normalizedDeviceCoords[0], normalizedDeviceCoords[1], -1f, 1f}; The next step was to convert this Homogeneous Clip Coordinates into Eye Coordinates: public static float[] getEyeCoords(float[] clipCoords){ float[] invertedProjection = new float[16]; Matrix.invertM(invertedProjection, 0, Render.camera.projMatrix, 0); float[] eyeCoords = new float[4]; Matrix.multiplyMV(eyeCoords, 0, invertedProjection, 0 ,clipCoords, 0); float[] result = new float[]{eyeCoords[0], eyeCoords[1], -1f, 0f}; return result; } Next was to convert the Eye Coordinates into World Coordinates and normalize it: public static float[] getWorldCoords(float[] eyeCoords){ float[] invertedViewMatrix = new float[16]; Matrix.invertM(invertedViewMatrix, 0, Render.camera.viewM, 0); float[] rayWorld = new float[4]; Matrix.multiplyMV(rayWorld, 0, invertedViewMatrix, 0 ,eyeCoords, 0); float length = (float)Math.sqrt(rayWorld[0] * rayWorld[0] + rayWorld[1] * rayWorld[1] + rayWorld[2] * rayWorld[2]); if(length != 0){ rayWorld[0] /= length; rayWorld[1] /= length; rayWorld[2] /= length; } return rayWorld; } Putting this all together gives me a method to get the ray direction I need: public static float[] calculateMouseRay(){ float touchX = MainActivity.touch.x; float touchY = MainActivity.touch.y; float[] normalizedDeviceCoords = getNormalizedDeviceCoords(touchX, touchY); float[] homogeneousClipCoords = new float[]{normalizedDeviceCoords[0], normalizedDeviceCoords[1], -1f, 1f}; float[] eyeCoords = getEyeCoords(homogeneousClipCoords); float[] worldCoords = getWorldCoords(eyeCoords); return worldCoords; } I then test for the Ray / Sphere intersection using this with double precision: public static boolean getRaySphereIntersection(float[] rayOrigin, float[] spherePosition, float[] rayDirection, float radius){ double[] v = new double[4]; double[] dir = new double[4]; // Calculate the a, b, c and d coefficients. // a = (XB-XA)^2 + (YB-YA)^2 + (ZB-ZA)^2 // b = 2 * ((XB-XA)(XA-XC) + (YB-YA)(YA-YC) + (ZB-ZA)(ZA-ZC)) // c = (XA-XC)^2 + (YA-YC)^2 + (ZA-ZC)^2 - r^2 // d = b^2 - 4*a*c v[0] = (double)rayOrigin[0] - (double)spherePosition[0]; v[1] = (double)rayOrigin[1] - (double)spherePosition[1]; v[2] = (double)rayOrigin[2] - (double)spherePosition[2]; dir[0] = (double)rayDirection[0]; dir[1] = (double)rayDirection[1]; dir[2] = (double)rayDirection[2]; double a = (dir[0] * dir[0]) + (dir[1] * dir[1]) + (dir[2] * dir[2]); double b = (dir[0] * v[0] + dir[1] * v[1] + dir[2] * v[2]) * 2.0; double c = (v[0] * v[0] + v[1] * v[1] + v[2] * v[2]) - ((double)radius * (double)radius); // Find the discriminant. //double d = (b * b) - c; double d = (b * b) - (4.0 * a * c); Log.d("d", String.valueOf(d)); if (d == 0f) { //one root } else if (d > 0f) { //two roots double x1 = -b - Math.sqrt(d) / (2.0 * a); double x2 = -b + Math.sqrt(d) / (2.0 * a); Log.d("X1 X2", String.valueOf(x1) + ", " + String.valueOf(x2)); if ((x1 >= 0.0) || (x2 >= 0.0)){ return true; } if ((x1 < 0.0) || (x2 >= 0.0)){ return true; } } return false; } After a week and a half of playing around with this chunk of code, and researching everything I could on google, I found out by sheer accident that the sphere position to use in this method must be the transformed sphere position extracted from the model matrix, not the position itself. Which not one damn tutorial or forum article mentioned! And works great using this. Haven't tested the objects modelView yet though. To visually see the ray, I made a class to draw the 3D line, and noticed that it has no trouble at all with one end being my mouse cursor. The other end, which is the origin, is sort of working. And it only messes up when I rotate left or right as I move around in a FPS style camera. Which brings me to my next point. I have no idea what the ray origin should be for the camera. And I have 4 choices. 3 of them worked but gave me the same results. Ray Origin Choices: 1. Using just the camera.position.x, camera.position.y, and camera.position.z for my ray origin worked flawlessly straight due to the fact that the ray origin remained in the center of the screen, but messed up when I rotated the camera, and moved off screen as I was rotating. Now theoretically, even if you were facing at an angle, you still are fixated at that point, and the ray origin shouldn't be flying off away from the center of the screen at all. A point is a point after all. 2.Using the cameras model matrix (used for translating and rotating the camera, and later multiplied to the cameras view matrix), specifically -modelMatrix[12], -modelMatrix[13], and -modelMatrix[14] (note I am using negative), basically gave me nearly the same results. Only difference is that camera rotations play a role in the cameras positions. Great facing straight, but the ray origin is no longer centered at different angles. 3.Using the camera's view matrix didn't work at all, positive or negative, using 12, 13, and 14 in the matrix. 4.Using the camera's inverted view matrix (positive invertedViewMatrix[12], invertedViewMatrix[13], and invertedViewMatrix[14]) did work, but gave me what probably seemed like the same results as #2. So basically, I'm having difficulty getting the other end of the ray, which is the ray origin. Shooting the ray to the mouse pointer was no problem, like I said. With the camera end of the ray being off, it throws off the accuracy a lot at different camera angles other than straight. If anyone has any idea's, please let me know. I'm sure my math is correct. If you need any more information, such as the camera, or how I render the ray, which I don't think is needed, I can show that too. Thanks in advance!
  4. Psychopathetica

    3D Picking Complex Models

    Another idea I had in mind was something to do with the shader. Like the polygon model you drew out from it, some how feeding it the ray cast information and test to see if it overlapped. Possibly through the fragment shader.
  5. Psychopathetica

    3D Picking Complex Models

    After a tiring search on Google, I came up dry looking for a way to do 3d picking on complex object models, such as any model thats not a sphere, ellipsoid, or a box. A character model would be a good example. I found 2 ways to do picking. One method being getting a color on screen, which is not good because the color could be anything. The other method being ray casting. Only problem, is that the ray casting tests on nearly every tutorial was on simplified models such as spheres or cubes. Yet in 3d studio max, world of warcraft, and other programs clearly show you can click any model, and not just primitives. Does anyone have a good idea on how this can be done? Thanks in advance.
  6. Psychopathetica

    Vertex Buffer Issue

    Another observation, if you comment out the Normal code (which was working btw), texturemapping works as intended too. Its only when I toss in normals, that texturemapping only uses one color. And the entire thing works when Vertex Buffers are removed. Weird. Not sure if this is just the emulator doing this, but I'm gonna test it on my real phone to see if it works. public void bindData(){ int offset = 0; glBindBuffer(GL_ARRAY_BUFFER, vertexBufferObject[0]); glVertexAttribPointer(aPositionHandle, POSITION_COMPONENT_COUNT_3D, GL_FLOAT, false, POSITION_COMPONENT_STRIDE_3D, offset); offset += POSITION_COMPONENT_COUNT_3D; glVertexAttribPointer(aColorHandle, COLOR_COMPONENT_COUNT, GL_FLOAT, false, COLOR_COMPONENT_STRIDE, numberOfVertices * offset * BYTES_PER_FLOAT); offset += COLOR_COMPONENT_COUNT; glVertexAttribPointer(aTextureCoordinateHandle, TEXTURE_COORDINATES_COMPONENT_COUNT, GL_FLOAT, false, TEXTURE_COORDINATE_COMPONENT_STRIDE, numberOfVertices * offset * BYTES_PER_FLOAT); //offset += TEXTURE_COORDINATES_COMPONENT_COUNT; //glVertexAttribPointer(aNormalHandle, NORMAL_COMPONENT_COUNT, GL_FLOAT, false, NORMAL_COMPONENT_STRIDE, numberOfVertices * offset * BYTES_PER_FLOAT); glBindBuffer(GL_ARRAY_BUFFER, 0); ///////////////////////////////////////////////////// /* vertexBuffer.position(0); glVertexAttribPointer(aPositionHandle, POSITION_COMPONENT_COUNT_3D, GL_FLOAT, false, POSITION_COMPONENT_STRIDE_3D, vertexBuffer); colorBuffer.position(0); glVertexAttribPointer(aColorHandle, COLOR_COMPONENT_COUNT, GL_FLOAT, false, COLOR_COMPONENT_STRIDE, colorBuffer); textureCoordBuffer.position(0); glVertexAttribPointer(aTextureCoordinateHandle, TEXTURE_COORDINATES_COMPONENT_COUNT, GL_FLOAT, false, TEXTURE_COORDINATE_COMPONENT_STRIDE, textureCoordBuffer); normalBuffer.position(0); glVertexAttribPointer(aNormalHandle, NORMAL_COMPONENT_COUNT, GL_FLOAT, false, NORMAL_COMPONENT_STRIDE, normalBuffer); */ } public void createVertexBuffer(){ glGenBuffers(1, vertexBufferObject, 0); int vertexLine = POSITION_COMPONENT_COUNT_3D + COLOR_COMPONENT_COUNT + TEXTURE_COORDINATES_COMPONENT_COUNT; glBindBuffer(GL_ARRAY_BUFFER, vertexBufferObject[0]); glBufferData(GL_ARRAY_BUFFER,numberOfVertices * vertexLine * BYTES_PER_FLOAT, null, GL_STATIC_DRAW); int offset = 0; glBufferSubData(GL_ARRAY_BUFFER, offset, numberOfVertices * POSITION_COMPONENT_COUNT_3D * BYTES_PER_FLOAT, vertexBuffer); offset += POSITION_COMPONENT_COUNT_3D; glBufferSubData(GL_ARRAY_BUFFER, numberOfVertices * offset * BYTES_PER_FLOAT, numberOfVertices * COLOR_COMPONENT_COUNT * BYTES_PER_FLOAT, colorBuffer); offset += COLOR_COMPONENT_COUNT; glBufferSubData(GL_ARRAY_BUFFER, numberOfVertices * offset * BYTES_PER_FLOAT, numberOfVertices * TEXTURE_COORDINATES_COMPONENT_COUNT * BYTES_PER_FLOAT, textureCoordBuffer); //offset += TEXTURE_COORDINATES_COMPONENT_COUNT; //glBufferSubData(GL_ARRAY_BUFFER, numberOfVertices * offset * BYTES_PER_FLOAT, // numberOfVertices * NORMAL_COMPONENT_COUNT * BYTES_PER_FLOAT, normalBuffer); glBindBuffer(GL_ARRAY_BUFFER, 0); } [EDIT] OMG it was the emulator. But I don't understand why? Works great on the phone and tablet. I set the ram up pretty high, like 4 Gigs
  7. Not long ago, I create a nice OBJ loader that loads 3D Studio Max files. Only problem is, is that although it works and works great, I wasn't using Vertex Buffers. Now that I applied Vertex Buffers, it seems to only use the first color of the texture and spread it all across the poiygon. I examined my code over and over again, and the Vertex Buffer code is correct. But when I comment out all of my vertex buffer code, it works as intended. I practically given up on fixing it on my own, so hopefully you guys will be able to figure out what is wrong. public static final int BYTES_PER_FLOAT = 4; public static final int POSITION_COMPONENT_COUNT_3D = 4; public static final int COLOR_COMPONENT_COUNT = 4; public static final int TEXTURE_COORDINATES_COMPONENT_COUNT = 2; public static final int NORMAL_COMPONENT_COUNT = 3; public static final int POSITION_COMPONENT_STRIDE_2D = POSITION_COMPONENT_COUNT_2D * BYTES_PER_FLOAT; public static final int POSITION_COMPONENT_STRIDE_3D = POSITION_COMPONENT_COUNT_3D * BYTES_PER_FLOAT; public static final int COLOR_COMPONENT_STRIDE = COLOR_COMPONENT_COUNT * BYTES_PER_FLOAT; public static final int TEXTURE_COORDINATE_COMPONENT_STRIDE = TEXTURE_COORDINATES_COMPONENT_COUNT * BYTES_PER_FLOAT; public static final int NORMAL_COMPONENT_STRIDE = NORMAL_COMPONENT_COUNT * BYTES_PER_FLOAT; int loadFile() { ArrayList<Vertex3D> tempVertexArrayList = new ArrayList<Vertex3D>(); ArrayList<TextureCoord2D> tempTextureCoordArrayList = new ArrayList<TextureCoord2D>(); ArrayList<Vector3D> tempNormalArrayList = new ArrayList<Vector3D>(); ArrayList<Face3D> tempFaceArrayList = new ArrayList<Face3D>(); StringBuilder body = new StringBuilder(); try { InputStream inputStream = context.getResources().openRawResource(resourceID); InputStreamReader inputStreamReader = new InputStreamReader(inputStream); BufferedReader bufferedReader = new BufferedReader(inputStreamReader); String nextLine; String subString; String[] stringArray; String[] stringArray2; int[] indexNumberList = new int[3]; int[] textureCoordNumberList = new int[3]; int[] normalNumberList = new int[3]; int i = 0; int j = 0; int k = 0; try { while ((nextLine = bufferedReader.readLine()) != null) { if (nextLine.startsWith("v ")) { subString = nextLine.substring(1).trim(); stringArray = subString.split(" "); try { tempVertexArrayList.add(new Vertex3D(Float.parseFloat(stringArray[0]), Float.parseFloat(stringArray[1]), Float.parseFloat(stringArray[2]), 1f)); } catch(NumberFormatException e){ Log.d(TAG, "Error: Invalid number format in loading vertex list"); return 0; } String x = String.valueOf(tempVertexArrayList.get(i).x); String y = String.valueOf(tempVertexArrayList.get(i).y); String z = String.valueOf(tempVertexArrayList.get(i).z); //Log.d(TAG, "vertex " + String.valueOf(i) + ": " + x + ", " + y + ", " + z); i++; } if (nextLine.startsWith("vn ")) { subString = nextLine.substring(2).trim(); stringArray = subString.split(" "); try { if(reverseNormals){ tempNormalArrayList.add(new Vector3D(-Float.parseFloat(stringArray[0]), -Float.parseFloat(stringArray[1]), -Float.parseFloat(stringArray[2]))); } else{ tempNormalArrayList.add(new Vector3D(Float.parseFloat(stringArray[0]), Float.parseFloat(stringArray[1]), Float.parseFloat(stringArray[2]))); } } catch(NumberFormatException e){ Log.d(TAG, "Error: Invalid number format in loading normal list"); return 0; } String nx = String.valueOf(tempNormalArrayList.get(j).x); String ny = String.valueOf(tempNormalArrayList.get(j).y); String nz = String.valueOf(tempNormalArrayList.get(j).z); //Log.d(TAG, "normal " + String.valueOf(j) + ": " + nx + ", " + ny + ", " + nz); j++; } if (nextLine.startsWith("vt ")) { subString = nextLine.substring(2).trim(); stringArray = subString.split(" "); try { tempTextureCoordArrayList.add(new TextureCoord2D(Float.parseFloat(stringArray[0]), Float.parseFloat(stringArray[1]))); } catch(NumberFormatException e){ Log.d(TAG, "Error: Invalid number format in loading texture coordinate list"); return 0; } String tu = String.valueOf(tempTextureCoordArrayList.get(k).tu); String tv = String.valueOf(tempTextureCoordArrayList.get(k).tv); //Log.d(TAG, "texture coord " + String.valueOf(k) + ": " + tu + ", " + tv); k++; } if (nextLine.startsWith("f ")) { subString = nextLine.substring(1).trim(); stringArray = subString.split(" "); for (int index = 0; index <= 2; index++) { stringArray2 = stringArray[index].split("/"); try { indexNumberList[index] = Integer.parseInt(stringArray2[0]) - 1; if(indexNumberList[index] < 0){ Log.d(TAG, "Error: indexNumberList[] is less than zero"); return 0; } } catch(NumberFormatException e){ Log.d(TAG, "Error: Invalid number format in loading indexNumberList[]"); return 0; } try{ textureCoordNumberList[index] = Integer.parseInt(stringArray2[1]) - 1; if(textureCoordNumberList[index] < 0){ Log.d(TAG, "Error: textureCoordNumberList[] is less than zero"); return 0; } } catch(NumberFormatException e){ Log.d(TAG, "Error: Invalid number format in loading textureCoordNumberList[]"); return 0; } try{ normalNumberList[index] = Integer.parseInt(stringArray2[2]) - 1; if(normalNumberList[index] < 0){ Log.d(TAG, "Error: normalNumberList[] is less than zero"); return 0; } } catch(NumberFormatException e){ Log.d(TAG, "Error: Invalid number format in loading normalNumberList[]"); return 0; } } tempFaceArrayList.add(new Face3D(indexNumberList[0], textureCoordNumberList[0], normalNumberList[0], indexNumberList[1], textureCoordNumberList[1], normalNumberList[1], indexNumberList[2], textureCoordNumberList[2], normalNumberList[2])); } body.append(nextLine); body.append('\n'); } //Now that everything has successfully loaded, you can now populate the public variables. if(tempVertexArrayList != null && tempVertexArrayList.size() != 0) vertexArrayList.addAll(tempVertexArrayList); if(tempTextureCoordArrayList != null && tempTextureCoordArrayList.size() != 0) textureCoordArrayList.addAll(tempTextureCoordArrayList); if(tempNormalArrayList != null && tempNormalArrayList.size() != 0) normalArrayList.addAll(tempNormalArrayList); if(tempFaceArrayList != null && tempFaceArrayList.size() != 0) faceArrayList.addAll(tempFaceArrayList); vertexList = new float[faceArrayList.size() * POSITION_COMPONENT_COUNT_3D * NUMBER_OF_SIDES_PER_FACE]; indexList = new short[faceArrayList.size() * NUMBER_OF_SIDES_PER_FACE]; colorList = new float[faceArrayList.size() * COLOR_COMPONENT_COUNT * NUMBER_OF_SIDES_PER_FACE]; textureCoordList = new float[faceArrayList.size() * TEXTURE_COORDINATES_COMPONENT_COUNT * NUMBER_OF_SIDES_PER_FACE]; normalList = new float[faceArrayList.size() * NORMAL_COMPONENT_COUNT * NUMBER_OF_SIDES_PER_FACE]; int nextFace = 0; int step = POSITION_COMPONENT_COUNT_3D * NUMBER_OF_SIDES_PER_FACE; for (int currentVertex = 0; currentVertex < vertexList.length; currentVertex += step){ vertexList[currentVertex + 0] = vertexArrayList.get(faceArrayList.get(nextFace).indexNumberList.get(0)).x; vertexList[currentVertex + 1] = vertexArrayList.get(faceArrayList.get(nextFace).indexNumberList.get(0)).y; vertexList[currentVertex + 2] = vertexArrayList.get(faceArrayList.get(nextFace).indexNumberList.get(0)).z; vertexList[currentVertex + 3] = vertexArrayList.get(faceArrayList.get(nextFace).indexNumberList.get(0)).w; vertexList[currentVertex + 4] = vertexArrayList.get(faceArrayList.get(nextFace).indexNumberList.get(1)).x; vertexList[currentVertex + 5] = vertexArrayList.get(faceArrayList.get(nextFace).indexNumberList.get(1)).y; vertexList[currentVertex + 6] = vertexArrayList.get(faceArrayList.get(nextFace).indexNumberList.get(1)).z; vertexList[currentVertex + 7] = vertexArrayList.get(faceArrayList.get(nextFace).indexNumberList.get(1)).w; vertexList[currentVertex + 8] = vertexArrayList.get(faceArrayList.get(nextFace).indexNumberList.get(2)).x; vertexList[currentVertex + 9] = vertexArrayList.get(faceArrayList.get(nextFace).indexNumberList.get(2)).y; vertexList[currentVertex + 10] = vertexArrayList.get(faceArrayList.get(nextFace).indexNumberList.get(2)).z; vertexList[currentVertex + 11] = vertexArrayList.get(faceArrayList.get(nextFace).indexNumberList.get(2)).w; nextFace++; } numberOfVertices = vertexList.length / POSITION_COMPONENT_COUNT_3D; nextFace = 0; for (int currentIndex = 0; currentIndex < indexList.length; currentIndex += NUMBER_OF_SIDES_PER_FACE){ indexList[currentIndex + 0] = faceArrayList.get(nextFace).indexNumberList.get(0).shortValue(); indexList[currentIndex + 1] = faceArrayList.get(nextFace).indexNumberList.get(1).shortValue(); indexList[currentIndex + 2] = faceArrayList.get(nextFace).indexNumberList.get(2).shortValue(); } step = COLOR_COMPONENT_COUNT * NUMBER_OF_SIDES_PER_FACE; for (int currentVertex = 0; currentVertex < colorList.length; currentVertex += step){ colorList[currentVertex + 0] = red; colorList[currentVertex + 1] = green; colorList[currentVertex + 2] = blue; colorList[currentVertex + 3] = alpha; colorList[currentVertex + 4] = red; colorList[currentVertex + 5] = green; colorList[currentVertex + 6] = blue; colorList[currentVertex + 7] = alpha; colorList[currentVertex + 8] = red; colorList[currentVertex + 9] = green; colorList[currentVertex + 10] = blue; colorList[currentVertex + 11] = alpha; } nextFace = 0; step = TEXTURE_COORDINATES_COMPONENT_COUNT * NUMBER_OF_SIDES_PER_FACE; for (int currentVertex = 0; currentVertex < textureCoordList.length; currentVertex += step){ textureCoordList[currentVertex + 0] = textureCoordArrayList.get(faceArrayList.get(nextFace).textureCoordNumberList.get(0)).tu * mult; textureCoordList[currentVertex + 1] = textureCoordArrayList.get(faceArrayList.get(nextFace).textureCoordNumberList.get(0)).tv * mult; textureCoordList[currentVertex + 2] = textureCoordArrayList.get(faceArrayList.get(nextFace).textureCoordNumberList.get(1)).tu * mult; textureCoordList[currentVertex + 3] = textureCoordArrayList.get(faceArrayList.get(nextFace).textureCoordNumberList.get(1)).tv * mult; textureCoordList[currentVertex + 4] = textureCoordArrayList.get(faceArrayList.get(nextFace).textureCoordNumberList.get(2)).tu * mult; textureCoordList[currentVertex + 5] = textureCoordArrayList.get(faceArrayList.get(nextFace).textureCoordNumberList.get(2)).tv * mult; nextFace++; } nextFace = 0; step = NORMAL_COMPONENT_COUNT * NUMBER_OF_SIDES_PER_FACE; for (int currentVertex = 0; currentVertex < normalList.length; currentVertex += step){ normalList[currentVertex + 0] = normalArrayList.get(faceArrayList.get(nextFace).normalNumberList.get(0)).x; normalList[currentVertex + 1] = normalArrayList.get(faceArrayList.get(nextFace).normalNumberList.get(0)).y; normalList[currentVertex + 2] = normalArrayList.get(faceArrayList.get(nextFace).normalNumberList.get(0)).z; normalList[currentVertex + 3] = normalArrayList.get(faceArrayList.get(nextFace).normalNumberList.get(1)).x; normalList[currentVertex + 4] = normalArrayList.get(faceArrayList.get(nextFace).normalNumberList.get(1)).y; normalList[currentVertex + 5] = normalArrayList.get(faceArrayList.get(nextFace).normalNumberList.get(1)).z; normalList[currentVertex + 6] = normalArrayList.get(faceArrayList.get(nextFace).normalNumberList.get(2)).x; normalList[currentVertex + 7] = normalArrayList.get(faceArrayList.get(nextFace).normalNumberList.get(2)).y; normalList[currentVertex + 8] = normalArrayList.get(faceArrayList.get(nextFace).normalNumberList.get(2)).z; nextFace++; } vertexBuffer = ByteBuffer.allocateDirect(vertexList.length * BYTES_PER_FLOAT).order(ByteOrder.nativeOrder()).asFloatBuffer(); indexBuffer = ByteBuffer.allocateDirect(indexList.length * BYTES_PER_FLOAT).order(ByteOrder.nativeOrder()).asShortBuffer(); colorBuffer = ByteBuffer.allocateDirect(colorList.length * BYTES_PER_FLOAT).order(ByteOrder.nativeOrder()).asFloatBuffer(); textureCoordBuffer = ByteBuffer.allocateDirect(textureCoordList.length * BYTES_PER_FLOAT).order(ByteOrder.nativeOrder()).asFloatBuffer(); normalBuffer = ByteBuffer.allocateDirect(normalList.length * BYTES_PER_FLOAT).order(ByteOrder.nativeOrder()).asFloatBuffer(); vertexBuffer.put(vertexList).position(0); indexBuffer.put(indexList).position(0); colorBuffer.put(colorList).position(0); textureCoordBuffer.put(textureCoordList).position(0); normalBuffer.put(normalList).position(0); createVertexBuffer(); uMVPMatrixHandle = glGetUniformLocation(program, U_MVPMATRIX); uMVMatrixHandle = glGetUniformLocation(program, U_MVMATRIX); uTextureUnitHandle = glGetUniformLocation(program, U_TEXTURE_UNIT); aPositionHandle = glGetAttribLocation(program, A_POSITION); aColorHandle = glGetAttribLocation(program, A_COLOR); aTextureCoordinateHandle = glGetAttribLocation(program, A_TEXTURE_COORDINATES); aNormalHandle = glGetAttribLocation(program, A_NORMAL); glEnableVertexAttribArray(aPositionHandle); glEnableVertexAttribArray(aColorHandle); glEnableVertexAttribArray(aTextureCoordinateHandle); glEnableVertexAttribArray(aNormalHandle); glActiveTexture(GL_TEXTURE0); glUniform1i(uTextureUnitHandle, 0); } catch(IOException e){ } } catch (Resources.NotFoundException nfe){ throw new RuntimeException("Resource not found: " + resourceID, nfe); } return 1; } public void draw(){ glEnable(GL_DEPTH_TEST); bindData(); glBindBuffer(GL_ARRAY_BUFFER, vertexBufferObject[0]); glDrawArrays(GL_TRIANGLES, 0, faceArrayList.size() * NUMBER_OF_SIDES_PER_FACE); glBindBuffer(GL_ARRAY_BUFFER, 0); } public void bindData(){ int offset = 0; glBindBuffer(GL_ARRAY_BUFFER, vertexBufferObject[0]); glVertexAttribPointer(aPositionHandle, POSITION_COMPONENT_COUNT_3D, GL_FLOAT, false, POSITION_COMPONENT_STRIDE_3D, offset); offset += POSITION_COMPONENT_COUNT_3D; glVertexAttribPointer(aColorHandle, COLOR_COMPONENT_COUNT, GL_FLOAT, false, COLOR_COMPONENT_STRIDE, numberOfVertices * offset * BYTES_PER_FLOAT); offset += COLOR_COMPONENT_COUNT; glVertexAttribPointer(aTextureCoordinateHandle, TEXTURE_COORDINATES_COMPONENT_COUNT, GL_FLOAT, false, TEXTURE_COORDINATE_COMPONENT_STRIDE, numberOfVertices * offset * BYTES_PER_FLOAT); offset += TEXTURE_COORDINATES_COMPONENT_COUNT; glVertexAttribPointer(aNormalHandle, NORMAL_COMPONENT_COUNT, GL_FLOAT, false, NORMAL_COMPONENT_STRIDE, numberOfVertices * offset * BYTES_PER_FLOAT); glBindBuffer(GL_ARRAY_BUFFER, 0); ///////////////////////////////////////////////////// /* vertexBuffer.position(0); glVertexAttribPointer(aPositionHandle, POSITION_COMPONENT_COUNT_3D, GL_FLOAT, false, POSITION_COMPONENT_STRIDE_3D, vertexBuffer); colorBuffer.position(0); glVertexAttribPointer(aColorHandle, COLOR_COMPONENT_COUNT, GL_FLOAT, false, COLOR_COMPONENT_STRIDE, colorBuffer); textureCoordBuffer.position(0); glVertexAttribPointer(aTextureCoordinateHandle, TEXTURE_COORDINATES_COMPONENT_COUNT, GL_FLOAT, false, TEXTURE_COORDINATE_COMPONENT_STRIDE, textureCoordBuffer); normalBuffer.position(0); glVertexAttribPointer(aNormalHandle, NORMAL_COMPONENT_COUNT, GL_FLOAT, false, NORMAL_COMPONENT_STRIDE, normalBuffer); */ } public void createVertexBuffer(){ glGenBuffers(1, vertexBufferObject, 0); glBindBuffer(GL_ARRAY_BUFFER, vertexBufferObject[0]); glBufferData(GL_ARRAY_BUFFER,numberOfVertices * (POSITION_COMPONENT_COUNT_3D + COLOR_COMPONENT_COUNT + TEXTURE_COORDINATES_COMPONENT_COUNT + NORMAL_COMPONENT_COUNT) * BYTES_PER_FLOAT, null, GL_STATIC_DRAW); int offset = 0; glBufferSubData(GL_ARRAY_BUFFER, offset, numberOfVertices * POSITION_COMPONENT_COUNT_3D * BYTES_PER_FLOAT, vertexBuffer); offset += POSITION_COMPONENT_COUNT_3D; glBufferSubData(GL_ARRAY_BUFFER, numberOfVertices * offset * BYTES_PER_FLOAT, numberOfVertices * COLOR_COMPONENT_COUNT * BYTES_PER_FLOAT, colorBuffer); offset += COLOR_COMPONENT_COUNT; glBufferSubData(GL_ARRAY_BUFFER, numberOfVertices * offset * BYTES_PER_FLOAT, numberOfVertices * TEXTURE_COORDINATES_COMPONENT_COUNT * BYTES_PER_FLOAT, textureCoordBuffer); offset += TEXTURE_COORDINATES_COMPONENT_COUNT; glBufferSubData(GL_ARRAY_BUFFER, numberOfVertices * offset * BYTES_PER_FLOAT, numberOfVertices * NORMAL_COMPONENT_COUNT * BYTES_PER_FLOAT, normalBuffer); glBindBuffer(GL_ARRAY_BUFFER, 0); }
  8. Psychopathetica

    Get Angle of 3D Line and Rotate Polygon With Result

    Atan2(×, z) i believe is the same as atan(x / z). If javascript doesnt support division by zero, you can check it in a simple if statement.
  9. Psychopathetica

    Get Angle of 3D Line and Rotate Polygon With Result

    Success! I created a wormhole mesh. Since I have both the linear path and the mesh, I attempted to find out how to get the angle of the 2 points in 3D, and managed to solve that as well. And I need this for many applications, such as sprite following, and camera following, at that particular angle. This is the math I used, and I actually found 2 methods. Roll is always zero because you never see the line "roll", and it threw off the values anyways. Method one was to use this: float x, y, z; D3DXVECTOR3 a = { -100.0f, 0.0f, 200.0f }; D3DXVECTOR3 b = { 50.0f, 0.0f, 250.0f }; D3DXVECTOR3 angle; x = b.x - a.x; y = b.y - a.y; z = b.z - a.z; angle.x = atan2f(sqrtf(x * x + z * z), y) * (180.0f / static_cast<float>(PI)) - 90.0f; //Pitch angle.y = atan2f(x, z) * (180.0f / static_cast<float>(PI)); //Yaw angle.z = 0.0f; //Roll Another way was to the same thing with less math was this: float x, y, z; D3DXVECTOR3 a = { -100.0f, 0.0f, 200.0f }; D3DXVECTOR3 b = { 50.0f, 0.0f, 250.0f }; D3DXVECTOR3 angle; x = b.x - a.x; y = b.y - a.y; z = b.z - a.z; angle.x = -atan2f(y, z) * (180.0f / static_cast<float>(PI)); //Pitch angle.y = atan2f(x, z) * (180.0f / static_cast<float>(PI)); //Yaw angle.z = 0.0f; //Roll And it did not matter what angle the line was in, the polygon always moved with it. With some Lerping, or Slerping, I can have it move along the line if I want to.
  10. Psychopathetica

    Get Angle of 3D Line and Rotate Polygon With Result

    When I get home, Ill check out the turn to mesh /poly modifier. I think this will be easier. I still need the line segment path though for the camera to follow, which means I still need to properly get the angles of the line for the camera to turn at that angle. I may need to resort to quaternions. I dont know.
  11. Psychopathetica

    Get Angle of 3D Line and Rotate Polygon With Result

    Here is what I'm trying to do. I made a wormhole mesh in Autodesk 3D studio max using line segments interconnected, and made the line radius like 150, which made it into a wormhole tunnel. Problem is, when I exported the mesh, I got a line path out of it instead of a real mesh. So I had an idea to use this data in my program. Every line segment will have a series of walls around it (like 8 or 10 walls per segment) with the radius being the distance from the line segment, so as I load all the line segment data, it would spitout my wormhole I created. But at the moment, I'm only using one polygon quad hovering over the line segment, When the line segment is at a different angle, the polygon will rotate to be at the same angle with the line segment. Like in this image: Unfortunately, when I do other angles, it doesn't go with it. So I'm just trying to figure out the trig in order for it to always hover over the line. Once I figure this out, the other walls that will circle the line will just be offsets.
  12. Psychopathetica

    Get Angle of 3D Line and Rotate Polygon With Result

    Wait a tick, I think I'm on to something. Since I need the results independently (angle X, angle Y, angle Z), I did getting the angle in 2D for 3D, only to get X, I get the atan2() of y and z, to get Y, I get the atan2() of x and z, and to get Z, I get the atan2() of y and x. So far the Yaw in 45 degrees while Pitch and Roll is zero as intended. I just hope the math is right o.O: #include <iostream> using namespace std; int main(void); struct Vertex3D { float x; float y; float z; }; struct Vector3D { float x; float y; float z; }; const double PI = 4.0 * atan(1.0); Vertex3D a = { 0.0f, 0.0f, 0.0f }; Vertex3D b = { 1.0f, 0.0f, 1.0f }; Vector3D u; Vector3D vecX = { 1.0f, 0.0f, 0.0f }; Vector3D vecY = { 0.0f, 1.0f, 0.0f }; Vector3D vecZ = { 0.0f, 0.0f, 1.0f }; Vector3D angle; Vector3D dot; float mag; float x, y, z; int main() { //acos domain: -1 to 1 range: 0 < x < pi //asin domain: -1 to 1 range: -pi/2 < x < pi/2 //atan domain: all real num range: -pi/2 < x < pi/2 //u.x = b.x - a.x; //u.y = b.y - a.y; //u.z = b.z - a.z; //mag = sqrtf(u.x * u.x + u.y * u.y + u.z * u.z); //dot.x = u.x * vecX.x + u.y * vecX.y + u.z * vecX.z; //dot.y = u.x * vecY.x + u.y * vecY.y + u.z * vecY.z; //dot.z = u.x * vecZ.x + u.y * vecZ.y + u.z * vecZ.z; //angle.x = acosf(dot.x / mag) * 180.0f / static_cast<float>(PI); //Pitch -- //angle.y = acosf(dot.y / mag) * 180.0f / static_cast<float>(PI); //Yaw | //angle.z = acosf(dot.z / mag) * 180.0f / static_cast<float>(PI); //Roll . x = b.x - a.x; y = b.y - a.y; z = b.z - a.z; angle.x = atan2f(y, z) * 180.0f / static_cast<float>(PI); angle.y = atan2f(x, z) * 180.0f / static_cast<float>(PI); angle.z = atan2f(y, x) * 180.0f / static_cast<float>(PI); cout << "Point a is at " << a.x << ", " << a.y << ", " << a.z << endl; cout << "Point b is at " << b.x << ", " << b.y << ", " << b.z << endl; cout << "angle x (Pitch): " << angle.x << endl; cout << "angle y (Yaw): " << angle.y << endl; cout << "angle z (Roll): " << angle.z << endl; system("pause"); return 0; }
  13. Hello. Basically I'm trying to get the angle of a 3D line and getting the result as an X Y and Z angle in order to rotate polygons to be parallel with that angle. In other words, I'm trying to create a "Wormhole Wall" by taking a series of lines and wrapping 8 polygons around each line at a certain distance away from the line. But honestly I don't think the results of getting the angle of a 3d line are correct. If, for example, point A is at <0, 0, 0> and point B is at <1, 0, 1>, you would think the Yaw is at a 45 degree angle, while the Pitch and Roll remains zero. Instead, my Pitch is 45, Yaw is 90, and Roll is 45. I made a simplified version to see if you can help spot anything wrong. Thanks in advance: #include <iostream> using namespace std; int main(void); struct Vertex3D { float x; float y; float z; }; struct Vector3D { float x; float y; float z; }; const double PI = 4.0 * atan(1.0); Vertex3D a = { 0.0f, 0.0f, 0.0f }; Vertex3D b = { 1.0f, 0.0f, 1.0f }; Vector3D u; Vector3D vecX = { 1.0f, 0.0f, 0.0f }; Vector3D vecY = { 0.0f, 1.0f, 0.0f }; Vector3D vecZ = { 0.0f, 0.0f, 1.0f }; Vector3D angle; Vector3D dot; float mag; int main() { //acos domain: -1 to 1 range: 0 < x < pi //asin domain: -1 to 1 range: -pi/2 < x < pi/2 //atan domain: all real num range: -pi/2 < x < pi/2 u.x = b.x - a.x; u.y = b.y - a.y; u.z = b.z - a.z; mag = sqrtf(u.x * u.x + u.y * u.y + u.z * u.z); dot.x = u.x * vecX.x + u.y * vecX.y + u.z * vecX.z; dot.y = u.x * vecY.x + u.y * vecY.y + u.z * vecY.z; dot.z = u.x * vecZ.x + u.y * vecZ.y + u.z * vecZ.z; angle.x = acosf(dot.x / mag) * 180.0f / static_cast<float>(PI); //Pitch -- angle.y = acosf(dot.y / mag) * 180.0f / static_cast<float>(PI); //Yaw | angle.z = acosf(dot.z / mag) * 180.0f / static_cast<float>(PI); //Roll . cout << "Point a is at " << a.x << ", " << a.y << ", " << a.z << endl; cout << "Point b is at " << b.x << ", " << b.y << ", " << b.z << endl; cout << "angle x (Pitch): " << angle.x << endl; cout << "angle y (Yaw): " << angle.y << endl; cout << "angle z (Roll): " << angle.z << endl; system("pause"); return 0; }
  14. Psychopathetica

    Wormhole Effect

    Well I created the wormhole, and it works great on 3D studio max. Was easy actually. I just made line splines that interconnect to each other in one twisted pretzel loop with smooth curves, changed the thickness to 100, added a camera to follow around it and vwola. Once I successfully load the wormhole tunnel mesh, how can I go about having a camera move inside? I was thinking of using nodes.
  15. Psychopathetica

    Wormhole Effect

    Hey guys. I was wondering if there is a mathematical formula to produce the famous "wormhole" effect found in many games and movies, such as Marvel vs Capcom 2's vs screen, StarGate, etc. Tried Googling it, but the closest I've come was this https://gamedev.stackexchange.com/questions/27684/how-do-i-create-a-wormhole-effect-in-c-and-directx but I don't think it will be the same thing I'm looking for. I wan't to be able to have it in my game but so far no luck. It should be similar to this: 1920 × 1080 - pond5.com 1024 × 576 - stargate.wikia.com I'm good at math and all, but this is a beast that'll take some time to tackle. Hopefully you guys can help me in the right direction. Thanks in advance [EDIT] I may have to create a Tunnel mesh on 3D Studio Max, warp the hell out of it, load it into my program, and have a camera follow through. Cause this guy did: https://forum.unity.com/threads/wormhole-tunnel-effect.97032/ Guess that's one way of doing it
  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!