Jump to content
  • Advertisement

Psychopathetica

Member
  • Content count

    172
  • Joined

  • Last visited

Community Reputation

259 Neutral

About Psychopathetica

  • Rank
    Member

Personal Information

  • Industry Role
    Game Designer
  • Interests
    Audio
    Programming

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

  1. Psychopathetica

    3D Picking Complex Models

    Another idea I had in mind was something to do with the shader. Like the polygon model you drew out from it, some how feeding it the ray cast information and test to see if it overlapped. Possibly through the fragment shader.
  2. Psychopathetica

    3D Picking Complex Models

    After a tiring search on Google, I came up dry looking for a way to do 3d picking on complex object models, such as any model thats not a sphere, ellipsoid, or a box. A character model would be a good example. I found 2 ways to do picking. One method being getting a color on screen, which is not good because the color could be anything. The other method being ray casting. Only problem, is that the ray casting tests on nearly every tutorial was on simplified models such as spheres or cubes. Yet in 3d studio max, world of warcraft, and other programs clearly show you can click any model, and not just primitives. Does anyone have a good idea on how this can be done? Thanks in advance.
  3. Psychopathetica

    OpenGL ES Vertex Buffer Issue

    Another observation, if you comment out the Normal code (which was working btw), texturemapping works as intended too. Its only when I toss in normals, that texturemapping only uses one color. And the entire thing works when Vertex Buffers are removed. Weird. Not sure if this is just the emulator doing this, but I'm gonna test it on my real phone to see if it works. public void bindData(){ int offset = 0; glBindBuffer(GL_ARRAY_BUFFER, vertexBufferObject[0]); glVertexAttribPointer(aPositionHandle, POSITION_COMPONENT_COUNT_3D, GL_FLOAT, false, POSITION_COMPONENT_STRIDE_3D, offset); offset += POSITION_COMPONENT_COUNT_3D; glVertexAttribPointer(aColorHandle, COLOR_COMPONENT_COUNT, GL_FLOAT, false, COLOR_COMPONENT_STRIDE, numberOfVertices * offset * BYTES_PER_FLOAT); offset += COLOR_COMPONENT_COUNT; glVertexAttribPointer(aTextureCoordinateHandle, TEXTURE_COORDINATES_COMPONENT_COUNT, GL_FLOAT, false, TEXTURE_COORDINATE_COMPONENT_STRIDE, numberOfVertices * offset * BYTES_PER_FLOAT); //offset += TEXTURE_COORDINATES_COMPONENT_COUNT; //glVertexAttribPointer(aNormalHandle, NORMAL_COMPONENT_COUNT, GL_FLOAT, false, NORMAL_COMPONENT_STRIDE, numberOfVertices * offset * BYTES_PER_FLOAT); glBindBuffer(GL_ARRAY_BUFFER, 0); ///////////////////////////////////////////////////// /* vertexBuffer.position(0); glVertexAttribPointer(aPositionHandle, POSITION_COMPONENT_COUNT_3D, GL_FLOAT, false, POSITION_COMPONENT_STRIDE_3D, vertexBuffer); colorBuffer.position(0); glVertexAttribPointer(aColorHandle, COLOR_COMPONENT_COUNT, GL_FLOAT, false, COLOR_COMPONENT_STRIDE, colorBuffer); textureCoordBuffer.position(0); glVertexAttribPointer(aTextureCoordinateHandle, TEXTURE_COORDINATES_COMPONENT_COUNT, GL_FLOAT, false, TEXTURE_COORDINATE_COMPONENT_STRIDE, textureCoordBuffer); normalBuffer.position(0); glVertexAttribPointer(aNormalHandle, NORMAL_COMPONENT_COUNT, GL_FLOAT, false, NORMAL_COMPONENT_STRIDE, normalBuffer); */ } public void createVertexBuffer(){ glGenBuffers(1, vertexBufferObject, 0); int vertexLine = POSITION_COMPONENT_COUNT_3D + COLOR_COMPONENT_COUNT + TEXTURE_COORDINATES_COMPONENT_COUNT; glBindBuffer(GL_ARRAY_BUFFER, vertexBufferObject[0]); glBufferData(GL_ARRAY_BUFFER,numberOfVertices * vertexLine * BYTES_PER_FLOAT, null, GL_STATIC_DRAW); int offset = 0; glBufferSubData(GL_ARRAY_BUFFER, offset, numberOfVertices * POSITION_COMPONENT_COUNT_3D * BYTES_PER_FLOAT, vertexBuffer); offset += POSITION_COMPONENT_COUNT_3D; glBufferSubData(GL_ARRAY_BUFFER, numberOfVertices * offset * BYTES_PER_FLOAT, numberOfVertices * COLOR_COMPONENT_COUNT * BYTES_PER_FLOAT, colorBuffer); offset += COLOR_COMPONENT_COUNT; glBufferSubData(GL_ARRAY_BUFFER, numberOfVertices * offset * BYTES_PER_FLOAT, numberOfVertices * TEXTURE_COORDINATES_COMPONENT_COUNT * BYTES_PER_FLOAT, textureCoordBuffer); //offset += TEXTURE_COORDINATES_COMPONENT_COUNT; //glBufferSubData(GL_ARRAY_BUFFER, numberOfVertices * offset * BYTES_PER_FLOAT, // numberOfVertices * NORMAL_COMPONENT_COUNT * BYTES_PER_FLOAT, normalBuffer); glBindBuffer(GL_ARRAY_BUFFER, 0); } [EDIT] OMG it was the emulator. But I don't understand why? Works great on the phone and tablet. I set the ram up pretty high, like 4 Gigs
  4. Not long ago, I create a nice OBJ loader that loads 3D Studio Max files. Only problem is, is that although it works and works great, I wasn't using Vertex Buffers. Now that I applied Vertex Buffers, it seems to only use the first color of the texture and spread it all across the poiygon. I examined my code over and over again, and the Vertex Buffer code is correct. But when I comment out all of my vertex buffer code, it works as intended. I practically given up on fixing it on my own, so hopefully you guys will be able to figure out what is wrong. public static final int BYTES_PER_FLOAT = 4; public static final int POSITION_COMPONENT_COUNT_3D = 4; public static final int COLOR_COMPONENT_COUNT = 4; public static final int TEXTURE_COORDINATES_COMPONENT_COUNT = 2; public static final int NORMAL_COMPONENT_COUNT = 3; public static final int POSITION_COMPONENT_STRIDE_2D = POSITION_COMPONENT_COUNT_2D * BYTES_PER_FLOAT; public static final int POSITION_COMPONENT_STRIDE_3D = POSITION_COMPONENT_COUNT_3D * BYTES_PER_FLOAT; public static final int COLOR_COMPONENT_STRIDE = COLOR_COMPONENT_COUNT * BYTES_PER_FLOAT; public static final int TEXTURE_COORDINATE_COMPONENT_STRIDE = TEXTURE_COORDINATES_COMPONENT_COUNT * BYTES_PER_FLOAT; public static final int NORMAL_COMPONENT_STRIDE = NORMAL_COMPONENT_COUNT * BYTES_PER_FLOAT; int loadFile() { ArrayList<Vertex3D> tempVertexArrayList = new ArrayList<Vertex3D>(); ArrayList<TextureCoord2D> tempTextureCoordArrayList = new ArrayList<TextureCoord2D>(); ArrayList<Vector3D> tempNormalArrayList = new ArrayList<Vector3D>(); ArrayList<Face3D> tempFaceArrayList = new ArrayList<Face3D>(); StringBuilder body = new StringBuilder(); try { InputStream inputStream = context.getResources().openRawResource(resourceID); InputStreamReader inputStreamReader = new InputStreamReader(inputStream); BufferedReader bufferedReader = new BufferedReader(inputStreamReader); String nextLine; String subString; String[] stringArray; String[] stringArray2; int[] indexNumberList = new int[3]; int[] textureCoordNumberList = new int[3]; int[] normalNumberList = new int[3]; int i = 0; int j = 0; int k = 0; try { while ((nextLine = bufferedReader.readLine()) != null) { if (nextLine.startsWith("v ")) { subString = nextLine.substring(1).trim(); stringArray = subString.split(" "); try { tempVertexArrayList.add(new Vertex3D(Float.parseFloat(stringArray[0]), Float.parseFloat(stringArray[1]), Float.parseFloat(stringArray[2]), 1f)); } catch(NumberFormatException e){ Log.d(TAG, "Error: Invalid number format in loading vertex list"); return 0; } String x = String.valueOf(tempVertexArrayList.get(i).x); String y = String.valueOf(tempVertexArrayList.get(i).y); String z = String.valueOf(tempVertexArrayList.get(i).z); //Log.d(TAG, "vertex " + String.valueOf(i) + ": " + x + ", " + y + ", " + z); i++; } if (nextLine.startsWith("vn ")) { subString = nextLine.substring(2).trim(); stringArray = subString.split(" "); try { if(reverseNormals){ tempNormalArrayList.add(new Vector3D(-Float.parseFloat(stringArray[0]), -Float.parseFloat(stringArray[1]), -Float.parseFloat(stringArray[2]))); } else{ tempNormalArrayList.add(new Vector3D(Float.parseFloat(stringArray[0]), Float.parseFloat(stringArray[1]), Float.parseFloat(stringArray[2]))); } } catch(NumberFormatException e){ Log.d(TAG, "Error: Invalid number format in loading normal list"); return 0; } String nx = String.valueOf(tempNormalArrayList.get(j).x); String ny = String.valueOf(tempNormalArrayList.get(j).y); String nz = String.valueOf(tempNormalArrayList.get(j).z); //Log.d(TAG, "normal " + String.valueOf(j) + ": " + nx + ", " + ny + ", " + nz); j++; } if (nextLine.startsWith("vt ")) { subString = nextLine.substring(2).trim(); stringArray = subString.split(" "); try { tempTextureCoordArrayList.add(new TextureCoord2D(Float.parseFloat(stringArray[0]), Float.parseFloat(stringArray[1]))); } catch(NumberFormatException e){ Log.d(TAG, "Error: Invalid number format in loading texture coordinate list"); return 0; } String tu = String.valueOf(tempTextureCoordArrayList.get(k).tu); String tv = String.valueOf(tempTextureCoordArrayList.get(k).tv); //Log.d(TAG, "texture coord " + String.valueOf(k) + ": " + tu + ", " + tv); k++; } if (nextLine.startsWith("f ")) { subString = nextLine.substring(1).trim(); stringArray = subString.split(" "); for (int index = 0; index <= 2; index++) { stringArray2 = stringArray[index].split("/"); try { indexNumberList[index] = Integer.parseInt(stringArray2[0]) - 1; if(indexNumberList[index] < 0){ Log.d(TAG, "Error: indexNumberList[] is less than zero"); return 0; } } catch(NumberFormatException e){ Log.d(TAG, "Error: Invalid number format in loading indexNumberList[]"); return 0; } try{ textureCoordNumberList[index] = Integer.parseInt(stringArray2[1]) - 1; if(textureCoordNumberList[index] < 0){ Log.d(TAG, "Error: textureCoordNumberList[] is less than zero"); return 0; } } catch(NumberFormatException e){ Log.d(TAG, "Error: Invalid number format in loading textureCoordNumberList[]"); return 0; } try{ normalNumberList[index] = Integer.parseInt(stringArray2[2]) - 1; if(normalNumberList[index] < 0){ Log.d(TAG, "Error: normalNumberList[] is less than zero"); return 0; } } catch(NumberFormatException e){ Log.d(TAG, "Error: Invalid number format in loading normalNumberList[]"); return 0; } } tempFaceArrayList.add(new Face3D(indexNumberList[0], textureCoordNumberList[0], normalNumberList[0], indexNumberList[1], textureCoordNumberList[1], normalNumberList[1], indexNumberList[2], textureCoordNumberList[2], normalNumberList[2])); } body.append(nextLine); body.append('\n'); } //Now that everything has successfully loaded, you can now populate the public variables. if(tempVertexArrayList != null && tempVertexArrayList.size() != 0) vertexArrayList.addAll(tempVertexArrayList); if(tempTextureCoordArrayList != null && tempTextureCoordArrayList.size() != 0) textureCoordArrayList.addAll(tempTextureCoordArrayList); if(tempNormalArrayList != null && tempNormalArrayList.size() != 0) normalArrayList.addAll(tempNormalArrayList); if(tempFaceArrayList != null && tempFaceArrayList.size() != 0) faceArrayList.addAll(tempFaceArrayList); vertexList = new float[faceArrayList.size() * POSITION_COMPONENT_COUNT_3D * NUMBER_OF_SIDES_PER_FACE]; indexList = new short[faceArrayList.size() * NUMBER_OF_SIDES_PER_FACE]; colorList = new float[faceArrayList.size() * COLOR_COMPONENT_COUNT * NUMBER_OF_SIDES_PER_FACE]; textureCoordList = new float[faceArrayList.size() * TEXTURE_COORDINATES_COMPONENT_COUNT * NUMBER_OF_SIDES_PER_FACE]; normalList = new float[faceArrayList.size() * NORMAL_COMPONENT_COUNT * NUMBER_OF_SIDES_PER_FACE]; int nextFace = 0; int step = POSITION_COMPONENT_COUNT_3D * NUMBER_OF_SIDES_PER_FACE; for (int currentVertex = 0; currentVertex < vertexList.length; currentVertex += step){ vertexList[currentVertex + 0] = vertexArrayList.get(faceArrayList.get(nextFace).indexNumberList.get(0)).x; vertexList[currentVertex + 1] = vertexArrayList.get(faceArrayList.get(nextFace).indexNumberList.get(0)).y; vertexList[currentVertex + 2] = vertexArrayList.get(faceArrayList.get(nextFace).indexNumberList.get(0)).z; vertexList[currentVertex + 3] = vertexArrayList.get(faceArrayList.get(nextFace).indexNumberList.get(0)).w; vertexList[currentVertex + 4] = vertexArrayList.get(faceArrayList.get(nextFace).indexNumberList.get(1)).x; vertexList[currentVertex + 5] = vertexArrayList.get(faceArrayList.get(nextFace).indexNumberList.get(1)).y; vertexList[currentVertex + 6] = vertexArrayList.get(faceArrayList.get(nextFace).indexNumberList.get(1)).z; vertexList[currentVertex + 7] = vertexArrayList.get(faceArrayList.get(nextFace).indexNumberList.get(1)).w; vertexList[currentVertex + 8] = vertexArrayList.get(faceArrayList.get(nextFace).indexNumberList.get(2)).x; vertexList[currentVertex + 9] = vertexArrayList.get(faceArrayList.get(nextFace).indexNumberList.get(2)).y; vertexList[currentVertex + 10] = vertexArrayList.get(faceArrayList.get(nextFace).indexNumberList.get(2)).z; vertexList[currentVertex + 11] = vertexArrayList.get(faceArrayList.get(nextFace).indexNumberList.get(2)).w; nextFace++; } numberOfVertices = vertexList.length / POSITION_COMPONENT_COUNT_3D; nextFace = 0; for (int currentIndex = 0; currentIndex < indexList.length; currentIndex += NUMBER_OF_SIDES_PER_FACE){ indexList[currentIndex + 0] = faceArrayList.get(nextFace).indexNumberList.get(0).shortValue(); indexList[currentIndex + 1] = faceArrayList.get(nextFace).indexNumberList.get(1).shortValue(); indexList[currentIndex + 2] = faceArrayList.get(nextFace).indexNumberList.get(2).shortValue(); } step = COLOR_COMPONENT_COUNT * NUMBER_OF_SIDES_PER_FACE; for (int currentVertex = 0; currentVertex < colorList.length; currentVertex += step){ colorList[currentVertex + 0] = red; colorList[currentVertex + 1] = green; colorList[currentVertex + 2] = blue; colorList[currentVertex + 3] = alpha; colorList[currentVertex + 4] = red; colorList[currentVertex + 5] = green; colorList[currentVertex + 6] = blue; colorList[currentVertex + 7] = alpha; colorList[currentVertex + 8] = red; colorList[currentVertex + 9] = green; colorList[currentVertex + 10] = blue; colorList[currentVertex + 11] = alpha; } nextFace = 0; step = TEXTURE_COORDINATES_COMPONENT_COUNT * NUMBER_OF_SIDES_PER_FACE; for (int currentVertex = 0; currentVertex < textureCoordList.length; currentVertex += step){ textureCoordList[currentVertex + 0] = textureCoordArrayList.get(faceArrayList.get(nextFace).textureCoordNumberList.get(0)).tu * mult; textureCoordList[currentVertex + 1] = textureCoordArrayList.get(faceArrayList.get(nextFace).textureCoordNumberList.get(0)).tv * mult; textureCoordList[currentVertex + 2] = textureCoordArrayList.get(faceArrayList.get(nextFace).textureCoordNumberList.get(1)).tu * mult; textureCoordList[currentVertex + 3] = textureCoordArrayList.get(faceArrayList.get(nextFace).textureCoordNumberList.get(1)).tv * mult; textureCoordList[currentVertex + 4] = textureCoordArrayList.get(faceArrayList.get(nextFace).textureCoordNumberList.get(2)).tu * mult; textureCoordList[currentVertex + 5] = textureCoordArrayList.get(faceArrayList.get(nextFace).textureCoordNumberList.get(2)).tv * mult; nextFace++; } nextFace = 0; step = NORMAL_COMPONENT_COUNT * NUMBER_OF_SIDES_PER_FACE; for (int currentVertex = 0; currentVertex < normalList.length; currentVertex += step){ normalList[currentVertex + 0] = normalArrayList.get(faceArrayList.get(nextFace).normalNumberList.get(0)).x; normalList[currentVertex + 1] = normalArrayList.get(faceArrayList.get(nextFace).normalNumberList.get(0)).y; normalList[currentVertex + 2] = normalArrayList.get(faceArrayList.get(nextFace).normalNumberList.get(0)).z; normalList[currentVertex + 3] = normalArrayList.get(faceArrayList.get(nextFace).normalNumberList.get(1)).x; normalList[currentVertex + 4] = normalArrayList.get(faceArrayList.get(nextFace).normalNumberList.get(1)).y; normalList[currentVertex + 5] = normalArrayList.get(faceArrayList.get(nextFace).normalNumberList.get(1)).z; normalList[currentVertex + 6] = normalArrayList.get(faceArrayList.get(nextFace).normalNumberList.get(2)).x; normalList[currentVertex + 7] = normalArrayList.get(faceArrayList.get(nextFace).normalNumberList.get(2)).y; normalList[currentVertex + 8] = normalArrayList.get(faceArrayList.get(nextFace).normalNumberList.get(2)).z; nextFace++; } vertexBuffer = ByteBuffer.allocateDirect(vertexList.length * BYTES_PER_FLOAT).order(ByteOrder.nativeOrder()).asFloatBuffer(); indexBuffer = ByteBuffer.allocateDirect(indexList.length * BYTES_PER_FLOAT).order(ByteOrder.nativeOrder()).asShortBuffer(); colorBuffer = ByteBuffer.allocateDirect(colorList.length * BYTES_PER_FLOAT).order(ByteOrder.nativeOrder()).asFloatBuffer(); textureCoordBuffer = ByteBuffer.allocateDirect(textureCoordList.length * BYTES_PER_FLOAT).order(ByteOrder.nativeOrder()).asFloatBuffer(); normalBuffer = ByteBuffer.allocateDirect(normalList.length * BYTES_PER_FLOAT).order(ByteOrder.nativeOrder()).asFloatBuffer(); vertexBuffer.put(vertexList).position(0); indexBuffer.put(indexList).position(0); colorBuffer.put(colorList).position(0); textureCoordBuffer.put(textureCoordList).position(0); normalBuffer.put(normalList).position(0); createVertexBuffer(); uMVPMatrixHandle = glGetUniformLocation(program, U_MVPMATRIX); uMVMatrixHandle = glGetUniformLocation(program, U_MVMATRIX); uTextureUnitHandle = glGetUniformLocation(program, U_TEXTURE_UNIT); aPositionHandle = glGetAttribLocation(program, A_POSITION); aColorHandle = glGetAttribLocation(program, A_COLOR); aTextureCoordinateHandle = glGetAttribLocation(program, A_TEXTURE_COORDINATES); aNormalHandle = glGetAttribLocation(program, A_NORMAL); glEnableVertexAttribArray(aPositionHandle); glEnableVertexAttribArray(aColorHandle); glEnableVertexAttribArray(aTextureCoordinateHandle); glEnableVertexAttribArray(aNormalHandle); glActiveTexture(GL_TEXTURE0); glUniform1i(uTextureUnitHandle, 0); } catch(IOException e){ } } catch (Resources.NotFoundException nfe){ throw new RuntimeException("Resource not found: " + resourceID, nfe); } return 1; } public void draw(){ glEnable(GL_DEPTH_TEST); bindData(); glBindBuffer(GL_ARRAY_BUFFER, vertexBufferObject[0]); glDrawArrays(GL_TRIANGLES, 0, faceArrayList.size() * NUMBER_OF_SIDES_PER_FACE); glBindBuffer(GL_ARRAY_BUFFER, 0); } public void bindData(){ int offset = 0; glBindBuffer(GL_ARRAY_BUFFER, vertexBufferObject[0]); glVertexAttribPointer(aPositionHandle, POSITION_COMPONENT_COUNT_3D, GL_FLOAT, false, POSITION_COMPONENT_STRIDE_3D, offset); offset += POSITION_COMPONENT_COUNT_3D; glVertexAttribPointer(aColorHandle, COLOR_COMPONENT_COUNT, GL_FLOAT, false, COLOR_COMPONENT_STRIDE, numberOfVertices * offset * BYTES_PER_FLOAT); offset += COLOR_COMPONENT_COUNT; glVertexAttribPointer(aTextureCoordinateHandle, TEXTURE_COORDINATES_COMPONENT_COUNT, GL_FLOAT, false, TEXTURE_COORDINATE_COMPONENT_STRIDE, numberOfVertices * offset * BYTES_PER_FLOAT); offset += TEXTURE_COORDINATES_COMPONENT_COUNT; glVertexAttribPointer(aNormalHandle, NORMAL_COMPONENT_COUNT, GL_FLOAT, false, NORMAL_COMPONENT_STRIDE, numberOfVertices * offset * BYTES_PER_FLOAT); glBindBuffer(GL_ARRAY_BUFFER, 0); ///////////////////////////////////////////////////// /* vertexBuffer.position(0); glVertexAttribPointer(aPositionHandle, POSITION_COMPONENT_COUNT_3D, GL_FLOAT, false, POSITION_COMPONENT_STRIDE_3D, vertexBuffer); colorBuffer.position(0); glVertexAttribPointer(aColorHandle, COLOR_COMPONENT_COUNT, GL_FLOAT, false, COLOR_COMPONENT_STRIDE, colorBuffer); textureCoordBuffer.position(0); glVertexAttribPointer(aTextureCoordinateHandle, TEXTURE_COORDINATES_COMPONENT_COUNT, GL_FLOAT, false, TEXTURE_COORDINATE_COMPONENT_STRIDE, textureCoordBuffer); normalBuffer.position(0); glVertexAttribPointer(aNormalHandle, NORMAL_COMPONENT_COUNT, GL_FLOAT, false, NORMAL_COMPONENT_STRIDE, normalBuffer); */ } public void createVertexBuffer(){ glGenBuffers(1, vertexBufferObject, 0); glBindBuffer(GL_ARRAY_BUFFER, vertexBufferObject[0]); glBufferData(GL_ARRAY_BUFFER,numberOfVertices * (POSITION_COMPONENT_COUNT_3D + COLOR_COMPONENT_COUNT + TEXTURE_COORDINATES_COMPONENT_COUNT + NORMAL_COMPONENT_COUNT) * BYTES_PER_FLOAT, null, GL_STATIC_DRAW); int offset = 0; glBufferSubData(GL_ARRAY_BUFFER, offset, numberOfVertices * POSITION_COMPONENT_COUNT_3D * BYTES_PER_FLOAT, vertexBuffer); offset += POSITION_COMPONENT_COUNT_3D; glBufferSubData(GL_ARRAY_BUFFER, numberOfVertices * offset * BYTES_PER_FLOAT, numberOfVertices * COLOR_COMPONENT_COUNT * BYTES_PER_FLOAT, colorBuffer); offset += COLOR_COMPONENT_COUNT; glBufferSubData(GL_ARRAY_BUFFER, numberOfVertices * offset * BYTES_PER_FLOAT, numberOfVertices * TEXTURE_COORDINATES_COMPONENT_COUNT * BYTES_PER_FLOAT, textureCoordBuffer); offset += TEXTURE_COORDINATES_COMPONENT_COUNT; glBufferSubData(GL_ARRAY_BUFFER, numberOfVertices * offset * BYTES_PER_FLOAT, numberOfVertices * NORMAL_COMPONENT_COUNT * BYTES_PER_FLOAT, normalBuffer); glBindBuffer(GL_ARRAY_BUFFER, 0); }
  5. Psychopathetica

    Get Angle of 3D Line and Rotate Polygon With Result

    Atan2(×, z) i believe is the same as atan(x / z). If javascript doesnt support division by zero, you can check it in a simple if statement.
  6. Psychopathetica

    Get Angle of 3D Line and Rotate Polygon With Result

    Success! I created a wormhole mesh. Since I have both the linear path and the mesh, I attempted to find out how to get the angle of the 2 points in 3D, and managed to solve that as well. And I need this for many applications, such as sprite following, and camera following, at that particular angle. This is the math I used, and I actually found 2 methods. Roll is always zero because you never see the line "roll", and it threw off the values anyways. Method one was to use this: float x, y, z; D3DXVECTOR3 a = { -100.0f, 0.0f, 200.0f }; D3DXVECTOR3 b = { 50.0f, 0.0f, 250.0f }; D3DXVECTOR3 angle; x = b.x - a.x; y = b.y - a.y; z = b.z - a.z; angle.x = atan2f(sqrtf(x * x + z * z), y) * (180.0f / static_cast<float>(PI)) - 90.0f; //Pitch angle.y = atan2f(x, z) * (180.0f / static_cast<float>(PI)); //Yaw angle.z = 0.0f; //Roll Another way was to the same thing with less math was this: float x, y, z; D3DXVECTOR3 a = { -100.0f, 0.0f, 200.0f }; D3DXVECTOR3 b = { 50.0f, 0.0f, 250.0f }; D3DXVECTOR3 angle; x = b.x - a.x; y = b.y - a.y; z = b.z - a.z; angle.x = -atan2f(y, z) * (180.0f / static_cast<float>(PI)); //Pitch angle.y = atan2f(x, z) * (180.0f / static_cast<float>(PI)); //Yaw angle.z = 0.0f; //Roll And it did not matter what angle the line was in, the polygon always moved with it. With some Lerping, or Slerping, I can have it move along the line if I want to.
  7. Psychopathetica

    Get Angle of 3D Line and Rotate Polygon With Result

    When I get home, Ill check out the turn to mesh /poly modifier. I think this will be easier. I still need the line segment path though for the camera to follow, which means I still need to properly get the angles of the line for the camera to turn at that angle. I may need to resort to quaternions. I dont know.
  8. Psychopathetica

    Get Angle of 3D Line and Rotate Polygon With Result

    Here is what I'm trying to do. I made a wormhole mesh in Autodesk 3D studio max using line segments interconnected, and made the line radius like 150, which made it into a wormhole tunnel. Problem is, when I exported the mesh, I got a line path out of it instead of a real mesh. So I had an idea to use this data in my program. Every line segment will have a series of walls around it (like 8 or 10 walls per segment) with the radius being the distance from the line segment, so as I load all the line segment data, it would spitout my wormhole I created. But at the moment, I'm only using one polygon quad hovering over the line segment, When the line segment is at a different angle, the polygon will rotate to be at the same angle with the line segment. Like in this image: Unfortunately, when I do other angles, it doesn't go with it. So I'm just trying to figure out the trig in order for it to always hover over the line. Once I figure this out, the other walls that will circle the line will just be offsets.
  9. Psychopathetica

    Get Angle of 3D Line and Rotate Polygon With Result

    Wait a tick, I think I'm on to something. Since I need the results independently (angle X, angle Y, angle Z), I did getting the angle in 2D for 3D, only to get X, I get the atan2() of y and z, to get Y, I get the atan2() of x and z, and to get Z, I get the atan2() of y and x. So far the Yaw in 45 degrees while Pitch and Roll is zero as intended. I just hope the math is right o.O: #include <iostream> using namespace std; int main(void); struct Vertex3D { float x; float y; float z; }; struct Vector3D { float x; float y; float z; }; const double PI = 4.0 * atan(1.0); Vertex3D a = { 0.0f, 0.0f, 0.0f }; Vertex3D b = { 1.0f, 0.0f, 1.0f }; Vector3D u; Vector3D vecX = { 1.0f, 0.0f, 0.0f }; Vector3D vecY = { 0.0f, 1.0f, 0.0f }; Vector3D vecZ = { 0.0f, 0.0f, 1.0f }; Vector3D angle; Vector3D dot; float mag; float x, y, z; int main() { //acos domain: -1 to 1 range: 0 < x < pi //asin domain: -1 to 1 range: -pi/2 < x < pi/2 //atan domain: all real num range: -pi/2 < x < pi/2 //u.x = b.x - a.x; //u.y = b.y - a.y; //u.z = b.z - a.z; //mag = sqrtf(u.x * u.x + u.y * u.y + u.z * u.z); //dot.x = u.x * vecX.x + u.y * vecX.y + u.z * vecX.z; //dot.y = u.x * vecY.x + u.y * vecY.y + u.z * vecY.z; //dot.z = u.x * vecZ.x + u.y * vecZ.y + u.z * vecZ.z; //angle.x = acosf(dot.x / mag) * 180.0f / static_cast<float>(PI); //Pitch -- //angle.y = acosf(dot.y / mag) * 180.0f / static_cast<float>(PI); //Yaw | //angle.z = acosf(dot.z / mag) * 180.0f / static_cast<float>(PI); //Roll . x = b.x - a.x; y = b.y - a.y; z = b.z - a.z; angle.x = atan2f(y, z) * 180.0f / static_cast<float>(PI); angle.y = atan2f(x, z) * 180.0f / static_cast<float>(PI); angle.z = atan2f(y, x) * 180.0f / static_cast<float>(PI); cout << "Point a is at " << a.x << ", " << a.y << ", " << a.z << endl; cout << "Point b is at " << b.x << ", " << b.y << ", " << b.z << endl; cout << "angle x (Pitch): " << angle.x << endl; cout << "angle y (Yaw): " << angle.y << endl; cout << "angle z (Roll): " << angle.z << endl; system("pause"); return 0; }
  10. Hello. Basically I'm trying to get the angle of a 3D line and getting the result as an X Y and Z angle in order to rotate polygons to be parallel with that angle. In other words, I'm trying to create a "Wormhole Wall" by taking a series of lines and wrapping 8 polygons around each line at a certain distance away from the line. But honestly I don't think the results of getting the angle of a 3d line are correct. If, for example, point A is at <0, 0, 0> and point B is at <1, 0, 1>, you would think the Yaw is at a 45 degree angle, while the Pitch and Roll remains zero. Instead, my Pitch is 45, Yaw is 90, and Roll is 45. I made a simplified version to see if you can help spot anything wrong. Thanks in advance: #include <iostream> using namespace std; int main(void); struct Vertex3D { float x; float y; float z; }; struct Vector3D { float x; float y; float z; }; const double PI = 4.0 * atan(1.0); Vertex3D a = { 0.0f, 0.0f, 0.0f }; Vertex3D b = { 1.0f, 0.0f, 1.0f }; Vector3D u; Vector3D vecX = { 1.0f, 0.0f, 0.0f }; Vector3D vecY = { 0.0f, 1.0f, 0.0f }; Vector3D vecZ = { 0.0f, 0.0f, 1.0f }; Vector3D angle; Vector3D dot; float mag; int main() { //acos domain: -1 to 1 range: 0 < x < pi //asin domain: -1 to 1 range: -pi/2 < x < pi/2 //atan domain: all real num range: -pi/2 < x < pi/2 u.x = b.x - a.x; u.y = b.y - a.y; u.z = b.z - a.z; mag = sqrtf(u.x * u.x + u.y * u.y + u.z * u.z); dot.x = u.x * vecX.x + u.y * vecX.y + u.z * vecX.z; dot.y = u.x * vecY.x + u.y * vecY.y + u.z * vecY.z; dot.z = u.x * vecZ.x + u.y * vecZ.y + u.z * vecZ.z; angle.x = acosf(dot.x / mag) * 180.0f / static_cast<float>(PI); //Pitch -- angle.y = acosf(dot.y / mag) * 180.0f / static_cast<float>(PI); //Yaw | angle.z = acosf(dot.z / mag) * 180.0f / static_cast<float>(PI); //Roll . cout << "Point a is at " << a.x << ", " << a.y << ", " << a.z << endl; cout << "Point b is at " << b.x << ", " << b.y << ", " << b.z << endl; cout << "angle x (Pitch): " << angle.x << endl; cout << "angle y (Yaw): " << angle.y << endl; cout << "angle z (Roll): " << angle.z << endl; system("pause"); return 0; }
  11. Psychopathetica

    Wormhole Effect

    Well I created the wormhole, and it works great on 3D studio max. Was easy actually. I just made line splines that interconnect to each other in one twisted pretzel loop with smooth curves, changed the thickness to 100, added a camera to follow around it and vwola. Once I successfully load the wormhole tunnel mesh, how can I go about having a camera move inside? I was thinking of using nodes.
  12. Psychopathetica

    Wormhole Effect

    Hey guys. I was wondering if there is a mathematical formula to produce the famous "wormhole" effect found in many games and movies, such as Marvel vs Capcom 2's vs screen, StarGate, etc. Tried Googling it, but the closest I've come was this https://gamedev.stackexchange.com/questions/27684/how-do-i-create-a-wormhole-effect-in-c-and-directx but I don't think it will be the same thing I'm looking for. I wan't to be able to have it in my game but so far no luck. It should be similar to this: 1920 × 1080 - pond5.com 1024 × 576 - stargate.wikia.com I'm good at math and all, but this is a beast that'll take some time to tackle. Hopefully you guys can help me in the right direction. Thanks in advance [EDIT] I may have to create a Tunnel mesh on 3D Studio Max, warp the hell out of it, load it into my program, and have a camera follow through. Cause this guy did: https://forum.unity.com/threads/wormhole-tunnel-effect.97032/ Guess that's one way of doing it
  13. Psychopathetica

    OpenGL ES 2D OrthoM Matrix Issue

    I don't think there is a point considering that it's doing what I want. Anyways I fixed it. Had a flaw in my code that was stretching the image when in landscape mode.
  14. Hey guys. Wow it's been super long since I been here. Anyways, I'm having trouble with my 2D OrthoM matrix setup for phones / tablets. Basically I wan't my coordinates to start at the top left of the screen. I also want my polygons to remain squared regardless if you have it on portrait or landscape orientation. At the same time, if I translate the polygon to the middle of the screen, I want it to come to the middle regardless if I have it in portrait or landscape mode. So far I'm pretty close with this setup: private float aspectRatio; @Override public void onSurfaceChanged(GL10 glUnused, int width, int height) { Log.d("Result", "onSurfacedChanged()"); glViewport(0, 0, width, height); if (MainActivity.orientation == Configuration.ORIENTATION_PORTRAIT) { Log.d("Result", "onSurfacedChanged(PORTRAIT)"); aspectRatio = ((float) height / (float) width); orthoM(projectionMatrix, 0, 0f, 1f, aspectRatio, 0f, -1f, 1f); } else{ Log.d("Result", "onSurfacedChanged(LANDSCAPE)"); aspectRatio = ((float) width / (float) height); orthoM(projectionMatrix, 0, 0f, aspectRatio, 1f, 0f, -1f, 1f); } } When I translate the polygon using TranslateM( ) however, goes to the middle in portrait mode but in landscape, it only moved partially to the right, as though portrait mode was on some of the left of the screen. The only time I can get the translation to match is if in Landscape I move the aspectRatio variable in OrthoM( ) from the right arguement to the bottom arguement, and make right be 1f. Works but now the polygon is stretched after doing this. Do I just simply multiply the aspectRatio to the translation values only when its in Landscape mode to fix this or is there a better way? if (MainActivity.orientation == Configuration.ORIENTATION_PORTRAIT) { Matrix.translateM(modelMatrix, 0, 0.5f, 0.5f * aspectRatio, 0f); } else { Matrix.translateM(modelMatrix, 0, 0.5f * aspectRatio, 0.5f, 0f); } Thanks in advance.
  15. Psychopathetica

    Create Polygonal View Frustum

    DX11 forces you to use shaders which is not my strong point unfortunately. DX9 gives you the option to use either or. Yet I've been working with DirectX9 the longest so I'm more comfortable with it. It's the only reason I was testing my idea on DirectX9. On top of that, they screwed up the entire library. Now using anything D3DX is now deprecated. And you have to use another library of C++ and h files in a thing called DirectXTK, and finding code on working with that on the internet is pretty far fetched considering most of the websites, and even books out there still use the deprecated D3DX library. Hell the Windows SDK no longer even packages the d3dx lib. I didn't want to get into a struggle of even trying to draw one measly polygon by spending days on even getting the code working. With that said, I've played plenty of games where in the options menu gives you the choice of using DX9 or DX11 (notice they skipped DX10), World of Warcraft being a fine example. I don't wanna turn this thread into a DX war, but you must understand why I still use DirectX9 to test my idea. I will see if I can find anything on AABB to cull quadrants from the view frustum.
  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

Participate in the game development conversation and more when you create an account on GameDev.net!

Sign me up!