• Content count

  • Joined

  • Last visited

Community Reputation

157 Neutral

About Spykam22

  • Rank
  1.   I just tried that and the module no longer matches up correctly.  They still only rotate correctly when the attachment point rotations are different and not exactly the opposite.  When they are exactly opposite, the module just rotates around and faces the direction the other attachment point is facing (the attachment point on the module that the new module is trying to connect to).
  2.   Ashaman, you are absolutely awesome.  I almost have it working perfectly.   FTransform NewTransform = FTransform(ThisAttachmentPoint->GetRelativeTransform().GetTranslation()).Inverse(); NewTransform *= ThisAttachmentPoint->GetRelativeTransform().GetRotation(); NewTransform *= OtherAttachmentPoint->GetRelativeTransform(); NewTransform *= OtherModule->GetTransform(); SetActorTransform(NewTransform);  I am having one problem though.  When trying to connect two modules whose rotations are the exact opposite of each other, 'Tm1' ends up equaling 'Tm2'.  This is very odd.  It only does this on modules with exactly opposite rotations.  For example:   Correct: Attaching Module 2 Attachment Point 2 (Yaw = -90) to Module 1 Attachment Point 1 (Yaw = 0) does this: Incorrect: Attaching Module 2 Attachment Point 3 (Yaw = -180) to Module 1 Attachment Point 1 (Yaw = 0) does this:   What Attaching Module 2 Attachment Point 3 (Yaw = -180) to Module 1 Attachment Point 1 (Yaw = 0) should result in:   Any idea? Thanks! :-)
  3. Hello Ashaman!  Thanks for the response.  Please forgive me but I thought I wrote the new code correctly but I am a little confused as to how this would translate into C++.  My transformation code now looks like this:   FTransform NewTransform = OtherModule->GetTransform() * OtherAttachmentPoint->GetComponentTransform() * FTransform(ThisAttachmentPoint->GetComponentRotation()) * FTransform(ThisAttachmentPoint->GetComponentLocation()).Inverse(); SetActorTransform(NewTransform);     Does this look correct?  I tried it out and it doesn't seem to be working.  Thanks man, I really appreciate it. :-)
  4. Hello all!  I am working on a project in Unreal Engine 4 and I am not exactly sure if I am calculating this correctly.  Here is some information on what I am trying to do.  Read all the way down to "What needs to be done?" please.  I would like to tackle one problem at time.  The problem that I am trying to solve right now is rotating a module and moving it so that the attachment points line up correctly and the rotations are the same.  This is what I currently have: float ZRotation = FMath::RadiansToDegrees(acosf(FVector::DotProduct(-ThisAttachmentPoint->GetForwardVector().GetSafeNormal(), OtherAttachmentPoint->GetForwardVector().GetSafeNormal()))) * -FMath::Sign(ThisAttachmentPoint->GetForwardVector().X); AddActorWorldRotation(FRotator(0, ZRotation, 0)); AddActorWorldOffset(OtherAttachmentPoint->GetComponentLocation() - ThisAttachmentPoint->GetComponentLocation()); The way I am moving the object to have the attachment points meet works, but the rotation is proving to be really tricky for me.  I thought that I'd have the object rotate first, then move to line up but while this works some ways, other ways it doesn't.  I thought the process would be: Find the yaw angle between the attachment point's rotation and the object's rotation. Add that result to the rotation. Move the object so that both attachment points line up. I can't seem to calculate the 'ZRotation' correctly so that it lines up no matter how I rotate the main module that the new module will attach to.  I've also tried this: float ZRotation = FMath::RadiansToDegrees(FMath::GetAzimuthAndElevation(OtherAttachmentPoint->GetForwardVector().GetSafeNormal(), FVector::ForwardVector, FVector::ZeroVector, FVector::ZeroVector).X - FMath::GetAzimuthAndElevation(-ThisAttachmentPoint->GetForwardVector().GetSafeNormal(), FVector::ForwardVector, FVector::ZeroVector, FVector::ZeroVector).X); Any help? Thanks! :)
  5. Calculating Right Vector from Two Points

    Hey haegarr! I got it working thanks to you! Thanks! :D
  6. Calculating Right Vector from Two Points

    Hey! I got some sleep and I understand it now.  Lol I don't know why I was thinking it was a multiplication sign.  I changed it to "1 + FVector::DotProduct(s1, s2)" now. :D I will be testing it soon.
  7. Calculating Right Vector from Two Points

    Hey haegarr! Thanks for the help so far, I really appreciate it.  Sorry to bug you with all these questions, but I found a way to get the right vector using the points rotation (I didn't know it had a function like that until yesterday) and was wondering if this looks right now: float distance = Path->GetDistanceAlongSplineAtSplinePoint(i) + (segmentLength * s); float distance2 = Path->GetDistanceAlongSplineAtSplinePoint(i) + (segmentLength * s) + segmentLength; FVector v0 = GetTransform().InverseTransformPosition(Path->GetWorldLocationAtDistanceAlongSpline(distance)); FVector v2 = GetTransform().InverseTransformPosition(Path->GetWorldLocationAtDistanceAlongSpline(distance2)); FRotator v0Rotation = Path->GetWorldRotationAtDistanceAlongSpline(distance); FRotator v2Rotation = Path->GetWorldRotationAtDistanceAlongSpline(distance2); FVector s1 = FRotationMatrix(v0Rotation).GetScaledAxis(EAxis::Y); FVector s2 = FRotationMatrix(v2Rotation).GetScaledAxis(EAxis::Y); FVector h = s1 + s2; FVector extendDir = h * width / (FVector(1, 1, 1) + s1 * s2); I was a little confused about calculating vi though (extendDir is that vector).  How I read it was "[halfway vector] multiplied by [width] divided by [a vector with all components set to 1] plus [sideway vector 1] multiplied [sideway vector 2]".  I'm pretty sure I didn't read it right.  Can you type the equation out in words? lol And are these the right points for each calculation?   Thanks for the help, I really appreciate it again man!
  8. Calculating Right Vector from Two Points

      What I am trying to do is get away from this: FVector extendDir = FVector(0, width, 0); Because it always extends the mesh on the Y axis, I get this result if I move the spline back while moving it left or right:   What I would like to do is have the mesh maintain it's width where ever it is.  I don't want it to get squished together, it needs to be the same width at the end as it was before.  So I would like to have everything extend according to the right vector's of the points.   1) I have everything sub-divided and I can change how may segments each part is sub-divided into:   2) I've calculated the sideway vectors for each segement (bottom left point of triangle to left top point of triangle).  Does this look correct? // Transform the bottom left point of the triangle and left top point of the triangle by y axis to get the right vector. FVector v0right = FRotationMatrix(Path->GetWorldRotationAtDistanceAlongSpline(Path->GetDistanceAlongSplineAtSplinePoint(i) + (segmentLength * s))).TransformVector(FVector(0, 1, 0)).SafeNormal().ForwardVector; FVector v2right = FRotationMatrix(Path->GetWorldRotationAtDistanceAlongSpline(Path->GetDistanceAlongSplineAtSplinePoint(i) + (segmentLength * s) + segmentLength)).TransformVector(FVector(0, 1, 0)).SafeNormal().ForwardVector; 3) How would I compute the common vector? 4) How would I compute a sideway displacement?   Thanks!
  9. Original Thread from Unreal Engine Forums   Hello all! I am working on a water/river mode plugin for the Unreal Engine 4.  It seems to be coming along alright.  I have the base features working except for one partially works.  And that one is moving a spline to adjust the path.     The problem is that as I extend move it more to the right or left, it gets weird:   I already know what the problem is: FVector extendDir = FVector(0, width, 0); one.Vertex0.Position = v0; one.Vertex1.Position = v0 + extendDir; one.Vertex2.Position = v2; two.Vertex0.Position = one.Vertex2.Position + extendDir; two.Vertex1.Position = one.Vertex2.Position; two.Vertex2.Position = one.Vertex1.Position; It is because I'm extending it on the Y-axis. I did it intentionally for testing purposes. Now that I know that the mesh renders alright, the next thing on my list is to get the right vector of a segment on the spline component so I can extend the mesh in that direction. That way, it will never get squished like that and the width of the mesh will stay constant throughout the whole thing. How would I do this? Is it possible? I've been trying to do this all day. And I cannot use the point's rotation because it has not been rotated, I just moved it. Is there a way to calculate this using the first point and connected point or something? Thanks!   
  10. Hello everyone! Thanks for the answers. I think I may just write my DirectX version first. It is being developed alongside a game that I am using it for. It will be my first game to be released. After I am finished, then I'll focus on GL versions. How many people do you guys think use Bootcamp on OS X? Thanks! :)
  11. Hello!  I re-writing my game engine from scratch and was trying to decide whether I should support DirectX & OpenGL or just DirectX?  I understand that OpenGL is cross-platform while DirectX is restricted to Windows & the Xbox.  I also have a couple more questions:   I am developing on Windows using Visual Studio 2013 Ultimate.  If I were to implement OpenGL,  how would I build for different platforms? Example: If I wanted to build on maybe, Linux or Mac.  Would I have to build my project on that operating system? If I do develop for both API's, how should I setup my project? Should I create two separate projects? Or keep them both separated in one project using enums, and interfaces? Example:  enum NXAPIChoice { NXAPI_DirectX, NXAPI_OpenGL, } NXAPIChoice NXGame::DefaultAPI = NXAPIChoice::NXAPI_DirectX; void NXGame::FinalInitialize() { int sWidth = 0, sHeight = 0; InitializeWindows(sWidth, sHeight); SetupGPU(sWidth, sHeight); //TODO: [HIGH PRIORITY] Initialize Input //TODO: [HIGH PRIORITY] Initialize Audio Initialize(); //TODO: [HIGH PRIORITY] Create & Start GameTime } void NXGame::SetupGPU(int width, int height) { //stuff... NXGPU::Initialize(DefaultAPI, width, height, ...other); } void NXGPU::Initialize(NXAPIChoice choice, int width, int height, ...other) { if (choice == NXAPIChoice::NXAPI_DirectX) InitializeDirect3D(width, height, D3D_FEATURE_LEVEL_11_0, ...other); else InitializeOpenGL(width, height, GL_VERSION, ...other) } Also if I were to develop for both, and I do have to compile my code on other platforms (linux or mac), will I be able to skip my DirectX code (assuming it would cause errors because DX is not available) and just compile GL code? Is it even worth adding support for both? Thanks! [EDIT: Or perhaps, shall I finish my engine with DirectX first (creating empty methods for GL in the process) and just start working on GL support (fill in the empty methods) after DirectX is fully implemented?]
  12. Hello! I have two classes: NXGameObject & NXSceneObject. class NXGameObject { public: LPCWSTR Name; bool RenderToTexture = true; bool IsActive = false; virtual void NEXENGINE_API Initialize() = 0; virtual void NEXENGINE_API Update(NXGameTime*) = 0; virtual void NEXENGINE_API Render(NXGameTime*) = 0; virtual void NEXENGINE_API RenderDeferred(NXGameTime*){} virtual void NEXENGINE_API Dispose() = 0; }; class NXSceneObject : public NXGameObject { public: void Initialize(){} void Render(NXGameTime* gameTime){} virtual void ReadData(NXBinaryReader* reader){} virtual void Update(NXGameTime* gameTime){} virtual void Render(NXCamera3D* camera, NXGameTime* gameTime){} virtual void Dispose(){} NXGameObject* ParentScene; map<string, NXSceneObject*> Chlidren; bool IsVisible; XMMATRIX WorldMatrix() { return XMMatrixScaling(Size.x, Size.y, Size.z) * XMMatrixTranslation(Position.x, Position.y, Position.z) * XMMatrixRotationRollPitchYaw(Rotation.x, Rotation.y, Rotation.z); } XMFLOAT3 Position; XMFLOAT3 Rotation; XMFLOAT3 Size; NXBoundingBox Bounds; NXSceneObject(){ } NXSceneObject(NXBinaryReader* reader) { //Read Main Data IsVisible = reader->ReadBoolean(); Position = reader->ReadVector3(false); Rotation = reader->ReadVector3(false); Size = reader->ReadVector3(false); Bounds = reader->ReadBoundingBox(false); //Call the type of scene object being created's ReadData method and Read Custom Object Specific Data ReadData(reader); } }; Now I have a bunch of different classes (player markers, enemy spawn points, etc.) derived from NXSceneObject.  Each class has overrides ReadData(NXBinaryReader*).  Here's an example. class NXMarker : virtual public NXSceneObject { public: NXMarker(NXBinaryReader* reader) : NXSceneObject(reader) { } }; class NXPlayerStartMarker : public NXMarker { public: float Health; float MoveSpeed; bool Realistic; NXPlayerStartMarker(NXBinaryReader* reader) : NXMarker(reader) { } void ReadData(NXBinaryReader* reader) override { Realistic = reader->ReadBoolean(); Health = reader->ReadFloat(); MoveSpeed = reader->ReadFloat(); } }; My problem is that, no matter what I try, the classes don't seem to be overriding ReadData and the ReadData method in the created class is not called.  I put breakpoints inside the PlayerStartMarker's ReadData method and it is not called.   I am trying to override the method so that after the main data is read in the NXSceneObject constructed, it'll call whatever type of class is being created's ReadData method.  Sorry if I wasn't clear.  Any help? Thanks! :)
  13. What do you want in a Next Generation Console?

    Hello Everyone!   Thanks, all the information you are providing is really helpful.     I know right! This is not going to be the case with my product.         Great info and ideas.      You read my mind for the most part.   Thanks.        Thanks! Trust me, it will have lots of memory.      Thanks      Thanks    Thanks again! Continue to leave replies about what you would want to see in a next generation console. Kickstarter coming soon.