# Search the Community

Showing results for tags 'Algorithm'.

• ### Search By Tags

Type tags separated by commas.

### Categories

• Audio
• Music and Sound FX
• Business and Law
• Career Development
• Production and Management
• Game Design
• Game Design and Theory
• Writing for Games
• UX for Games
• Industry
• Interviews
• Event Coverage
• Programming
• Artificial Intelligence
• General and Gameplay Programming
• Graphics and GPU Programming
• Engines and Middleware
• Math and Physics
• Networking and Multiplayer
• Visual Arts
• Archive

• Audio
• Visual Arts
• Programming
• Writing

### Categories

• GameDev Unboxed

### Categories

• Game Dev Loadout
• Game Dev Unchained

### Categories

• Game Developers Conference
• GDC 2017
• GDC 2018
• Power-Up Digital Games Conference
• PDGC I: Words of Wisdom
• PDGC II: The Devs Strike Back
• PDGC III: Syntax Error

### Forums

• Audio
• Music and Sound FX
• Games Career Development
• Production and Management
• Games Business and Law
• Game Design
• Game Design and Theory
• Writing for Games
• Programming
• Artificial Intelligence
• Engines and Middleware
• General and Gameplay Programming
• Graphics and GPU Programming
• Math and Physics
• Networking and Multiplayer
• Visual Arts
• 2D and 3D Art
• Critique and Feedback
• Community
• GameDev Challenges
• GDNet+ Member Forum
• GDNet Lounge
• GDNet Comments, Suggestions, and Ideas
• Coding Horrors
• Hobby Project Classifieds
• Indie Showcase
• Article Writing
• Affiliates
• NeHe Productions
• AngelCode
• Topical
• Virtual and Augmented Reality
• News
• Workshops
• C# Workshop
• CPP Workshop
• Freehand Drawing Workshop
• Hands-On Interactive Game Development
• SICP Workshop
• XNA 4.0 Workshop
• Archive
• Topical
• Affiliates
• Contests
• Technical
• GameDev Challenges's Topics
• For Beginners's Forum

### Calendars

• Community Calendar
• Games Industry Events
• Game Jams
• GameDev Challenges's Schedule

• GameDev Gear

• 0 Replies

• 0 Reviews

• 0 Views

### Steam

Found 170 results

1. ## Algorithm Raycasts or Rigidbody2D for 2D Platformer in Unity

So I am trying to prototype a 2D platformer like game in Unity and trying to figure out the best method that I should be investigating for controlling the player character. I see generally 2 different ways to handle this which is either using Rigidbody2D or using raycasts. The list of things that I am looking to do right now are: has some sort of gravity like effect moving side to side jumping double jumping being able to walking up slopes of a certain angle (but prevent it after a certain angle and have the user slide down if they are on that angle or larger) be able to hang off the edge and pull up I sure some things might come up later in development if I get that far but these are what I want to get working at least at a very basic level before I move on from the character controller for the prototype (if you watch this video for about 15 - 20 seconds, you will see generally most of the functionality I am looking to achieve: https://www.youtube.com/watch?v=6rzTnx6IHqM&start=375). Now I started with the Rigidbody2D since I do want to have some level of gravity in the game (not just the player but items, enemy, etc.) and I can get the first 4 working pretty easy but the 5th is more troublesome (and have not gotten to the 6th). I can move up angles (I am setting Rigidbody2D.velocity for movement as it seems to be the most recommended) but for example if I am walking up an angle and then stop, my character jumps up a little (I am guess because of the forward velocity that is has when I stop applying hortizontal velocity for moving side to side, there is still extra vertical velocity). So I have 2 questions: Should I be looking at Rigidbody2D or manual raycasts? Either way that you recommend going, do you have any resources that you would recommend looking at for reference (video tutorials, articles, etc.)?
2. ## OpenGL Wrap around clip space points created in the vexter shader

Hello, I am currently drawing an FFT ocean into a texture including Dx, Dy and Dz. As i need the real height at a point for another algorithm, i am passing the points through a vertex shader as following: #version 330 core layout (location = 0) in vec3 position; layout (location = 1) in vec2 texCoords; // Displacement and normal map uniform sampler2D displacementMap; uniform mat4 MVPMatrix; uniform int N; uniform vec2 gridLowerLeftCorner; out float displacedHeight; void main() { // Displace the original position by the amount in the texture vec3 displacedVertex = texture(displacementMap, texCoords).xyz + position; // Scale vertex to the 0 -> 1 range vec2 waterCellIndices = vec2((displacedVertex.x - gridLowerLeftCorner.x)/N, (displacedVertex.z - gridLowerLeftCorner.y)/N); // scale it to -1 -> 1 waterCellIndices = (waterCellIndices * 2.0) - 1.0; displacedHeight = displacedVertex.y; gl_Position = vec4(waterCellIndices, 0, 1); } This works correctly (it writes the correct height at a given point). The issue is that some points due to the Dx and Dz displacement will get outside the clip space. This points should instead wrap around as the ocean is a collection of tiles. As you can see in the attached file the edges fit together perfectly inside white square if they would wrap around (this is the clip space dimensions from RenderDoc). Is there any way i could wrap around this texture (in reality wrap around the clip space positions) so it stays all inside the viewport correctly? I tried to wrap around in the vertex shader by checking the boundaries and wrapping around but it doesnt work when a triangle has a least one vertice inside of the viewport and others outside. Many thanks, André

4. ## Would you pay money for this service?

Hello everyone, my name is Valerio and I'm an expert in business development and product innovation. At the moment I'm working for a big gambling company and I'm also working on a side project to launch a startup aimed at the whole gaming world. I have an idea and I would like the help of this community to validate it. Thanks to machine learning it is possible to predict user behavior. For example, from the tests I have done, I can assure you that it is possible to predict. with an accuracy between 80 and 90 percent (depending on the quality of the data), which users will use a certain app next week. My idea of the service is to create a Softwere as a Service, with a monthly fee, which allows developers and business owners of a game to load tables of data into the software and receive the desired predictive output. For example, thanks to this softwere you might know which users will play and which ones will not play next week, or analyze more specific metrics, like who will make a purchase and who does not, who will use a particular feature and who does not, and so on. With this information, the team that manages the app can set up marketing activities to increase the engagment, such as sending push notifications only to those who know you will not play next week, or special offers to those who will not make a purchase. Here are the questions I need you to answer: - Do you find this service useful? - If so, how much would you pay per month for such a service? Thank you all for your participation, Valerio.
5. ## Algorithm Scalable Data Driven design for RPG

Hello! I'm currently developing a top-down RPG styled game with an Entity Component System architecture and, as the game grows in features, so does my game entities, that is, item models, enemy prototypes, etc. Those definitions are currently in JSON files but at the end of the day I still have long factory classes that read from those files and populate the entities with their respective components and properties in the most naive way. Reading through a presentation about Techniques and Strategies for Data-driven design in Game Development (slides 80–93) (warning: big pdf file) there is this "prototyping approach" where you can build up each game entity from multiple prototypes. I find this really interesting, however, the presentation doesn't mention any implementation details and I'm totally in the dark. I don't know how powerful should this system be. By the way, I'm using Java and LibGDX's engine. My first idea is making a robust prototype-instancing factory where, given a JSON file, it will be able to return an entity populated with its corresponding components. For example: Enemies.json { "skeleton" : { "id" : 0, "components" : { "HealthComponent" : { "totalHealth" : 100 }, "TextureComponent" : { "pathToTexture" : "assets/skeleton.png" } } } } If I wanted to instantiate a Skeleton entity, I would read it's prototype, iterate over it's components and somehow I would instantiate them correctly. With this approach I have the following issues: It will most likely involve using Java Reflection to instance entities from a JSON file. This is a topic that I know little about and will probably end up in dark magic code. Some instances properties can't be prototyped and will have to be passed as parameters to the factory. For example, when creating an enemy entity, an (x, y) position will have to be provided. Suddenly creating instances is not so straight forward. How powerful should this system be? Should it have this "recursive" behavior where you can extend a prototype with an other and so on? This sounds a little bit like dependency injection. Am I reinventing the wheel? Is there anything done already I can make us of? Even though it's still in it's infancy, here is a short demo (under one minute) of my game. Thank you!
6. ## AlgorithmFlexible Room Layout algorithm

While making a roguelike game, procedural generation have to be quick and yet intriguing enough for the generated level to be fun to just pick up and play. There are many ways to generate and laying out procedurally generated rooms. In The Binding Of Isaac, for example, you have many different types of regular room presets. The generator just picks a preset based on things like room placement and size. Because those rooms are always of fixed size, this is a nice compromise. By having handmade presets the generated level is somewhat believable (i.e. there are no gaps or obstacles below a room door or secret room and whatnot). Another example would be Nuclear Throne. The game takes a different approach to procedural generation by keeping it relatively simple. Because it's not room-based like The Binding Of Isaac, there are more things like caves and large open area. The gameplay also plays into this, as the player needs to eliminate every enemy to spawn a portal to the next level. Because my game is somehow more inspired by The Binding of Isaac, the right way to procedurally generate rooms would be to use presets, and this is how I make special rooms. However, there's a big difference between The Binding Of Isaac and my game: my regular rooms aren't always the same size. This means that rather than having presets of regular rooms as well as special rooms I need something more flexible and, most importantly, dynamic.. The anatomy of a Room In my game, as I've said in a previous post, levels are big two-dimensional arrays from which the level geometry is generated. Every room of a level is made using a BSP tree. I won't go in details much on how rooms are generated, but in essence, we create a grid from which we trace a path between two rooms and sparsely attach bonus rooms along the way. Because I already have rooms sizes and whatnot with that level generation, I could reuse the same idea for room layouts. Within rooms, I've also set special anchor points from which props (or more precisely, prop formations, more on that later...) could be generated. Basic Layouts The idea here is to have room layout presets. Within those, presets are an array of prop formations, and each of these formations is linked to a specific anchor point. A formation has a two-dimensional boolean array that indicates whenever or not there should be a prop here. Let's take, for example, a diamond array: The dimension of the array depends on its rooms' dimensions. Here's how it's done: $$size = \left \lceil \frac{2(max(RoomSize_{x},RoomSize_{y}))) }{ 3 } \right \rceil$$ In order to change the array's content we actually use common image manipulation algorithms... Bresenham's Line Algorithm The first used algorithm is the Bresenham's Line Algorithm. Its purpose is to simply render a line describe by two bitmap points onto a raster image. To put it simply, we get the deviation (delta, or "d" for short) in both X and Y of each point of the described line and compare both of them. Depending on the biggest, we simply incremate the point on that axis and colour it in. Here's the implementation: public void TraceLine(Vector2Int p0, Vector2Int p1) { int dx = Mathf.Abs(p1.x - p0.x), sx = p0.x < p1.x ? 1 : -1; int dy = Mathf.Abs(p1.y - p0.y), sy = p0.y < p1.y ? 1 : -1; int err = (dx > dy ? dx : -dy) / 2, e2; while (true) { m_propArray[p0.x][p0.y] = true; if (p0.x == p1.x && p0.y == p1.y) { break; } e2 = err; if (e2 > -dx) { err -= dy; p0.x += sx; } if (e2 < dy) { err += dx; p0.y += sy; } } } Midpoint Circle Algorithm The midpoint circle algorithm is an algorithm used to render a circle onto an image. The idea is somehow similar to Bresenham's Line Algorithm, but rather than drawing a line we draw a circle. To do this we also need, for simplicity sakes, to divide the circle into 8 pieces, called octants. We can do this because circles are always symmetric. (It's also a nice way to unroll loops) Here's the implementation: private void TraceCircle(Vector2Int center, int r, AbstractPropFormation formation) { int d = (5 - r * 4) / 4; int x = 0; int y = r; do { // ensure index is in range before setting (depends on your image implementation) // in this case we check if the pixel location is within the bounds of the image before setting the pixel if (IsValidPoint(center + new Vector2Int(x,y)) { formation.m_propArray[center.x + x][center.y + y] = true; } if (IsValidPoint(center + new Vector2Int(x,-y)) { formation.m_propArray[center.x + x][center.y - y] = true; } if (IsValidPoint(center + new Vector2Int(-x,y)) { formation.m_propArray[center.x - x][center.y + y] = true; } if (IsValidPoint(center + new Vector2Int(-x,-y)) { formation.m_propArray[center.x - x][center.y - y] = true; } if (IsValidPoint(center + new Vector2Int(y,x)) { formation.m_propArray[center.x + y][center.y + x] = true; } if (IsValidPoint(center + new Vector2Int(y,-x)) { formation.m_propArray[center.x + y][center.y - x] = true; } if (IsValidPoint(center + new Vector2Int(-y,x)) { formation.m_propArray[center.x - y][center.y + x] = true; } if (IsValidPoint(center + new Vector2Int(-y,-x)) { formation.m_propArray[center.x - y][center.y - x] = true; } if (d < 0) { d += 2 * x + 1; } else { d += 2 * (x - y) + 1; y--; } x++; } while (x <= y); } Flood Fill Algorithm This is quite a classic, but it's still useful nevertheless. The idea is to progressively fill a section of an image with a specific colour while The implementation is using a coordinate queue rather than recursion for optimization sakes. We also try to fill the image using west-east orientation. Basically, we fill the westmost pixel first, eastmost second and finally go north-south. Here's the implementation: public void Fill(Vector2Int point) { Queue<Vector2Int> q = new Queue<Vector2Int>(); q.Enqueue(point); while (q.Count > 0) { Vector2Int currentPoint = q.Dequeue(); if (!m_propArray[currentPoint.x][currentPoint.y]) { Vector2Int westPoint = currentPoint, eastPoint = new Vector2Int(currentPoint.x + 1, currentPoint.y); while ((westPoint.x >= 0) && !m_propArray[westPoint.x][westPoint.y]) { m_propArray[westPoint.x][westPoint.y] = true; if ((westPoint.y > 0) && !m_propArray[westPoint.x][westPoint.y - 1]) { q.Enqueue(new Vector2Int(westPoint.x, westPoint.y - 1)); } if ((westPoint.y < m_propArray[westPoint.x].Length - 1) && !m_propArray[westPoint.x][westPoint.y + 1]) { q.Enqueue(new Vector2Int(westPoint.x, westPoint.y + 1)); } westPoint.x--; } while ((eastPoint.x <= m_propArray.Length - 1) && !m_propArray[eastPoint.x][eastPoint.y]) { m_propArray[eastPoint.x][eastPoint.y] = true; if ((eastPoint.y > 0) && !m_propArray[eastPoint.x][eastPoint.y - 1]) { q.Enqueue(new Vector2Int(eastPoint.x, eastPoint.y - 1)); } if ((eastPoint.y < m_propArray[eastPoint.x].Length - 1) && !m_propArray[eastPoint.x][eastPoint.y + 1]) { q.Enqueue(new Vector2Int(eastPoint.x, eastPoint.y + 1)); } eastPoint.x++; } } } } Formation Shapes Each formation also has a specific shape. These shapes simply define the content of the formation array. We can build these shapes using the previously mentioned algorithms. There are 9 different types of shapes as of now. Vertical line A simple vertical line of a width of one Horizontal line A simple horizontal line of a width of one Diamond A rather nice diamond shape, especially pretty in corners Circle The circle is rendered using the Midpoint circle algorithm. Especially pretty in the center of rooms Cross A simple cross shape, i.e a vertical and horizontal line align at the center. X Shape An "X" shaped cross, i.e two perpendicular diagonal lines align at the center. Triangle An Isocele triangle. Square A solid block. Every cell of the formation is essentially true. Checkers A nice variation of the square shape. Every other cell is false. There might be more types of shapes as time goes by, but it's about it for now. Placing props Once the array is set, we simply need to place the actual props in the room. Each formation is of an actual type, i.e. rocks, ferns, etc. (For simplicity sakes, let's say that every prop is a 1x1x1m cube. This would simplify future steps.) In order to find their position, we simply align the array's center to the formations' specified anchor point. For each prop formation, we then instantiate props for each true cells while checking whenever or not the prop would be outside its room. Afterwards, we do a precise position check to make sure no props are either partially or fully outside a room. Finally, we make sure every room connections aren't obstructed with props. And voilà, we have a nicely decorated room In Game Screenshots Here's a couple of screenshots of what it looks like in-game

8. ## Polygons and the Separating Axis Theorem

I have programmed an implementation of the Separating Axis Theorem to handle collisions between 2D convex polygons. It is written in Processing and can be viewed on Github here. There are a couple of issues with it that I would like some help in resolving. In the construction of Polygon objects, you specify the width and height of the polygon and the initial rotation offset by which the vertices will be placed around the polygon. If the rotation offset is 0, the first vertex is placed directly to the right of the object. If higher or lower, the first vertex is placed clockwise or counter-clockwise, respectively, around the circumference of the object by the rotation amount. The rest of the vertices follow by a consistent offset of TWO_PI / number of vertices. While this places the vertices at the correct angle around the polygon, the problem is that if the rotation is anything other than 0, the width and height of the polygon are no longer the values specified. They are reduced because the vertices are placed around the polygon using the sin and cos functions, which often return values other than 1 or -1. Of course, when the half width and half height are multiplied by a sin or cos value other than 1 or -1, they are reduced. This is my issue. How can I place an arbitrary number of vertices at an arbitrary rotation around the polygon, while maintaining both the intended shape specified by the number of vertices (triangle, hexagon, octagon), and the intended width and height of the polygon as specified by the parameter values in the constructor? The Polygon code: class Polygon { PVector position; PShape shape; int w, h, halfW, halfH; color c; ArrayList<PVector> vertexOffsets; Polygon(PVector position, int numVertices, int w, int h, float rotation) { this.position = position; this.w = w; this.h = h; this.halfW = w / 2; this.halfH = h / 2; this.c = color(255); vertexOffsets = new ArrayList<PVector>(); if(numVertices < 3) numVertices = 3; shape = createShape(); shape.beginShape(); shape.fill(255); shape.stroke(255); for(int i = 0; i < numVertices; ++i) { PVector vertex = new PVector(position.x + cos(rotation) * halfW, position.y + sin(rotation) * halfH); shape.vertex(vertex.x, vertex.y); rotation += TWO_PI / numVertices; PVector vertexOffset = vertex.sub(position); vertexOffsets.add(vertexOffset); } shape.endShape(CLOSE); } void move(float x, float y) { position.set(x, y); for(int i = 0; i < shape.getVertexCount(); ++i) { PVector vertexOffset = vertexOffsets.get(i); shape.setVertex(i, position.x + vertexOffset.x, position.y + vertexOffset.y); } } void rotate(float angle) { for(int i = 0; i < shape.getVertexCount(); ++i) { PVector vertexOffset = vertexOffsets.get(i); vertexOffset.rotate(angle); shape.setVertex(i, position.x + vertexOffset.x, position.y + vertexOffset.y); } } void setColour(color c) { this.c = c; } void render() { shape.setFill(c); shape(shape); } } My other issue is that when two polygons with three vertices each collide, they are not always moved out of collision smoothly by the Minimum Translation Vector returned by the SAT algorithm. The polygon moved out of collision by the MTV does not rest against the other polygon as it should, it instead jumps back a small distance. I find this very strange as I have been unable to replicate this behaviour when resolving collisions between polygons of other vertex quantities and I cannot find the flaw in the implementation, though it must be there. What could be causing this incorrect collision resolution, which from my testing appears to only occur between polygons of three vertices? Any help you can provide on these issues would be greatly appreciated. Thank you.
9. ## Algorithm Reading RVA memory address of a given exported function demangled name

Suppose i don't have any linker at hand but i am calling an exported function from a C++ DLL Windows, i.e. sqrt from mvcrt14.dll, how would i get just and only just the Relative Virtual Address of sqrt from that dll to simulate what linker does and convert this call to a call to such RVA on the hexcoded generated .exe file? Either, how would i read the RVA of Mac, Android, iOS and Linux library formats?
10. ## Ray intersection on (axis aligned) swept curve?

I'm struggling to find the correct way to make ray intersection with curve that are swept along an axis.In theory I thought it should be simple, I decompose the problem into component:- find intersection on the curve (cross section), which is easy- extrude that intersection point into an axis aligned line, solve the slope intersection to that line to get the final offset.To be sure I got it right, I'm starting with a swept 45° line centered on origin (line-plane intersection with dot product would be more efficient, but remember I'm trying to validating swiping a long a line).- line equation is origine + directionVector * t- line to line intersection equation is t = (origine1 - origine2)/(directionVector2 - directionVector1) >> assuming they are never parallel.So let line2dIntersection(directionVector1,origine1,directionVector2,origine2)Assuming the ray start on the xy plane (pseudo code):- I first compute the cross section into xzintersection1 = line2dIntersection(rayDir.xz, vector2(origine.x,0), vector2(1,1).normalize(), vector2(0,0));result.xz = raydir.xz * intersection1;-Then find the slope swipe offset into yzintersection2 = line2dIntersection(rayDir.yz, vector2(origine.y,0), vector2(1,0).normalize(), vector2(0,result.z));result.y = raydir.y * intersection2;But all my result are garbage. What am I doing wrong? where is the leap of logic I did?

12. ## Simple organic and brute force dungeon generation

Subscribe to our subreddit to get all the updates from the team! Last month, I made a pretty simple dungeon generator algorithm. It's an organic brute force algorithm, in the sense that the rooms and corridors aren't carved into a grid and that it stops when an area doesn't fit in the graph. Here's the algorithm : Start from the center (0, 0) in 2D Generate a room Choose a side to extend to Attach a corridor to that side If it doesn't fit, stop the generation Attach a room at the end of the corridor If it doesn't fit, stop the generation Repeat from steps 3 to 7 until enough rooms are generated It allowed us to test out our pathfinding algorithm (A* & String pulling). Here are some pictures of the output in 2D and 3D :

14. ## Look what I found....

Haha! Take a look what I uncovered the other day while digging through my books.... It even came with a good old floppy disk! (I couldn't even use it if I wanted to now)... This book was my very first on 3D graphics programming, before 3D graphics card were even a thing (for the mainstream masses anyway). Published in 1994, it's a book that served as a really good primer to understanding 3D geometry and programming even if I never bothered with the later chapters on curved geometry. All the code samples were in C (not even C++) and there was no such mention of things such as texture mapping (although it did cover shadows, reflections and refraction!). It took you right from the beginning; from the equation of a straight line, translations, matrices etc to projections (viewing pyramid) and clipping algorithms. Great stuff! I still find it useful now as I re-read the introductory chapters and refresh myself on a few basics. My other go-to book is "Real Time Collision Detection" by Christer Ericson - an absolute diamond of a book! Do you guys and girls have any books that you rely on or hold in high regard for your game development? It would be interesting to know if anyone has ever seen or read the Georg Glaeser book above. Anyway, I'll soon have another update on The Berg, but it's bed time for me now. Night all.

16. ## How to do projective Texturing and save the piece of projected image into the mesh's texture?

Hi,guys. I need to project a picture from a projector(maybe a camera) onto some meshes and save those into the mesh texture according to the mesh's unfolded UV.It just like the light map which encode the lighting-info into the texture instead of the project-info. The following picture is an example(But it just project without writting into texture).I noticed blender actually has this function that allow you to draw a texture on to a mesh.But i have no idea on how to save those project pixel into the mesh's texture. I think maybe i can finish this function if i have a better understanding about how to produce Light map.Any advises or matertials can help me out?(any idea,any platform,or reference)>?
17. ## Algorithm Reverse Engineering .RAW files from PS2 game

Hello everyone, I'm new here and sorry if this isn't the right place to ask but i asked in a few forums around the internet and no one yet help with it.. I'm have been trying to mod this game for years, but I still stuck with the raw files from RACJIN games, Raw Files [ Mod edit: Removed ] I would like to identify the compression algorithm used to compress these files so that they can be decompressed and analyzed. Game : Naruto Uzumaki Chronicles 2... A.K.A Naruto Konoha Spirits in Japan.
18. ## Removing double vertices algorithm, like blender's?

I'm looking for an algorithm that I can use to remove vertices that are close to each other within a margin of error in a triangular mesh. Pretty much similar to Blender's "Remove Doubles" feature, if anyone is familiar with it. I think the issue isn't just removing the doubles, but also how would I handle the face indices once I remove "duplicate" vertices?
19. ## 3D What's the real tech behind the GPU triangles Clipping

I've learned that the triangle clipping in the rasterization process usually using Sutherland–Hodgman algorithm. I also found an algorithm called "Guard-band". I'm writing a software raster so I want to know what technical the GPU use, I want to implement it for study. Thanks! updated: what's the more proper triangulate algorithm?

21. ## Algorithm RNG

Hi guys i'm new here, i really hope my question won't sound utterly stupid.. I'd like to know whether it's better to use a PRNG or a regular RNG, if you are trying to program your own video slot machine. Actually i don't even have clearly understood the difference between the two =D 2nd question is: which developer i should rely on? I'm following this guide, they talk about RNG but not which one or where to find it. Thank you in advance :)

23. ## How can I fix this bug affecting my AABB function?

Simply put, my function compares two AABBs for collision detection, and when they are around the same size it works fine, but I noticed that if I greatly decreased the size of one of them (or augmented the size of the other), then the collision wouldn't be detected correctly; I would have to have them intersect for a collision to be registered rather than having it register when they are at least in direct contact, which is the intended behavior. Below is my code. local function DetectCollision(a, b) -- AABB to AABB local collisionX = (a.Position.X + a.Size.X) >= b.Position.X and (b.Position.X + b.Size.X) >= a.Position.X local collisionY = (a.Position.Y + a.Size.Y) >= b.Position.Y and (b.Position.Y + b.Size.Y) >= a.Position.Y local collisionZ = (a.Position.Z + a.Size.Z) >= b.Position.Z and (b.Position.Z + b.Size.Z) >= a.Position.Z return collisionX and collisionY and collisionZ end EDIT - To be more specific, the issues start to occur when I cut the size of one of the AABBs in half. For instance, if I had two cubes where one's size is 12 on all axes and the other is six on all axes, then the collision will not register. Upon debugging, I noticed that only one of the collision bools will become false. This seems to depend on what axis the smaller bounding box moves from in relation to the bigger one, so if I moved the smaller AABB away from the bigger one on the y-axis, then collisionY will be false.
24. ## A Brief Introduction to Lerp

Linear interpolation (sometimes called 'lerp' or 'mix') is a really handy function for creative coding, game development and generative art. The function interpolates within the range [start..end] based on a 't' parameter, where 't' is typically within a [0..1] range. For example, divide 'loop time' by 'loop duration' and you get a 't' value between 0.0 and 1.0. Now you can map this 't' value to a new range, such as lerp(20, 50, t) to gradually increase a circle's radius, or lerp(20, 10, t) to gradually decrease its line thickness. Another example: you can use linear interpolation to smoothly animate from one coordinate to another. Define a start point (x1, y1) and end point (x2, y2), then interpolate the 'x' and 'y' dimensions separately to find the computed point in between. Or use linear interpolation to spring toward a moving target. Each frame, interpolate from the current value to the target value with a small 't' parameter, such as 0.05. It's like saying: walk 5% toward the target each frame. A more advanced example, but built on the same concept, is interpolating from one color (red) to another (blue). To do this, we interpolate the (R, G, B) or (H, S, L) channels of the color individually, just like we would with a 2D or 3D coordinate. Another quick example is to choose a random point along a line segment. There are lots of ways to use linear interpolation, and lots more types of interpolation (cubic, bilinear, etc). These concepts also lead nicely into areas like: curves, splines and parametric equations. Source code for each of these examples is available here: https://gist.github.com/mattdesl/3675c85a72075557dbb6b9e3e04a53d9 About the author: Matt DesLauriers is a creative coder and generative artist based in London. He combines code and emergent systems to make art for the web, print media, and physical installations. Note: This brief introduction to lerp was originally published as a Twitter thread and is republished here with the kind permission of the original author. [Wayback Machine Archive]
25. ## Calculating Expierence - Logic? Formula?

Hi! How is XP calculated? For example, to reach level 4, the player should reach 200 XP and to reach level 5 player requires 450 XP. Is there a formula? Is there a tutorial available online? Any source where I can learn how XP is calculated? Can you give me a simple example from any existing game where the XP increases whether the player wins or loses? Thanks!