• Advertisement
Sign in to follow this  

Unity Making an endless runner similar to Turbo Pug

This topic is 437 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Hey everyone!


I'm pretty new to this forum, so don't know if this is the right place to be posting this. I'm a hobbyist game developer, and have been teaching myself Game Maker for a few years now - with a view to moving over to Unity in the near future.


I've been thinking about how to go about making a 2d sidescrolling procedurally generated endless runner, and have worked out the basic programming logic (move the background rather than the player, have platforms spawn and scroll across the screen etc). I've been playing this endless runner called Turbo Pug recently, and have been trying to work out the programming logic for the platforms. I've posted a video below. Most endless runners I've come across vary the length of their platforms (so, kind of like Canabalt), but Turbo Pug seems to have set sizes/types of platform. I'm unsure whether there are different sprites drawn for each platform 'type' (which seems long and unlikely), or whether the developers have meshed together different combinations of tiles. This still seems really complex to me, but if anyone has any ideas about how this might have been done, that would be really helpful! I'm not asking for anyone to code this for me, just to give me some idea of the logic behind it. Cheers!


Turbo Pug: 







Share this post

Link to post
Share on other sites
Yes - and also whether they've drawn different platform types. Or whether they somehow generate these through code?

Share this post

Link to post
Share on other sites

with a view to moving over to Unity in the near future.


I would look at UE4's 2d paper side of the engine. It has some very good resources for this.

Edited by Navyman

Share this post

Link to post
Share on other sites

Procedural generation like this relies on creating a set of rules defining what you want to see and then determining how you can generate data within those bounds.


For what I saw of that game, they made a handful of concepts about floors and ceilings, then they added possible variations to each concept, such as how high or low the thing is, or how wide it is. If your runner accelerates over time then you probably want to increase the width of things (gaps, floors, etc) a little bit over time as well. Other than that you just use the RNG to pick all the variable factors (within the ranges you specified) for the thing, spawn the thing with those factors, and then use the RNG one more time to determine how long it will be before another thing of that type is spawned.


For example, you could have one function for spawning floors, one for spawning ceilings, and one for spawning groups of score tokens. If you use the distance that you've run as a timer then you can have one function for generating and spawning each of those and returning the timestamp of when that function should be called next. Every frame you check the timer against the timesteps and call the functions that need called.


Make sense?

Share this post

Link to post
Share on other sites

Thank you, that makes perfect sense! You've set out a really good method for coding procedural generation, and I think it's within my remit to put something like that together. Although, I'm still unsure whether they used your specific method in Turbo Pug. Most sets of blocks seem to be the same width/appear at the same height every time - and the gaps between them seem to be the same width. If that's the case, do you think they've purely drawn separate sets of sprites (whilst using the procedural generation method you suggested above)?

Edited by matt93

Share this post

Link to post
Share on other sites
I didn't look too closely at it, but when I made a runner with some other folks a while back we had a few tall but narrow sprites that we tiled together to make spans of rooftop of varying lengths and heights.

Whether the developers of Turbo Pug used large sprites or just predefined chunks with tiles doesn't have much impact on the final process, but if they used tiles (it sure looks like it) then it would have been easier for them to create and edit the chunks during development. Tiles are also smaller on disk and in memory, although the difference in this case is probably trivial.

Share this post

Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
  • Advertisement
  • Popular Tags

  • Advertisement
  • Popular Now

  • Similar Content

    • By Innoc uous
      If you want to incorporate noise into your shaders, the Turbulance Library has you covered. Using code I gathered from this library, I made a cginc file that contains all you need to easily implement noise into your unity shaders. Who knows how this stuff works, but man, does it work well!
      Here is an example of what you can create using these noise functions.
    • By Nio Martinez
      I'll be buying a new laptop as my workstation for building games, Mostly 3D but not hard core. 
      I'm stuck at choosing between these 2 specs below. Does this really matter and if so, can some one tell my how and why it matters. 
      Intel core i5-8250U (8th gen Kabylake refresh)(6 MB Smart Cache, 1.6 GHz Base with Turbo Boost up to 3.4 GHz) 4 cores 8 threads
      RAM 8 GB DDR4 (2400 MHz)
      GPU 2 GB DDR5 Nvidia MX150 256 bit
      SSD: yes
      Intel core i7-7500U 2.70GHz Base Processor (4M Cache, up to 3.50 GHz Boost) 2 Cores, 4 Threads
      RAM 4 GB DDR4 (1800 MHz)
      GPU 2 GB DDR5 Nvidia GeForce 940MX 256 bit
      SSD: No
    • By Manuel Berger
      Hello fellow devs!
      Once again I started working on an 2D adventure game and right now I'm doing the character-movement/animation. I'm not a big math guy and I was happy about my solution, but soon I realized that it's flawed.
      My player has 5 walking-animations, mirrored for the left side: up, upright, right, downright, down. With the atan2 function I get the angle between player and destination. To get an index from 0 to 4, I divide PI by 5 and see how many times it goes into the player-destination angle.

      In Pseudo-Code:
      angle = atan2(destination.x - player.x, destination.y - player.y) //swapped y and x to get mirrored angle around the y axis
      index = (int) (angle / (PI / 5));
      PlayAnimation(index); //0 = up, 1 = up_right, 2 = right, 3 = down_right, 4 = down

      Besides the fact that when angle is equal to PI it produces an index of 5, this works like a charm. Or at least I thought so at first. When I tested it, I realized that the up and down animation is playing more often than the others, which is pretty logical, since they have double the angle.

      What I'm trying to achieve is something like this, but with equal angles, so that up and down has the same range as all other directions.

      I can't get my head around it. Any suggestions? Is the whole approach doomed?

      Thank you in advance for any input!
    • By devbyskc
      Hi Everyone,
      Like most here, I'm a newbie but have been dabbling with game development for a few years. I am currently working full-time overseas and learning the craft in my spare time. It's been a long but highly rewarding adventure. Much of my time has been spent working through tutorials. In all of them, as well as my own attempts at development, I used the audio files supplied by the tutorial author, or obtained from one of the numerous sites online. I am working solo, and will be for a while, so I don't want to get too wrapped up with any one skill set. Regarding audio, the files I've found and used are good for what I was doing at the time. However I would now like to try my hand at customizing the audio more. My game engine of choice is Unity and it has an audio mixer built in that I have experimented with following their tutorials. I have obtained a great book called Game Audio Development with Unity 5.x that I am working through. Half way through the book it introduces using FMOD to supplement the Unity Audio Mixer. Later in the book, the author introduces Reaper (a very popular DAW) as an external program to compose and mix music to be integrated with Unity. I did some research on DAWs and quickly became overwhelmed. Much of what I found was geared toward professional sound engineers and sound designers. I am in no way trying or even thinking about getting to that level. All I want to be able to do is take a music file, and tweak it some to get the sound I want for my game. I've played with Audacity as well, but it didn't seem to fit the bill. So that is why I am looking at a better quality DAW. Since being solo, I am also under a budget contraint. So of all the DAW software out there, I am considering Reaper or Presonus Studio One due to their pricing. My question is, is investing the time to learn about using a DAW to tweak a sound file worth it? Are there any solo developers currently using a DAW as part of their overall workflow? If so, which one? I've also come across Fabric which is a Unity plug-in that enhances the built-in audio mixer. Would that be a better alternative?
      I know this is long, and maybe I haven't communicated well in trying to be brief. But any advice from the gurus/vets would be greatly appreciated. I've leaned so much and had a lot of fun in the process. BTW, I am also a senior citizen (I cut my programming teeth back using punch cards and Structured Basic when it first came out). If anyone needs more clarification of what I am trying to accomplish please let me know.  Thanks in advance for any assistance/advice.
    • By Yosef BenSadon
      Hi , I was considering this start up http://adshir.com/, for investment and i would like a little bit of feedback on what the developers community think about the technology.
      So far what they have is a demo that runs in real time on a Tablet at over 60FPS, it runs locally on the  integrated GPU of the i7 . They have a 20 000 triangles  dinosaur that looks impressive,  better than anything i saw on a mobile device, with reflections and shadows looking very close to what they would look in the real world. They achieved this thanks to a  new algorithm of a rendering technique called Path tracing/Ray tracing, that  is very demanding and so far it is done mostly for static images.
      From what i checked around there is no real option for real time ray tracing (60 FPS on consumer devices). There was imagination technologies that were supposed to release a chip that supports real time ray tracing, but i did not found they had a product in the market or even if the technology is finished as their last demo  i found was with a PC.  The other one is OTOY with their brigade engine that is still not released and if i understand well is more a cloud solution than in hardware solution .
      Would there  be a sizable  interest in the developers community in having such a product as a plug-in for existing game engines?  How important  is Ray tracing to the  future of high end real time graphics?
  • Advertisement