• Advertisement

Blogs

Featured Entries

  • Game Design Document, Scoping, Prototyping

    By Stefan Hendriks

    In the monthly progress post I figured I needed a Game Design Document (GDD) before working on extra features. It would be a waste of time working on features when I have no clearer picture where the game should evolve to.   The first thing I did was searching for online resources. I have found a bunch, and decided to share those on a page dedicated to Game Development Resources for other game developers.   While working on the GDD I noticed it takes quite a lot of time and it is easy to get lost into details for a long time.   Then, after a week or so, I came into contact with a guy named Richie and he had the tip that I had to scope my game into 5 minutes of gameplay.   So in 5 minutes, the player has to know: what the game is about the basic mechanics if it is fun to play   With that I began to look at the GDD I made so far. I immediately noticed that I had way too much material for 5 minutes of gameplay. Obviously this is because a RTS game has several phases. It would be easy to discard the 5 minute thing, but I wanted to follow it though: what would be the essence, the end goal of the game?   Together with Dorus Verhoeckx, someone I work with together on our own products and dreams, I started stripping all stuff that might be ‘distracting’ from the main objective. This meant base building is minimized (2 buildings left). Resource gathering was cut – just get money from structures. Have only 1 unit type. Etc.   This was done on paper. A lot of stuff stripped and the core objective exposed. Then I started writing down what was left to do to get this into a working prototype: Prototype, with todo’s on the right. Lots of the has been crossed, since I worked on them lately     So what is the essence? Domination in RTS games. Conquer specific structures, hold out long enough to score domination points. Resource gathering? I ditched that in favor of getting money for specific captured structures. (not all captured structures give you ‘domination points’ though). You play 5 rounds of 1 minute. After 5 rounds – the player with the most domination points wins!   Is it a good idea? Is it fun? What about all the other cool stuff I wanted in my RTS game?   Well lets first see to finish the prototype – I want to get it out before the end of the month. I’d love to hear how you guys think about this kind of game. And if you’re interested in testing out the prototype just let me know at stefan[at]indienamic.com. Then, after that I will decide if I miss stuff, want to re-add things and further iterate. My expectation is that there will be quite some iterations / prototypes / refinements before the final ‘game’ mechanics are there.
    View the full article
    • 0 comments
    • 1199 views
  • Games Look Bad, Part 1: HDR and Tone Mapping

    By Promit

    This is Part 1 of a series examining techniques used in game graphics and how those techniques fail to deliver a visually appealing end result. See Part 0 for a more thorough explanation of the idea behind it. High dynamic range. First experienced by most consumers in late 2005, with Valve’s Half Life 2: Lost Coast demo. Largely faked at the time due to technical limitations, but it laid the groundwork for something we take for granted in nearly every blockbuster title. The contemporaneous reviews were nothing short of gushing. We’ve been busy making a complete god awful mess of it ever since. Let’s review, very quickly. In the real world, the total contrast ratio between the brightest highlights and darkest shadows during a sunny day is on the order of 1,000,000:1. We would need 20 bits of just luminance to represent those illumination ranges, before even including color in the mix. A typical DSLR can record 12-14 bits (16,000:1 in ideal conditions). A typical screen can show 8 (curved to 600:1 or so). Your eyes… well, it’s complicated. Wikipedia claims 6.5 (100:1) static. Others disagree. Graphics programmers came up with HDR and tone mapping to solve the problem. Both film and digital cameras have this same issue, after all. They have to take enormous contrast ratios at the input, and generate sensible images at the output. So we use HDR to store the giant range for lighting computations, and tone maps to collapse the range to screen. The tone map acts as our virtual “film”, and our virtual camera is loaded with virtual film to make our virtual image. Oh, and we also throw in some eye-related effects that make no sense in cameras and don’t appear in film for good measure. Of course we do. And now, let’s marvel in the ways it goes spectacularly wrong. In order: Battlefield 1, Uncharted: Lost Legacy, Call of Duty: Infinite Warfare, and Horizon Zero Dawn. HZD is a particular offender in the “terrible tone map” category and it’s one I could point to all day long. And so we run head first into the problem that plagues games today and will drive this series throughout: at first glance, these are all very pretty 2017 games and there is nothing obviously wrong with the screenshots. But all of them feel videogamey and none of them would pass for a film or a photograph. Or even a reasonably good offline render. Or a painting. They are instantly recognizable as video games, because only video games try to pass off these trashy contrast curves as aesthetically pleasing. These images look like a kid was playing around in Photoshop and maxed the Contrast slider. Or maybe that kid was just dragging the Curves control around at random. The funny thing is, this actually has happened to movies before. Hahaha. Look at that Smaug. He looks terrible. Not terrifying. This could be an in-game screenshot any day. Is it easy to pick on Peter Jackson’s The Hobbit? Yes, it absolutely is. But I think it serves to highlight that while technical limitations are something we absolutely struggle with in games, there is a fundamental artistic component here that is actually not that easy to get right even for film industry professionals with nearly unlimited budgets. Allow me an aside here into the world of film production. In 2006, the founder of Oakley sunglasses decided the movie world was disingenuous in their claims of what digital cameras could and could not do, and set out to produce a new class of cinema camera with higher resolution, higher dynamic range, higher everything than the industry had and would exceed the technical capabilities of film in every regard. The RED One 4K was born, largely accomplishing its stated goals and being adopted almost immediately by one Peter Jackson. Meanwhile, a cine supply company founded in 1917 called Arri decided they don’t give a damn about resolution, and shipped the 2K Arri Alexa camera in 2010. How did it go? 2015 Oscars: Four of the five nominees in the cinematography category were photographed using the ARRI Alexa. Happy belated 100th birthday, Arri. So what gives? Well, in the days of film there was a lot of energy expended on developing the look of a particular film stock. It’s not just chemistry; color science and artistic qualities played heavily into designing film stocks, and good directors/cinematographers would (and still do) choose particular films to get the right feel for their productions. RED focused on exceeding the technical capabilities of film, leaving the actual color rendering largely in the hands of the studio. But Arri? Arri focused on achieving the distinctive feel and visual appeal of high quality films. They better understood that even in the big budget world of motion pictures, color rendering and luminance curves are extraordinarily difficult to nail. They perfected that piece of the puzzle and it paid off for them. Let’s bring it back to games. The reality is, the tone maps we use in games are janky, partly due to technical limitations. We’re limited to a 1D luminance response where real film produces both hue and saturation shifts. The RGB color space is a bad choice to be doing this in the first place. And because nobody in the game industry has an understanding of film chemistry, we’ve all largely settled on blindly using the same function that somebody somewhere came up with. It was Reinhard in years past, then it was Hable, now it’s ACES RRT. And it’s stop #1 on the train of Why does every game this year look exactly the goddamn same? The craziest part is we’re now at the point of real HDR televisions showing game renders with wider input ranges. Take this NVIDIA article which sees the real problem and walks right past it. The ACES tone map is destructive to chroma. Then they post a Nikon DSLR photo of a TV in HDR mode as a proxy for how much true HDR improves the viewing experience. Which is absolutely true – but then why does the LDR photo of your TV look so much better than the LDR tone map image? There’s another tone map in this chain which nobody thought to examine: Nikon’s. They have decades of expertise in doing this. Lo and behold, their curve makes a mockery of the ACES curve used in the reference render. Wanna know why that is? It’s because the ACES RRT was never designed to be an output curve in the first place. Its primary design goal is to massage differences between cameras and lenses used in set so they match better. You’re not supposed to send it to screen! It’s a preview/baseline curve which is supposed to receive a film LUT and color grading over top of it. “Oh, but real games do use a post process LUT color grade!” Yeah, and we screwed that up too. We don’t have the technical capability to run real film industry LUTs in the correct color spaces, we don’t have good tools to tune ours, they’re stuck doing double duty for both “filmic look” as well as color grading, the person doing it doesn’t have the training background, and it’s extraordinary what an actual trained human can do after the fact to fix these garbage colors. Is he cheating by doing per-shot color tuning that a dynamic scene can’t possibly accomplish? Yes, obviously. But are you really going to tell me that any of these scenes from any of these games look like they are well balanced in color, contrast, and overall feel? Of course while we’re all running left, Nintendo has always had a fascinating habit of running right. I can show any number of their games for this, but Zelda: Breath of the Wild probably exemplifies it best when it comes to HDR.  No HDR. No tone map. The bloom and volumetrics are being done entirely in LDR space. (Or possibly in 10 bit. Not sure.) Because in Nintendo’s eyes, if you can’t control the final outputs of the tone mapped render in the first place, why bother? There’s none of that awful heavy handed contrast. No crushed blacks. No randomly saturated whites in the sunset, and saturation overall stays where it belongs across the luminance range. The game doesn’t do that dynamic exposure adjustment effect that nobody actually likes. Does stylized rendering help? Sure. But you know what? Somebody would paint this. It’s artistic. It’s aesthetically pleasing. It’s balanced in its transition from light to dark tones, and the over-brightness is used tastefully without annihilating half the sky in the process. Now I don’t think that everybody should walk away from HDR entirely. (Probably.) There’s too much other stuff we’ve committed to which requires it. But for god’s sake, we need to fix our tone maps. We need to find curves that are not so aggressively desaturating. We need curves that transition contrast better from crushed blacks to mid-tones to blown highlights. LUTs are garbage in, garbage out and they cannot be used to fix bad tone maps. We also need to switch to industry standard tools for authoring and using LUTs, so that artists have better control over what’s going on and can verify those LUTs outside of the rendering engine. In the meantime, the industry’s heavy hitters are just going to keep releasing this kind of over-contrasty garbage. Before I finish up, I do want to take a moment to highlight some games that I think actually handle HDR very well. First up is Resident Evil 7, which benefits from a heavily stylized look that over-emphasizes contrast by design. That’s far too much contrast for any normal image, but because we’re dealing with a horror game it’s effective in giving the whole thing an unsettling feel that fits the setting wonderfully. The player should be uncomfortable with how the light and shadows collide. This particular scene places the jarring transition right in your face, and it’s powerful. Next, at risk of seeming hypocritical I’m going to say Deus Ex: Mankind Divided (as well as its predecessor). The big caveat with DX is that some scenes work really well. The daytime outdoors scenes do not. The night time or indoor scenes that fully embrace the surrealistic feeling of the world, though, are just fantastic. Somehow the weird mix of harsh blacks and glowing highlights serves to reinforce the differences between the bright and dark spots that the game is playing with thematically throughout. It’s not a coincidence that Blade Runner 2049 has many similarities. Still too much contrast though. Lastly, I’m going to give props to Forza Horizon 3.   Let’s be honest: cars are “easy mode” for HDR. They love it. But there is a specific reason this image works so well. It is low contrast. Nearly all of it lives in the mid-tones, with only a few places wandering into deep shadow (notably the trees) and almost nothing in the bright highlights. But the image is low contrast because cars themselves tend to use a lot of black accents and dark regions which are simply not visible when you crush the blacks as we’ve seen in other games. Thus the toe section of the curve is lifted much more than we normally see. Similarly, overblown highlights mean whiting out the car in the specular reflections, which are big and pretty much always image based lighting for cars. It does no good to lose all of that detail, but the entire scene benefits from the requisite decrease in contrast. The exposure level is also noticeably lower, which actually leaves room for better mid-tone saturation. (This is also a trick used by Canon cameras, whose images you see every single day.) The whole image ends up with a much softer and more pleasant look that doesn’t carry the inherent stress we find in the images I criticized at the top. If we’re looking for an exemplar for how to HDR correctly in a non-stylized context, this is the model to go by. Where does all this leave us? With a bunch of terrible looking games, mostly. There are a few technical changes we need to make right up front, from basic decreases in contrast to simple tweaks to the tone map to improved tools for LUT authoring. But as the Zelda and Forza screenshots demonstrate, and as the Hobbit screenshot warns us, this is not just a technical problem. Bad aesthetic choices are being made in the output stages of the engine that are then forced on the rest of the creative process. Engine devs are telling art directors that their choices in tone maps are one of three and two are legacy options. Is it bad art direction or bad graphics engineering? It’s both, and I suspect both departments are blaming the other for it. The tone map may be at the end of graphics pipeline, but in film production it’s the first choice you make. You can’t make a movie without loading film stock in the camera, and you only get to make that choice once (digital notwithstanding). Don’t treat your tone map as something to tweak around the edges when balancing the final output LUT. Don’t just take someone else’s conveniently packaged function. The tone map’s role exists at the beginning of the visual development process and it should be treated as part of the foundation for how the game will look and feel. Pay attention to the aesthetics and visual quality of the map upfront. In today’s games these qualities are an afterthought, and it shows. UPDATE: User “vinistois” on HackerNews shared a screenshot from GTA 5 and I looked up a few others. It’s very nicely done tone mapping. Good use of mid-tones and contrast throughout with great transitions into both extremes. You won’t quite mistake it for film, I don’t think, but it’s excellent for something that is barely even a current gen product. This is proof that we can do much better from an aesthetic perspective within current technical and stylistic constraints. Heck, this screenshot isn’t even from a PC – it’s the PS4 version.

    View the full article
    • 0 comments
    • 1205 views
  • Node Graphs and the Terrain Editor

    By JTippetts

    I've been working on the node graph editor for noise functions in the context of the Urho3D-based Terrain Editor I have been working on. It's a thing that I work on every so often, when I'm not working on Goblinson Crusoe or when I don't have a whole lot of other things going on. Lately, it's been mostly UI stuff plus the node graph stuff. The thing is getting pretty useful, although it is still FAR from polished, and a lot of stuff is still just broken. Today, I worked on code to allow me to build and maintain a node graph library. The editor has a tool, as mentioned in the previous entry, to allow me to use a visual node graph system to edit and construct chains/trees/graphs of noise functions. These functions can be pretty complex: I'm working on code to allow me to save these graphs as they are, and also to save them as Library Nodes. Saving a graph as a Library Node works slightly differently than just saving the node chain. Saving it as a Library Node allows you to import the entire thing as a single 'black box' node. In the above graph, I have a fairly complex setup with a cellular function distorted by a couple of billow fractals. In the upper left corner are some constant and seed nodes, explicitly declared. Each node has a number of inputs that can receive a connection. If there is no connection, when the graph is traversed to build the function, those inputs are 'hardwired' to the constant value they are set to. But if you wire up an explicit seed or constant node to an input, then when the graph is saved as a Library Node, those explicit constants/seeds will be converted to the input parameters for a custom node representing the function. For example, the custom node for the above graph looks like this: Any parameter to which a constant node was attached is now tweakable, while the rest of the graph node is an internal structure that the user can not edit. By linking the desired inputs with a constant or seed node, they become the customizable inputs of a new node type. (A note on the difference between Constant and Seed. They are basically the same thing: a number. Any input can receive either a constant or a seed or any chain of constants, seeds, and functions. However, there are special function types such as Seeder and Fractal which can iterate a function graph and modify the value of any seed functions. This is used, for example, to re-seed the various octaves of a fractal with different seeds to use different noise patterns. Seeder lets you re-use a node or node chain with different seeds for each use. Only nodes that are marked as Seed will be altered.) With the node graph library functionality, it will be possible to construct a node graph and save it for later, useful for certain commonly-used patterns that are time-consuming to set up, which pretty much describes any node graph using domain turbulence. With that node chain in hand, it is easy enough to output the function to the heightmap: Then you can quickly apply the erosion filter to it: Follow that up with a quick Cliffify filter to set cliffs: And finish it off with a cavity map filter to place sediment in the cavities: The editor now lets you zoom the camera all the way in with the scroll wheel, then when on the ground you can use WASD to rove around the map seeing what it looks like from the ground. Still lots to do on this, such as, you know, actually saving the node graph to file. but already it's pretty fun to play with.
    • 0 comments
    • 1442 views
  • Code Reuse In Actual Practice

    By ApochPiQ

    It’s very common to hear engineers talking about "code reuse" - particularly in a positive light. We love to say that we’ll make our designs "reusable". Most of the time the meaning of this is pretty well understood; someday, we want our code to be able to be applied to some different use case and still work without extensive changes. But in practice, code reuse tends to fall flat. A common bit of wisdom is that you shouldn’t even try to make code reusable until you have three different use cases that would benefit from it. This is actually very good advice, and I’ve found it helps a lot to step back from the obsession with reusability for a moment and just let oneself write some "one-off" code that actually works. This hints at the possibility of a few flaws in the engineering mindset that reuse is a noble goal. Why Not Reuse? Arguing for reuse is easy: if you only have to write and debug the code once, but can benefit from it multiple times, it’s clearly better than writing very similar code five or six times…​ right? Yes and no. Premature generalization is a very real thing. Sometimes we can’t even see reuse potential until we’ve written similar systems repeatedly, and then it becomes clear that they could be unified. On the flip side, sometimes we design reusable components that are so generic they don’t actually do what we needed them to do in the first place. This is a central theme of the story of Design Patterns as a cultural phenomenon. Patterns were originally a descriptive thing. You find a common thread in five or six different systems, and you give it a name. Accumulate enough named things, though, and people start wanting to put the cart before the horse. Patterns became prescriptive - if you want to build a Foo, you use the Bar pattern, duh! So clearly there is a balancing act here. Something is wrong with the idea that all code should be reusable, but something is equally wrong with copy/pasting functions and never unifying them. But another, more insidious factor is at play here. Most of the time we don’t actually reuse code, even if it was designed to be reusable. And identifying reasons for this lapse is going to be central to making software development scalable into the future. If we keep rewriting the same few thousand systems we’re never going to do anything fun. Identifying Why We Don’t Reuse Here’s a real world use case. I want to design a system for handling callbacks in a video game engine. But I’ve already got several such systems, built for me by previous development efforts in the company. Most of them are basically the exact same thing with minor tweaks: Define an "event source" Define some mechanism by which objects can tell the event source that they are "interested" in some particular events When the event source says so, go through the container of listeners and give them a callback to tell them that an event happened Easy. Except Guild Wars 2 alone has around half a dozen different mechanisms for accomplishing this basic arrangement. Some are client-side, some are server-side, some relay messages between client and server, but ultimately they all do the exact same job. This is a classic example of looking at existing code and deciding it might be good to refactor it into a simpler form. Except GW2 is a multi-million line-of-code behemoth, and I sure as hell don’t want to wade through that much code to replace a fundamental mechanism. So the question becomes, if we’re going to make a better version, who’s gonna use it? For now the question is academic, but it’s worth thinking about. We’re certainly not going to stop making games any time soon, so eventually we should have a standardized callback library that everyone agrees on. So far so good. But what if I want to open-source the callback system, and let other people use it? If it’s good enough to serve all of ArenaNet’s myriad uses, surely it’d be handy elsewhere! Of course, nobody wants a callback system that’s tied to implementation details of Guild Wars 2, so we need to make the code genuinely reusable. There are plenty of reasons not to use an open-source callback library, especially if you have particular needs that aren’t represented by the library’s design. But the single biggest killer of code reuse is dependencies. Some dependencies are obvious. Foo derives from base class Bar, therefore there is a dependency between Foo and Bar, for just one example. But others are more devilish. Say I published my callback library. Somewhere in there, the library has to maintain a container of "things that care about Event X." How do we implement the container? Code reuse is the name of the game here. The obvious answer (outside of game dev) is to use the C++ Standard Library, such as a std::vector or std::map (or both). In games, though, the standard library is often forbidden. I won’t get into the argument here, but let’s just say that sometimes you don’t get to choose what libraries you rely on. So I have a couple of options. I can release my library with std dependencies, which immediately means it’s useless to half my audience. They have to rewrite a bunch of junk to make my code interoperate with their code and suddenly we’re not reusing anything anymore. The other option is to roll my own container, such as a trivial linked list. But that’s even worse, because everyone has a container library, and adding yet another lousy linked list implementation to the world isn’t reuse either. Policy-Based Programming to the Rescue The notion of policy-based architecture is hardly new, but it is sadly underused in most practical applications. I won’t get into the whole exploration of the idea here, since that’d take a lot of space, and I mostly just want to give readers a taste of what it can do. Here’s the basic idea. Let’s start with a simple container dependency. class ThingWhatDoesCoolStuff { std::vector<int> Stuff; }; This clearly makes our nifty class dependent on std::vector, which is not great for people who don’t have std::vector in their acceptable tools list. Let’s make this a bit better, shall we? template <typename ContainerType> class ThingWhatDoesCoolStuff { ContainerType Stuff; }; // Clients do this ThingWhatDoesCoolStuff<std::vector<int>> Thing; Slightly better, but now clients have to spell a really weird name all the time (which admittedly can be solved to great extent with a typedef and C++11 using declarations). This also breaks when we actually write code: template <typename ContainerType> class ThingWhatDoesCoolStuff { public: void AddStuff (int stuff) { Stuff.push_back(stuff); } private: ContainerType Stuff; }; This works provided that the container we give it has a method called push_back. What if the method in my library is called Add instead? Now we have a compiler error, and I have to rewrite the nifty class to conform to my container’s API instead of the C++ Standard Library API. So much for reuse. You know what they say, you can solve any problem by adding enough layers of indirection! So let’s do that real quick. // This goes in the reusable library template <typename Policy> class ThingWhatDoesCoolStuff { private: // YES I SWEAR THIS IS REAL SYNTAX typedef typename Policy::template ContainerType<int> Container; // Give us a member container of the desired type! Container Stuff; public: void AddStuff (int stuff) { using Adapter = Policy::ContainerAdapter<int>; Adapter::PushBack(&Stuff, stuff); } }; // Users of the library just need to write this once: struct MyPolicy { // This just needs to point to the container we want template <typename T> using ContainerType = std::vector<T>; template <typename T> struct ContainerAdapter { static inline void PushBack (MyPolicy::ContainerType * container, T && element) { // This would change based on the API we use container->push_back(element); } }; }; Let’s pull this apart and see how it works. First, we introduce a template "policy" which lets us decouple our nifty class from all the things it relies on, such as container classes. Any "reusable" code should be decoupled from its dependencies. (This by no means the only way to do so, even in C++, but it’s a nice trick to have in your kit.) The hairy parts of this are really just the syntax for it all. Effectively, our nifty class just says "hey I want to use some container, and an adapter API that I know how to talk to. If you can give me an adapter to your container I’ll happily use it!" Here we use templates to avoid a lot of virtual dispatch overhead. Theoretically I could make a base class like "Container" and inherit from it and blah blah vomit I hate myself for just thinking this. Let’s not explore that notion any further. What’s cool is that I can keep the library code 100% identical between projects that do use the C++ Standard Library, and projects which don’t. So I could publish my callback system exactly once, and nobody would have to edit the code to use it. There is a cost here, and it’s worth thinking about: any time someone reuses my code, they have to write a suitable policy. In practice, this means you write a policy about once for every time you change your entire code base to use a different container API. In other words, pffffft. For things which aren’t as stable as containers, the policy cost may become more significant. This is why you want to reuse in only carefully considered ways, preferably (as mentioned earlier) when you have several use cases that can benefit from that shared abstraction. Concluding Thoughts One last idea to consider is how the performance of this technique measures up. In debug builds, it can be a little ugly, but optimized builds strip away literally any substantial overhead of the templates. So runtime performance is fine, but what about build times themselves? Admittedly this does require a lot of templates going around. But the hope is that you’re reusing simple and composable components, not huge swaths of logic. So it’s easy to go wrong here if you don’t carefully consider what to apply this trick to. Used judiciously, however, it’s actually a bit better of a deal than defining a lot of shared abstract interfaces to decouple your APIs. I’ll go into the specific considerations of the actual callback system later. For now, I hope the peek at policy-based decoupling has been useful. Remember: three examples or you don’t have a valid generalization!
    View the full article
    • 1 comment
    • 6900 views
  • Dystopian Lights: Dev Blog #1

    By kostile

    So to start the day I am going to share with you guys how I use touch controls and touch "joysticks" in Dystopian Lights (previously Neon Android) to control movement and crosshair aim. Initially I had a floating joystick system where you would tap and hold in a certain area but the response to that system has been heavily negative, so I have redone the code running the joysticks and with refactoring I have made them static as well as extremely reusable. This code is tested in monogame for android and should work for iOS as well if you are using the mono and the monogame framework. What I wasn't aware of when I wrote my first joysticks, is that monogame has a cool feature for touchlocation IDs that allows for greater control without having to worry about which touch is which, we already know if we assign it an ID on touch. You can view the code and use however you want if you decide you like it. If you have ever used monogame and you look at the scaling in the draw method, you will notice that I multiply the scale by 0.001f. This is because the Primitives.circle texture has a radius of 500. There may be some other oddities in the code, if you do use it and have any questions, don't hesitate to ask. The code is at the end of the blog. So this is one of the updates for the game, as well as a proper title. I have been developing the game under a working title "Neon Android" for quite some time but I have always known that this isn't going to fly as the actual title of the game. Dystopian Lights rang a good bell for me when thinking about how I want the game portrayed. For now it sits as a very similar, almost clone of, Geometry Wars. This is intended for art and play style to some degree, but with many varying factors I think the two games will stand far apart from one another once Dystopian Lights is a finished product. One of the other upcoming changes for the game is the change in the GUI. It has been heavily improved on mobile platforms. Previously the GUI was tiny and fit for a computer, but now it looks okay on Android and you can actually navigate the menus when upgrading your weapons. Also I have changed the buttons for weapon selection to be highlighted when selected so you actually know what's firing without having to pay attention to your projectile.   So from here, the plan is to release an update on my website by this Friday, which will include the changes I have showcased here as well as a few other minor changes. Right now the game is only available on Android and is available for download from my website, https://solumgames.com so please head there if you want to check it out. I will have more Dev Blogs in the future and hope to do a minimum of one a week until the end of the year, when I will release it on the Android app market and then see about working for a PC release. The largest problem I have encountered so far while developing this game is the spawning pattern. If anyone has any recommendations for setting up a spawner that gets infinitely more difficult, please feel free to give your input. I have a system right now that works, but its scaling is terrible and it feels like end game within about 5 minutes. There is some obvious tweaking and balancing that I still need to work out as well, so if anyone trys the game and would enjoy something being different, feel free to let me know what I should change. Some of the ideas I have for the future of the game will be to include a story mode. My thoughts keep going around to scripted spawns, and then some animations with full screen-width chat boxes that show an avatar of the character speaking. There will be at least a few boss fights and some over the top story that would make this game seem like a space opera straight out of the cheesiest movie you have ever seen on the Sci-Fi network. I have also been working out this idea where your player has a capacitor that can only sustain so many weapons for so long. If I do add it, the shot weapon will probably always be available, when you use your missile and laser in tandem with something else, depending on your capacitor recharge rate and max capacitor, you will run out of energy which will cause your capacitor to short and then the only weapon available until it fully recharges will be the shot weapon. Health packs are coming as well. They will be random drops from enemies and they will heal different amounts based on the enemy killed.  So far I am happy to see that I have about 40 downloads of the game just from my website alone. I have not received any feedback from any users of the software that I don't know, so please if you are downloading it, speak up with any critique or recommendations. I would love to hear, the good, the bad, and the ugly.   public class Joystick { private Vector2 outerLocation; private Vector2 innerLocation; private Vector2 direction; private Color innerColor = new Color(0.2f, 0.2f, 0.2f, .5f); private Color outerColor = new Color(.5f, .5f, .5f, .5f); private float innerRadiusRatio; private float outerRadius; private int touchId; public Joystick(Vector2 location, int radius) { outerLocation = location; innerLocation = location; outerRadius = radius; innerRadiusRatio = 0.3f; } public Vector2 Direction { get { return direction; } } public void Update(TouchCollection touchCollection) { foreach (TouchLocation tl in touchCollection) { if (tl.State == TouchLocationState.Pressed) { if (AOneMath.Distance(outerLocation.X, tl.Position.X, outerLocation.Y, tl.Position.Y) <= outerRadius) { touchId = tl.Id; } } if (touchId != tl.Id) { continue; } if (tl.State == TouchLocationState.Moved) { float dirX = tl.Position.X - outerLocation.X; float dirY = tl.Position.Y - outerLocation.Y; direction.X = dirX; direction.Y = dirY; direction.Normalize(); float length = AOneMath.Distance(tl.Position.X, outerLocation.X, tl.Position.Y, outerLocation.Y); if (length > outerRadius - outerRadius * 0.5f) { length = outerRadius - outerRadius * 0.5f; } innerLocation.X = outerLocation.X + direction.X * length; innerLocation.Y = outerLocation.Y + direction.Y * length; } if (tl.State == TouchLocationState.Released) { innerLocation = outerLocation; direction.X = 0; direction.Y = 0; } } } public void Draw(SpriteBatch spriteBatch) { spriteBatch.Draw(Primitives.circle, position: Vector2.Subtract(outerLocation, new Vector2(outerRadius / 2, outerRadius / 2)), scale: new Vector2(outerRadius * 0.001f, outerRadius * 0.001f), color: innerColor); spriteBatch.Draw(Primitives.circle, position: Vector2.Subtract(innerLocation, new Vector2(outerRadius * innerRadiusRatio / 2, outerRadius * innerRadiusRatio / 2)), scale: new Vector2(outerRadius * innerRadiusRatio * 0.001f, outerRadius * innerRadiusRatio * 0.001f), color: outerColor); } }  
    • 0 comments
    • 1074 views

Our community blogs

  1. My best day-to-day tool will surprise you !

    Hi everyone, i'm super happy to be back and i'm really sorry if it took some time. I decided to rework how we will post information to now include video to provide a more visual interaction. The life a indie game developer is hard, so any tool you can find to save time, produce faster, improve your workflow and enhance your skill is always welcome. I'm going to share with you some of the tool that make my life easier.

    Make sure to tell us what are your favorite tool in the comment below

    Not in the mood of reading ? Watch the full episode

     

     

    Tool for tablet owner

    The first plugin you should look into if you own a table is LazyNemuziPro. Your tablet already cost you a lot, and I personally really enjoy having one because it's much better for creating sketch and concept art, it's super precise for texture editing/creating. It work well in substance to paint your asset and it's essential for any zbrushing work you need. That said, having the right pressure feeling is hard and you often end up with having jaggy line or not enough pressure range to achieve the effect you are expecting. I have a tendency to push hard on my pen and making my Wacom less sensitive help, but not much. This tool will not only improve your overall skill by smoothing out the pressure and the line itself, removing all unwanted movement to break that professional looking smooth line. The tool also come with a tons of usefull tool such as parallel line, ellipse, perspective, fish eye perspective, isometric etc. Really awesome tool, Wacom should provide something like that in the first place, but they don't so grab this tool :)à

    Tools for photoshop

    You spend a lot of time in photoshop, so why not improving you tool set in there. Really simple Brush box plugin will allow you to better visualize and organize your brush by assigning folder, color, favorite, etc. Simple drop and drag operation and better management make it quicker to find the brush you need.

    I also really enjoy Coolorus which replace the color swatch of photoshop by providing you a much better and natural color circle that Corel Draw user are more use to. It help you save and manage custom swatches, it allow you to work your color based on lighting and shade, you can limit your gammut to a lesser range so you can work in a certain color palette. It's a very simple tool that will help you everyday.

    All those tool would be worthless without a good set of brushes to make your art. There is one fabulous artist Roman Melentyev that sell over creativemarket an impressive brush collection. Have a look and find the one that are right for you https://creativemarket.com/RainbowWings I picked up the Cocept Artist III bundle that come with many of his brushes and i'm really pleased with them. They work well. I especially enjoy the Markers which I like to draw with. Those are my go to for sketch.

    Tool for coder

    This tool is probably the most interesting one. If you are a coder, you need it. It's been my favorite recent discovery because it made me save literally 1 week on a single asset. I buy asset on Unity Asset Store that I find interesting, many are of very high quality and created by expert developer, sometimes from reknown studio. There is no reason why not to use them. That said, once you include them in your game, most of the time it require tweaking and modification so that it perform the way you want for your game. I see most of the asset as a starting point that you need to improve. The best practice when you do change the behavior of an asset or that you want to link it to your need you try to stay out of the original script and you push/pull data out of it. That said, you eventually end up having no choice but to directly modify one, either because it's just way easier and faster or because you have no other choice. Once you start doing this, you are creating a barrier between your work and the ability to update the asset which would overwrite your code. Manually replacing and rebuilding an asset each and every update is time consuming, annoying and long. With the tool call Code Compare all the thinking is done immediately and you can update your script with a couple of click. We have been improving this asset for a few month and we work with the original developer to get more feature into the asset and after a long period and skipping many update we had no choice to update. We estimated it would took a week since the asset contain like 30 scripts, thousand and thousand of line and we have made important modification in at least half of them. The most recent update was mostly a full rewrite. With this tool, we were able to accomplish the updated in less than a day and everything was working flawlessly. Check out the video or the software website for more information.

    The big tool that cost money

    If you have a little budget, take a look at the Indie licence of Allegorithmic. Substance are THE thing right now and their tool are amazing. It's a well known tool and it's worth it. If you don't have enough money, you can at least make use of CrazyBump, an old application that create normal map for your texture, very sweet, but honestly Bitmap2Material from Allegorithmic does a much better job. If you are looking for a LOD tool, i especially like Polygon Cruncher, but it's pricy. Unity asset store offer you cheaper solution that does all the naming and compression for you. ZBrush doesn't need any introduction if you have a fat wallet. For 3D Modeling we use Lightwave3D. This is simply the BEST 3d software you can get. The quality of the render may be not the best, but it come really complete with all the tool you need for animation and modeling without the need of buying anything else and the software cost less than any Autodesk solution. 

    Made In Canada - Un petit mot d'introduction en français !

     

    • 0
      entries
    • 0
      comments
    • 47
      views

    No blog entries yet

  2. Another week of development has come and gone with a bit less overt productivity but a much more game-like product. So far the game is coming together decently and has a "has potential" feel in my opinion. I've not spent much time on getting the pacing right and, without music, the game is lacking in atmosphere. However shooting feels good and I feel that the scoring mechanic gives just enough impetus for the player to be aggressive to keep the pace up.

    What we did last week

    Day 1 was finishing up what was started at the end of last week. When the player is hit the screen shakes and fades out. If the player is out of lives the game ends, if not the player respawns after a brief delay. Also set up a proper model for the player's gun.

    large.gameover.gif.14013c43a91b1e5c635bb

    Day 2 was unable to work due to personal commitments

    Day 3 created models for different shots and a model for the player character.

    Day 4 added scoring mechanics and got the status bar working completely.

    chain.gif.518757e9826cb6f6d9e122a3bf536785.gif

    Day 5 rigged the player character and got some really minimal inverse kinematics going. Looks janky but it's a start.  

    ik.gif.976970c24280763d3272cde4ca227790.gif

    Day 6 not visibly productive. Animated the player character then ran into tons of issues with importing the animations causing, among other issues, the player model to embed itself in the floor.

    Day 7 resolved the import issues by moving the player model in Blender so its median is at the origin. This stopped the model from jumping around so I could place it accurately. I suspect it's due to parenting issues which are discussed in the final paragraphs of this article

  3. If you haven’t peeked into the Corona Marketplace recently, it now offers dozens of plugins and assets, from art packs to audio tracks to useful utility plugins. Periodically, we will highlight a few exciting products which can help you develop your dream app using Corona.

    Halloween Skeleton Game 2D Character Sprite

    halloween-sprite-icon-125x71.jpgHalloween may seem like it’s a long time off, but now is a good time to start planning your fall games out. This collection of animated sprites for a pumpkin hero can get you started. Check it out!

    Synthwave Vol.2

    synthwave-logo-125x125.jpgSynthwave Vol.2 is a collection of sci-fi sounds from Nocturnal Animals, including music and sound effects that can be used in a variety of games. The pack contains 4 music tracks and 20 sound effects.

    Progress Ring

    progressRinglogo-125x125.pngThe Progress Ring plugin from Jason Schroeder allows you to add customizable circular progress rings to your projects. They can be used for anything from health bars to timers to business apps. They can be added to your project in as little as one line of code.

     


    View the full article

  4. Release of kit "Close Combat: Fighter". This add-on is devoted to hand-to-hand combat. A small story in the style of Max Payne and john wick - there are subtitles.

  5. I'm a man on a Mobile Gaming Quest (MGQ) to play a new mobile game every day, documenting my first impressions here and on YouTube. Below is the latest episode. Here's a quick overview of all games I've covered so far.

    Don't let the graphics fool ya', Wizard's Wheel: ReRolled is an indie idle RPG with more depth than most mobile games can only hope to achieve.

    There's buildings to buy, loot to find, weapons to upgrade, dragons to slay, heroes to hire, and of course time to warp (makes you restart from zero with some advantages), as you experience the game and world slowly expanding. The game monetizes through a few IAPs going up to $10, and incentivized ads, which aren't pushed heavily either. 

    Although I was a bit intimidated by the sheer complexity of the game at first, I've only come to love it more and more the more I play - it's close to everything I'd expect from a great indie game! 

    My thoughts on Wizard's Wheel:


    Google Play: https://play.google.com/store/apps/details?id=com.WindingClock.WizardsWheel&hl=en
    iOS: https://itunes.apple.com/us/app/wizards-wheel/id1273827438?mt=8

    Subscribe on YouTube for more commentaries: https://goo.gl/xKhGjh
    Or join me on Facebook: https://www.facebook.com/mobilegamefan/
    Or Instagram: https://www.instagram.com/nimblethoryt/
    Or Twitter: https://twitter.com/nimblethor

  6. Greetings! 

    The third devlog is a video demonstration of what I have been worked on in the past several weeks. So, here is what the gameplay looks like in the current state of development:


    There is still a lot of work to do and release date is not exactly defined, but Steam page for the game has been already set:

    http://store.steampowered.com/app/803850/Pixelpunk_XL/

  7. Latest Entry

    I thought of limiting the comma between players and host by changing how often they receive updates. If the screen is 2000 pixels wide and a character takes up the whole screen at 1 meter then at 1/(2000/(d)) it will only have a pixel of change for a meter of movement so if the maximum speed is 1 m/2s you don’t need to worry any more than every 2 s. Also for bullets you would have a last agreed upon position received and sent between each player so they know who their bullet cone effects that they have to sent trigger updates to. The width grows linearly but the area of the screen that we need to worry about updating grows inverse squarely 1/(2000/(d^2)).

    • The player can now get dropped into the game and go from beginning to end.
    • Player HUD completed.
    • New player tutorial completed

    Image 442.png

  8. Mobile apps are an expensive thing to build and given the competition it faces from the millions of others already present in the market, it is also a risky proposition. So, before you rush out to hire application developers to give shape to your fantastic idea, it is necessary that you have already figured out the aftermaths  

    Additionally, despite having considerable technological developments over the years, developing a sophisticated mobile app is still a slow process that requires even the top mobile app development companies months, if not a year, to deploy.

    Now, to overcome this overhead of cost and time of deployment, starting your venture with an MVP (Minimum Viable Product) may just be the best route to take. Not only is its development relatively quick and inexpensive for developing a full-scale app, but it also helps businesses to test their idea and gain the first-mover advantage before they commit considerable resources and efforts in their app venture.

    Now that it is clear why businesses should start their app idea with an MVP, here is how to actually do it:

    Figure out the core objective- Yes, your app performs a function and has some unique features that you can market later, but unless you figure out what exactly is the problem that your app solves, it is nearly impossible to draw a roadmap from idea conception to success. Not only will it bring clarity to the project, but will also indicate who your core audience is going to be form the start

    Look for competitors- Having competitors isn’t as bad as it sounds. Without them, the only way you can learn is by your own mistakes. So, when you start to implement your idea, look for the businesses already providing similar services and study what the users like and don’t like about their product. , Eventually, this will help you refine your own service without actually risking your prospects.

    Layout and user flow- The only correct way of defining the layout and work flow of your app is the way your users want it to be. They should, at no point, feel that the app is forcing them to take multiple steps to achieve the simplest of tasks. This is a general rule of thumb to minimize the number of clicks that a user has to do to reach certain sections or avail certain services. Simply reminding yourself that you are not the intended user and what seems right to you might not be the same for your potential users, will go a long way in achieving the optimum outcome.

    List and prioritize your features- Now, this is where it gets tricky. Though it is possible for an app to incorporate any number of features, it is simply not feasible to incorporate them all. Not only will it cost an inordinate amount of time and money, but the usability of such apps declines exponentially. So, even though you might have many ideas about what your app should do, you need to rationally evaluate all such features and create a priority list. The most feasible way to achieve this is to measure the value each feature brings to your app against its overhead.

    Agile development- Once you have everything figured out, it’s time for the actual development to kick-off. The best way is to take the agile route where all the concerned parties are constantly kept in the loop to promote transparency, eliminating any chances of miscommunication.

    Learn as you lead- The whole purpose of deploying MVP is to provide businesses with the liberty of making quick mistakes, improving upon which, they can create a more reliable and feasible product in the future. So, in a sense, it is only after that the MVP is completely developed that the real development starts.

    Do you have an idea that you are itching into turn to a marketable product but isn’t sure of the response it will gain? And while you might be inclined to either take or drop that risk, an app development company suggest you adopt a pure business approach- the calculated risk. Project your idea in the shape of an MVP, that can be built in a small percentage of cost and time that the full-fledged app requires, while still allowing yourself enough room to make rational changes in the future.

  9. hero.jpg.fba97f9a5478c7dbc95cb46e539b0ea8.jpg

    For a long time I had been delaying finding a solution to feet etc interpenetrating terrain in my game. Finally I asked for suggestions here, and came to the conclusion that Inverse Kinematics (IK) was probably the best solution.

    https://www.gamedev.net/forums/topic/694967-animating-characters-on-sloping-ground/

    There seem to be quite a few 'ready built' solutions for Unity and Unreal, but I'm doing this from scratch so had to figure it out myself. I will detail here the first foray into getting IK working, some more steps are remaining to make it into a working solution.

    Inverse Kinematics - how is it done?

    The two main techniques for IK seem to be an iterative approach such as CCD or FABRIK, or an analytical solution where you directly calculate the solution. After some research CCD and FABRIK looked pretty simple, and I will probably implement one of these later. However for a simple 2 bone chain such as a leg, I decided that the analytical solution would probably do the job, and possibly be more efficient to calculate.

    The idea is that based on some school maths, we can calculate the change in angle of the knee joint in order for the foot to reach a required destination.

    The formula I used was based on the 'law of cosines':
    https://en.wikipedia.org/wiki/Law_of_cosines

    I will not detail here but it is easy enough to look up.

    For the foot itself I used a different system, I calculated the normal of the ground under the foot in the collision detection, then matched the orientation of the foot to the ground.

    leg.jpg.e4f905195e630f5d7c456486b4d79669.jpg

    My test case was to turn off the animation and just have animals in an idle pose, and get the IK system to try to match the feet to the ground as I move them around. The end effect is like ice skating over the terrain. First I attempted to get it working with the main hero character.

    Implementing

    The biggest hurdle was not understanding IK itself, but in implementing it within an existing skeletal animation system. At first I considered changing the positions of the bones in local space (relative to the joint), but then realised it would be better to calculate the IK in world space (actually model space in my case), then somehow interpolate between the local space animation rotations and the world space IK solution.

    I was quite successful in getting it working until I came to blending between the animation solution and the IK solution. The problems I was having seemed to be stemming from my animation system concatenating transforms using matrices, rather than quaternions and translates. As a result, I was ending up trying to decompose a matrix to a quaternion in order to perform blends to and from IK.

    This seemed a bit ridiculous, and I had always been meaning to see whether I could totally work the animation system using quaternion / translate pairs rather than matrices, and it would clearly make things much easier for IK. So I went about converting the animation system. I wasn't even absolutely sure it would work, but after some fiddling, yay! It was working.

    I now do all the animation blending / concatenation / IK as quaternions & translates, then only as a final stage convert the quaternion/translate pairs to matrices, for faster skinning.

    This made it far easier in particular to rotate the foot to match the terrain.

    monkey.jpg.fba6503f671b3cbdcc597e3c5a8fe5bd.jpg

    Another snag I found was that blender seemed to be exporting some bones with an 'extra' rotation, i.e. if you use an identity local rotation the skin doesn't always point along the bone axis. I did some tests with an ultra simple 3 bone rig, trying to figure out what was causing this (perhaps I had set up my rig wrong?) but no joy. It is kind of hard to explain and I'm sure there is good reason for it. But I had to compensate for this in my foot rotation code.

    Making it generic

    To run the IK on legs, I set up each animal with a number of legs, and the foot bone ID, number of bones in the chain etc. Thus I could reuse the same IK routines for different animals just changing these IK chain lists. I also had to change the polarity of IK angles in some animals .. maybe because some legs work 'back to front' (look at the anatomy of e.g. a horse rear leg).

    The IK now appears to be working on most of the animals I have tested. This basic solution simply bends the knees when the ground level is higher than the foot placed by the animation. This works passably with 2 legged creatures but it is clear that with 4 legged creatures such as elephant I will also have to rotate the back / pelvis to match the terrain gradient, and perhaps adjust the leg angles correspondingly to line up with gravity.

    At the moment the elephant looks like it is sliding in snow down hills. :)

    elephant.jpg.bcd92ddb3f9fc1b4ca79e45e80ee854a.jpg

    Blending

    To blend the IK solution with the animation is kind of tricky to get to look perfect. It is clear when the foot from the animation is at ground level or below, the IK solution should be blended in fully. At a small height above the ground I gradually blend back from the IK into the animation. This 'kind of' works, but doesn't look as good as the original animation, I'm sure I will tweak it.

    Another issue is that when one leg is on an 'overhang', you can end up with a situation where the fully outstretched leg cannot reach the ground. I have seen that others offset the skeleton downwards in these cases, which I will experiment with. Of course this means that the other leg may have a knee bent further than physically possible. So there are limits to what can be achieved without rotating the animals pelvis / back.
     
     Anyway this is just description of the trials I had, hopefully helpful to those who haven't done IK, and maybe will generate some tips from those of you that have already solved these problems. :)

  10. This coming week, my game design club will (finally) start working on Digital Games.

    Last week we made paper concepts.  Most of us have ZERO Game engine experience, this is going to be thrilling!!!
    I've decided to bring everyone into a 2D engine called Defold, which outputs Cross-platform (Mostly HTML5) games with LUA Scripting and joint animations.

    That's great Timm, but who's going to answer their questions?
    They are, of course!  I have never used Defold, but in the Game Dev industry, they will

    • routinely have to self-teach to keep up
    • Rely on teammates to solve problems that nobody really knows the answer to
    • Rarely if ever start a game from square zero, they'll always build on others' work.

    To that end, rather than making a game from zero (/*programmers NEVER start at square one*/), we are going to mod a public platformer template.  

    Hopefully, we can divide into some kind of logical teams based on specialty and ability.  Good groups are small enough to enable everyone's input, but big enough to explode productivity.

    My Experience:

    Modding is better than square zero for learning game development:

    • THOUGHT PROCESS:  Since every large company has their own proprietary engine, learning how to learn an unfamiliar engine is invaluable
    • WORKFLOW: Game Companies will teach you by letting you dive into existing code, which is exactly what modders do
    • SPECIALIZATION: You can focus on your specialty (programming, art, music, level design) instead of trying to juggle ALL OF THEM so that you can get a job in ONE OF THEM.
    • SCALE: You get experience in a HUGE PROJECT that you may never fully understand rather than a tiny demo 
    • RESULTS: You can make something awesome (though not quite as accessible) in a shorter time since most of the heavy lifting is done
    • PLAYERS: You already have a huge player base and a known target audience if you mod a popular game.  this looks great on a resume
    • FEEDBACK: If you do have lots of players, you have lots of complaints.  Learn to deal with it, noobs.

    Today, I got to see an eight-year old open his VERY FIRST Raspberry Pi.  I taught him to install NOOBS and use it, and he's really excited to change the world (For one, he won't be bored at home anymore).
    I showed him the built-in python games and how to edit their code (to make yourself faster, bigger, etc.).   
    Even though I can code faster than I can make bad jokes, I would never have been able to make a game with him... but just editing a couple lines of code in an existing game brought about some super-fun results. 

    So basically, I showed him how to mod as a gateway* into programming :)

     

    *Not a Gateway 2000, he's too young for those

    27908399_2133855000230658_7971844948527972012_o.jpg

  11. Steam achievements for Mine Seeker are now complete. I will for sure be including these in all games going forward. Along with Cloud game saves and other services Steam offers now that I am more aware of what all they have to offer. Integrating with Steam was a particularly rewarding experience. I currently have 24 achievements players can earn. I had 30 but some were either not good or didn't fit the game well so in the end I actually removed some. Still a good number to keep people busy. So this screen shot says 9 of 30 achievements but its 24 now.

    SteamInGameMenu.thumb.png.08118f72dbeea9a1d9b4a529c049cd34.png

     

    But now that the hard work is done I now will be putting together some marketing materials, not very good at this part but I do my best. I've also learned a few new things so I'm looking forward to applying them and see if it has an effect on my sales. I'll be making a video, a bunch of pics and descriptions, etc. Once I have this together I'll be uploading the game to Steam so I can more easily have people test the game. I didn't know this until a couple days ago but its a bit of a pain to get the game running without Steam's assistance installing dependencies. So once I get the game up on Steam, hopefully in a week or so, I'll be reaching out to testers, bloggers, YouTubers, etc to see if anyone has an interest in testing, reviewing, or talking about my game. 

    I also heard of a service I've never heard of before, keymailer. They help put game creators in touch with streamers so I signed up to check it out and see what it involves. So if anyone has any experience with them, good or bad. I would love to hear about it. Also I will be passing out Steam keys for the first time so if anyone has any tips or suggestions on that it would be greatly appreciated. 

  12. Hi everybody

    After having quite some success Sound Effects Album sales @ www.ogsoundfx.com we are now also selling single sound effects, starting at $0,99.

    This is perfect for very low budgets, and projects that only require a few specific sound effects. We are in the process of uploading hundreds (and probably soon thousands) of sound effects. You can already start browsing through the first batch uploaded so far: Monster Sounds !!! It's over here !

    And if you have absolutely no budget, you can still subscribe to the OGsoundFX newsletter and get 120MB of free sounds ! And more free sounds @  www.ogsoundfx.com !!!

    Don't forget to check out my youtube channel, you could learn how to make your own sound effects like this one :

     

     
  13. Sorry for the delayed update this week.  Family's been sick and I'm not feeling that great myself.

    ENGINE UPDATE PROGRESS REPORT


    Currently the engine appears to have a PP version brewing for mid-February which should resolve many of the issues we've seen with a quick followup (hopefully) to a regular release.  That'll put us on track for Lee's usual 6 month release cycle for major changes.  

    Beyond that it appears there's ongoing work on previous download content and add-ons for Game-Guru getting the 'PBR treatment'.  This includes now the Mega Pack 1 DLC as mentioned here.

    NEW PRODUCTS IN THE STORE

    store-0212-1.PNG

    Mad lobster keeps adding more and more to his laboratory kit.  I'm astonished he'd continue to give current owners extra value on what was ALREADY a good value.  The price of the kit HAS gone up to reflect that, but if you're a current owner make sure you download the latest goodies for the laboratory kit. 

    store-0212-2.PNG

    It looks like the venerable (and somewhat wacky) Colosso has increased his proficiency at modeling considerably and his newer objects continue to add quality and value to his portfolio.  His newest pack and objects are proof of that.
    store-0212-3.PNG

    Teabone has added some fine clutter objects, including the best looking food object I've seen in a long time.  Not sure how he does it, but definitely worth a purchase for that low of a price.

    Also, as a side note, purchasers of my Advanced Time of Day and Weather kit will have an update available to them.  It's a fairly major bugfix and update.  You can see the details below in the 'in my own works' section.

    FREE STUFF


    The big news here is that Lafette has given away a very major piece of work for free out of what seems to be boredom with the project.  It's an extremely high quality science fiction kit that can be found here: https://forum.game-guru.com/thread/219315

    THIRD PARTY TOOLS


    Heightmap import (HIMP) tool is on hold due to BOTR having a newborn son.  Congrats!
    Entity welder is also on hold for the same reason.


    RANDOM ACTS OF CREATIVITY


    Looks like Dimoxiland is doing some more work on his project "Space Losers" for GG.  These updates are always exciting because he's pretty much the cream of the crop for Game-Guru developers.  Check out this screenshot!

    wipmeu.jpg


    That's what I'm talking about. All custom code, graphics, etc. It's impressive beyond any reason and if he finishes it will probably put Game-Guru on the map.


    IN MY OWN WORKS


    As mentioned my Advanced Time of Day and Weather kit has had some fairly major updates.

     02/12/2018    
    • Added specular effect for snow to give it whitish appearance for terrain objects on highest values.  I basically changed specularity to 50x for white.  This can be commented out if it's not desirable but overall I think it works well enough considering the engine itself.
    • Repaired broken build with files specifically from PP dev build of mine cross-pollinating live non-pp (DX9) build. 
    • updated test map file
    • Fixed/repaired time function(s), they sync to states now and also rollover seamlessly day to day 
    • Fixed broken cycleloopcounter pointers which were still working in old code (singleuse/singlestate/etc). 
    • Fixed bag_w_indoors script, huzzah!
    • Added 'pp' versions of the weather effects.  If they don't work, let me know the error you get. 
    So to sum up I basically had a non-pp build get mixed with a pp build that accidentally got uploaded. I basically made separate FPE files at this point for the PP weather effect decals which can be used in place.  The script remains the same for both as it will function on both.  I'm also including both .fx files for the effectbank folder which should resolve any conflict between the two versions.  The biggest fix is the resolution of the w_indoors script, which now functions seamlessly.

    Camerakit is almost ready to be bundled up and posted to the store, expect that very shortly!

    View the full article

    • 1
      entry
    • 0
      comments
    • 63
      views

    Recent Entries

    There are lots of benefits of outsourcing 3D animation:

    Saving costs 

    Outsourcing 3D animation costs up to 50-60 % less, compared to in-house development.

    Profitability 

    Less operational costs boost profit margins. No overhead costs, no need for capital-intensive investment help utilise financial resources elsewhere  

    Quality 

    The outsourcing partners have an expert level in this field and have all necessary tools and resources.

    No overhead costs 

    You do not need to pay for training and conferences for your animators, we take care of everything 

     Our company provides both keyframe and Motion Capture-based animation, or combination of those. Wide experience and deep knowledge ensure highest quality that meets all requirements of our clients. Human, cartoon characters, creatures, engineering, medicine, machines, weapons, etc. – we are providing various types of 3d animation and not limited by particular areas. Our company works in different 3D software such as Maya, 3ds Max, Motion Builder etc. – we are flexible for adjusting workflow to client requirements.

    Please find more information about us at http://tavo-art.com/

    • 2
      entries
    • 0
      comments
    • 228
      views

    Recent Entries

    Hello.

    A post a bit special as it is not about my game project (https://www.gamedev.net/projects/4-shoot/), hence the blog to which I am posting this. But still, it will look familiar to someone that saw a few screenshots...

    The central parts are in the prototyping state. I just finished building the side blocs above the wings. The completed sections are made of nearly 760 parts. The process I am following is:

    - prototype a section with the parts my children have. My own models are on shelves (not glued) so I am not using them,

    - re-create the section on the computer at http://www.mecabricks.com sometimes with better parts than the one I have used for the prototype,

    - export the parts list to https://www.bricklink.com/v2/main.page to buy them so I can give back the parts to my children,

    - enjoy the assembly.

    I do not know how long it will take to get this thing done (delivery charges increase the cost a lot so I have to delay the purchase a bit) but I hope it will be finished before the end of the year...

    ship_00.jpg

    ship_01.jpg

    ship_02.jpg

    ship_03.jpg

  14. What's the story behind the game Charly Men's BIZARRE?

     

    The life of Charly Clearwater, a newlywed young and successful business man, has changed dramatically after he had been shot in the head by an unknown.

    Happy to have survived the attack, he is henceforth experiencing different real and surreal visions and anxiety attacks that start to ruin his career and life in all respects.

    Since recovery, Charly Clearwater gets confronted every day by the bizarre shocking fantasies and dreams of completely unknown people around him. Initially, as he doesn’t know whether these visions are real or just imagination, he’s simply trying to ignore them all, because all he wants is to stay focused on his career that is more than important to him. But when the visions become more realistic and shocking, and therefore Clearwater is afraid to fall to madness, his brother John persuades him not to flee but to make their bizarre dreams come true!

    That’s when he starts to fulfill the first one of 13 bizarre wishes of unknown people, a process that is turning him into a henchman without him noticing.

    Charly Clearwater feels a temporary relief of his attacks and visions when making the people’s absurd dreams come true, and for that reason, and because he starts sympathizing with the bizarreness on a sexual and emotional way, he doesn’t refuse when his brother John encouraged him to fulfill further 11 dreams.

    But shortly before the fulfillment of the 13th dream, he receives the wish of his wife Amanda that puts him on the track of his murder and let him become human again.

    In case he’d make her greatest wish come true, he’ll save his marriage and life, but he’ll also free a dark might that is going to lead us to anarchy!

     

     *we are a German gamedev team, so please apologize any English mistake

  15. A week ago I decided to scrap the RPG idea for the game, as it would require way too much time to complete. Instead, I will be making zombie shooter game for android. 

    During this week I added smooth camera movement, camera shakes, bigger and faster bullets and blood particles to make the shooting more enjoyable.

    You can see the current state of the game here:

     

  16. Have a good Monday, everyone ! The last week was supposed to be devoted to the creation of the dialogs of the first cutscene but a great misfortune happened to my production hard drive. This one has finished is life in a loud crashing noise. So I had to buy another one and reinstall all my softwares in order to be able to work again on the game. Fortunately, I lost only about 1 hour of work with the help of bitbucket. If you still  not work with an external source control, I really recommend it. It can save hours and hours of work in case of hardware problems. Before the crash of my hard drive, I began to draw a little  civilian who will walk in the base during the cutscene. I still have some work to do on it to improve it. Here is an animation of it:

    CivilianA.gif

    During the week, I discovered an extraordinary website for planning video game projects. His name is Hacknplan. On this website, you can write your complete documentation. In addition, you can plan your work with the Agile method. Most importantly, there is a free version that can still do a lot for individual developers. I recommend you take a look at it: http://hacknplan.com/

    In the next update, there should be more things to show since I changed my hard drive. If you have questions or comments, you can post it here.

  17. Player Panel
    DOMEN KONESKI

    Player panel is now fully implemented and ready to be taken further. It serves as an indicator how well you are doing. It’s also a place where you manage your gear – from head items, torso and so called ‘plugins’ which increase your stats. You will be able to craft and find these plugins troughout the world.

    2vl5hn4.jpg
     

    Navigation
    DOMEN KONESKI

    Since the world won’t be that small, we will give players some sense of direction by adding simple compass on top of your screen. We had a map in previous iteration and with testing the game we have seen that noone had been using it. Probably because they weren’t any interesting things on the map, so these things can change troughout the development. Though the main idea remains – for players to know where they are (we are about to add memorable points of interest and floating islands). Also, biomes are the easiest way to tell where you are (snow biome is usually in the north, desert biome is in the south).

    compass_lowpoly_floatlands.jpg?fit=394%2

    GPU Rendered Grass Test
    DOMEN KONESKI

    We had plenty of concerns about our recent grass renderer that I made. It uses Unity’s DrawMeshInstanced() function over multiple meshes that are spawned troughout the world. The problem with this system isn’t rendering itself, but batching grass meshes that are near you. We’ve used the fastest possible Octree implementation to get nearest grass meshes. These were then put to the renderer, that caused a lot of GC allocation and used alot of CPUs usage if we wanted dense and diverse grass. It also used a lot of drawing calls when we wanted to draw different meshes with different textures at the same time (example is when you stand on the edge of two or three biomes).

    gpu_geometry_lowpoly_grass_floatlands.pn
    Rendering grass on the GPU by using geometry shaders

    The last and only solution that I thought of was doing everything on the GPU by using geometry shaders. There are zero to none examples on how these things work in Unity so I started experimenting with it. The shader successfully draws triangles on a list of vertices I provide along with color and normal of the surface beneath. The shader has tweakable settings, from width and height of the grass, shadow intensity, wind strength and more. This is for now only a test and will probably take it further from here, we have to figure out how will grass cope with our art style.
     

    Shader world models for items
    DOMEN KONESKI

    Item framework system a has new additon which enables us to (re)use world item models on different items. This means we can create more items with less modeling. Example: two torso sets that look the same, but have different stat bonuses on it. More about it in the future blogposts.
     

    Timelapse #6: Companion modeling and rigging
    ANDREJ KREBS

    Companion will work as a tool and is therefore modeled with relatively lots of polygons, since it will appear close to the players view. The model was then rigged, so I could animate all the moving parts. The model was then animated together with the first person rig for the various actions that will be performed.
     


     

    Outfits and skins
    ANDREJ KREBS

    The outfits and armors we are adding to the player character are separate models, that fit over the player character and are connected to the same skeleton as the player. I model and weight paint them with the player model in blender, so I can test it with different animations and see how it works and looks with the player, but I then save it in its own file for import into unity. This way we can keep adding more outfits.

    outfits_lowpoly_floatlands.jpg?resize=72
    Outfits

    Skins are basically new models that represent the same item. Sometimes I copy an existing armor and modify it and other times I just make new armors from scratch for skins, depends on what I’m trying to make. The player and outfit models are then brought back together in Unity and are then remaped to the same skeleton, to work with the animation in the engine.
     

    Floating islands biome concept
    MITO HORVAT

    The floating islands above the ground are going to need an overhaul as well. We’re redefining the shapes and sizes plus it will have its own mini biome. The old island models look too boring and simple to recycle them, so we had to come up with something fresh. I drew up various shapes and sizes concepts which were then approved and based on that, I painted this quick concept art.

    floating_islands_biome_floatlands.jpg?re
    concept of floating islands biome


    New icons for UI/Inventory screen
    MITO HORVAT

    I’d been given another sidetask by Domen, to paint the new icons for the game UI/Inventory screen. It’s going to take ages to complete, but it is the necessary evil that has to be dealt with.

    icons_UI_inventory_lowpoly_floatlands.pn
    New icons for UI/Inventory screen


    Improving resource helper
    TADEJ VRANEŠIČ

    This week was all about improving resource helper. I made a simple Server/Client ftp connector, which resolves version disputes quite easily. It has an Upload – Push and Download – Pull function, which reads and writes from server and to a server. This storing and reading technique will be used heavily, internally, but it may have the potential to be used as a modding tool. User could download item resources file with different settings from the ones that are set on his PC. This completely alters the behaviour of Floatlands game and after he is done “playing around”, he can simply download the official and latest version, which sets game behaviour to default.


    World bounds
    VILI VOLČINI

    Lately I’ve been working on world bounds, which are automatically generated based on bounds size. The intent here is to prevent a player from falling off the world.

    world_bounds_lopwoly_floatlads.png?fit=7
    World bounds


    Charting software
    VILI VOLČINI

    I was searching for charting software for planning. I did this because it felt like we were somewhat in the dark with planning. I found TeamGantt, their charts allow you to plan tasks on the timeline, and more importantly, make tasks depend on each other – “if Y depends on X, then we must finish X before working on Y”. With this software, we can estimate, how much work there is to do and how much time it will take.

    gantt_charting_lowpoly_floatlands.png?fi

    Gantt online charting software

     

    Serene track
    CHRIS PLEXIDAS

    Every blogpost from now on and until the release date we will release one track of the Official Floatlands Soundtrack a week, but also remixes and experimental tracks. Happy listening and let’s make a start with the track Serene.

    Serene is a track that is set out to make you feel the vastness of the scenery in Floatlands. A multitude of layered and swelling Synthesizer patches guide you slowly into the dazzling, dreamlike worlds.


    More about Floatlands:


     

  18. Creating an awesome trailer for your indie game (on a budget)

    In this post I’ll talk about:

    • Background information about our trailer;
    • Features and spec of our trailer;
    • General advice to make a good trailer;
    • How to create an awesome trailer for your indie game (on a budget);
    • Translating the trailer (and website);
    • Preparing the video to be shared;
    • Make all your effort worth it.
    • Background information
       

    3 Minutes to Midnight” ‘s (Scarecrow Studio’s first point-and-click adventure game) teaser-trailer has been officially released today! At Scarecrow Studio we couldn’t be prouder! In fact, we’ve been working on this teaser for the whole last month, while not losing focus on the point-and-click part of this adventure game.

    First a few specs about the “3 Minutes to Midnight” trailer

    • Voices in English.
    • 16 Subtitles (English, Spanish, Chinese Simplified, Chinese Traditional, Czech, Polish, German, French, Italian, Portuguese, Hindi, Turkish, Catalan, Korean, Japanese and Russian;
    • Length 2:20 minutes;
    • Resolution 4K;
    • Youtube Link.

    General advice for creating a good trailer

    Keep it simple and to the point; This means avoid unnecessary logo intros; Cut to the chase, you don’t want people to close your video even before seeing the actual footage;
    Avoid showing too much black screens with text; That’s a resource many indie games use, because they don’t have enough material. If you don’t have enough material, don’t do a video. It will hurt you more than it will help you;

     

    • First impressions count! Don’t ever think they don’t; So try to make your first impression a good one;
    • Show the video first to people who wouldn’t mind hurting your feelings; That means forget about family and friends, you want to know the truth about what you created, not someone who tickles your ears;
    • Make it short and interesting, try to keep the viewer paying attention all the time; We managed to make it interesting, therefore in our case the video is 2:20 minutes long; However, I would advice you to try to keep the length around 60 seconds and 90 seconds;
    • Make sure you show what your game is about and the main features.

    Now, how did we do it? Aka, creating an awesome trailer for your indie game (on a budget)

    My first advice to you would be to make a list of the features that are going to make your game stand out; After creating the list, you have to make sure that every item on your list is shown in your trailer. Since there’s a huge variety of games I’m going to use our game, “3 Minutes to Midnight”, as an example. The features that make our game stand out from others are:

    • Environment art;
    • Character design;
    • Fluid animations;
    • Great story and background stories of the characters;
    • Voices in English (and translations into 16 languages);
    • A high dose of humor.

    So, how could we show all that in our teaser trailer without spending a lot of time (money) on it? At that point we had no game-play ready, so we couldn’t show that. At the same time, we had a lot of material that we couldn’t show to people (to avoid major spoilers or ruining the story), and the final script still was on the works.

    We wanted to create something unique and original while re-using some of the materials we already had, keeping the game development unaltered. Creating a trailer video, to show what we were doing, allowed us to start the promotion.

    The idea was to show the game as a movie that is about to be filmed. For instance, we could use the scenes, we already had, as the trailer’s background sets. Same for the characters, we could use them in their already animated positions for the game. Moreover, we could create a parallel script pretending the characters of the game are ‘actors’. The script should also clearly show the sense of humor of the game. It also allowed us to have the voice-actors begin with the voice-over prior the trailer was done.

    Translating the game

    A big piece of advice: Do all you can to have enough budget to reach as many users as possible. We realized how important localization and translations are. And we believe players will certainly appreciate this fact. In case your budget is really really tight, I’ll list you (in order of amount of users) the most important languages:

    • English | Spanish | Russian | Chinese | Portuguese | German | French | Polish | Turkish

    We hired freelancers for the translation, (our budget can’t allow us to have permanent positions for this task). In order find the right freelancers we used a couple of websites, (I’m not going to write them here, but google “freelancers” and you’ll find them easily).

    • First, you search the freelancers and sort them by reviews and amount earned, (that will ensure quality);
    • After that, you contact them and negotiate terms and costs, (really important to do it before hand);
    • Then, you make them do a test, (a small one, you might get it free of charge);
    • Later, you make someone else proofread the test to see how many mistakes the first one made, and you repeat the process until you find the right person.
    • Important advice when dealing with other languages:
    • Make an excel file, with all the sentences in one column;
    • Always specify the gender of the speakers, even if the character is talking to him/herself, (some languages change completely depending on the speaker’s gender);
    • Be really careful with rhymes (in your language it might do, but in others it will need a lot of effort in localization, in the end, it might either increase the cost or lose the meaning);
    • Also, don’t use expressions or sentences that only make sense in your country, (such as inside jokes that only people from your country would understand), that will save you a lot of time of giving explanations to the translators, simply try to make the process easy and smooth;
    • Try to make short sentences, use as many punctuation as you can, avoid long sentences AT ALL COSTS! A long sentence might force the translator to paraphrase it completely and might lose the sense you were originally aiming for.

    Since you are an indie company, and your resources are limited, you don’t want to spend a lot of time answering questions coming from the translators. In our case, we have 16 languages, imagine answering questions being made by 16 people at the same time.

    Preparing the video to be shared

    One of the main features of our game is that it’s going to be 4K. A 4K 2:20 minute video is about 53GB after you get it ready with any edition software. So we recommend not to upload that directly. Why not:

    • When you upload a video that big, YouTube will automatically resize it, which means you’ll have to wait until YouTube processes the whole video.
    • You can’t control the outcome quality, since it’s the YouTube algorithms who are going to control the output file.

    Recomendations:

    • After creating the video with your edition software, find out how that software generates a YouTube ready video. What you’ll get doing it like this:
    • Smaller video (will upload way quicker) the bigger the file the longer it takes the higher chances of something interrupting the upload.
    • YouTube won’t edit at all a YouTube ready video.
    • Your video will be available right away after the upload without waiting so you can start working on it right away.

    Uploading the video and working with subtitles:

    • We suggest you to upload it in one place, in our case in YouTube. So since you are indie and small try to concentrate all the viewers, visits, and comments in one spot.
    • Create your own channel. If people like it they will subscribe and your updates will reach people who’s actually interested in your game.
    • Create a good description of your game, you obviously know what is about, but explain that clearly to people. In our case it’s a 2D Classic Point-and-Click Adventure Game. Help people know where they can find out more about the game, add a link to your website or social media in the description.
    • Work with the best quality you can, in our case is 4K but YouTube automatically creates duplicates in lower resolutions so anyone can see it.
    • Fill all the information about the video, tags, description, suitable for all viewers, etc.

    Make all your effort worth it!

    Let people know about all the languages of your game, IN THEIR LANGUAGE, so make sure your website has at least one page where you can talk about the game in every language your game is going to be available to. In our case, we made the whole website in English, except the press kit, our press kit is in 16 languages, and it auto-generates the content depending on the preferred language of your browser, give it a try if you want:
    Press Kit

    Point-and-Click Graphic Adventure Game - 3 Minutes to Midnight (From Scarecrow Studio) (2).png

    Point-and-Click Graphic Adventure Game - 3 Minutes to Midnight (From Scarecrow Studio) (3).png

    Point-and-Click Graphic Adventure Game - 3 Minutes to Midnight (From Scarecrow Studio) (4).png

    Point-and-Click Graphic Adventure Game - 3 Minutes to Midnight (From Scarecrow Studio) (5).png

    Point-and-Click Graphic Adventure Game - 3 Minutes to Midnight (From Scarecrow Studio) (6).png

    Point-and-Click Graphic Adventure Game - 3 Minutes to Midnight (From Scarecrow Studio) (7).png

  19. We spend 3 days making a game mode for the UE4 WinterJam. Here is the product of what we were able to accomplish. It is 100% multiplayer and works without anybody (bots) as well. Let me know what you think and any and all bugs you might find!

    Download the game and play from our itch.io link: https://riuthamus.itch.io/cracked-ice

     

  • Advertisement