Jump to content
  • Advertisement

SillyCow

Member
  • Content count

    465
  • Joined

  • Last visited

Everything posted by SillyCow

  1. SillyCow

    How to avoid bugs

    My philosophy is: "Bugs will happen, so make sure you mitigate them". Make each bug as easy as possible to detect and find. Write automatic System Tests. (ex: Write an automatic test for the AI to play your game) Make sure you have satisfactory code coverage Make small commits, so that you can later browse through them and see which one introduced the bug. Try not to use untyped languages or data. Python, Javascript, etc... will ensure that you will only find bugs at run time. When using C++ try to avoid using pointers whenever possible: Can a function use a const reference instead? Can I use a smart-pointer without hurting performance too much? This pays off 100x when you are multi threading Make sure you use logging as much as possible (when it doesn't cause performance problems). That way you can investigate a sporadic bug when it happens. Make sure you have the option for IDE debugging. (this can sometimes require defensive coding, as not all languages are IDE friendly) Write small functions: If a function does not fit on a screen, you should probably break it down. That way, bugs remain contained etc... Basically: Bugs will happen. You cannot do much against that, However, just make sure that you have favourable conditions to: 1. Understand that you have a bug as soon as possible 2. Isolate the bug (find out what is causing the bug and what is effected by it ) 3. Have the right tools to fix it quickly.
  2. I would consider working for equity under the following conditions: 1. I am very passionate about the company's tech 2. I get a unique opportunity to further my career: (like getting VP status very early in my career) 3. I believe the company has a sound buisness plan: More likely to go for a sure million than a longshot billion. When it comes to making games 1 is easy, but 2 and 3 are hard.
  3. SillyCow

    How to avoid bugs

    For me it's automatic regression tests with ample code coverage. I find system tests do the job better than unit tests here. My most complicated hobby project to date was a multiplayer RTS game. There are lot's of bugs that can cause the game to go out of synch. So I created a test where 2 bots played eachother automatically for every build that I made: This was a complete end to end test: There were actually 2 separate instances of the program playing eachother over the network.All of this was launched automatically, so it was painless for me to test. It took me several days to set this up. But, It helped me catch 75% of my bugs. In a hobby game this is specifically important because bugs demotivate you.
  4. Sometimes I enjoy grinding. Why would you want to remove it? Instead of punishing the grinder, reward the "brave" player! Add a mechanic where if you slay an advanced monster/complete an advanced dungeon you get some special reward. For example: If I kill a dragon at level 50 I get the usual reward loot. But if I kill the dragon at level 10, then the king rewards me with a special artefact for my valour. Maybe it even unlocks some part of the story. Maybe you can even add a "level down" mechanic (drinking alcohol in a pub 🙂 ) to allow players to level back down to make the game interesting again. Could be interesting to level up to get certain traits, and then level down to start earning new traits instead. Or to elaborate on your "getting harder" idea: Make the player a fugitive: The longer they stay in one place, the more run ins they will have with "the law" or the assassins chasing them.
  5. Actually, assuming a single screen game/app it is usually usefull to select one dominant axis as 1.0 and scale the other according to the aspect ratio. Otherwise your image will get stretched.
  6. If your game doesn't scroll, I would recommend making the top left corner (0,0) with sizes increasing towards the bottom right. This is inline with how popular image files are encoded. (I assume you will be loading sprites from PNG or JPG files...) If you are making a scrolling game (Mario), it doesn't really matter. The center of the screen is as good an origin as anything. PS: I wouldn't spend too much time thinking about it, just start drawing stuff and see what happens. Eventually you can make any coordinate system work. Just start try *something* and see if it works for you. (More important to start prototyping and gain some experience). Even if you end up making a mistake, you will have more tools to understand it.
  7. First you need to decide what kind of 2d game you want to have. I usually normalise my coordinates to one of two things: If it's a single screen game (no scrolling) I usually chose one of the axis of the screen (usualy width) as 1.0 . This is useful when you do GUI work: Menus, buttons,etc.. (ex: Card games ) If it's a scrolling tile based game, I usually normalise to tile size: 1 tile = 1.0 x 1.0 (ex: Strategy games / Platformers) It doesn't really matter how you normalise your coordinates, but it's useful to normalise them to something meaningful to your game. (in anything with physics I usually use 1.0 = 1 meter)
  8. Good C++ is always faster than good C# or Java. However the point I was trying to make about game engines is that even if you write your game logic in C#, it is very rare for most of your execution time to be gamelogic. And the actual engine core code is usually written in C++. Compare this to the mobile browser wars of earlier this decade: Android always boasted a faster JavaScript engine than iOS. Yet mobile Safari allways blew it away in terms of performance. Why? Because when dealing with multimedia web-sites, Javascript was rarely the performance bottleneck. It was the browser's rendering engine which was doing all the heavy lifting. Game engine performance tends to work in much the same way. However, if you are one of the rare few doing cutting edge CPU work on your game, then you should be using C++. It's not about "no heap allocation whatsoever". But rather trying not to do any real time heap allocations. Try not to call "new/malloc" when your game is running. If you are using C++ try not to call "delete/free" when your game is running. Just preallocate as much as you can at the beginning and recycle it as much as you can. For ex: I wrote a rather complex pathfinding algorithm for an RTS. The performance more than doubled if I "recycled" my memory structures when I was done with them. So subsequent calls would just reinitialise the memory structures from the previous calls.
  9. If it's only the GC that's bothering you, you can circumvent this problem by not using the "new" keyword too often and recycling your classes. Using "new" in C++ isn't such a great idea either because it will lead to alot memory fragmentation. I tend to try and avoid heap memory allocation in real time games altogether. In unity's C# for example "new Vector3" does not create a reference. You are completely right. But what kind of game are you making (and on what hardware?) where this is a real performance issue? Today's smartphones are powerful enough to overcome most of this. And the really core performance code in engines is written in C++ anyway. Whether you are using Unity or Unreal, both engines' core functions are not written in a scripting language. While you might be coding for some extremely low spec device, I doubt any of the engines target such devices so I assume you are talking about smartphones here. And with smartphones, it's usually the GPU's fragment shaders that set the performance bottleneck. With CPUs, unless you are writing your own physics, or some really fancy AI I doubt any of this would matter. Unless you are note optimising your code (making too many real time memory allocations and such). But you can write bad code in any language...
  10. The best code completion and refactoring tools for the other popular languages. (Javascript[webstorm] ,Python[pycharm] ,C++[visual-assist] ,etc...) do not come close to what you get for C# and Java. In C# and Java I sometimes create whole classes and interfaces just by using automatic refactoring tools. In C++ for example refactoring doesn't yield the same trusted result, and usually fails for all but the most trivial refactoring tasks. Same goes for auto completion. I think that the preprocessor is to blame for this. And many engines make heavy use of preprocessor directives.
  11. This is your problem, not Godot's. I don't want to sound mean but if you're having trouble with Godot's scripting language then you aren't going to have a fun time with any engine. One of the great things with unity (when you use it with Visual Studio) is the great intellisense support. I rarely have to lookup an API on the internet. It's usually all there when I press "Ctrl + Space". This is one of the reasons I prefer C# (Unity) to other scripting languages (C++ [unreal] or custom languages ). C# is great for IDE functions, and therefore easier to learn (While I am a good C# programmer, I don't necessarily remember all of Unity's specific API funtions) This is an example of a "good" language property of C# and Java compared to other languages: They are much more "well defined" and thus easier for an IDE to parse. Therefore, you can get the IDE to analyse your code for you, and even write code for you (automatic method creation and extraction). They will always win in this regard compared to Javascript/Python/C++ . The number of times I have to open a browser and google something when I work with Unreal is much larger, because C++ is harder for the IDE to parse.
  12. It's true that in 2006 that model gave the best graphics card a run for it's money, and today you could render 10 of them in game at the same time on a mid-range card. However it was not the same as witnessing Doom3's lighting for the first time. Doom3's lighting looked like nothing you've ever seen in real time before. Do most games look like the GIF you posted? I'd argue that most games wouldn't put in the time to create that face artistically. On the other hand, I don't think the original quake had a lot of "art" in it. It was more about optimising rendering techniques which were not seen in real time before. I'm not against artwork or anything like that 🙂 . It's just that as a programmer with a life long infatuation with graphics, I don't as excited by this tech anymore. Maybe I'm just old, but then again I don't remember any game marketing heavily on it's engine. Maybe "No Man's Sky", but their engine's uniqueness didn't lay in it's graphics )
  13. As a gamer I prefer gameplay to graphics. Heck as an "old" gamer, I sometimes play some of my old terrible looking favourites just because I like the game play. However... As a hobby engine programmer I get really excited by graphics technology. The first 3d games that I played (Wolfenstine and Ultima) made my mind explode. I had to figure out how they were made!. It resulted in me learing alot of math, and building several software based 3d engines. By far, this is the reason I am a programmer today. So there is nothing like a cool programming technique to get me interested a game. These are the games that I played that really geeked me out when I saw them. When I call them "first", it's the first games I saw as a kid. It doesn't mean they are the first to use a technique. Wofenstein: First "3d" game I played Novalogic's Comanche: First time I saw voxel graphics (it was beautiful!) Seventh Guest: First fully rendered, fully animated environment. (This made me learn 3D modelling software) Quake2: First 3D game I ran with H/W acceleration GTA 3: First game with Seamless 3D open world Doom3: First game with modern lighting techiniques Crysis: First game with modern shader code. These games are not necessarily my favourite games to play. But as a programmer each of them cause me to pick up a book, or create my own version of these games Unfortunately for my programming self, most games today invest in "art" rather than graphics. They create bigger and more impressive set pieces, but I haven't really encountered any revolutionary rendering system that knocked my socks off. I sort of get the feeling that the 3D graphics revolution is over. We are now more limited by what artists can create, then what the computer can render. Ex: Creating realistic faces today is more about putting in the 3D modelling work then inventing new rendering techniques. This nvidia demo from 2006, looks good enough today (12 years later). The reason most games don't look like this isn't because of missing graphics tech, rather it's because it requires a daunting amount of artwork. So I think since DX10 (when the pipelines became flexible) we are at an age where art is more important than graphics. This is sort of reminiscent of 1990s CG movies like Terminator 2 where viewers flocked to theatres to see what the latest CG could do. Whereas today (I'd say since ~ lord of the rings) if you notice the CG it means that it's bad. I actually liked these parts in the "Last Jedi" where they where CG'ing the admiral and the princess (badly). It made me feel like a kid again. And someone was finally taking some risks with some new graphics technology. I am tempted to say that no more GFX revolutions await us in gaming. But I wonder what will be the next revolutionary tech? Fully ray traced games? Something else? I personally have not gotten excited about a game engine since "Crysis"
  14. SillyCow

    Defining AAA

    http://www.mobygames.com/game/windows/stalker-shadow-of-chernobyl/credits This team does not look tiny to me. For god's sake they have the "Prague Symphonic Orchestra" making their music for them! This is the difference between an indie and a funded studio (I will not go as far as calling GSC "AAA"). Both have 3-4 lead engine programmers. But the funded studio will hire an intern to make the particles look slightly better.
  15. SillyCow

    Defining AAA

    Here's a thought: The current discontent over indie vs AAA has nothing to do with the quality of either. It has more to do with mainstream attention. AAAs have made the entry barrier so high that your chances of getting a spotlight as a small studio have plummeted. It's not that AAA have gotten worse. It's not that small games have gotten better. It's just that the percentage household brands which are also "indie" ( or "developed in a garage" as one might say in the 80s) has dropped. That is because a Houshold name means you are one of the top 20(?) grossing games in your category. 1M$ cut it in the past, but today you need to make $50M to be in top selling charts. The indie game can still make 1M$, just as before. It's just that making 1M$ is considered non-mainstream today. It will amount to a game which only a small percentage of gamers have played. One might argue that quantitatively the number of players remains the same, it's just that selling 10k copies today isn't considered much.
  16. SillyCow

    Defining AAA

    I would define this as the time when small passionate teams (sometimes one man teams) accounted for gaming blockbusters. This was a time when noone thought you could ever make billions from video games, so noone thought a multimillion dev budget would be justifiable. Think about a hit like Prince of Persia. It was a game with unbelievable production value for it's time. I'm not talking about innovation, just production value. And it was largely made by one entry level programmer. I think in those "romantic" times it was expected that a small passionate indie team *could* publish a polished hit and compete with the big guys. It was probably the same earlier on (in the Atari period), but I started gaming the 80s. So I am not sure. Today I don't think any indie thinks they can compete with the production value of a AAA title today. Sure, you can strike gold with innovation ( ex: minecraft ) but there will be no indie "Call of Duty". If you compare blockbuster indie developers Crytek to ID software, you will see that Crytek needed a much bigger team to make a blockbuster then ID did. Whereas ID where just "a couple of kids", the very talented Yerli brothers needed an established team around them (and a AAA publisher) to break out.
  17. SillyCow

    Motivation

    I use the drunken self messaging method from "How I Met Your Mother". (The drunken characters would leave answering machine messages to themselves when drunk so that they would remember important stuff in the morning) I make sure to leave an intentional error where I left off in the code. A lot of times that error would be a self directed comment without the "//" prefix. That way I can immediatly find my place in the code and continue working.Something like: int a=math.pow(a,2); Note to self: squared function makes physics look bad, try something else //this will not compile :-) Then all I need to do is start up my project, and get a compilation error on a certain line. And seeing that red compiler message immediatly gets me motivated to keep working!
  18. SillyCow

    Unity dropping Monodevelop a let down for small indie?

    I also don't get all the Anti-IDE hate. I see it in "the younger generation" regardless of gamedev or regular dev job. It's become very fashionable to minimise your tool chain. I don't get it, I feel old 😞 ... I'm always confused when I see my teammates using text editors to code. But this is definitely a growing trend.
  19. SillyCow

    Defining AAA

    I would argue that "nowadays" might be wrong. In the romantic days of gaming there was no big budget involved (except for the console market). What defines our era is that most games sales revenue is AAA. (For good and for bad) I specifically remember that when COD MW II it made more money than any summer blockbuster. That was the moment that I understood that the gaming industry has changed. It had made the transition into "industry". Here you go. I swear that I googled this link only after I wrote my reply above, but look who's on top🙂 . And also look at how much more they spent on marketing! And regardless of whether you think it's a good game or not, it proves that AAA marketing pays off in the end. https://en.wikipedia.org/wiki/List_of_most_expensive_video_games_to_develop
  20. SillyCow

    Unity dropping Monodevelop a let down for small indie?

    But there us a very valid reason to bring up Unreal. If Unity 2017 was a Middleweight and Unreal is a Heavyweight engine. I think I may have misused the word heavy. Didn't mean feature wise. Just the footprint. Unreal Editor is slower than Unity (not talking about the actual resulting game performance). Takes up more space. Uses more memory. Requires a stronger machine. I am not talking about amount of features or resulting game quality. Just what kind of specs you need to gamedev with it.
  21. SillyCow

    Unity dropping Monodevelop a let down for small indie?

    Visual studio is an awesome Dev tool if you are using C#. I Always prefer visual studio over mono develop. The only time I used monodevelop was when I had to develop stuff on a mac. By the way, what's Unity's solution for Mac and Linux in that regard? "Visual Studio Code" is not Visual studio. It doesn't have half the features or customisability that "visual studio community" has. It's actually more similar to monodevelop in that respect. You should not bring Unreal into this conversation. Besides being a much "heavier" engine, Unreal is C++ based. C++ requires a much more powerful IDE than C# or Javascript to get stuff done. I haven't encountered a light-weight C++ IDE yet (just text editors). Because of the cross module dependencies created by the Pre-Processor, C++ is much more complex for an IDE to understand. So when I work with C++ in Unreal I don't dare use something other than Visual Studio. There are many valid reasons to use unreal, but none of them are "small editor footprint" or "easier coding experience".
  22. SillyCow

    Scrum metodology

    Doesn't conditional branching (conditional jump commands) kill predictive execution? Of course virtual functions have their own disadvantages. But I always assumed an abundance of conditional jumps has it's performance problems as well. Certainly when there are a 100 of them.
  23. Would it be possible to use gamedev to manage my public game project? I am looking to build an opensource game, and I really like the community here. I would much rather use a feature in these forums than opening a slack for my project (because I like the community here). Ideally it would be a place for brainstorming and getting members (gamedev.net members) to contribute to the project (if they want to). Does this functionality already exist here?
  24. After spending some time with my Oculus Rift I would like to create a combat game. I am looking for weapon designs which would translate well into VR. The current problematic weapons that I've seen are: 1. Regular Swords: You have no feedback with swords. Eventually I find in sword games that I just tend to stick them "inside" my opponent, and flail them around until I kill them. They will just go through the body of your opponents breaking immersion. 2. Guns (and ranged weapons with a trigger) : Quite disappointing to see so many "virtua cop" clones (I love virtua cop!) . This is becoming the FPS cliche` style for VR combat. And what does using your hands for pointing add over a non VR experience (point with the mouse)? So I am looking for better weapons. Ones that would feel right when you use them. For example: A flail - The swinging balls would not require you to hit your hand against something. Their bounce would also provide good feedback to how hard something was hit. I have also seen archery done well (although I have mixed feelings about it because it is really tiring to hold up both hands [ In the same way real archery is tiring!] ) So I was wondering, what kind of good VR weapons can you think of which "feel" right, and be special in VR? PS: I was impressed with how well "Super hot" manages this feeling by turning all characters into glass. That way it makes sense when you effortlessly cut through them. That's a really cool game design hack 🙂 .
  25. SillyCow

    Virtual Reality Weapons

    You are right. That started out being rhetorical. It was more of a statement on how repetitive VR shooting games are becoming, The game you are describing would actually be pretty interesting. Dead and Burried already has the sight mechanic nailed down pretty good. Also the cover mechanic. I'd be sceptical if someone could create a good reloading experience though. If you've ever fired a gun (which It seems like you have) I would think that reloading is a tactile action. You feel the cartrige, you don't really look at it. That's why all the games I've played just have you "shaking" the barrel out of the gun to reload. Which is does not feel realistic, or more complex than pressing a reload button. Putting that aside, a realistic gun game would actually be fun to play (so would be a good idea!). But I don't really see how you can make guns more "tactile" then they already are in Dead and Buried or Robo Recall. I value tactility over "realism" because I think that's what makes the weapon feel good. But really, how many shooters can a guy play? I'm really looking for something other than a gun
  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!