Jump to content
  • Advertisement

SillyCow

Member
  • Content Count

    469
  • Joined

  • Last visited

Everything posted by SillyCow

  1. Hello, Gutten Tag! This sounds interesting. I am a developer based in Europe, and I am experienced in programming browser based multimedia projects. I would be happy to discuss this with you. My questions: Is this game free? ( I noticed you said it was not intended to make money) If it is free, would you be willing to make it opensource (it would be easier to get people to work on it) Is it a desktop game? (Browser games for mobiles are much harder to make) Is it multiplayer? Is it real time? (how long is a turn?) Is it massively multiplayer? is it a single persistent universe, or is it made up of separate game rooms? Would you need a sophisticated lobby? How many people are currently working on this game, and how did you start? Are you friends? Are you students? Are you all located in the same city/country?
  2. SillyCow

    What do game testers learn?

    I have encountered some really good testers in my non-games jobs. A good tester was really helpful to the team. I have also seen them advance through the ranks. The best testers I've known went beyond the testing script that they got: * They spoke to customers ( they learned customer relations / focus groups ) * They insisted on testing in better "field conditions" (they improved the testing scripts) * They got involved to the point where they were given outside tasks (they learned the company they were working in, and network themselves) I've seen testers transition to other jobs such as: Developers Operational Officers Salespeople / Marketing / PR Testing managers That said, if you are allready sure you want to be any of these thing, just go ahead and start learning them. It will be quicker. But if you are not sure what you want, the barrier for entry as a tester is lower. So it is a good job to get exposed to the industry. Although I probably wouldn't recommend a full on degree in S/W testing. It should be an entry level job.
  3. SillyCow

    How do you build your learning roadmap?

    I try to learn at least one new thing with every project that I start. It can be a new language (golang), tool(kubernetes), platform(oculus rift) or algorithm(pathfinding) . (The above list is a collection of things from my last 4 projects) I learn through making a prototype. I am usually guided by step by step internet tutorials. I would say that the thing that sets game programmers from other programmers is interactivity. Your design needs to be user facing with all that that entails.
  4. SillyCow

    What are you working on?

    I am developing an Oculus Rift game with free roaming mechanics. I am developing a "nausea free" locomotion system which will enable an open world game shooter game. ( Far Cry X play style ). Of course, since this is a hobby game, it will all be low poly graphics, and no epic story. Still the prototype is already really fun to play!
  5. SillyCow

    How to avoid bugs

    My philosophy is: "Bugs will happen, so make sure you mitigate them". Make each bug as easy as possible to detect and find. Write automatic System Tests. (ex: Write an automatic test for the AI to play your game) Make sure you have satisfactory code coverage Make small commits, so that you can later browse through them and see which one introduced the bug. Try not to use untyped languages or data. Python, Javascript, etc... will ensure that you will only find bugs at run time. When using C++ try to avoid using pointers whenever possible: Can a function use a const reference instead? Can I use a smart-pointer without hurting performance too much? This pays off 100x when you are multi threading Make sure you use logging as much as possible (when it doesn't cause performance problems). That way you can investigate a sporadic bug when it happens. Make sure you have the option for IDE debugging. (this can sometimes require defensive coding, as not all languages are IDE friendly) Write small functions: If a function does not fit on a screen, you should probably break it down. That way, bugs remain contained etc... Basically: Bugs will happen. You cannot do much against that, However, just make sure that you have favourable conditions to: 1. Understand that you have a bug as soon as possible 2. Isolate the bug (find out what is causing the bug and what is effected by it ) 3. Have the right tools to fix it quickly.
  6. I would consider working for equity under the following conditions: 1. I am very passionate about the company's tech 2. I get a unique opportunity to further my career: (like getting VP status very early in my career) 3. I believe the company has a sound buisness plan: More likely to go for a sure million than a longshot billion. When it comes to making games 1 is easy, but 2 and 3 are hard.
  7. SillyCow

    How to avoid bugs

    For me it's automatic regression tests with ample code coverage. I find system tests do the job better than unit tests here. My most complicated hobby project to date was a multiplayer RTS game. There are lot's of bugs that can cause the game to go out of synch. So I created a test where 2 bots played eachother automatically for every build that I made: This was a complete end to end test: There were actually 2 separate instances of the program playing eachother over the network.All of this was launched automatically, so it was painless for me to test. It took me several days to set this up. But, It helped me catch 75% of my bugs. In a hobby game this is specifically important because bugs demotivate you.
  8. Sometimes I enjoy grinding. Why would you want to remove it? Instead of punishing the grinder, reward the "brave" player! Add a mechanic where if you slay an advanced monster/complete an advanced dungeon you get some special reward. For example: If I kill a dragon at level 50 I get the usual reward loot. But if I kill the dragon at level 10, then the king rewards me with a special artefact for my valour. Maybe it even unlocks some part of the story. Maybe you can even add a "level down" mechanic (drinking alcohol in a pub 🙂 ) to allow players to level back down to make the game interesting again. Could be interesting to level up to get certain traits, and then level down to start earning new traits instead. Or to elaborate on your "getting harder" idea: Make the player a fugitive: The longer they stay in one place, the more run ins they will have with "the law" or the assassins chasing them.
  9. Actually, assuming a single screen game/app it is usually usefull to select one dominant axis as 1.0 and scale the other according to the aspect ratio. Otherwise your image will get stretched.
  10. If your game doesn't scroll, I would recommend making the top left corner (0,0) with sizes increasing towards the bottom right. This is inline with how popular image files are encoded. (I assume you will be loading sprites from PNG or JPG files...) If you are making a scrolling game (Mario), it doesn't really matter. The center of the screen is as good an origin as anything. PS: I wouldn't spend too much time thinking about it, just start drawing stuff and see what happens. Eventually you can make any coordinate system work. Just start try *something* and see if it works for you. (More important to start prototyping and gain some experience). Even if you end up making a mistake, you will have more tools to understand it.
  11. First you need to decide what kind of 2d game you want to have. I usually normalise my coordinates to one of two things: If it's a single screen game (no scrolling) I usually chose one of the axis of the screen (usualy width) as 1.0 . This is useful when you do GUI work: Menus, buttons,etc.. (ex: Card games ) If it's a scrolling tile based game, I usually normalise to tile size: 1 tile = 1.0 x 1.0 (ex: Strategy games / Platformers) It doesn't really matter how you normalise your coordinates, but it's useful to normalise them to something meaningful to your game. (in anything with physics I usually use 1.0 = 1 meter)
  12. Good C++ is always faster than good C# or Java. However the point I was trying to make about game engines is that even if you write your game logic in C#, it is very rare for most of your execution time to be gamelogic. And the actual engine core code is usually written in C++. Compare this to the mobile browser wars of earlier this decade: Android always boasted a faster JavaScript engine than iOS. Yet mobile Safari allways blew it away in terms of performance. Why? Because when dealing with multimedia web-sites, Javascript was rarely the performance bottleneck. It was the browser's rendering engine which was doing all the heavy lifting. Game engine performance tends to work in much the same way. However, if you are one of the rare few doing cutting edge CPU work on your game, then you should be using C++. It's not about "no heap allocation whatsoever". But rather trying not to do any real time heap allocations. Try not to call "new/malloc" when your game is running. If you are using C++ try not to call "delete/free" when your game is running. Just preallocate as much as you can at the beginning and recycle it as much as you can. For ex: I wrote a rather complex pathfinding algorithm for an RTS. The performance more than doubled if I "recycled" my memory structures when I was done with them. So subsequent calls would just reinitialise the memory structures from the previous calls.
  13. If it's only the GC that's bothering you, you can circumvent this problem by not using the "new" keyword too often and recycling your classes. Using "new" in C++ isn't such a great idea either because it will lead to alot memory fragmentation. I tend to try and avoid heap memory allocation in real time games altogether. In unity's C# for example "new Vector3" does not create a reference. You are completely right. But what kind of game are you making (and on what hardware?) where this is a real performance issue? Today's smartphones are powerful enough to overcome most of this. And the really core performance code in engines is written in C++ anyway. Whether you are using Unity or Unreal, both engines' core functions are not written in a scripting language. While you might be coding for some extremely low spec device, I doubt any of the engines target such devices so I assume you are talking about smartphones here. And with smartphones, it's usually the GPU's fragment shaders that set the performance bottleneck. With CPUs, unless you are writing your own physics, or some really fancy AI I doubt any of this would matter. Unless you are note optimising your code (making too many real time memory allocations and such). But you can write bad code in any language...
  14. The best code completion and refactoring tools for the other popular languages. (Javascript[webstorm] ,Python[pycharm] ,C++[visual-assist] ,etc...) do not come close to what you get for C# and Java. In C# and Java I sometimes create whole classes and interfaces just by using automatic refactoring tools. In C++ for example refactoring doesn't yield the same trusted result, and usually fails for all but the most trivial refactoring tasks. Same goes for auto completion. I think that the preprocessor is to blame for this. And many engines make heavy use of preprocessor directives.
  15. This is your problem, not Godot's. I don't want to sound mean but if you're having trouble with Godot's scripting language then you aren't going to have a fun time with any engine. One of the great things with unity (when you use it with Visual Studio) is the great intellisense support. I rarely have to lookup an API on the internet. It's usually all there when I press "Ctrl + Space". This is one of the reasons I prefer C# (Unity) to other scripting languages (C++ [unreal] or custom languages ). C# is great for IDE functions, and therefore easier to learn (While I am a good C# programmer, I don't necessarily remember all of Unity's specific API funtions) This is an example of a "good" language property of C# and Java compared to other languages: They are much more "well defined" and thus easier for an IDE to parse. Therefore, you can get the IDE to analyse your code for you, and even write code for you (automatic method creation and extraction). They will always win in this regard compared to Javascript/Python/C++ . The number of times I have to open a browser and google something when I work with Unreal is much larger, because C++ is harder for the IDE to parse.
  16. It's true that in 2006 that model gave the best graphics card a run for it's money, and today you could render 10 of them in game at the same time on a mid-range card. However it was not the same as witnessing Doom3's lighting for the first time. Doom3's lighting looked like nothing you've ever seen in real time before. Do most games look like the GIF you posted? I'd argue that most games wouldn't put in the time to create that face artistically. On the other hand, I don't think the original quake had a lot of "art" in it. It was more about optimising rendering techniques which were not seen in real time before. I'm not against artwork or anything like that 🙂 . It's just that as a programmer with a life long infatuation with graphics, I don't as excited by this tech anymore. Maybe I'm just old, but then again I don't remember any game marketing heavily on it's engine. Maybe "No Man's Sky", but their engine's uniqueness didn't lay in it's graphics )
  17. As a gamer I prefer gameplay to graphics. Heck as an "old" gamer, I sometimes play some of my old terrible looking favourites just because I like the game play. However... As a hobby engine programmer I get really excited by graphics technology. The first 3d games that I played (Wolfenstine and Ultima) made my mind explode. I had to figure out how they were made!. It resulted in me learing alot of math, and building several software based 3d engines. By far, this is the reason I am a programmer today. So there is nothing like a cool programming technique to get me interested a game. These are the games that I played that really geeked me out when I saw them. When I call them "first", it's the first games I saw as a kid. It doesn't mean they are the first to use a technique. Wofenstein: First "3d" game I played Novalogic's Comanche: First time I saw voxel graphics (it was beautiful!) Seventh Guest: First fully rendered, fully animated environment. (This made me learn 3D modelling software) Quake2: First 3D game I ran with H/W acceleration GTA 3: First game with Seamless 3D open world Doom3: First game with modern lighting techiniques Crysis: First game with modern shader code. These games are not necessarily my favourite games to play. But as a programmer each of them cause me to pick up a book, or create my own version of these games Unfortunately for my programming self, most games today invest in "art" rather than graphics. They create bigger and more impressive set pieces, but I haven't really encountered any revolutionary rendering system that knocked my socks off. I sort of get the feeling that the 3D graphics revolution is over. We are now more limited by what artists can create, then what the computer can render. Ex: Creating realistic faces today is more about putting in the 3D modelling work then inventing new rendering techniques. This nvidia demo from 2006, looks good enough today (12 years later). The reason most games don't look like this isn't because of missing graphics tech, rather it's because it requires a daunting amount of artwork. So I think since DX10 (when the pipelines became flexible) we are at an age where art is more important than graphics. This is sort of reminiscent of 1990s CG movies like Terminator 2 where viewers flocked to theatres to see what the latest CG could do. Whereas today (I'd say since ~ lord of the rings) if you notice the CG it means that it's bad. I actually liked these parts in the "Last Jedi" where they where CG'ing the admiral and the princess (badly). It made me feel like a kid again. And someone was finally taking some risks with some new graphics technology. I am tempted to say that no more GFX revolutions await us in gaming. But I wonder what will be the next revolutionary tech? Fully ray traced games? Something else? I personally have not gotten excited about a game engine since "Crysis"
  18. SillyCow

    Defining AAA

    http://www.mobygames.com/game/windows/stalker-shadow-of-chernobyl/credits This team does not look tiny to me. For god's sake they have the "Prague Symphonic Orchestra" making their music for them! This is the difference between an indie and a funded studio (I will not go as far as calling GSC "AAA"). Both have 3-4 lead engine programmers. But the funded studio will hire an intern to make the particles look slightly better.
  19. SillyCow

    Defining AAA

    Here's a thought: The current discontent over indie vs AAA has nothing to do with the quality of either. It has more to do with mainstream attention. AAAs have made the entry barrier so high that your chances of getting a spotlight as a small studio have plummeted. It's not that AAA have gotten worse. It's not that small games have gotten better. It's just that the percentage household brands which are also "indie" ( or "developed in a garage" as one might say in the 80s) has dropped. That is because a Houshold name means you are one of the top 20(?) grossing games in your category. 1M$ cut it in the past, but today you need to make $50M to be in top selling charts. The indie game can still make 1M$, just as before. It's just that making 1M$ is considered non-mainstream today. It will amount to a game which only a small percentage of gamers have played. One might argue that quantitatively the number of players remains the same, it's just that selling 10k copies today isn't considered much.
  20. SillyCow

    Defining AAA

    I would define this as the time when small passionate teams (sometimes one man teams) accounted for gaming blockbusters. This was a time when noone thought you could ever make billions from video games, so noone thought a multimillion dev budget would be justifiable. Think about a hit like Prince of Persia. It was a game with unbelievable production value for it's time. I'm not talking about innovation, just production value. And it was largely made by one entry level programmer. I think in those "romantic" times it was expected that a small passionate indie team *could* publish a polished hit and compete with the big guys. It was probably the same earlier on (in the Atari period), but I started gaming the 80s. So I am not sure. Today I don't think any indie thinks they can compete with the production value of a AAA title today. Sure, you can strike gold with innovation ( ex: minecraft ) but there will be no indie "Call of Duty". If you compare blockbuster indie developers Crytek to ID software, you will see that Crytek needed a much bigger team to make a blockbuster then ID did. Whereas ID where just "a couple of kids", the very talented Yerli brothers needed an established team around them (and a AAA publisher) to break out.
  21. SillyCow

    Motivation

    I use the drunken self messaging method from "How I Met Your Mother". (The drunken characters would leave answering machine messages to themselves when drunk so that they would remember important stuff in the morning) I make sure to leave an intentional error where I left off in the code. A lot of times that error would be a self directed comment without the "//" prefix. That way I can immediatly find my place in the code and continue working.Something like: int a=math.pow(a,2); Note to self: squared function makes physics look bad, try something else //this will not compile :-) Then all I need to do is start up my project, and get a compilation error on a certain line. And seeing that red compiler message immediatly gets me motivated to keep working!
  22. SillyCow

    Unity dropping Monodevelop a let down for small indie?

    I also don't get all the Anti-IDE hate. I see it in "the younger generation" regardless of gamedev or regular dev job. It's become very fashionable to minimise your tool chain. I don't get it, I feel old 😞 ... I'm always confused when I see my teammates using text editors to code. But this is definitely a growing trend.
  23. SillyCow

    Defining AAA

    I would argue that "nowadays" might be wrong. In the romantic days of gaming there was no big budget involved (except for the console market). What defines our era is that most games sales revenue is AAA. (For good and for bad) I specifically remember that when COD MW II it made more money than any summer blockbuster. That was the moment that I understood that the gaming industry has changed. It had made the transition into "industry". Here you go. I swear that I googled this link only after I wrote my reply above, but look who's on top🙂 . And also look at how much more they spent on marketing! And regardless of whether you think it's a good game or not, it proves that AAA marketing pays off in the end. https://en.wikipedia.org/wiki/List_of_most_expensive_video_games_to_develop
  24. SillyCow

    Unity dropping Monodevelop a let down for small indie?

    But there us a very valid reason to bring up Unreal. If Unity 2017 was a Middleweight and Unreal is a Heavyweight engine. I think I may have misused the word heavy. Didn't mean feature wise. Just the footprint. Unreal Editor is slower than Unity (not talking about the actual resulting game performance). Takes up more space. Uses more memory. Requires a stronger machine. I am not talking about amount of features or resulting game quality. Just what kind of specs you need to gamedev with it.
  25. SillyCow

    Unity dropping Monodevelop a let down for small indie?

    Visual studio is an awesome Dev tool if you are using C#. I Always prefer visual studio over mono develop. The only time I used monodevelop was when I had to develop stuff on a mac. By the way, what's Unity's solution for Mac and Linux in that regard? "Visual Studio Code" is not Visual studio. It doesn't have half the features or customisability that "visual studio community" has. It's actually more similar to monodevelop in that respect. You should not bring Unreal into this conversation. Besides being a much "heavier" engine, Unreal is C++ based. C++ requires a much more powerful IDE than C# or Javascript to get stuff done. I haven't encountered a light-weight C++ IDE yet (just text editors). Because of the cross module dependencies created by the Pre-Processor, C++ is much more complex for an IDE to understand. So when I work with C++ in Unreal I don't dare use something other than Visual Studio. There are many valid reasons to use unreal, but none of them are "small editor footprint" or "easier coding experience".
  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!