Jump to content

  • Log In with Google      Sign In   
  • Create Account


Member Since 26 Feb 2007
Offline Last Active Today, 02:49 AM

#5206047 Lone Poor developer protected from Mega companies

Posted by Ravyne on 22 January 2015 - 02:37 PM

Actually i have gone beyond the business plan stage...
I mean with your idea you haven't  got the funds to fully develop and mass-market it. So with the aim of developing further in versions/sequels, you release a watered-down version (so the idea is out), which means the rich guys (big companies) with the huge marketing resources see the idea and your toil becomes their gain 


So don't make a watered-down version of your game if you're that worried about it. I think this is actually not as much a threat as you make it out to be, but in any case trying to make a big splash against the AAAs by playing their own game is very much a case of your reach exceeding your grasp -- unless you can get someone to give you enough money to play that game effectively.


But you're falling into a false dichotomy, because you don't have to go big-budget or go low-budget. You could make a different game, and thereby build experience and perhaps earn a nestegg with which to take on a bigger idea.


These days cool ideas comes about ( in addition to intelligence and creativity) by: accidentally, experimentally or both, otherwise it will be obvious and would have been done. 
If you don't agree you are saying the rest of the world is not as smart (or maybe dumb) - in other words you are saying "i can easily see what other cannot see"


Think of all the thoughts you've had and never executed on -- and not just executed because it was a bad idea, but not executed because life stands in your way, or because watching your favorite television show is just too important to you to give up in persuit of an idea. You have more ideas, I'm sure, than you would ever get to in 10 lifetimes. Everyone is like this. Follow-through is far more important than an idea, and its far rarer. Many good ideas will have been thought of already (at least in broad strokes), and some ideas do have a kind of latent value, but no idea has extant value until its made real and shared.

#5206042 Options for GPU debugging DX11 on Windows 7

Posted by Ravyne on 22 January 2015 - 02:17 PM

Can you switch from Express to VS2013 Community? With the Express SKUs a decisions was made to not support graphics diagnostics for developers of desktop apps, but VS2013 Community supports all forms of development and all the tools -- Community is essentially Pro with some license restrictions to prevent larger commercial entities and enterprises from using it. You should check the terms for yourself, but in essence Community is free for evaluation, open-source contribution, and for teams of up to 5, and yes, you can use it commercially if you fit under those restrictions. You *can't* use it if you're a development shop larger than 5 people, or are part of an "enterprise" as Microsoft defines it (I forget exactly how its defined, but its either one of making millions of dollars per year, or of having a certain large number (hundreds?) of employees.)


Community, as a pro-derived SKU, also supports the rest of the advanced features that weren't necessarily in Express, and also supports plugins, which the express SKUs don't. If you're a Unity3D user, this means you can use Visual Studio Tools for Unity (formerly UnityVS) entirely free now.

#5206034 Encapsulation through anonymous namespaces

Posted by Ravyne on 22 January 2015 - 01:32 PM

Another point worth making here is that you seem to have gone down this path because firstly, "everything is a class" (which leads to singleton thinking), and secondly that you had some sense that having non-member functions as part of the interface was somehow icky (and hence wanting to put at least some of them in this anonymous namespace).


You're correct in the first case that "everything is a class" is generally harmful -- its requirement is the biggest part of my distaste for Java -- but you're mistaken that non-member functions are bad or harmful. In fact, you should prefer in general to implement any "method" of a class that you can as a non-member, non-friend function because it decreases coupling, reduces the size of the class's essential interface (which is especially good when that class might be inherited from), and encourages less code duplication. As an example, I have a set of classes for vectors and matrices -- only the operators and usual arithmetic functions that modify the self-same instance (+=, *=, normalize(), etc) are actual member functions, the non-modifying versions of all of those functions are non-member, non-friend functions in the parent namespace, and they're all implemented in terms of the self-modifying operators using the canonical pattern. These non-member, non-friend functions in the same namespace as the class they relate to actually are a part of the class's interface every bit as much as its member functions, and Koenig Lookup ensures that the right thing happens WRT overload resolution and such.


Anonymous namespaces are useful when you want to restrict visibility to the file as you've identified. Its the right tool if that's the thing you want. But in the general case when you have a free function that deals with a particular class, you do want them to live in the same namespace; sometimes you might want to put such functions that are infrequently-used and numerous into a different namespace (such as a 'utilities' namespace) though, or into several namespaces demarcated by different problem domains, although the new inline namespaces feature of C++11 might be a better choice here.


Also as other's have said, any state in an anonomous namespace is still shared state -- its a less-global global, but a global in some measure just the same. It suffers all the same ills as any global, its just that its effects are generally limited to having influence over less code. Be cautious, however, of the fact that just because something has file-scope and can't be directly touched by outside code does not mean its entirely hidden; If a change to its state can be observed indirectly through an interface that's available to outside code, then it still has an affect outside what you would assume its sphere of influence to be (The same is true of private member variables). Again, this is not a bad thing and more-hidden is almost always better than less-hidden -- its just that usually what we want form hiding something is not hiding for its own sake, but hiding as a means of maintaining invariants -- private members achieve this at the granularity of a class, while things with (private) namespace scope achieve this at the level of a namespace, and things with file-scope achieve this at the granularity of a file (which might define a system of otherwise unrelated classes).

#5205894 Lone Poor developer protected from Mega companies

Posted by Ravyne on 21 January 2015 - 06:54 PM

You won't beat the big guys by playing their game -- they're expertly structured to thrive in that competitive environment and have the resources necessary to do so. But, as in evolution, they're a kind of creature that's over-specialized at doing what they do such that they have trouble doing anything else -- they rule the oceans, but they walk dry land about as well as you'd expect of a fish.


So build your camp on dry land and do the things that a big studio cannot do. You'll face other competition for sure, but at least you'll be playing a game you can make a reasonable go at.

#5205891 Should I leave the current company and take the risk?

Posted by Ravyne on 21 January 2015 - 06:44 PM

Its not a personal affront -- they had work that needed to be done, and they had to map the resources they have (that's you and your coworkers) to that work that needs doing to ship a product. Its that simple. Its every bit as likely that they chose you because they thought you'd be better at the server stuff than your coworkers as is that they chose you because they thought your coworkers are better than you at the client stuff. More likely still is that they considered the sum-total of effort vs. productivity given all the combinations of assignments they could make -- Lets say your co-worker, lets call him Bill, does 1 unit of work on client side, but would only do 0.8 units of work server-side, and lets also say for the sake of argument that you are objectively better than Bill at both, you do 1.2 units of work on either the client or the server side. If your manager places Bill on the server work, and you on the client work, the two of you generate 2 work units total -- on the other hand, if Bill stays on the client where he's more competent, and you on the server where you do just as well as on the client and certainly better than Bill would, the two of you generate 2.2 units of work. A manager would have to be dead between the ears to not take that 10% productivity advantage. Again, I think your attachment to the notion that you should always get to do the work that most interests you is coloring how you see things.


Managers can sometimes be daft, but you can always trust that they are doing the thing they think leads to quickest/cheapest success, if for no other reason than self-interest. If you're contributing to their success, then they'll put you where you're most useful -- that's how they get noticed, get raises, get promoted -- and even bad managers know to reward those who keep them in success almost all of the time. 

#5205843 Should I leave the current company and take the risk?

Posted by Ravyne on 21 January 2015 - 02:56 PM

You should worry a little why those two guys are leaving, especially at the same time and especially why the employer wouldn't or couldn't move mountains to get at least one of them to stay because even in big companies, the teams themselves are usually small and loosing two out of even a dozen team members is a significant blow unless they weren't contributing at all.


But this isn't a binary decision. It sounds like you have your current job for as long as you want to keep it and can put up with doing work that's not as fulfilling as you might like. That gives you time, which is a force-multiplier when it comes to having options. Your choice is not "Do I stay or do I jump to the AR company?", your choice is "Do I stay, do I jump to the AI company, or do I make due where I am while I take my time looking for a better opportunity?" Time is always the most valuable asset you have when making a decision, don't be so quick to throw it away in a panic.

#5205831 How to get a job in Graphic programming?

Posted by Ravyne on 21 January 2015 - 02:20 PM

You probably need to re-aim your portfolio -- accepting that jumping straight into a graphics programmer role is unlikely, landing a more attainable entry job like game-play programmer isn't going to happen with a portfolio that only shows your graphics chops. Its always good to show well-roundedness, but every employer want's to know that you've deeper capability in the particular area they can fit you with. Valve describes their ideal employee as being "T-shaped" by which they mean that an employee who knows a good bit about every subject and knows everything about one particular subject is what they're looking for; and while Valve is known best for saying it, it's my observation that this is really almost everyone's ideal employee. 

#5205826 Should I leave the current company and take the risk?

Posted by Ravyne on 21 January 2015 - 02:02 PM

To be perfectly blunt, it sounds like something in this working relationship has soured already -- whether its your manager, this other programmer, or yourself I don't know. One thing I pick up on though, is that you seem very attached to the idea of working on the things that interest you -- and while that is normal, healthy, and good in your own hobbies and for the effort you put into growing and expanding your skills, that's not the reality of how any job anywhere works. You were hired to do a portion of the work that needs doing, because the business of your company is getting products out the door -- always keep in mind that shipping products is what pays your salary at the end of the day, and you can't buy bread with your job satisfaction. Think hard and be honest with yourself whether you are part of (or perhaps the whole) problem -- Are you really as skilled as you think? Should you rightfully earn the higher wages some of your (soon-to-be) coworkers do? Are you able to execute in the areas you are interested in better than the coworkers assigned to them? Have you payed your dues and earned the right of first-refusal to the work that interests you? (Hint on that last one: You've been there a year; you haven't.)


Now, I'm not saying you should always stick with an unhappy but stable job -- I've done that once in my first job out of college and ultimately my enthusiasm waned, my performance suffered, and my manager's opinion of me soured to the point that I was not only responsible for my own shortcomings at the time but became the scapegoat as well for my direct supervisor's mistakes. Who was to blame in exactly what measure is mostly irrelevant except that I acknowledge and accept my part, the combined environment of this particular workplace, coworkers, and myself was simply toxic. I was shown the door, but it was business -- nothing personal, and even though it was a blow to my ego at the time, I was also happy to be free and quickly found happier work that even came with a pay raise. Happily I've not experienced that anyplace else and everyone since seems happy with my contributions (plus, I take a little solace knowing that supervisor was let go about a year later). You might be at a similar junction to my own, but life is always a balance in all things and one that's very personal, only you can make the final call.

#5205604 Best Computer Type for Game Dev?

Posted by Ravyne on 20 January 2015 - 01:20 PM

For your needs, just about any off-the-shelf computer will do. The only thing I would really look out for, in a desktop, is to do some research and make sure you get a standard ATX power supply, rather than something proprietary like some of the less-expensive manufacturers use. The reason is that often times these proprietary units are low-wattage and lack the additional power connectors that even some lower-end discrete GPUs require. If its a standard size and layout, at least you can replace it, should you need to. Luckily proprietary PSUs are less common than they once were, but you still have to be aware of them. Double-lucky, many mainstream OEM-style PSUs will support enough additional GPU power connectors to drive at least one middle-of-the-road GPU.


Other than that, shoot for at least 8GB of RAM, and get a decent SSD if you can afford the luxury -- even a smallish (128 GB) one if its all you can afford. Both are relatively small costs, but are the best thing you can do to increase your computer's perceived performance. Often times, when we feel that our computers are "being slow" its because its waiting on the slowest componenent in the system, which is the hard disk, either because you're accessing files, or because you're using more RAM than your computer has and the OS is swapping pages of memory to hard disk.


For myself, I usually build from scratch or have bought a base workstation (as in, a computer for serious work, not a consumer PC) and made the upgrades I wanted. If you're feeling adventurous and maybe have a technical friend that can help you out, a self-build might be worthwhile if you have a local place you can get components for fair prices. Newegg is great too, but it's a lot more hassle to deal with if you happen to get a faulty component, so I don't recommend it for an inexperienced builder.


If you're in the market for a laptop, the biggest thing is getting a good-quality screen that's a comfortable size and resolution. Like Alessio said, 1366x768 is miserable to work on -- 1600x900 is passible, but 1080p (1920x1080) is better. For me, a 14" screen is the ideal compromise of screen-size vs computer size, but right now I've got a 15.6" laptop (1080p) because I wanted hardware you really can't find in a smaller machine today, let alone two years ago when I bought it. You'll also find some very-high-resolution displays nowadays -- they're nice because you get really sharp text and images, but at small physical screen sizes more pixels doesn't translate to more useable desktop space because the user interface becomes intractably small; in terms of usable space, cut those high-resolutions by a factor of 2/3rds or 1/2 to get a sense of how much usable workspace you get. You'll also want to be wary that you can't add more RAM to some laptops today because its soldered in to save space (mostly a problem on very thin computers, and on all Apple computers regardless of size), you have to buy the amount you want when you buy it and be content with it. The next biggest factor is the keyboard and pointing device -- you can use an external keyboard and mouse (and display) at home, but you won't have that at the coffee shop, and even just a mouse is a no-go in cramped spaces.

#5204749 What is a good average vertex-per-face count?

Posted by Ravyne on 16 January 2015 - 12:38 PM

Think of it like this -- your goal in making greater use of the vertex cache is to avoid paying the cost of another vertex, right? You can also avoid the cost of another vertex simply by not adding another vertex to the model -- in fact, fewer vertices are better because it increases the relative occupancy of useful verts in the cache. Getting the most visual fidelity for the fewest verts is the first-order optimization, it'll give you the most bang for your buck.


The vertex cache is there to optimize the gpu's execution of whatever mesh it's handed.Its good to consider cache behavior, but its a Second order optimization at best. You're just so much more likely to be hosed by having too many verts, or state changes, or API overhead, than poor vertex cache utilization, and the workflow to exercise any control over it is so time-consuming, that its not really worth thinking about.


Your tools should try to emit good patches/strips, and your artists should be vaguely aware of things they shouldn't do because it will prevent the tools from doing a good job.


If all you're really asking for is a means you can use to identify outliers to investigate for iteration, you'll already have it: its the one whose vert/face ratio is unusually high compared to similar models.

#5204556 I am beginning to hate the IT and gaming industry.

Posted by Ravyne on 15 January 2015 - 03:28 PM

I don't think it does anyone any good to espouse that the game industry is stable or "fair" -- it operates increasingly like a move production, where only core staff have any reasonable expectation of being retained until the next project, the other 85% or more either have a next gig to fall to, or they don't. Most studios aren't large enough to run multiple projects, concurrently, thus the time when they need the most people (pre-launch crunch) is immediately followed by a period of time where they need the least (conceptualization and early work on the next project). The number of active employees at these single-game studios is very cyclical. At least the modern ability/trend to patch titles/offer DLC keeps some of those folks in a job for a bit longer, its not quite the cliff it would be otherwise. Also some of the perpetually-updated titles like league-of-legends exempt themselves by having a steadier flow of work and income over long stretches.


But I would say that most game industry employees don't know now that they have a job when their current project ends, and this combined with the number of newcomers wanting to break into the industry, and fewer jobs than people wanting them all conspire to make it really hard to get a job inside -- for newcomers especially, but even for those with experience. Longish periods of unemployment, or stories of taking months or years to break in, are not uncommon.


Now, lots of "sexy" jobs are like this, so I don't mean to overly vilify the industry -- save those studios that abuse their buyer's market, the industry has become what it is because its just the model that's evolved to maximize profit (keep in mind that the largest publishers/studios are publicly traded, and that even those who aren't mostly have to compete with or appeal to those who are, forcing most players to follow practices one way or another). In some ways I think the game industry gets the most negative press over this because its the most attainable career out of the "sexy" jobs that operate in roughly the same way (movies, TV, music, etc).


To Shogun -- keep trying to achieve whatever path it is that you want. You can only do your best, and try to accept that at the end of the day the hiring is out of your hands, and is mercurial at best. Its frustrating to be turned down, and can be scary to be without work (like many, I've spent a 10 month stint unemployed too). Good luck. The only advice I can offer is don't compromise or sell yourself short out of frustration -- in the end it will only make you feel less valued.

#5204541 What is a good average vertex-per-face count?

Posted by Ravyne on 15 January 2015 - 02:48 PM

I don't think anyone is saying that the number of shared vertices doesn't have an impact, I think they're saying that its not a factor you can control for -- you can't go to your artists and say something like "I need you guys to make sure you share more vertices" and expect anything other than an incredulous stare -- the sharing factor is essentially a function of the subject matter and the level of detail, assuming the artists (or model processing tools) aren't doing anything incompetent.


As a thought experiment, consider a simple cube: 12 faces, 8 vertices is 0.66... verts/face -- you can get a much lower ratio by adding a vertex to the center of each side of the cube: 24 faces, 14 vertices is 0.583... verts/face. So, you can achieve more sharing of vertices in a cube, but only by adding vertices which is bad. Going the other direction, if you take away even one vertex from the original cube, its no longer a cube. Thus the conclusion: more sharing is just more sharing, it does not produce an optimal model; 0.66... is the optimal sharing ratio for a cube -- other kinds of shapes have different optimal ratios.


My gut instinct tells me that for any given subject matter (person, machine, etc) and given level-of-detail (that is, the number of total vertices) the law-of-averages/scales-of-economy says that the model will converge towards the ratio that's optimal, given those parameters. Again, assuming that the artist/processing tools aren't incompetent.


As to the cache, yes you want to make effective use of it, but you need to understand how it works -- it doesn't remember every vertex that's been transformed anywhere in the model, its been a long time since I've investigated, but last I remember the cache was only 16 verts deep. You don't benefit from the cache at all if you come back to that vertex 17 indices later. While it is the case that more shared vertices are likely to exercise the cache more, that doesn't necessarily translate to better overall performance, because you might be sharing more vertices at the cost of simply having more vertices, each of which has to be processed at least once. That's why the general approach to optimizing a mesh is to reduce vertex count instead of trying to increase sharing.

#5204349 How to validate restored purchases?

Posted by Ravyne on 14 January 2015 - 06:35 PM

A simple solution might be to just cap the number of times the server will validate the receipt without intervention from you to some reasonable number -- say 5 or 10 times. That way, if a group posts a receipt in order to attack you, only the first few will succeed; thereafter the crack will fail and it will likely be downvoted/forgotten about on whatever site it was posted. Now, its still possible that a user might legitimately want to install your app on that number of devices over the lifetime of the app, so when the server finds the number of validations have been exceeded, pop up a messagebox with instructions (or better yet, a template) for contacting you to lodge a support ticket to unstick their validation. That gives you a chance to intervene in mass piracy, while not costing you too much developer effort, or the customer too much trouble -- they write you an email, you make sure their story checks out, and you reset the validation counter in your database -- if you see a thousand requests against that receipt, you know its been compromised, and can react accordingly. This incurs some work on you to intervene, but the difference between (large-scale) piracy and legitimate users will be obvious -- if your game should ever be so popular that the volume of this work overwhelms you, you'll probably be wealthy enough to hire the help.


A skilled cracker will be able to side-step all of this, but IMO the above describes about the right level of effort to go to in stemming the tide of casual piracy.



In another approach, even if your game has no multi-player content, if you can provide *some* kind of service (friends, leaderboards, "cloud" saves, simple online inspection of their character) to online account holders, that can still incentivise them into creating an account and another fringe benefit might be that it simplifies this whole validation business by gaining you an account to tie them to.

#5204094 When you realize how dumb a bug is...

Posted by Ravyne on 13 January 2015 - 06:40 PM

My "best" bug happened way back, in a compiled version of QuickBasic (4.5, for those who remember). I had well exceeded anything most bedroom QB coders had done in terms of volume/advanced (not to say I was genius, only that few people were content pushing QB rather than moving on to, say, C). Anyways, my source code had grown too large for the DOS-based editor to keep it all in memory, so I started writing code in windows using a programmers editor and a make-like build system someone in the QB community had written to solve this problem.


One day I had made a large changeset to convert my tile engine from a grown-too-large switch statement, to indexing my tiles using pointers into a big memory buffer -- sounds simple, but pointers weren't really a concept in QB originally, they had been added in v4.5 (or maybe v3.0), and it was also DOS 16bit pointers where you had to juggle code and data segments separately, in sync with whatever you were doing with pointers. I was maybe 14ish at the time, and I knew a little bit about 32bit pointers from C, but the segments/near/far pointers of DOS were new to me. This was Windows 95 era, so Windows at the time was still intimately connected with 16bit DOS in some ways, but was supposed to be isolated at least to the point that Windows would remain standing even if a DOS app crashed.


Whatever changes I had made that day proved the isolation theory wrong. I had introduced a bug that instantly blue-screened Windows, bringing down the whole computer. A proper debugger was an unknown luxury, so you guessed and checked your way to a solution, logging mileposts and conditions to a file, commenting out sections of code, or even by counting beeps (which was only scalable if you knew the ballpark your bug was in). I narrowed it down within an hour (several crash/reboot cycles later) to the lines of code that were setting up segments and pointers. There were a pair of lines repeated to set up the 4-5 segments of memory I was using and then compute the pointer within that, given an index. I narrowed it down to one of these pairs -- now, the pairs themselves were identical, save for which segment they were modifying and as I remember it the API was such that it couldn't be re-factored into a function, so I had just retyped the lines several times.


So I compare the lines in the pair to each other and everything lines up. I look at the pairs above and below, and each pair also lines up with each other. The surrounding pairs also work just swell. The program doesn't crash when I comment out the offending pair. I spend my evenings for 3-4 frustrating days researching segments, and pointers, theorizing, testing, failing -- wondering what could be causing the one pair to fail when the others worked. Everything lined up, or so I thought.


It turns out that every pair did line up, but the offending pair didn't line up with the working pairs in just one way. In the others I had done the pointer arithmetic correctly with an addition operator, and in the failing pair I had mistakenly transcribed a multiplication operator in its place. I didn't realize this until, as a last resort, I compared the pairing lines character by character with the preceding, working, pair. Days wasted. Fucking Eureka's lament. Wherever the pointer landed was somewhere in the weeds that Windows, or DOS, or something in between didn't much like.

#5204045 Code appearance, is it really important?

Posted by Ravyne on 13 January 2015 - 03:55 PM

Code should be these things in this order: correct, clear, fast -- at least for most of its life. Sometimes when a project approaches its launch date and there's still some performance lacking, you can trade away clarity (and even some correctness, sometimes) for speed, but you never do that until you need and you never trade away more than is necessary.


The code you posted looks to do the same thing as your friends, and if your solutions are correct (first), you're starting off just fine. Your friend's code is indeed more clear because he does in one expression what it takes you 5 lines and a loop to do -- to understand what your code does a reader has to parse and mentally execute those 5 lines doing enough loops in their head to be satisfied they understand any loop-wise variations (here, which index receives the new random number), to understand your friend's code a reader only has to know what the random.sample method does, and because they know that the implementation of random.sample is sound, they only need to check its parameters for potential errors.


To be clear, its not always about having the fewest number of expressions -- usually its a good metric for clarity -- but its possible to take brevity so far that it actually obscures clarity. A better guide is this: whenever you write a code expression (or series of expressions) that maps to a concept in your problem domain strongly consider making it a function -- if it contains more expressions/statements than the number of times you use it throughout your program, it should definitely* be a function. For common problems, its almost certain that someone else has already done that work for you, and you should use that instead, especially if its a part of your language's standard library, or is already provided for by another library dependency you've already taken.


*definitely: There are no concrete rules, so 100% definitely is not strictly accurate. I mean to express something less than always but well north of almost-always.