Jump to content

  • Log In with Google      Sign In   
  • Create Account


Member Since 26 Feb 2007
Offline Last Active Oct 21 2016 10:37 PM

#5308537 Visual Studio Hardware Requirements Seem Lower

Posted by on 29 August 2016 - 01:40 PM

Also, depending on where you live and whether you can wait on a good deal, you can really get a lot of bang-for-your buck. At least in the US, late fall seems to be a great time to get a good deal, because computer sellers are blowing out old stock before the late-fall hardware refresh cycle. If you can wait, consider that, but don't hold yourself back if you can't. Sales typically start around a month before school starts back up, and again after thanksgiving/black Friday.


Lenovo seems to always have pretty great deals around that time, and makes excellent machines.

#5308534 Visual Studio Hardware Requirements Seem Lower

Posted by on 29 August 2016 - 01:31 PM

I want a really nice experience, no lag, etc.



I would use an external SSD with either.


It was said to get the most ram and processing power for my money so thats what I tried to do.  However now I'm looking at the i3s and the money is less and the display is 17" and beyond.


What do you all recommend is the larger screen still much better or is the smaller laptop screen (15.6 ") just what's used now, and just as good for amount viewed?


Honestly, you're going to have to make some compromises at the price-point you seem to be targeting, you can get a pretty decent computer experience out of a $350-$400 laptop these days, but its not going to be the nicest experience.


The machines you linked to have low-resolution displays -- 1366x768, and that's really too low to be very productive with Visual Studio or other productivity tools, and the pixels will be quite chunky on a 15.6 inch screen. Plus, its not screen size alone that gives you usable real-estate, its having a balance of screen size and resolution -- I find that you really want 1920x1080 or better, and that its a good fit for most screen sizes (though, its too many pixels for anything smaller than 13.3 inches, and even that is stretching it) -- 1600x900 is a good resolution for a 13.3" screen too, though I'd say its the bare minimum resolution for on-the-go productivity. Screen quality and viewing angles are also important considerations for your comfort and ergonomics.


You also want an internal SSD. You can get some pretty speedy external SSDs, but they're not inexpensive and you could have an internal SSD for the same price. Try to find a laptop with a 128GB SSD inside or larger; or get one with a mechanical drive that you can easily change yourself without voiding your warranty -- and then buy an SSD and install it yourself --  afterwards, you can put the mechanical drive in an external USB enclosure and use it for extra space and backups. Be aware that if you install your own SSD this way, you'll need to jump through a couple hoops to get your OS on it, but its doable.


In a laptop, screen-size also directly affects portability. A 17" inch screen sounds good at first, but you might not feel that way after lugging the thing around for a day. If its too heavy or bulky that you never want to move the thing, you might as well have built/gotten a desktop instead, since you'll usually get more computer for the same money and have more options to expand and upgrade. If you're going to pay the laptop premium, it needs to be portable in practice, not just portable in theory.

#5308528 2D huge tile maps handling?

Posted by on 29 August 2016 - 01:01 PM

Tangletail explains the concept well. To implement this, you'll want a map data structure and file format that lets you handle 'chunks' of the map as a unit, rather than trying to pull only what you want from a monoliithic map structure (e.g. a big array of tiles in memory or on disk).


So, rather than a giant array of tiles, you want to break your map into regular pieces -- you can choose different sizes to suit your needs, but for now lets just say that chunks are 16x16 tiles (power-of-two squares have some nice properties that can be used to optimize some of the math you'll need to do). The chunks themselves are just smaller versions of the big array.


On disk, your map file won't be a big array of tiles any more, it will be a smaller array of chunk references, and a list of chunks. From disk, you load whole chunks whenever any part of it is within the radius you care about.


In memory, your map structure is similar to the disk contents -- One option is to load the entire array of chunk references into memory, which you use to determine which chunks are nearby and need to be loaded. If your maps are really big, you might increase the chunk size or even chunk up the array of chunk references (giving you two layers of indirection, instead of just one). When a chunk is inside your area of interest, it needs to be in memory -- and when a chunk you've previously loaded goes outside the area of interest, you can re-claim that memory so it can be used for another chunk. This works really well with memory management pools -- You set aside enough memory for the maximum number of chunks that fit inside your area of interest, and maybe a bit more, and then you just re-use those chunk structures over and over to hold different chunks as the player moves around. The chunks in memory don't have to be stored in map-order, the array of chunk references takes care of spacial ordering.


This is all pretty straight-forward, but you're dealing with a few different coordinate systems here (the camera view, the area of interest, the coordinate system of the array of chunk references (and one of those for each layer of chunking), and the coordinate system of the chunks themselves) so there are a lot of little details to get right.



Now, all that assumes that the world is sort of static -- if not, you'll need to additionally track which chunks the player has changed, and write those changes back to disk somehow, and you might need to do that per-player if they have independent worlds. 

#5307987 Style preferences in for loop structure

Posted by on 26 August 2016 - 01:25 AM

I'm personally a fan of decrementing my for-loops rather than incrementing. So instead of this:

for(int i = 0; i < MAX; i ++) {
    // ...

There's this:

for(int i = MAX; i --> 0;) {
    // ...

It comes in handy when you're iterating over a container and are deleting elements along the way. This way you don't have to add as much logic for moving to the next element after deleting the one prior.


Methinks that's one of those "too clever by half" moves. Its a neat syntactic side-show, but the result is mostly that anyone who reads your code scratches their head for a minute, has a minor eureka, then hits google to confirm and winds up at that stackoverflow thread. It wouldn't pass any code review I was part of, just on the grounds of it not being idomatic.


Theoretically, comparisons against 0 can be faster, but this form also gives up the pre-decrement operator which is similarly theoretically faster, and, IMO, if you're doing anything interesting with those indices you've probably then got to jump through hoops to adjust your thinking to operate with 'unnatural' relationships between indexes.


Looping backwards does make good sense when you want to modify elements of a container in-place based on preceding elements, though. As far as removing elements from the container, probably something like the std::remove_if-std::erase idiom is better, especially if you leave those "removed" elements around, as in an object pool.

#5307734 Visual Studio Hardware Requirements Seem Lower

Posted by on 24 August 2016 - 05:57 PM

Most any new laptop these days should be enough -- you probably want an i3 at a minimum, you probably want 8GB at a minimum, and you probably want an SSD drive, and it sounds like you want at least 1 external video port (easy) if not two (many -- if not most -- laptops have this) -- Even with one external port, you should still be able to use the laptop's display for debugging while your program runs on an external monitor.


On the topic of GPUs, even integrated graphics are quite capable these days -- You can play a game like Bioshock Infinite on modest settings at lower resolutions at 30FPS or more with a mid-tier integrated Intel GPU, and the Skylake ones, at least, are Direct3D 12. They're more than enough for 2D, and more than enough to experiment and grow into modest 3D graphics. They'll just never be a 3D powerhouse.

#5306232 Tax Deductions

Posted by on 16 August 2016 - 03:32 PM

You really need to consult your own tax professional now, and should have done so before paying anyone out of your own pocket. I would imagine that--depending on the formal structure of your company (e.g. DBA, sole proprietorship, partnership, LLC, or Corp) or lack thereof--how you put money into the "company" from your own pocket can have wildly different tax implications, just as how you take money *out* of your company has wildly different tax implications. For both you as an individual, and for your company.


In particular, I'm 95% certain that putting your personal wages into a "company" wouldn't adjust your income unless, potentially, it was somehow structured as an investment. I certainly would not expect for your employer to adjust for it.

#5306068 Is it pathetic to never get actually helpful answers here?

Posted by on 15 August 2016 - 08:46 PM

I would say pretty universally that good-quality questions attract good-quality answers, often excellent answers given the level of expertise around here. If you find yourself gathering low-quality, follow-up-questions, or even aggression, the best way to remedy that situation is to ask better questions. Also, I've seen a handful of people over the years who were upset not because their questions weren't being answered, but because they weren't getting the answer they wanted or expected -- if this is you, stop it. Part of the art of answering a question here is reading between the lines and getting to the heart of the issue, and sometimes the problem that needs to be addressed is not the one you've asked about. No one wan't to give you the right answer to the wrong problem, but without context to your question its easy to come to the wrong conclusion about what your problem really is. Again, providing more context to your question is key to getting useful answers.


Likewise, there's an art to asking questions as well. Tell us the what, where, why, and how of the problem in your question. What are you trying to do? What's wrong with the results you see (and what did you expect to see)? What are the surrounding conditions/assumptions that restrict your solution? Where are you doing this--platforms, libraries, programming languages? Why is this problem important? How have you tried to solve it already?


Keep your question focused -- its easy to drift into giving too many details or irrelevant details.

#5305666 Preparing for a 'XBox one' code stream

Posted by on 13 August 2016 - 02:43 PM

I would say say that that your perception is off with regard to market share if you're talking about new style apps using the Windows 8.1 SDK -- Windows 8.0/8.1 combined account for only 11% of OS market share according to the Steam hardware survey. Windows 10 currently accounts for 46% (with only 3% of those 32bit) and Windows 7 accounts for 36% (with 20% being 32bit). In other words, you don't gain much by addressable market by setting Windows 8 as your minimum bar -- it seems to me that if you're going to go through the trouble, you'd want to go back to Windows 7, not 8 -- and by not fully embracing Windows 10 UWP over Windows 8, it costs you being able to reach XBox (assuming you were accepted into ID@XBox) which is one hardware configuration with a 20+ million install base.


That said, those are the July survey numbers and the Windows free upgrade offer expired at the end of July, which probably prompted a final push for people to upgrade (anecdotally reinforced by the panic on my facebook feed). I'm really interested to see august's survey, I expect a larger than usual decline for both Windows 8 and for windows 7, and I would expect Windows 10 to reach perhaps as high as 60%.


After that, Windows 7 will remain a relevant, if ever-shrinking, market for probably another year or two. Windows 8 will be even less relevant than it already is.

#5305573 Preparing for a 'XBox one' code stream

Posted by on 12 August 2016 - 08:11 PM

You can use the Windows 10 SDK to make UWP apps, which work across PC/Xbox, but as an "app", not a native game. That's probably closer to XNA / XBLIG on the Xbox 360.


The Xbox uses Direct3D11.x and Direct3D12.x, so you'd have to carry out some small amount of porting still.


If you mean to take the 'I don't have an AAA publisher' route, you have only two paths to get software running on other people's XBox Ones -- you have the open "UWP Apps on Xbox One" program, and you have the ID@Xbox program.


Ultimately, the direction things are going in is that Xbox will become, more or less, just another Windows form-factor. Still, though, the two programs I mentioned are different, and have different intentions behind them.


Currently, the UWP on XBox program that's completely open to anyone is for Apps, not games -- this is a policy decision, not a technical one. As a developer you can write a game using the tools made available, and you can deploy and test on your own Xbox. You cannot, however, put that same application on the Store where other people can buy it. This could change, but its not how it is today. Furthermore, the apps you create using these tools don't have full access to the hardware (All 'apps' on Xbox work this way, e.g. Netflix, etc) -- You're limited to 1GB of RAM (and you have to be able to enter suspensions using only 128MB quickly), your app shares 2-4 CPU cores with the OS and other apps, you have the vanilla DirectX 11 API, but only at feature-level 10_0 (e.g. no async), and you get only 45% of the GPU when your app is in the foreground.


For games, the ID@Xbox program is what you want, but its not open to everyone. If approved, you get the same level of access to the hardware and to XBox Live that AAA games get. 



As far as setting yourself up for an easy transition, you've gotten good advice. Basically, create a Windows 10 UWP game, and stick to the new APIs as much as you're able to (not the legacy/Win32 ones), assume until further notice that you'll have Vanilla D3D11 (not D3D12 or 11.x), assume that libraries that work under Windows 10 UWP will work under XBox UWP (or that support will be added quickly once things open up).


Design-wise, the Xbox is not just a PC with a controller -- Do design for first-class controller input (Look at 7 Days to Die to see what happens when you don't), but that doesn't just mean gameplay input. UI best-practices are completely different for gamepad vs. Keyboard/mouse, and you design just differently for a 40" screen at 6-10 feet than you do a 24" screen at 18-30 inches.


Here are some links:

Games and DirectX (UWP), especially the Windows 10 game development guide and Game technologies for UWP apps.

ID@Xbox program page or check out what indies are making.

UWP on XBox One

#5304983 Ide For Linux

Posted by on 09 August 2016 - 04:05 PM

For the full story: Evil Mode: Or, How I learned to stop worrying and love Emacs


The short version is that while Vim is a remarkable contribution to free software, and is awesome for many people, the quality of the software code itself is not so great, its unlikely that Vim proper will ever be able to support asynchronous processes (One of the motivating factors behind neo-vim), and there is effectively no continuity of ownership when its originator and sole maintainer, Bram, passes away. My reasons for moving away have less to do with any current lack of vim (other than async processes), I just see it as inevitable that once Bram is no longer around, its effectively the end of the line for evolution from the Vim codebase. There's a lot of spaghetti inside, and a lot of not-so-great code, and Bram is basically the only one who understands it properly. In an interview he was once asked "What can we do to keep Vim around?" and he quipped "Keep me alive."


As I said, I prefer Vim's approach to the keyboard and modality -- but I can get that through Evil Mode under Emacs for the most part. Then, you have the fact that the torch has already been passed to the community for Emacs, its abundance of modes, its support for glyphs/images (no more vim font-hacks) -- The ecosystem is equally good, if not better, and there's no specter of reaching the end of the line.


Plus Org-Mode -- praise be to Org-Mode, its truly awesome. Access to Org-Mode is reason enough alone, to make the switch to Emacs, IMO.

#5304825 Using Other Cultural Traditions In A Fantasy Game?

Posted by on 08 August 2016 - 11:40 PM

Sure -- and to be clear I don't mean my line of questioning as an attack on your motives, but more as a prompt for examining and challenging your own motivations (and for anyone who might come along asking this same vein of question). Likewise, I'm not in a position to promote or defend your intended use because I'm not a follower of Hindu faith or culturally Indian -- in fact, a white American like you.


I think its entirely possible to present a work of fantasy that borrows from history and traditions that are not your own without misappropriating them. And honestly a lot of contemporary humanity is not all that much more disconnected from others' traditions as they are 'their own' -- its not like being a white Catholic from the states makes you an expert in Catholicism or the early Vatican politics. I think the way to think about it is that the more directly you attempt to crib from any source, the greater risk you carry to get it wrong, and potentially the greater harm you do by getting it wrong. I think its also more difficult for you to gauge that risk or potential for harm the further you're removed from those things you're borrowing. For those reasons, extra caution and care when dealing with such things are your due diligence, IMO.


I think as designers and storytellers we also need to keep sight of the fact that once we put something out into the world, we lose control of it. We created it, but its not ours anymore. What we meant to put out into the world doesn't matter then, only how it was received. If some people were offended by it, then it was offensive -- maybe only to one single person, but no amount of our own good intentions erases or excuses how they experienced it. And still some people might think its a beautiful thing -- maybe lots of people, maybe people very much like the one person it offended -- all that can happen simultaneously and it doesn't make one perception any better or more correct than any other. This is true of any axis of criticism -- we do our best to put our visions out there, and in the end you're the one who's ultimately responsible for how well you can continue to sleep at night.

#5304801 Headless Server Side Software Renderer

Posted by on 08 August 2016 - 08:59 PM

Sean makes a very salient point -- its not free to do this server side. The CPU costs are probably pretty minimal, and can be amortized across users, but bandwidth costs money and even small updates (assuming you chunk your map, and program your clients to deal with it) add up. Its not always worthwhile to try to please everybody -- you're going to undertake a lot of work to please a few people who will make up your largest active operational costs -- are the margins wide enough to support them? Is the opportunity cost of implementing and maintaining that support now and in the future more important than all the other work you could be doing? From a business perspective you're often better off trying to capture more people who are like your core users (e.g. those with WebGL) than by capturing entirely new segments of users (especially already marginal and ever-shrinking segments like those without WebGL) -- its just a form of economies of scale, you may want to expand your definition of who a core user is after a time, but only when you believe it'll be easier/cheaper to capture new people from the expanded definition than to continue capturing from the original.

#5304776 Adequate Windows Operating System Testing

Posted by on 08 August 2016 - 05:57 PM

For games, XP has some relevancy left if you're targeting internet cafes in China or nearby regions (and maybe some others), but even that is giving over to Windows 7 or newer -- XP is already small, and only going to get smaller. Keep in mind that supporting XP means supporting Direct3D 9 and other legacy APIs, which is an entirely different level of expense than supporting the modern APIs you're already using on down-level operating systems that support them.


Also ask yourself if a PC that meets the hardware requirements of your game is likely to be running XP to begin with -- the number of people running XP on hardware suitable for Windows 7 is pretty small.


I'd say that Windows 10 is now a pretty safe bet -- since Windows 10 was a strict improvement over Windows 8/8.1 there was no reason not to do the free upgrade that Microsoft offered. Add to that that app-store model for Windows 8/8.1 is now effectively deprecated, I'd personally be entirely comfortable assuming Windows 10. If I felt compelled to support down-level operating systems I'd support Windows 7 in addition to 10, there's no compelling technical or market-share argument to support Windows 8 or 8.1.

#5304772 Headless Server Side Software Renderer

Posted by on 08 August 2016 - 05:37 PM

Look at SwiftShader on GitHub. Its the software renderer used by chrome when hardware acelleration is not available, and it implements very modern software rendering techniques. Its got an interface for OpenGL ES already, which should be quite similar to any WebGL rendering you already have. I don't know what dependencies it takes on, but it should do the trick.


Before being purchased by Google, it was owned by Transgaming which sold it as a solution for games that wanted a fast software-fallback when users lacked sufficient 3D acceleration and for helping Windows games port to Linux or run better under Wine (part of the SwiftShader distribution is a shared library that implements the Direct3D 9 interface exactly, entirely in software).

#5304769 Rendering 3D Over 2D

Posted by on 08 August 2016 - 05:22 PM

I don't know about MonoGame, but XNA (from which it decends) had an overloaded GraphicsDevice.Clear method that took some flags and other parameters that correspond to the depth-buffer.


Anticipating another potential issue -- whether you want to clear between 2D and 3D rendering at all depends on whether each scene's depth is described in terms of independent Z-spaces. For example, if your 2D elements are entirely behind the 3D elements then your current approach works (but its not optimal*) -- this would be the case if you are using the depth buffer for Z-ordering during the 2D phase and for normal depth buffering during the 3D phase. Or, if you mean for your 2D and 3D elements to be intermixed (some 2D behind 3D, some 2D in front of 3D) then what you want to do is to not clear the depth buffer here (but before each frame is rendered) and pre-bake scene-approapriate depth values into your 2D assets.


*On optimization -- if your 3D assets are entirely in front of your 2D elements, then its often better to render your 3D things first while writing the depth buffer, and then draw your 2D things afterwards with depth-orders that you know are larger than any of your 3D depth values. This skips drawing 2D pixels that ultimately get covered by 3D objects anyways. If those 2D pixels are complex, or if much of the screen gets covered by 3D objects this can save your GPU a lot of work.


If you undertake this optimization, be aware that transparency has to be considered. You would draw fully-opaque 3D stuff first from nearest objects to furthest objects, then 2D stuff (still with larger depth than anything in the 3D scene) in your usual order, then return to draw 3D stuff that's partially transparent from furthest objects to nearest objects.