• Announcements

    • khawk

      Download the Game Design and Indie Game Marketing Freebook   07/19/17

      GameDev.net and CRC Press have teamed up to bring a free ebook of content curated from top titles published by CRC Press. The freebook, Practices of Game Design & Indie Game Marketing, includes chapters from The Art of Game Design: A Book of Lenses, A Practical Guide to Indie Game Marketing, and An Architectural Approach to Level Design. The GameDev.net FreeBook is relevant to game designers, developers, and those interested in learning more about the challenges in game development. We know game development can be a tough discipline and business, so we picked several chapters from CRC Press titles that we thought would be of interest to you, the GameDev.net audience, in your journey to design, develop, and market your next game. The free ebook is available through CRC Press by clicking here. The Curated Books The Art of Game Design: A Book of Lenses, Second Edition, by Jesse Schell Presents 100+ sets of questions, or different lenses, for viewing a game’s design, encompassing diverse fields such as psychology, architecture, music, film, software engineering, theme park design, mathematics, anthropology, and more. Written by one of the world's top game designers, this book describes the deepest and most fundamental principles of game design, demonstrating how tactics used in board, card, and athletic games also work in video games. It provides practical instruction on creating world-class games that will be played again and again. View it here. A Practical Guide to Indie Game Marketing, by Joel Dreskin Marketing is an essential but too frequently overlooked or minimized component of the release plan for indie games. A Practical Guide to Indie Game Marketing provides you with the tools needed to build visibility and sell your indie games. With special focus on those developers with small budgets and limited staff and resources, this book is packed with tangible recommendations and techniques that you can put to use immediately. As a seasoned professional of the indie game arena, author Joel Dreskin gives you insight into practical, real-world experiences of marketing numerous successful games and also provides stories of the failures. View it here. An Architectural Approach to Level Design This is one of the first books to integrate architectural and spatial design theory with the field of level design. The book presents architectural techniques and theories for level designers to use in their own work. It connects architecture and level design in different ways that address the practical elements of how designers construct space and the experiential elements of how and why humans interact with this space. Throughout the text, readers learn skills for spatial layout, evoking emotion through gamespaces, and creating better levels through architectural theory. View it here. Learn more and download the ebook by clicking here. Did you know? GameDev.net and CRC Press also recently teamed up to bring GDNet+ Members up to a 20% discount on all CRC Press books. Learn more about this and other benefits here.


  • Content count

  • Joined

  • Last visited

Community Reputation

2860 Excellent

About irreversible

  • Rank
  1. As noted above, broadphase accepts bounding boxes and/or spheres and performs basic preliminary overlap checks to determine potentially colliding pairs. For moving objects, the bounding box becomes a composite of the objects bounding boxes at the start and the end of the update. To minimize potential pairs (eg the input set to the broadphase), you'll likely want to use something as simple as you can get away with that best matches the nature of the game you're working on. Generally speaking this can be as basic as a regular grid, unless you're dealing with heavily asymmetric object placement, which can really benefit from something like an octree. It really depends on what kind of a game you're working on... So you mean what are "the cutting edge" level partitioning schemes these days? Is it safe to assume you're working on an FPS game? Is it indoors/outdoors or has hybrid environments? Are the levels large? Is the world detailed and graphically heavy? Is geometry being procedurally generated/loaded or are you dealing with static levels that are loaded once? PS - regarding use of the acronym "CSG" in your topic title. This is something you don't generally come across outside of an editor. Boolean operations are usually performed prior to generating collision meshes, unless you're generating something procedurally. Though even then you're likely setting yourself up for a world of hurt performance-wise.
  2. Broadphase generally operates on bounding boxes and spheres. There is no difference to it in a brush-based environment in this regard. For production-quality examples of these examples, look at the source code of physics libraries, such as Box2D and Bullet. This is literally a google a way. Not only broadphase, but collision in general. The term you might be looking for is narrowphase. Quake et al used point-vs-brush collision, meaning that for collision, world geometry was extruded by the player's size. This meant that you could run a point or a pair of points along the collision mesh thus reducing the complexity of the test even further. Note that brushes, while fast, have a few inherent drawbacks. Trouble with turning corners smoothly comes to mind first. Things have, indeed, changed dramatically. For narrowphase, read up on GJK.
  3. I've wondered this myself, because the US has traditionally been a role model - for good or bad -, especially for smaller countries such as the one I live in, which are economically and politically dependent on maintaining good relations with it. However, that's really been changing recently - if not politically (although you can see considerable disalignment and even pushback from larger EU countries), then at least ideologically and morally. Trump is not popular (sic!) and that is undermining the US' credibility, making other allegiances increasingly more important. In short, I think you guys are screwed (for now), but it'd be far more difficult to push something like this though in the EU. Not speaking for China or Russia here...
  4. I can't help but feel sorry for the current state of things for you Americans. The net neutrality "debate" is so far removed from reason that it boggles my mind it exists in the first place.
  5. I think I did acknowledge both points (regarding support and the reason Spotify did what it did with the web player) in my original lengthy post, but to me the deeper problem runs even deeper and fundamentally reflects this attitude. This effectively applies to Spotify as well, because I still happen to remember their previous desktop app, which was far more open and powerful. Sticking to my original examples: how jaded do you have to be to screw up text selection in a text editing module? Or how little QA do you have to have to not realize that the new version of your text messaging service is nigh unusable, because it lags and doesn't include something as basic as emoticons, even if you type them in manually? There's supporting more than one version of a thing and then there's scrapping the previous version and releasing a broken, laggy app that has the functionality of something that belongs in 1998. It's our fault as consumers for turning the other cheek here and it angers me that we do so ever so readily. Besides - if I'm the product, then I want to, in the very least, be in the main aisle, not to be left out back to rot. One of the worst completely legal things big companies can do these days is to leverage their position via the ecosystem they've locked us into. I'm on Skype and I use Skype for my messaging needs, because the people I need to communicate with are also there. That being said, it's actually not THAT big of a problem if I wanted to ditch Skype. But migrating from Adobe Cloud to an alternative because of a broken feature is a no-go. Just as you really think thrice before your IDE because the dev team went Apple streamlined something fundamental like the solution explorer or the search feature into oblivion.
  6. The new Spotify web player is nigh on unusable if you were used to features like following your friends, last.fm integration, playlist management (they even removed sorting) and so forth. The web player was functionally basically a mirror of the desktop application. Now t's a shell of a music player with a design fit for for tablets. They may have patched stuff under the hood since then. The Skype update is actually on par with the Spotify changes if you took what it offered before for granted. Personally it took me about 3 hours to get an older apk and block the app from updating. Look - I'm not saying "don't do your own thing". And I'm not disputing there may be other reasons at work here. What I am saying is that when you're radically changing your stuff, do one of the following: 1) make it opt-in (or at least opt-out) 2) allow users to sample the changes and then make it easy (or in the very least possible) to revert 3) provide a huge red button that says "FEEDBACK" somewhere. Anywhere, really, as long as it's visible 4) warn about the changes before you silently apply them Oh, and I realize free software makes us the product, but there's such a thing as bad design and these three examples are all very notable cases of good old poor design at work. Which is pretty much laughable if your turnover is in the tens of billions.
  7. I'm bringing this up, because today was the third time my tech bubble was recently violated in a way that I would classify as consumer-hostile and borderline intrusive. I'm talking about the recent update to Spotify's web player interface and my latest surprise heart attack - the spectacular assault on reality, known as the redesigned Skype 8 update (in my case on Android - I've no idea where else they've rolled out the changes at this point). I've managed to calm myself marginally about the former, but right now I'm probably even more infuriated about the latter than I was about Spotify when the change hit. Also, if you're familiar with video editing, then this also applies to the latest change Adobe made to text editing in Premiere Pro when they completely hid the traditional titling tool and blindly forced the new (still) heavily lacking Essential Graphics travesty down anyone's throat who was brave enough to keep their software up to date. So - let's recap what happened in all three cases: - a multi-billion dollar company took a functional and well-established product and changed it - they got rid of MOST of the features any marginally more serious user than a grandma used on a daily basis - they "streamlined" the service by making it less responsive and less intuitive to use - they removed any semblance of anything that might be construed as "settings" or "options" - they replaced the above with meaningless trite like "choose your color" - they completely revamped the UI, in particular towards a minimalistic tablet-style solution (large letters, lots of empty space) - they apparently fired anyone who had any guts to say anything about the changes or flat out executed their product testing team - they rolled out the changes overnight in an automatic update with none of the following: - a way to opt out - a way to opt in - a chance to sample the update - a chance to roll back (not true in Adobe's case) - a chance to give feedback directly - a chance to understand what the hell just happened These are massive companies making groundbreaking changes like children. Make no mistake - these changes were not tested. Because if they were, they would know that the new version of Skype lags like hell. They would know that the new version of Spotify's web interface does not find the music you're looking for - even if you're fine with it not having 90% of the features it used to have. And you would know that when you write a text editing tool, then Shift-Arrow, Shift-Home/End and Ctrl+A should probably do something with the selection of the text you're currently working with, not scrub the timeline in a completely different window. These problems are beyond obvious and should never have made it past basic quality testing. These changes were likely unpopular already internally, because if they weren't these would be released as optional "cool new" alternative features (kind of what Lastpass does with its revamped UI). These wouldn't be MASSIVE overnight changes that absolutely obliterate the user experience and are introduced prematurely (at best) with a 40-second marketing video. These changes are not immutable milestones on a timeline. Because these companies have the means (I'm not necessarily talking about the will) to scrap a failed iteration. Neither are these products facing a myriad of expectations in terms of changes. In fact, while in Adobe's case progress was likely widely hoped for, none of these three products were flat out broken. Now they are. Can I find a friend or their playlists on Spotify? NO. Can I intuitively edit something as basic as text in one of the premier video editing tools in the world (pun somewhat intended)? NO. Can I choose a goddamn emoticon, which is not in the empty most recent list, in the most widely used (hey, I'm assuming!) messaging app in the world? HELL. NO. What about changing my online status then? Get outta here! I'll give Adobe SOME leeway here as they've actually rolled out an update to partially alleviate their respective problems since then. But that's an update that should have been part of the update that broke their software in the first place. Like, what the hell? Nobody expects you to release updates on a predetermined schedule that YOU decide. Just make a better schedule and release software that doesn't break people's work flow and in many cases cause them to lose real money because of time lost to learning how to sidestep your broken features. Or, you know, warn us about it... That being said, the lackluster and absolutely nerve-wrecking changes both Spotify and now Microsoft have made are beyond any form of logic to me. I have one question to ask: why!? I mean, I'm a semi-advanced (or in these particular cases, an relatively expert) user. I know my settings and I love my options. Though I can also understand if you want to reduce clutter and make your thing more slick. Just give me that one option, which spells out "Advanced" in itsy bitsy tiny letters. Don't resort to completely removing anything that might even remotely resemble a settings menu from your application. You still have the settings. They haven't gone anywhere. I know you have them, because you need to configure your bloody application somehow! And I'm not saying I was necessarily happy with Skype's interface before (okay, I thought it was convoluted and unwieldy to the point where it felt like it should be completely redone). But now they've torched everything and gone all the way back to 1970. Like... goddamn, you people... Stop! Just stop. And THINK for a moment. That being said - I'm not quite as mad about Spotify, because they transitioned their player from Flash to HTML5. It was still hostile to the consumer, but I do realize that many people use the web player to circumvent the paywall (heyho, Adblock), so that might have been their little revenge. But I'm going to fault Adobe (which is in the business of designing prosumer productivity tools) and Microsoft (which in this case is in a unique position to provide a service that is both well-established and heavily used) to the fullest. Most importantly, all these companies have recently made changes that are going to screw up any semblance of a nuanced software market in the future - these changes in popular products are all designed to dumb everything down to the point of "how straight can we make this curve without turning it into a line", and they are inevitably going to act like erasers on most consumers. They're going to dim our memory and eventually (which, let's be honest, is a matter of months to a few years) completely reform our expectation of what software is and what it can be. And I feel SO sorry for that, because the services these products purport to provide are actually fantastic. The set of applications that have lost their accessibility in favor of "accessibility" (usually at the expense of performance and, you know, accessibility) is ever-growing. Windows itself, the Office suite (which now even has the cool new feature of crashing randomly), proprietary Android distros (which are becoming more like iOS every day, but still fail to copy the most basic of accessibility features they so desire to emulate, like searchable settings) and so forth. These changes are being force-fed to us and feedback like this probably doesn't even make it near the review board. So, if anyone from the Skype product management team happens to read this - well done, you've just screwed not only your customer base, but you've undermined both your product and the future software in general. Claps to you! Which brings me back to the question posed in the topic - given this trend, what is the future of software anyway?
  8. Hm - I accidentally stumbled across the usWeightClass member inside the TT_OS2 structure, which seems to be present for most fonts (at least on my system). Statistically speaking, it seems most other fonts either seem to be Regular (weight 400) in nature or can blindly be assumed to have the default weight.
  9. What is the standardized way to quantify font weights in FreeType? FT seems to expose the style name via FT_Face, which provides a string-based descriptive name, but I'm encountering a number of styles that do not have a one-to-one match on the weight scale used by WinAPI (I'm not sure where the origin of the weight scale is from). For instance, "Narrow" does not seem to be a quantifiable descriptor. Nor does "Condensed". I presently have the following table, but it's incomplete (eg in addition to the two abovementioned styles it lacks things like "Semilight", etc): Extra Light = 100 Ultra Light = 100 Light = 200 Thin = 200 Book = 300 Demi = 300 Normal = 400 Regular = 400 Medium = 500 Semibold = 600 Demibold = 600 Bold = 700 Black = 800 Extra Bold = 800 Heavy = 800 ExtraBlack = 900 Fat = 900 Poster = 900 Ultra Black = 900 In addition to this, the PostScript descriptor seems to have a weight member (though I cannot tell whether it uses the same scale as above) and there seems to be no entirely consistent way in which the literal style tokens are written in the first place ("Semi Light" != "Semilight"). Am I missing something obvious or is there some voodoo involved here?
  10. C++

    And still, the description applies to the allocator proxy, not the container itself.
  11. C++

    Right. I might look into upgrading to 2017 after all then. The behavior was suspicious enough that I actually googled for a compiler bug and as noted above, apparently there's some truth to that. Thanks for confirming. is_standard_layout seems to cut it for now, although I'm unclear what the shortcomings might be in the grand scheme.
  12. C++

    The problem is that in the above test case I don't see where the code would be used :). I'm not omitting anything.
  13. C++

    Fair enough - that makes sense. Thanks for the heads up, although I'm somewhat curious how this becomes invalid in practice so long as I use the derived class - the compiler doesn't seem to be bothered by it. Also, that being said, why does the below code generate a deleted function compile time error for both is_copy_assignable and is_trivially_copy_assignable: error C2280: 'myvec<int32> myvec<int32>::operator =(const myvec<int32> &)' : attempting to reference a deleted function is_trivially_copy_constructible yields false and is_copy_constructible yields true. My expectation would be that all of these cases would a) compile and b) yield false on the account of my explicitly disabling the copy semantics. As mentioned above, I'm using VS2013. template<typename T> class myvec { private: std::vector<T> detail; public: myvec() { } myvec<T> operator=(IN const myvec<T>& copy) = delete; myvec(IN const myvec<T>& copy) = delete; }; class NonCopiableA { myvec<int32> vec; }; ... (DOUTEX() is a debug output macro) ... DOUTEX(std::is_copy_assignable<NonCopiableA>::value); DOUTEX(std::is_trivially_copy_assignable<NonCopiableA>::value); DOUTEX(std::is_copy_constructible<NonCopiableA>::value); DOUTEX(std::is_trivially_copy_constructible<NonCopiableA>::value);
  14. C++

    I'm just going to assume that I'm extremely poor at communicating my problem at this point This is exactly the thing I'm trying to not to do. I want to automatically NOT generate code in a generic class that would handle non-copiables in a container, throwing instead if I accidentally I do end up trying to do so. I feel like a really bad communicator here. How could I further clarify my question?
  15. C++

    I'm doing three things in my overloaded vector class: 1) provide a couple of wrapper functions like quick_erase() 2) disable the copy constructor and copy assignment operator 3) provide a custom allocator I don't see how any of these violate the extensibility of any standard library class. I'm not mixing shared and raw pointers. Neither is this particular issue in any way related to controlling lifetimes, although fundamentally it does stem from it. The real problem is exactly the one I described in my post. Given (untested pseudocode): struct NonCopiable { // disables copy semantics VECTOR<int32> data; }; template<typename T, bool ISCOPIABLE> class Allocator { public: // should always work void AllocPointer(void* basePtr) { VECTOR<shared_ptr<T>>* container = reinterpret_cast<VECTOR<shared_ptr<T>>*>(basePtr); container->push_back(NewShared<T>()); } // should not work with T = NonCopiable. I want to selectively compile this into a stub that throws, // indicating an error... template <boolean ISCOPIABLE2> typename std::enable_if < ISCOPIABLE2, void* >::type void DoAllocPlain(void* basePtr) { VECTOR<T>* container = reinterpret_cast<VECTOR<T>*>(basePtr); container->push_back(T()); } // ... which means that for T = NonCopiable it should become template <boolean ISCOPIABLE2> typename std::enable_if <!ISCOPIABLE2, void* >::type void DoAllocPlain(void* basePtr) { throw("You done goofed"); } void AllocPlain() { DoAllocPlain<ISCOPIABLE>(basePtr); } }; ... void LoadToContainer(void *basePtr, int32 contType) { Allocator<NonCopiable, std::is_copy_assignable<NonCopiable>> allocator; ... if(contType == ContainerOfSharedPointers) allocator->AllocPointer(basePtr); else allocator->AllocPlain(basePtr); } It would make no difference if I was using std::vector directly and passing it a plain non-copiable type. It wouldn't compile, resulting in the same situation. So, to reiterate - my question is about type traits and how to properly employ them to decide which version of AllocPlain() should be compiled. Edit: added enable_if to the sample code and fixed a few mistakes. PS - code editing in the new forums is apparently exceptionally broken.