• Content count

  • Joined

  • Last visited

Community Reputation

2860 Excellent

About irreversible

  • Rank

Personal Information

  • Interests
  1. I spent a ton of time over the past several months looking for a decent IDE around which to start seriously expanding my shader code ecosystem. Since I have quite a few custom bindings, I can't use a standalone suite out of the box, but require something that is strongly typed while also being able to leverage dynamic shared code. I want to stress that I'm NOT looking for a complete build/debug solution - just the editor/IDE portion so I can write shaders while expanding my codebase independently. For months I thought there was nothing that would suit my needs since any extensions that deal with shading languages seem to be limited mostly to syntax hilighting. But I wanted so much more. Finally, lo and behold, I there was an answer to my call. Well, sorta. Although in all honesty it's actually a surprisingly good answer from my perspective. Thing is, it comes with a hefty caveat. For the non-TL;DR crowd - here's what I want: Enter Visual Studio Code with the C/C++ extension. Thanks to the similarity of GLSL to C, it can actually do ALL of the above (except for the first point, obviously), and the whole thing is free. Here's some more words to set up the scene: The offending construct is uniform blocks. Take the following piece of code in GLSL: layout(std140, binding = 0) uniform _uniblock { my_struct _struct; }; In GLSL, _struct is treated like a global. There are no namespaces or other scope-related concepts that would allow me to "extract" _struct from _uniblock. In fact, as far as I know, the uniform block is purely syntactic from a code-writing point of view and from a logical standpoint pretty much does not exist. This is an issue, because I want the C++ intellisense to treat _struct like a global, so I don't lose suggestions when I try to access its members. Now, to hilight the type of connivery I'm employing - in order to deal with the layout qualifier part, I'm resolving it through the following macro: #define layout(arg) struct { int arg; }; As another example, in order to resolve something like: precision highp float; I'm defining the offending keywords and qualifiers as: #define highp #define precision typedef Oh, and "uniform" is defined as: #define uniform const Of course this isn't valid code - but it's not supposed to be. After the replacement, vscode thinks it's valid and consequently doesn't complain. Which is exactly what I want so long as I don't make a mistake while writing the shader. Now I want to apply the same kind of logic to the uniform block problem to get rid of the implicit scope as invisibly as possible. Basically anything goes - macros, typedefs, type obfuscation via templates, and - if all else fails - modifying the GLSL syntax using the preprocessor as long as the resulting code will compile without any additional modifications. Ideas? PS - I hope my description was in the very least marginally clear
  2. As noted above, broadphase accepts bounding boxes and/or spheres and performs basic preliminary overlap checks to determine potentially colliding pairs. For moving objects, the bounding box becomes a composite of the objects bounding boxes at the start and the end of the update. To minimize potential pairs (eg the input set to the broadphase), you'll likely want to use something as simple as you can get away with that best matches the nature of the game you're working on. Generally speaking this can be as basic as a regular grid, unless you're dealing with heavily asymmetric object placement, which can really benefit from something like an octree. It really depends on what kind of a game you're working on... So you mean what are "the cutting edge" level partitioning schemes these days? Is it safe to assume you're working on an FPS game? Is it indoors/outdoors or has hybrid environments? Are the levels large? Is the world detailed and graphically heavy? Is geometry being procedurally generated/loaded or are you dealing with static levels that are loaded once? PS - regarding use of the acronym "CSG" in your topic title. This is something you don't generally come across outside of an editor. Boolean operations are usually performed prior to generating collision meshes, unless you're generating something procedurally. Though even then you're likely setting yourself up for a world of hurt performance-wise.
  3. Broadphase generally operates on bounding boxes and spheres. There is no difference to it in a brush-based environment in this regard. For production-quality examples of these examples, look at the source code of physics libraries, such as Box2D and Bullet. This is literally a google a way. Not only broadphase, but collision in general. The term you might be looking for is narrowphase. Quake et al used point-vs-brush collision, meaning that for collision, world geometry was extruded by the player's size. This meant that you could run a point or a pair of points along the collision mesh thus reducing the complexity of the test even further. Note that brushes, while fast, have a few inherent drawbacks. Trouble with turning corners smoothly comes to mind first. Things have, indeed, changed dramatically. For narrowphase, read up on GJK.
  4. So, I have this question about the future of software...

    I've wondered this myself, because the US has traditionally been a role model - for good or bad -, especially for smaller countries such as the one I live in, which are economically and politically dependent on maintaining good relations with it. However, that's really been changing recently - if not politically (although you can see considerable disalignment and even pushback from larger EU countries), then at least ideologically and morally. Trump is not popular (sic!) and that is undermining the US' credibility, making other allegiances increasingly more important. In short, I think you guys are screwed (for now), but it'd be far more difficult to push something like this though in the EU. Not speaking for China or Russia here...
  5. So, I have this question about the future of software...

    I can't help but feel sorry for the current state of things for you Americans. The net neutrality "debate" is so far removed from reason that it boggles my mind it exists in the first place.
  6. So, I have this question about the future of software...

    I think I did acknowledge both points (regarding support and the reason Spotify did what it did with the web player) in my original lengthy post, but to me the deeper problem runs even deeper and fundamentally reflects this attitude. This effectively applies to Spotify as well, because I still happen to remember their previous desktop app, which was far more open and powerful. Sticking to my original examples: how jaded do you have to be to screw up text selection in a text editing module? Or how little QA do you have to have to not realize that the new version of your text messaging service is nigh unusable, because it lags and doesn't include something as basic as emoticons, even if you type them in manually? There's supporting more than one version of a thing and then there's scrapping the previous version and releasing a broken, laggy app that has the functionality of something that belongs in 1998. It's our fault as consumers for turning the other cheek here and it angers me that we do so ever so readily. Besides - if I'm the product, then I want to, in the very least, be in the main aisle, not to be left out back to rot. One of the worst completely legal things big companies can do these days is to leverage their position via the ecosystem they've locked us into. I'm on Skype and I use Skype for my messaging needs, because the people I need to communicate with are also there. That being said, it's actually not THAT big of a problem if I wanted to ditch Skype. But migrating from Adobe Cloud to an alternative because of a broken feature is a no-go. Just as you really think thrice before your IDE because the dev team went Apple streamlined something fundamental like the solution explorer or the search feature into oblivion.
  7. So, I have this question about the future of software...

    The new Spotify web player is nigh on unusable if you were used to features like following your friends, integration, playlist management (they even removed sorting) and so forth. The web player was functionally basically a mirror of the desktop application. Now t's a shell of a music player with a design fit for for tablets. They may have patched stuff under the hood since then. The Skype update is actually on par with the Spotify changes if you took what it offered before for granted. Personally it took me about 3 hours to get an older apk and block the app from updating. Look - I'm not saying "don't do your own thing". And I'm not disputing there may be other reasons at work here. What I am saying is that when you're radically changing your stuff, do one of the following: 1) make it opt-in (or at least opt-out) 2) allow users to sample the changes and then make it easy (or in the very least possible) to revert 3) provide a huge red button that says "FEEDBACK" somewhere. Anywhere, really, as long as it's visible 4) warn about the changes before you silently apply them Oh, and I realize free software makes us the product, but there's such a thing as bad design and these three examples are all very notable cases of good old poor design at work. Which is pretty much laughable if your turnover is in the tens of billions.
  8. I'm bringing this up, because today was the third time my tech bubble was recently violated in a way that I would classify as consumer-hostile and borderline intrusive. I'm talking about the recent update to Spotify's web player interface and my latest surprise heart attack - the spectacular assault on reality, known as the redesigned Skype 8 update (in my case on Android - I've no idea where else they've rolled out the changes at this point). I've managed to calm myself marginally about the former, but right now I'm probably even more infuriated about the latter than I was about Spotify when the change hit. Also, if you're familiar with video editing, then this also applies to the latest change Adobe made to text editing in Premiere Pro when they completely hid the traditional titling tool and blindly forced the new (still) heavily lacking Essential Graphics travesty down anyone's throat who was brave enough to keep their software up to date. So - let's recap what happened in all three cases: - a multi-billion dollar company took a functional and well-established product and changed it - they got rid of MOST of the features any marginally more serious user than a grandma used on a daily basis - they "streamlined" the service by making it less responsive and less intuitive to use - they removed any semblance of anything that might be construed as "settings" or "options" - they replaced the above with meaningless trite like "choose your color" - they completely revamped the UI, in particular towards a minimalistic tablet-style solution (large letters, lots of empty space) - they apparently fired anyone who had any guts to say anything about the changes or flat out executed their product testing team - they rolled out the changes overnight in an automatic update with none of the following: - a way to opt out - a way to opt in - a chance to sample the update - a chance to roll back (not true in Adobe's case) - a chance to give feedback directly - a chance to understand what the hell just happened These are massive companies making groundbreaking changes like children. Make no mistake - these changes were not tested. Because if they were, they would know that the new version of Skype lags like hell. They would know that the new version of Spotify's web interface does not find the music you're looking for - even if you're fine with it not having 90% of the features it used to have. And you would know that when you write a text editing tool, then Shift-Arrow, Shift-Home/End and Ctrl+A should probably do something with the selection of the text you're currently working with, not scrub the timeline in a completely different window. These problems are beyond obvious and should never have made it past basic quality testing. These changes were likely unpopular already internally, because if they weren't these would be released as optional "cool new" alternative features (kind of what Lastpass does with its revamped UI). These wouldn't be MASSIVE overnight changes that absolutely obliterate the user experience and are introduced prematurely (at best) with a 40-second marketing video. These changes are not immutable milestones on a timeline. Because these companies have the means (I'm not necessarily talking about the will) to scrap a failed iteration. Neither are these products facing a myriad of expectations in terms of changes. In fact, while in Adobe's case progress was likely widely hoped for, none of these three products were flat out broken. Now they are. Can I find a friend or their playlists on Spotify? NO. Can I intuitively edit something as basic as text in one of the premier video editing tools in the world (pun somewhat intended)? NO. Can I choose a goddamn emoticon, which is not in the empty most recent list, in the most widely used (hey, I'm assuming!) messaging app in the world? HELL. NO. What about changing my online status then? Get outta here! I'll give Adobe SOME leeway here as they've actually rolled out an update to partially alleviate their respective problems since then. But that's an update that should have been part of the update that broke their software in the first place. Like, what the hell? Nobody expects you to release updates on a predetermined schedule that YOU decide. Just make a better schedule and release software that doesn't break people's work flow and in many cases cause them to lose real money because of time lost to learning how to sidestep your broken features. Or, you know, warn us about it... That being said, the lackluster and absolutely nerve-wrecking changes both Spotify and now Microsoft have made are beyond any form of logic to me. I have one question to ask: why!? I mean, I'm a semi-advanced (or in these particular cases, an relatively expert) user. I know my settings and I love my options. Though I can also understand if you want to reduce clutter and make your thing more slick. Just give me that one option, which spells out "Advanced" in itsy bitsy tiny letters. Don't resort to completely removing anything that might even remotely resemble a settings menu from your application. You still have the settings. They haven't gone anywhere. I know you have them, because you need to configure your bloody application somehow! And I'm not saying I was necessarily happy with Skype's interface before (okay, I thought it was convoluted and unwieldy to the point where it felt like it should be completely redone). But now they've torched everything and gone all the way back to 1970. Like... goddamn, you people... Stop! Just stop. And THINK for a moment. That being said - I'm not quite as mad about Spotify, because they transitioned their player from Flash to HTML5. It was still hostile to the consumer, but I do realize that many people use the web player to circumvent the paywall (heyho, Adblock), so that might have been their little revenge. But I'm going to fault Adobe (which is in the business of designing prosumer productivity tools) and Microsoft (which in this case is in a unique position to provide a service that is both well-established and heavily used) to the fullest. Most importantly, all these companies have recently made changes that are going to screw up any semblance of a nuanced software market in the future - these changes in popular products are all designed to dumb everything down to the point of "how straight can we make this curve without turning it into a line", and they are inevitably going to act like erasers on most consumers. They're going to dim our memory and eventually (which, let's be honest, is a matter of months to a few years) completely reform our expectation of what software is and what it can be. And I feel SO sorry for that, because the services these products purport to provide are actually fantastic. The set of applications that have lost their accessibility in favor of "accessibility" (usually at the expense of performance and, you know, accessibility) is ever-growing. Windows itself, the Office suite (which now even has the cool new feature of crashing randomly), proprietary Android distros (which are becoming more like iOS every day, but still fail to copy the most basic of accessibility features they so desire to emulate, like searchable settings) and so forth. These changes are being force-fed to us and feedback like this probably doesn't even make it near the review board. So, if anyone from the Skype product management team happens to read this - well done, you've just screwed not only your customer base, but you've undermined both your product and the future software in general. Claps to you! Which brings me back to the question posed in the topic - given this trend, what is the future of software anyway?
  9. Font weights and thickness classification (in Freetype)

    Hm - I accidentally stumbled across the usWeightClass member inside the TT_OS2 structure, which seems to be present for most fonts (at least on my system). Statistically speaking, it seems most other fonts either seem to be Regular (weight 400) in nature or can blindly be assumed to have the default weight.
  10. What is the standardized way to quantify font weights in FreeType? FT seems to expose the style name via FT_Face, which provides a string-based descriptive name, but I'm encountering a number of styles that do not have a one-to-one match on the weight scale used by WinAPI (I'm not sure where the origin of the weight scale is from). For instance, "Narrow" does not seem to be a quantifiable descriptor. Nor does "Condensed". I presently have the following table, but it's incomplete (eg in addition to the two abovementioned styles it lacks things like "Semilight", etc): Extra Light = 100 Ultra Light = 100 Light = 200 Thin = 200 Book = 300 Demi = 300 Normal = 400 Regular = 400 Medium = 500 Semibold = 600 Demibold = 600 Bold = 700 Black = 800 Extra Bold = 800 Heavy = 800 ExtraBlack = 900 Fat = 900 Poster = 900 Ultra Black = 900 In addition to this, the PostScript descriptor seems to have a weight member (though I cannot tell whether it uses the same scale as above) and there seems to be no entirely consistent way in which the literal style tokens are written in the first place ("Semi Light" != "Semilight"). Am I missing something obvious or is there some voodoo involved here?
  11. C++ A correct way of writing this?

    And still, the description applies to the allocator proxy, not the container itself.
  12. C++ A correct way of writing this?

    Right. I might look into upgrading to 2017 after all then. The behavior was suspicious enough that I actually googled for a compiler bug and as noted above, apparently there's some truth to that. Thanks for confirming. is_standard_layout seems to cut it for now, although I'm unclear what the shortcomings might be in the grand scheme.
  13. C++ A correct way of writing this?

    The problem is that in the above test case I don't see where the code would be used :). I'm not omitting anything.
  14. C++ A correct way of writing this?

    Fair enough - that makes sense. Thanks for the heads up, although I'm somewhat curious how this becomes invalid in practice so long as I use the derived class - the compiler doesn't seem to be bothered by it. Also, that being said, why does the below code generate a deleted function compile time error for both is_copy_assignable and is_trivially_copy_assignable: error C2280: 'myvec<int32> myvec<int32>::operator =(const myvec<int32> &)' : attempting to reference a deleted function is_trivially_copy_constructible yields false and is_copy_constructible yields true. My expectation would be that all of these cases would a) compile and b) yield false on the account of my explicitly disabling the copy semantics. As mentioned above, I'm using VS2013. template<typename T> class myvec { private: std::vector<T> detail; public: myvec() { } myvec<T> operator=(IN const myvec<T>& copy) = delete; myvec(IN const myvec<T>& copy) = delete; }; class NonCopiableA { myvec<int32> vec; }; ... (DOUTEX() is a debug output macro) ... DOUTEX(std::is_copy_assignable<NonCopiableA>::value); DOUTEX(std::is_trivially_copy_assignable<NonCopiableA>::value); DOUTEX(std::is_copy_constructible<NonCopiableA>::value); DOUTEX(std::is_trivially_copy_constructible<NonCopiableA>::value);
  15. C++ A correct way of writing this?

    I'm just going to assume that I'm extremely poor at communicating my problem at this point This is exactly the thing I'm trying to not to do. I want to automatically NOT generate code in a generic class that would handle non-copiables in a container, throwing instead if I accidentally I do end up trying to do so. I feel like a really bad communicator here. How could I further clarify my question?