Promit

Senior Moderators
  • Content count

    15706
  • Joined

  • Last visited

Community Reputation

13246 Excellent

About Promit

  • Rank
    Moderator - Graphics and GPU Programming

Personal Information

  1. If you're porting and building a Mac OSX bundle app, you've have some file handling rework ahead of you. First, I recommend that you add your assets as references rather than directly in the project. This is because XCode will preserve folder structure for referenced folders, while flattening file hierarchies that are directly included in the project. Second, you will need to follow these instructions to get a real file path: https://stackoverflow.com/questions/8768217/how-can-i-find-the-path-to-a-file-in-an-application-bundle-nsbundle-using-c You'll be able to use fopen as normal after that. While we're on the topic, I'll point out that this approach will not work on Android, where you have to go through explicit platform-specific IO APIs to load content embedded in your app.
  2. Redifine all string literals type in my program

    MessageBox(0, L"Whatever", 0, 0); That requires UNICODE defined in the preprocessor to work correctly. Which is irritating. I want my code to build correctly with the minimum amount of configuration. Using the actual function name also has the added perk that Intellisense gives you an actual function declaration to work with.
  3. Redifine all string literals type in my program

    I wish. No, you will have to prefix everything with L. And don't bother using the TEXT macro at all, as it's pointless. Personally I explicitly call the W versions of Windows API functions - always MessageBoxW, never MessageBox. I am not on board with the macro horrors MS manufactured.
  4. I'm not sure there's an actual problem at all. And these days I'm very much against creating a solution before there's a specific problem being solved.
  5. This is true in the world where your game lives on a hard drive. Some of us still remember the world where the game lives on optical media which practically mandates this type of packaging to have any semblance of control over read patterns. If you can block out a package to load a bunch of assets in one long sequential read to get through loading from optical, it makes a world of difference. Probably not relevant to the question at hand of course, but I don't want to lose the plot of why some of these things were developed in the first place.
  6. Going to take a slightly different tack here. I've had a couple informal discussions and it appears that executives increasingly consider custom engines/tech to be a liability. This is different from years past. It used to be that an engine was considered a very large but viable investment. In other words a lot of studios used licensed tech mostly because they couldn't afford to build out custom systems, but there was generally a sense that there was indeed value in doing so. In 2017 however, it's considered a poor use of funds to even build custom tech at a studio level, regardless of whether they can afford it or not. It's only worthwhile, in management's eyes, at a publisher level where the engine can be shared across a lot of teams. That's the energy behind Frostbite, for example. There are some studios which have long term investments in custom tech, although publishers are largely working to share those codebases across their member studios. Otherwise, execs don't want to have custom tech on hand. It's considered a problem that can be farmed out to someone else, with minimal gain to doing it yourself. Having your own engine means a permanent staff of X people there, some of which can't be replaced at any cost because of how well they know that system. Managers don't like irreplaceable staff. They also don't like long term variable costs. Training/onboarding new hires is harder. Hell, hiring engine devs has itself become difficult. Homebrew mobile tech is so rare as to be effectively nonexistent in the marketplace. What that really leaves us with is a rather small group of studios who are building tech as a significant differentiating factor. Games in this category include No Man's Sky and Ashes of the Singularity. While there are arguments to be made both ways, I personally dislike the increased homogenization that is the inevitable result of centralization to a handful of technology packages. There are consequences from architectural decisions that impact final look and feel of games, more so than people often wish to admit.
  7. University or job experience?

    The game industry is relatively forgiving in substituting experience for a degree - but not just any random work experience. If I have an applicant who doesn't have a degree, I need to be assured of a couple things: There's a good reason for not having a degree. Money (in the US) is a valid one. "I didn't think it was useful" is not. The work experience and independent projects are sufficient to produce a capable engineer in general. The actual abilities and skillsets are directly relevant to the job at hand. There is intellectual potential for continued growth and advancement. In that light, I have some comments. "It's two years of formation, VERY practical, it doesn't enter into abstract things like algebra." Red flag. Algebra (and linear algebra) are not abstract nonsense, they're absolutely foundational. If you can't demonstrate basic mathematical competence then I have zero interest in hiring you for any programming position. "Right now, I'm doing a 1 month Unity course in my city, organized by an university, very solid (I was given a scholarship)." While this is a perfectly good way to achieve personal growth, it has no relevance to employment. "The design teacher (a really good professional) told us that he can help us to get a job as QA tester in a videogame company." While I have some friends who have followed this route, I don't advise it. In general, I'd rather see someone doing non-game programming work than spending time in the QA trenches just to say they were in games. QA is a better route for those who have minimal relevant skills of any sort. "Also, I've given a job offer for Java Programmer in a consulting company. It's not videogames, but it's programming experience. That's a safer route." Unfortunately, Java consulting is very unlikely to get you into the game industry. It's a good way to make ends meet while doing independent projects that can demonstrate your abilities, though. To be candid, I get the distinct vibe that foundational Java programming is all you know. That is not even remotely sufficient to get a games job. While both the university and job experience routes are ultimately viable, neither one automatically gets you there. In either situation, you're going to need to do substantial independent work to be considered qualified for a game programmer job. Generally the university path is much more likely to give you a smooth entry into the industry, as it raises less questions about your abilities and decisions. The main reason for not doing so in the US is related to the financial challenges of going to a university. If you're going to attempt to use job experience instead, it should either be because you need to make ends meet or because the job experience is strongly relevant to the game industry.
  8. There are occasional bugfixes in GitHub but we haven't done a fully packaged release basically since MS stopped doing "DirectX" releases. I'm happy to merge this change, but at this stage we tend to encourage people to do their own builds. I've been thinking about doing a modernized version of the library (DX11/12, XA2, XI against current languages and libs) but it hasn't materialized yet.
  9. Where is this Hot Teacher Trend coming from?

    Insert South Park reference here
  10. In general, the reason for different types of seemingly similar resources is that at least one major IHV has (potentially legacy) fast-path hardware that differentiates between them. There are a number of buffer types which perform differently on NV GPUs while AMD's GCN GPUs simply don't care. You're seeing hardware design issues leaking through the software abstractions. Ideally, we would just have buffers read by shaders and nothing else, not even textures. (I mean come on, texture buffers?) GPUs haven't reached that level of generalized functionality yet. MS originally pitched this design when they were sketching out D3D 10 and of course the IHVs explained it wasn't feasible.
  11. why are people using c# over java

    While people have covered the social side of things, I'm going to jump in and claim that C# is overall a better technical choice of language than Java. Yes, I went there. Java was designed first and foremost as safety scissors, a tool for people who couldn't be trusted not to hurt themselves. And honestly, that's true for most developers, particularly in the web and client/business app space. There was absolutely no desire to expose any "system level" functionality. It was meant to be a simple, sandboxy, safe environment to do most of the boring every day software development that makes the world tick. While C# partly shares this worldview as well, both the language and the underlying runtime were designed with the option to step outside that box, as long as you do it explicitly. (Notably, VB.NET was not designed this way.) There's a lot more capability in C# to manipulate memory, integrate with native libraries, control allocation, and do a lot of the direct manipulation of buffers that is inappropriate for most types of apps but is crucial for graphics code in particular and to some extent game code in general. It's the relative ease of doing many common game and graphics programming tasks that has made C# preferable here and in the industry in general. It's not that you can't do things in Java, but it always feels like you're fighting the language, working through kludges like FloatBuffer to get things done.
  12. I exhibited at this conference in 2016, immediately adjacent to the prize winning VR Spacewalk. At one point, two actual astronauts who had spacewalked previously came by and tried it out. I can very much vouch for the value and credibility of FoST.
  13. Worldwide ransomware cyber attack

    Oh ho, seems things are getting a bit more interesting still. Fearing Shadow Brokers leak, NSA reported critical flaw to Microsoft
  14. Worldwide ransomware cyber attack

    There's some evidence supporting this view, while the other two are purely speculation. There is no need to speculate on this point, as MS has a well established source code access program which goes out to many different organizations. For that matter, I personally had full Windows source access. Of course China is simply using hardware and software where they added the backdoors themselves, so it's not particularly helpful to those who would like neither China nor the US to have access to their systems. And no, before someone invariably brings it up, going to open source doesn't even remotely address the problem. All backdoors are broken eventually... At the end of the day, the US government requires significant visibility into systems running all kinds of operating systems and software, whether the parties responsible for that software are cooperative or not. That includes a variety of foreign and non-consumer equipment This means that they have to have a major program to penetrate all of those systems and we know factually that they do exactly that. Once you invest in all of that infrastructure, there is essentially no need to coerce Microsoft into adding or protecting vulnerabilities (which weren't present in W10 in the WCry case, by the way). You already have everything and you have it on your own terms. The conspiracy theory adds a bunch of extra idiotic steps for no reason. Spooks are nothing if not ruthlessly efficient.
  15. Worldwide ransomware cyber attack

    What complete and utter nonsense. Take your conspiracy mongering garbage somewhere else. This was a system bug like anything else, including heartbleed. Entirely false. System components are now checksummed, admin privileges are not assigned by default, and there aren't significant configuration problems. While these things CAN be circumvented, the circumvention approaches are equally as effective on other operating systems. We live in a world where it's now likely possible at any given time to attack a Linux server running on a VM, jump the Xen hypervisor, and take over the host. We have SEEN these bugs being sold in the wild. The registry does not work that way. I said it already but the permission system is a perfectly robust ACL based design shared by many other systems. I'm more concerned that you might think the old owner/group/user octal permission system is a good thing, which would be a shockingly lax security approach. In what way, exactly? You don't know, do you. Come back when you can explain why it's somehow less separated than Linux or OSX. Are you just making shit up now? No, it wasn't. It was assigning admin access to all users by default, which was bad. That's no longer the case, and the exploits we see in the wild are privileges escalations that exist in some form or another on all operating systems. Not only is this wrong, it's also not how the exploits today work. Because it's wrong. That is not happening, save a few cases where program installers are deliberately assigning bad permissions to their own files. I've seen that all the time on Linux boxes. No, it's not. The kernel is one file and it loads dynamic drivers pretty much just like Linux loads drivers. Yeah, that's called a privilege escalation exploit. They happen to every OS. Yes, they're bad. No, they're not at all the same thing as users having admin permissions. You don't know the first fucking thing about how NHS databases are configured. You don't know the first thing about how medical systems are configured. Frankly, a lot of the companies that put these systems together don't understand security in the first place and no OS could save them from their own idiocy. These are frequently people who would have a chmod -R 777 in their install script if it were a unix style platform. Go back to Slashdot or whatever random hole you crawled out of to waste our time.