Jump to content
  • Advertisement

Oberon_Command

Member
  • Content Count

    4295
  • Joined

  • Last visited

  • Days Won

    2

Oberon_Command last won the day on October 20

Oberon_Command had the most liked content!

Community Reputation

6283 Excellent

1 Follower

About Oberon_Command

  • Rank
    Contributor

Personal Information

  • Role
    Programmer
  • Interests
    Design
    Programming

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

  1. If the machines you're having poor performance on are laptops, make sure they're actually using their dedicated graphics card. Many laptops that ship these days actually have two graphics cards - an integrated card on the motherboard and a dedicated GPU. Sometimes OS settings cause apps to run using the integrated card, which usually doesn't perform nearly as well as the dedicated GPU.
  2. Oberon_Command

    Global Variables

    Try to minimize those dependencies as much as possible, right down to the function level. Does your class need a thing you want global? Try to make it so that only a few of its functions do. That way you can pass the "global" in as a function argument. If class A has functions foo and bar, but only bar needs a reference to class B, don't make class A store a reference to class B! Try to keep your function call hierarchies as shallow as practical. Instead of A calling B that calls C, try to rework your code so that A, B, and C are called from the same place, in sequence. This way, if C has a dependency that A and B do not, you don't have to pass that dependency to A and B. Have a section of your code that exists just to glue things together, and put your global objects there. Your main event loop or some kind of "app" class is the place I would typically put this. You would invoke your actual business logic from the glue code, without the business logic knowing more than it absolutely needs to about the glue code.
  3. Oberon_Command

    Diablo:Immortal

    Alright, incomplete analogy. Regardless, it's still looking a lot like pointless whining to me and it's still coming across as childish to me. Perhaps you could engage with the actual point rather than nitpicking my bad analogies. HOW does adding a new game to the franchise destroy the memory of it?
  4. Oberon_Command

    Diablo:Immortal

    I asked a simple question. How does Blizzard releasing a mobile game kill the memory of what Diablo "was"? The games that created those memories are still around, are still being sold, and it's not like Blizzard can somehow reach into your mind and remove your memories of PC Diablo. I flat-out don't see that what you're saying logically follows. Frankly, all I see is whining that you (plural) aren't the complete center of Blizzard's attention with regards to one of their major franchises - JTippetts even admitted such - and consequently you are all coming across like a child whining that their younger sibling got a new toy for Christmas. Is that really how you want to be seen? I had faith that folks here more mature than the gaming "hoi polloi", though that faith is being tested right now.
  5. Oberon_Command

    Diablo:Immortal

    How? Last I looked, Blizzard was still selling Diablo 2 and 3.
  6. Oberon_Command

    Diablo:Immortal

    Climb down off your smug high horse, ffs. A gamer doesn't suddenly put on their cold robot suit just because they also become a developer. Most of the ones I know did. They have to. Putting on a "cold robot suit" (assuming I understand what you mean by this) is the most straightforward way to deal with abusive rhetoric from their own fans, the loudest of whom are entitled shitheads who don't understand how the game development process works and make constant demands that have nothing to do with technical or business realities. Many developers live in active fear of their own fanbase. If I ever start publishing my side projects, I'm not sure I won't release my games anonymously, just to avoid that harassment nonsense. We live in the world where Reddit did its best to get a developer fired and the employer caved in and fired the developer. Perhaps you can forgive me for being startled and worried to see a fellow developer echoing the exact same kind of angry rhetoric that the entitled shitheads are using. On the mobile thing - I expect young teenagers as much as the Asian markets are the target audience here. My employer sent a few people to an industry event a little while ago where loads of kids came to see a booth we had. We discovered - but should really have expected, considering the way things are going - that these days kids under 10 don't know what a computer mouse is or how to use one. They expect everything to be a touchscreen, because in their world, everything is a touchscreen. Blizzard is, presumably, aiming at the kids who grew up with touchscreens - the Fornite demographic - rather than us, so in my mind this is all whining about the fact that Blizzard isn't catering to your whims with this particular product.
  7. It is key. Here successfull gamedev companies able to offer 3x-10x salary in comparsion with real-industry related companies and 2x in comparsion with topmost of locals banks that have a self-developed world-top banking software solution. But any of its gamedev companies have a self-made proprientary engines so it required engine developers for first that local-made bachelours unable to get in. That's quite different from here, then. I know someone who was making (or claimed to be making) $100k+ straight out of university, with a bachelors, by working for a large tech company. I interviewed with a company that was willing to pay me barely half that amount, as a junior dev (I did not take the job; I got the impression that this particular company didn't respect developers in general). My understanding was that most gamedevs (at least the ones who don't live in Silicon Valley or the Bay Area) don't break the $100k level until they have considerably seniority. But game developers don't often talk about salary with one another, so it can be hard to say. Still, everything I have heard indicates that the finance and medical industries pay substantially better than the game industry. This is now 4 years out of date, but is probably still somewhat relevant: https://www.gamasutra.com/view/news/221533/Game_Developer_Salary_Survey_2014_The_results_are_in.php As you can see, the management layer makes the most money. I would expect this pattern to hold across North American industry in general.
  8. Worth noting that I was mainly talking about the game industry, as that's the industry that I have the most experience with. Other parts of the software industry may well be a bit different. YMMV on whether being a gamedev is a "good job," of course, considering that gamedevs are often paid less than other software developers. I definitely didn't mean to suggest that having a degree is irrelevant. There are definitely employers who will say that they won't take anyone without a degree. However, my impression is that the degree requirement is mainly for the entry-level positions and that hard degree requirements are not something you can assume is universally a thing. Someone with 10+ years of experience and no degree would be preferable over someone with a degree and only 3 years of experience. Degree requirements that are imposed are not usually imposed by the developers at the company themselves, but by HR or management, and if you can point to a list of games that you actually shipped you may well be able to bypass those requirements. Now, masters degrees I would consider almost useless, as my expectation is that a bachelors is sufficient to bypass the majority of HR filters. I'm also not entirely sure what the average software developer would do in a masters degree that would make one more useful to a software company than someone with only a bachelors. But a degree will help get you in the door of the industry and will doubtless improve your chances of getting to a high seniority level.
  9. Curious - is that required by law, where you are? It sounds like the engineering laws we have here, but of course those don't apply to software, and I've never heard of a government setting down a law that software developers have to have a particular qualification. I suspect the game industry wouldn't have gotten all that far off the ground if we had that here. Too many foundational members of the industry started out selling games out of their basements (figuratively speaking).
  10. Not sure where OP lives, but I can confirm that it isn't like that here in North America. Work experience is more important than having any degree, unless you're very junior and have little work experience. I have coworkers of all ages who don't have degrees. Some of them even dropped out of university because they were offered full-time jobs as software developers before graduation.
  11. Here, those are generally different faculties of the same institution. To be clear, I'm talking about computer science programs, since that's the program I went through. Engineering programs might be a bit better at the actual teaching part, but "software engineering" isn't really "engineering" according to the engineering faculties, so stuff that's software-related is usually left to the math or science faculties. The faculty of computer science is in itself a scientific research institution. I believe this arrangement is quite common in North America.
  12. It's valuable experience, I don't question that, but the kind of experience I'm talking about is best attained by working with teams of dozens of people on projects of at least 50-100 thousand lines of code, that are going to be extended and maintained for years. Do those semester projects measure up to that standard? If so, awesome, your school was better than mine at the "prepping students for the workplace" thing, because the "semester projects" I did sure didn't measure up.
  13. That sounds a lot like our co-op programs, which unfortunately are not mandatory. Would be good for students if they were! One thing to note is that, at least in North America, universities are research institutions. The primary job of a professor is often not teaching students, but conducting research, and some of them have little interest in the teaching part. Some students forget this (or never learn it in the first place) and expect professors to hold their hand throughout their classes when it comes to learning. This is a bad attitude to take; if you do go to university and don't take responsibility for your own learning, you will get very little from it. My school's motto was "Tuum est" - "It's yours", in the sense of "It's your responsibility".
  14. Okay, I can agree with this story, as you present it - but the claim I'm specifically taking issue with is that 5 years of university will make you into a "god-level" programmer. It won't. It takes university (or equivalent studying) and work experience to become a genuinely good developer. Some people are quite bright and quite dedicated and can get the equivalent base without the formal degree, but one still needs to get the base somehow.
  15. State of the art when it comes to things like graphics and deep learning research, sure - but judging by my own experience plus all the interview experiences I've been hearing about from my coworkers who interview junior devs, definitely not state of the art when it comes to software engineering or modern C++. Most grad students I've known went into grad school straight from their undergrads, with no actual work experience in their field. I have no doubt that for some of them it was an attempt to dodge the "real world" for as long as they could. Perhaps it's different in Europe.
  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

GameDev.net is your game development community. Create an account for your GameDev Portfolio and participate in the largest developer community in the games industry.

Sign me up!