Jump to content

  • Log In with Google      Sign In   
  • Create Account

Promit

Member Since 29 Jul 2001
Offline Last Active Today, 06:41 PM

#5302945 How Do You Deal With Errors On Gpus? Do You At All?

Posted by Promit on 28 July 2016 - 10:01 AM

I'm in the camp that yes, (consumer) GPUs are not as reliable as CPUs. They take a lot of shortcuts and optimizations that generate nearly-correct results. That said, a blue screen failure due to a specific operation is definitely out of the ordinary, though using Vulkan in these early days definitely exposes you to it more than it otherwise would. This actually does sound like you should build a test case and submit it to AMD, as a lot of these weird corner cases can show up dependent on the GPU and they may not have noticed.

 

As far as handling errors - it is common to have workarounds for certain hardware configurations that are known to break. It is also common to have workarounds for particular drivers, particular operating systems, etc. These are all derived from testing before release (or afterwards...) to find out what works and what doesn't. What you don't have, though, is the ability to detect and handle GPU or driver errors in any sensible way. A blue screen is a kernel mode unhandled exception, and there is not a damn thing you can do about it after the bug has been invoked.




#5302705 you all say oculus rift but why not google glass?

Posted by Promit on 26 July 2016 - 09:53 PM

I have a HoloLens. It's a really cool toy, but attempting to sell this device to the general public would be a disaster and MS knows it. I think there's a wonderful future ahead for AR googles when they get the tech and use cases nailed, but VR is here now and AR is just a prototype.

 

By the way, the surprising amazing thing about the HoloLens isn't the holograms - it's the world tracking. I can lay down holograms in a couple offices and the hallways, and wander about freely and it always knows exactly where I am. And it persists the world correctly at all times. No cameras or laser projectors or outboard sensors needed. This makes room-scale VR look like child's play. 

 

Lastly, on the subject of VR: Vive is fantastic. The dual tracked controllers actually give you a real tactile sense into the world. Oculus, on the other hand, sucks due to not having it. (Although their optics might be better, I'm still deciding.) Being in VR and using an Xbox controller to play is weird and disconnected and fundamentally not that fun. (The single camera setup also means Oculus has some nasty occlusion issues.) Vive feels like what VR should be. Hopefully Oculus Touch will address the issue later this year.




#5300389 Posibility of getting to game industry?

Posted by Promit on 12 July 2016 - 09:09 AM

It wouldn't be devastating for a AAA company to invest a month in allowing a developer to learn C++ - the pay would probably reflect it. The lead physics programmer I work with was hired with zero C++ experience (early 2000s). I'm not going to say this is the case now but it's certainly happened in the past.

A month? A month of C++ experience is just enough to really get in the groove of breaking everything. You'll have to do detailed code reviews on every check-in for a good while. The early 2000s were a different time - industry C++ code was much closer to C, lots of people had C experience, and frankly the industry was small and much easier to get into. Each successive generation of tech has brought increased requirements for candidates as games themselves have evolved into extremely sophisticated systems.

 

There are a few exceptions for people who are not being hired for their programming prowess. Physics people were rare and massively in demand for several years from about 2002 - this was right when Havok was becoming a big name and HL2 was about to arrive. Even now they're in demand, but the good physics people all know C++ now  :rolleyes: Same thing happened when the PS3 arrived for low level optimization/SPU people, high end GPGPU/shader people, etc. C++ was not required for those jobs because they were based on specialized hard-to-find experience.

 

maths/physics

 
This is certainly the skill that I really am bad at.

 

That is something you will need to fix, no exceptions.




#5299844 Posibility of getting to game industry?

Posted by Promit on 08 July 2016 - 10:31 PM

 

 

is a great, productive language to get a lot of things done in. But C++ gets you the jobs, not C#.

 
HI, just wanna know if this is kind of true? like for example the industry prefers c++ people?

 


Promit knows whereof he speaks.

 

When it comes to the game industry, it's simply an expected and generally required part of your skill set. It's been the standard for a very long time now, and there's a reasonably healthy supply of capable candidates with that feather in their cap. If someone comes in with solid C++ abilities and not too much else, it's assumed they'll pick up other things as needed. Whether that's justified or not, that is an intrinsic widespread attitude in the industry. It's never assumed that a junior hire will just pick up C++, so the company basically plans to have to train you for a while before you're capable. And if I'm choosing between the candidate I have to train or the candidate who is ready to go out of the gate, well... it can be overcome, but it's easier just to have that knowledge up front.

 

This comment is ONLY applicable to the game industry. There are many types of software development that don't have this expectation.




#5299400 Is DirectXMath thread safe?

Posted by Promit on 06 July 2016 - 04:58 PM

Don't all of the DirectXMath functions operate on local by-value parameters and return by value? They don't know anything about threads or references, they're just functions operating on data. Anything YOU do with assigning or reading thread-shared variables is on you, and no different from anything else.




#5299352 Are there too many Unity Developers?

Posted by Promit on 06 July 2016 - 11:34 AM

At your stage in the game, producing something is vastly more important than what you use to make it. Use whatever gets you to the goal of a completed project.

 

Long term, "Unity" or "CryEngine" or "Unreal" are not generally sufficient to get a job without a portfolio of projects AND significant primary skills, either art or programming.




#5299328 Posibility of getting to game industry?

Posted by Promit on 06 July 2016 - 09:22 AM

2nd. As a person wanting to get into game industry, Is it right to aim for bigger game dev industry? like EA or SE?

Sure. One of the positive things about the game industry is that it's wonderfully neutral to your upbringing, background, educational background, etc. If you can do the job, you'll probably get the job. I feel a little bit like economic downturn in recent years has made it a bit harder than it used to be, but it's absolutely doable.

Will continuing c++ be a good asset for me? or Just use c# which a lot of inie game dev use?
C# is a great, productive language to get a lot of things done in. But C++ gets you the jobs, not C#.

3rd. What are the chances that at my mid 20s I can be able to get into game industry? How high would you think that be?
If you can learn enough to keep up, then you'll get a job. It'll be entry level for sure, and it will be challenging, but it's not an unrealistic goal. 


#5298917 Porting OpenGL to Direct3D 11 : How to handle Input Layouts?

Posted by Promit on 03 July 2016 - 01:57 PM

Won't claim this is the best/only way of doing this, but I define a "Geometry Input" object that is more or less equivalent to a VAO. It holds a vertex format and the buffers that are bound all together in one bundle. The vertex format is defined identically to D3D11_INPUT_ELEMENT_DESC in an array. In GL, this pretty much just maps onto a VAO. (It also virtualizes neatly to devices that don't have working implementations of VAO. Sadly they do exist.) In D3D, it holds an input layout plus a bunch of buffer references and the metadata for how they're bound to the pipeline. This is as simple as can be:

ID3D11InputLayout* _layout;
std::vector<ID3D11Buffer*> _vertexBuffers;
std::vector<UINT> _strides;
std::vector<UINT> _offsets;

That's more or less sufficient to emulate VAO for the base case. You can create duplicate input layouts for each input, or build a hashed cache and look them up. There's one nasty catch, which is the shader bytecode. I decided to make it optional to supply a shader to create one of these objects (but mandatory to supply the vertex format array). If you do supply a shader, the GL path ignores it and the D3D path uses the stored bytecode. If you don't supply it, I actually generate a fake shader matching the supplied vertex format, and compile a bytecode from it. This sounds annoying but is actually really easy to do. 




#5298707 Creating OpenGL context in different OpenGL driver versions

Posted by Promit on 01 July 2016 - 10:33 AM

So for example specify 4.5 version in profile struct but driver responses with 4.1 version then this is what you could get.

That's not legal behavior. The spec is very clear that the implementation is not allowed to respond with a version less than requested in the attribs submitted to wglCreateContextAttribsARB.
 
 

Same is true for if you want a core profile where depricated functions got out of scope or compatibility one where you still could use glBegin, glEnd ...

If you ask for a core profile, the implementation is required to give you a core profile. Incidentally, a core profile has a slight negative performance impact on NVIDIA and probably on the other major manufacturers too. It's not recommended to use it unless you have a concrete reason.
 

The reason is that if you specify a specific version driver is going to load that version function code into process memory and then you could access that versions extensions first.

No driver I've ever seen works that way. The driver is a single monolithic implementation which supports some particular version of OpenGL. It will load in its entirety and respond with that version of OpenGL no matter what you request during context creation. (Mac might be a bit different since it only has a few versions, I can't remember now.) It wouldn't be practical to slice the versions into separate libraries and defer loads, as that would create more problems without any benefits.




#5298595 Creating OpenGL context in different OpenGL driver versions

Posted by Promit on 30 June 2016 - 12:52 AM

Here's a modern day OpenGL context creation on Windows. It's a goddamn mess, because you have to create a context to ask what kind of context you can create. But that's how it's set up. You will probably want to read the documentation page for WGL_ARB_create_context for all the gross details.

	//Create a basic OpenGL context
	g_hRC = wglCreateContext(g_hDC);
	if(!g_hRC)
	{
		GlobalLogger->Write(LP_Critical, "Failed to create any OpenGL context at all.");
		KillGraphics();
		return false;
	}

	wglMakeCurrent(g_hDC, g_hRC);

	//Attempt to create a new GL 3.0+ context
	int flags = 0;
	int profile = WGL_CONTEXT_CORE_PROFILE_BIT_ARB;
	if(ap_OpenGLForwardCompat)
		flags |= WGL_CONTEXT_FORWARD_COMPATIBLE_BIT_ARB;
	if(ap_GraphicsDebug)
		flags |= WGL_CONTEXT_DEBUG_BIT_ARB;
	if(ap_OpenGLCompat)
		profile = WGL_CONTEXT_COMPATIBILITY_PROFILE_BIT_ARB;
	glewInit();
	if(wglewIsSupported("WGL_ARB_create_context"))
	{
		int attribs[] =
		{
			WGL_CONTEXT_MAJOR_VERSION_ARB, ap_OpenGLMajorVersion,
			WGL_CONTEXT_MINOR_VERSION_ARB, ap_OpenGLMinorVersion,
			WGL_CONTEXT_FLAGS_ARB, flags,
			WGL_CONTEXT_PROFILE_MASK_ARB, profile,
			0
		};

		//New context is available, spool it up and destroy the old one
		wglMakeCurrent(NULL, NULL);
		wglDeleteContext(g_hRC);
		g_hRC = wglCreateContextAttribsARB(g_hDC, 0, attribs);
		if(!g_hRC)
		{
			logger.WriteFormat(LP_Critical, "Failed to create OpenGL %d.%d context.", ap_OpenGLMajorVersion, ap_OpenGLMinorVersion);
			KillGraphics();
			return false;
		}

		wglMakeCurrent(g_hDC, g_hRC);

		glGetIntegerv(GL_MAJOR_VERSION, &ap_OpenGLMajorVersion);
		glGetIntegerv(GL_MINOR_VERSION, &ap_OpenGLMinorVersion);
		logger.WriteFormat(LP_Info, "Created OpenGL context version: %d.%d%s", ap_OpenGLMajorVersion, ap_OpenGLMinorVersion, 
			ap_OpenGLCompat ? " (compatibility)" : "");
	}
	else
	{
		logger.WriteFormat(LP_Warning, "Using OpenGL legacy context version: %s", glGetString(GL_VERSION));
	}



#5298485 Dear people who actually work at videogame industry:

Posted by Promit on 28 June 2016 - 09:05 PM

IMO you should focus all your efforts on getting into a US university. Long story short, someone without a degree from Mexico is unemployable outside Mexico. (Don't know how it is in the country.) And if you're thinking about leaving high school, you will never work a tech job. It doesn't matter who you are or what you've done. No one will even talk to you.

 

If someone came to me for a job without a degree, I would point blank ask them: "Why didn't you do the degree?" If it's a challenge with money or an unusual life situation or something, that wouldn't be a major issue for me. But if the answer is that they couldn't be bothered, it felt like a waste of time, etc, I would show that candidate the door. There's no room in a professional setting for someone who won't face up to work because of general lack of interest. I'm also not interested in hiring anyone who doesn't have experience working with other people as a team.

 

In a broader sense... I worked in the industry for a year without a degree. But that was in the healthy job market of 2007, with three years of a degree done and a clear plan to return. Oh, and years of tinkering with game development, twin professional internships at major companies, deep technical knowledge (for my age), several working projects and open source work, and some article writing work too. Nowadays I'm not sure I could get a job in this market with the background I had then.

 

P.S. The above applies only to programmers. You need to understand this: artists are 100% unemployable for games without a full four year art degree. End of story. There is NO wiggle room on this one. The industry is literally drowning in talent who have done incredible work alongside their degrees, there's no reason to even entertain candidates who skipped it.




#5298274 Link for Graphics Research Papers

Posted by Promit on 27 June 2016 - 01:17 PM

More still: http://gamedevs.org/




#5296834 MagicLeap teams up with LucasArts

Posted by Promit on 16 June 2016 - 12:16 PM

They always make some great looking videos, but I'm very much a rubber-meets-the-road kind of guy. We met with a rep from ML recently, so I'm hoping to get some eyes-on time in the near future. The HoloLens is super cool but the field of view is a real buzzkill. And as if that weren't enough, there's still Meta 2 in the mix.

 

In pretty much all of the VR and AR cases, the tech and hardware has well outstripped the content in both concept and execution. We have all these amazing toys, but everyone seems just a bit confused about why. We're buying them on the strength of what we individually imagine them to be. This feels the same - LA and ML are opening a joint research center but with pretty vague goals.




#5296270 CreateWindow limiting to 2560x1440

Posted by Promit on 12 June 2016 - 08:47 PM

Are you single-mon or multi-mon? If multi-mon, are all monitors at the same scaling? This page has a lot of detail:

https://msdn.microsoft.com/en-us/library/windows/desktop/dn469266(v=vs.85).aspx

Apparently they want you to use the app manifest, but you said you already did that. Also try disabling scaling on just your app from Windows properties and running it:

https://technet.microsoft.com/en-us/library/dn528847.aspx

If that works, then your DPI awareness declaration isn't taking hold, for whatever reason.




#5296176 CreateWindow limiting to 2560x1440

Posted by Promit on 12 June 2016 - 12:00 AM

You probably want to start by marking your game as DPI aware.

https://msdn.microsoft.com/en-us/library/ms701681(v=vs.85).aspx






PARTNERS