Jump to content

  • Log In with Google      Sign In   
  • Create Account

We need your help!

We need 7 developers from Canada and 18 more from Australia to help us complete a research survey.

Support our site by taking a quick sponsored survey and win a chance at a $50 Amazon gift card. Click here to get started!


Fredericvo

Member Since 06 Mar 2012
Offline Last Active Yesterday, 09:05 PM

Topics I've Started

Async asset loading

13 June 2015 - 08:39 AM

I've been wondering how loading assets within a running game loop might work and can only think of a few methods, all with pitfalls.

Which one of these, if any, is correct?

 

1. Device can be used from another thread, so just destroy and create textures, vertex buffers etc from a worker thread.

2. DeviceContext is NOT thread safe but I wonder if it means that under no circumstance at all another thread should touch it or is it only the currently used textures and buffers? eg. can I still map/unmap an unused texture while other ones are bound to the pipeline? (I have a bad feeling about this)

This would allow to keep a pool of ID3D11ShaderResourceViews that get reused.

 

3. Is there a way a different DeviceContext can load assets and share/transfer them to the first one? (sounds icky)

 

I do know how to use ReadFile, GetOverlappedResult asynchronously but what good is the ability to load large image files this way if I cannot create the actual assets async'ly too?

The only code examples I've seen were using something like C++/CLI and weren't really that helpful, although it gave me the impression that a combination of ReadFile and 1. were used from the high level view insight the code gave me. They didn't init the FILE_FLAG_OVERLAPPED though but it seems they spawn a thread manually with this kind of semantics I'm unfamiliar with: task<Platform::Array<byte>^>

I'm talking about this for example: https://msdn.microsoft.com/en-us/library/windows/apps/jj651549.aspx

 

 


Extremely weird, illogical bug. Part of false if clause executed.

28 May 2015 - 09:09 AM

I'm trying to find the high powered device in an Optimus system (Intel HD Family + AMD 6470M)

For this I tried the following code:

// We are going to iterate the adapter array to find the one with the largest memory size.
// This might be a simplistic way to find the high powered device though.
UINT memorySize = 0;
UINT adapterIndex = 0;
WCHAR * cardName = 0;
for (UINT i = 0; i < adapterArray.size() ; ++i)
   {
      result = adapterArray[i]->GetDesc(&adapterDesc);
   if(FAILED(result))
   {
      MessageBoxA(hwnd, "Cannot get video card description.", "DXGI", MB_OK);
      return false;
   }
 
if ( adapterDesc.DedicatedVideoMemory > memorySize )
   {
      cardName = reinterpret_cast<WCHAR*>(&adapterDesc.Description);
      memorySize = adapterDesc.DedicatedVideoMemory;
      adapterIndex = i;
      OutputDebugStringA("if clause executed; larger memory size found\r\n");
   }
}
 
The second adapter on that system is the Intel HD with a smaller memory.
The if clause successfully gets skipped as the first adapter already has the largest memory and is assumed the high powered device.
Yet part of my if clause, the one that sets the pointer to the name does get executed.
(in the running engine you see the Intel HD but the larger memory size mixed together, which prompted me to debug this)
 
I assume the compiler really hates my coercion with a reinterpret_cast as I got compiler errors initially hence its use but would that explain this bug? The strings are successfully set, first to the AMD as it should, then as the Intel which is the bug.

IG + Nvidia, IG faster

26 May 2015 - 12:03 PM

I had access to my uncle's laptop for a week as he wanted me to do some maintenance so I thought it'd be fun to see how my Direct3D11 apps would run somewhere else than only on my own laptop. This is where I'm running into the typical dual video card situation with an intel 3000 + Nvidia and I got 2 issues.

First, when DXGI has to enumerate adapters it can only find 1, the integrated one. Second, if I use Nvidia control panel and force my app to use the Nvidia, it still seems to run on the intel (I output the card name) and runs VERY slowly (yet prints 63 fps, an odd value with vsync but oh well)

My code is based on rastertek's tutorials with modifications although not to his DX init code. Any ideas what may be happening?
Is this a common situation with a simple fix?

Modern cartoon shading that's not cel shading

01 April 2015 - 10:09 AM

My little nephew was watching that cartoon I had never heard about some time ago (Raving Rabbids) and I was thinking wow that kind of shading would be perfect for the kind of games I would like to make, especially if on mobiles.

One has to choose a niche that fits and obviously AAA-style MASSIVELY MORPG games will forever remain beyond my capabilities and resources, while puzzle/board games aren't my cup of tea and 2D isn't what I want to aim for as I have a reasonable knowledge of 3D, so small 3D adventures (small as in the size of the gameworld, not necessarily the whole scope which can be divided with gamestates in manageable chunks such as old SNES Zelda did) with a cartoonish look are exactly what I want to do. 

I've covered all Rastertek tutorials and am in the process of totally factoring that codebase into something more reusable and flexible, or in short, creating a modest, but somewhat capable 3D engine. I'm comfortable with HLSL as well.

 

The kind of shading I want to achieve can be seen here in case you don't know that cartoon (though it's also a game here):

 

Now for the actual question (TL;DR)

 

When I say cartoonish I don't mean cel-shading which doesn't resemble the style I'm after at all. As you can see it's far from "flat" and there aren't necessarily black outlines. Is the shading-style I want above actually a form of processing or is it just that the artists use brightly coloured textures while the shaders are still just plain good ol' bling-phong, shadow maps, point lights etc?

 

Maybe somewhat of a second question, some effects you see puzzle me a bit too. When one character flexes his muscles (0:32) you see a left-to right white-coloured swipe (0:37 - 0:38) that briefly highlights it. Maybe it's just a strong specular effect with the light moving left-to-right or is it something more sophisticated? And the fireball (0:19) lights its surroundings a bit, a (screen-space?) pointlight or something I guess?

 

Thanks fro clarifying things a bit for me.

 

 

 

 


VS 2013 Community Edition while keeping VS 2012 Express

27 February 2015 - 07:27 AM

I'm slowly being tempted to move away from VS 2012 to VS 2013 Community Edition but don't want to do so if it implies uninstalling VS 2012 Express as I'd want to be sure that all my past projects compile fine with it.

Are there any conflicts between both versions or can you safely install the newer version into its own folder?

Is it even necessary to be afraid that some projects wouldn't compile without involved setting modifications, paths etc?

 

 

EDIT:

 

Sorry don't bother, I finally found the clear & concise answer that I was looking for.

If only the legalese could do the same :P

 

Q: Can you install Visual Studio Community 2013 and Visual Studio Express editions on the same machine?

 

A: Yes.


PARTNERS