Lighting problem on Intel HD Graphics 3000

Started by
11 comments, last by Phil UK 87 11 years, 1 month ago

Hi guys, I'm working on a project which uses DirectX 9 to render a detailed 3D scene and I have a problem. The program was developed for machines with Nvidia graphics on which it works correctly, but now it needs to be altered to run on a laptop with HD Graphics 3000 (2nd gen i7).

It seems that lighting isn't working at all; textures are visible but they look like they are being lit with white ambient lighting instead of the lights in the scene. Everything that isn't textured appears white (invisible against the white background) although some edges of objects are partially visible. Does this sound like a familiar problem to anyone? If anyone has any advice it would be greatly appreciated. I've already tried setting the program to load only one object and that didn't change anything, so it doesn't seem to be the amount of 3D data that's causing the problem.

Advertisement

Unfortunately, this can happen for a lot of reasons. You best bet is to bring up your scene in PIX and see exactly what is going wrong in the shader.

Perception is when one imagination clashes with another

Also, if you haven't already, update the Intel graphics driver :)

The ones that are commonly pre-installed are usually very poor in terms of features, even though they would be stable otherwise.

It is also worth noting that Intel integrated graphics have never been - nor likely will be - as feature-rich as discrete GPUs from NVidia or AMD, even though later versions of D3D do try to fix the feature parity issue.

Niko Suni

Thanks for the help. I couldn't get PIX to work with the program, I think I'd have to strip it down to just DirectX code which would take ages because it's a massive project and the DirectX code is mixed in with lots of other classes.

I already updated the graphics driver which didn't change anything. Nik, I'm not sure what you mean by "later versions of D3D", do you mean that I should update the DirectX runtime? i.e. from here: http://www.microsoft.com/en-us/download/details.aspx?id=8109

One more thing, the renderer isn't particularly complicated in that it just uses diffuse lighting and some texturing, no shaders as far as I can tell. It's just that there is a massive amount of geometry (hundreds of MBs of .x files). I think due to this complexity the programmers who worked on it before did something to optimize the renderer for Nvidia cards which could be why the Intel GPU has trouble with it. Unfortunately the other programmers moved on a while ago so I can't ask them what they did.

Can you post your device creation code?

Perception is when one imagination clashes with another

Try analysing a frame with intel GPA (intel's version of Pix/Nsight). It's pretty stable, and should give you a good idea of what is going wrong.

Assuming it's not a driver issue, it's most likely some assumption your code is making - perhaps some undefined state that you're relying on, which just happens to be ok with most graphics cards. Or maybe (given the description of your problem) some problem with setting shader constants.

Have you tried enabling the DirectX debug runtime and seeing if it spits out any interesting messages?

the renderer isn't particularly complicated in that it just uses diffuse lighting and some texturing, no shaders as far as I can tell

No shaders? Is it using the fixed function pipeline?

If the geometry is extremely massive (as you say), you may hit the boundary of how many primitives you can draw in one call.

By "later versions" I mean D3D10 and later; in contrast to D3D9, the newer APIs aim to unify the feature set across same-era hardware, while the performance is the biggest varying factor between low and high end.

Niko Suni

The reason would be the lack of required shader model.

Maybe the GPU has no PS 2.0/3.0 support, which could not be emulated.

The latest Intel's GPUs work correctly, I think.

The reason would be the lack of required shader model.

Maybe the GPU has no PS 2.0/3.0 support, which could not be emulated.

The latest Intel's GPUs work correctly, I think.

The Intel HD 3000 is DX 10.1 capable - I don't think we need be concerned with this as a possible explanation.

Direct3D has need of instancing, but we do not. We have plenty of glVertexAttrib calls.

This topic is closed to new replies.

Advertisement