Sign in to follow this  
ggambett

DirectX phoning home?

Recommended Posts

Scary stuff. Apparently the DirectX runtime tries to phone home (crl.microsoft.com) at game startup, which obviously frightens players (a firewall alert isn't nice!). Google says it's Direct3D 9 trying to verify that the certified video drivers still have valid WHQL certificates. After some more googling I've found some ways to disable it manually from Windows, but this isn't an acceptable solution for users. Is there a way to programatically disable this "feature"? BTW I'm looking for a D3D 7.0 compatible solution.

Share this post


Link to post
Share on other sites
Because it slows down startup (1-2 secs best case, up to 20 secs if there is no internet connection), and because players (especially casual game players, my customers) are scared to death of a firewall dialog popping up. I want them to buy my game, not to run away screaming "OMG SPYWARE" :)

Share this post


Link to post
Share on other sites
Quote:
Original post by ggambett
Because it slows down startup (1-2 secs best case, up to 20 secs if there is no internet connection), and because players (especially casual game players, my customers) are scared to death of a firewall dialog popping up. I want them to buy my game, not to run away screaming "OMG SPYWARE" :)

Given that players don't seem to be running away screaming from every other D3D game, why would they from yours?

Share this post


Link to post
Share on other sites
Guest Anonymous Poster
Quote:
Original post by ggambett
Scary stuff. Apparently the DirectX runtime tries to phone home (crl.microsoft.com) at game startup, which obviously frightens players (a firewall alert isn't nice!). Google says it's Direct3D 9 trying to verify that the certified video drivers still have valid WHQL certificates.

After some more googling I've found some ways to disable it manually from Windows, but this isn't an acceptable solution for users. Is there a way to programatically disable this "feature"? BTW I'm looking for a D3D 7.0 compatible solution.



In D3D9, you have to opt into this check on GetAdaptorIdentifier() using the D3DENUM_WHQL_LEVEL flag. On D3D8, you had to opt out of it using the D3DENUM_NO_WHQL_LEVEL flag. I don't believe the check was even there on D3D7.

Paul

Share this post


Link to post
Share on other sites
I'm curious - do you have an example of a game that does this? I've never come across it.. ever. And I play a lot of games.

Share this post


Link to post
Share on other sites
Our latest one does ( http://www.playfirst.com/game/piratepoppers ), and by googling around I've found many other references, such as the Rome : Total War demo ( http://www.gamespot.com/pc/strategy/rometotalwar/download.html?sid=6105478 ) and a couple more.

Share this post


Link to post
Share on other sites
So you're saying that you have D3D7-based code that is phoning home?

What Paul said/thinks is that D3D7 doesn't - and I'm pretty sure from my own memory that it was a new thing with D3D8 and later. That is, it shouldn't be an issue with D3D7.

Jack

Share this post


Link to post
Share on other sites
Believe me, my Direct3D7 code does. I think D3D7 didn't do it so the API didn't have a way to disable it. Now D3D8 and D3D9 do the check and the corresponding D3D8 and D3D9 API do have ways to disable it. Apparently the problem is using the D3D7 API with installed D3D8 or D3D9.

Fortunately not all D3D7 code does this (in the same machine) so I'm beginning to think D3DX, which I use, does something which most people don't do manually, which triggers this behavior.

Share this post


Link to post
Share on other sites
Quote:
Original post by ggambett
Fortunately not all D3D7 code does this (in the same machine) so I'm beginning to think D3DX, which I use, does something which most people don't do manually, which triggers this behavior.

I suppose it is possible, but doubtful. What are you using from D3DX? Since you are using DD7, it would have to be the math stuff, otherwise you would be using actual D3D9 stuff.

Share this post


Link to post
Share on other sites
I use D3DX to initialize ( D3DXCreateContext() ), a little bit of math, and texture uploading. Actual drawing (just quads - this is a hardware accelerated 2D engine) uses the Device object.

I guess it's D3DXCreateContext(). The D3D8 and D3D9 flags that control this behavior, the ones Paul listed above, are for GetAdapterIdentifier() which I'm not calling directly, but D3DX probably does. Doing my own init without calling GetAdaptorIdentifier() may fix the problem.

Share this post


Link to post
Share on other sites
Quote:
Original post by ggambett
Solution : it was indeed D3DXCreateContextEx(). I replaced the init with plain DDraw7/D3D7 code and the problem went away.

Hmm...interesting idea. I heard once that older DD interfaces were actually routed through newer D3D interfaces. Supposedly, this is to ease backwards compatability in hardware. If this is true, it could be that internally, a D3D device is created, which has that driver-checking functionality.

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

Sign in to follow this