• entries
  • comments
  • views

Developing a Graphics Driver I

Sign in to follow this  


Well, I've finished my first week. I want to write about what it's like to work on developing a graphics driver. First, we need to cover some basic architecture. Then I'll talk about the actual process. And remember, I'm on the DirectX driver team, so I'm going to focus on that. (I looked at OpenGL too, but I'm less clear on the architecture.) Not surprisingly, most development is currently focused on Windows Vista. We need to bring the Vista driver up to par with the XP driver, both in terms of correctness and performance.

If you take a D3D application, it's actually sitting on top of several layers. The first layer is the D3D runtime itself. What is underneath that changes somewhat between XP and Vista. In XP, D3D is on top of a kernel mode driver (KMD), which interacts with the HAL and a miniport driver (also kernel). There's a spliting of responsibilities between the various pieces. D3D essentially accumulates and validates information, and then forwards it onto the kernel driver at appropriate times. (Draw calls, present calls, etc.) The miniport driver handles low level communication details. Everything else is handled by the main kernel driver. The driver is very much on its own here, and has a relatively direct connection to the application.

In Vista, things shifted around and the splitting changed. The main graphics driver no longer lives in the kernel, but rather in user mode as a normal DLL which D3D loads for you. D3D comunicates with this layer through a series of calls and callbacks. There's an equivalent, separate layer for OpenGL. Both the OGL and D3D user mode drivers communicate with a common kernel mode layer. The KMD interacts with Vista's backend graphics scaffolding. In particular, Vista takes over on two fronts: resource management and GPU scheduling. In XP and earlier, the driver was solely responsible for deciding how allocations and deallocations were placed and managed, as well as how DMA was handled and when the GPU did what. All of those responsibilities have been taken over by the OS* now. Vista provides the scheduling, virtual memory, etc this way. The driver negotiates with Vista over that, and it's worth noting that direct pointers are no longer manipulated, because physical addresses are not always available to the driver. (Quite rarely, actually.) Kernel mode switches in this setup only happen when information needs to be communicated down to the kernel layer; this happens only when the user mode command buffer is full, when a present happens, or a couple other situations which force the KMD to become involved.

It should be clear by this point that the interactions between the various components are more complex under Vista. It shouldn't be a surprise, of course -- virtual memory, scheduling, etc weren't going to come for free. For the most part, the existing NVIDIA driver architecture seems to have fit in quite well; I assume that the designs of the ATI and NVIDIA drivers were both major factors in the design of the new Windows Display Driver Model (WDDM, formerly LDDM for Longhorn). Admittedly I don't know what things looked like pre-Vista, but right now it all looks fairly sane. The behavior of the driver isn't that different between XP and Vista; the main difference is that under Vista, some of the pieces aren't written by us anymore.

One particularly uncomfortable question is whether or not Vista is inherently slower than XP when it comes to graphics. That's a sticky discussion, and like all performance related discussions, it's hideously complex and intertwined. All I can really tell you is that Vista is for the most part slower than XP right now -- benchmarks by hardware sites have established that quite soundly. There are many issues to be worked through, both in our drivers and in Vista itself. We have plenty of tuning to do, and MS has some mistakes to fix. I suspect that things will get much, much better after Vista SP1 is released, thanks to work on the parts of both companies. (Also, I can't share the expected timeline for that. Sorry.) Things are getting better every day and will continue to improve.

That's enough for now. In the next part, I'll discuss the actual process of working on the driver.

* This is described in internal documentation under the sub-heading: "All Your Bufs are Belong to OS".
Sign in to follow this  


Recommended Comments

Shouldn't the Vista version eventually be faster? The move of the graphics driver to user mode should provide a big speed increase for each of the API calls that would normally have crossed the user -> kernal mode boundary in DirectX 9. The virtualization is a penalty, but if a benchmark is performed apples to apples (i.e. XP with a single fullscreen app using the graphics card vs. Vista with a SINGLE fullscreen app using the graphics card - no fancy UI to render to simultaneously) does it still fall behind?

Is it your expectation that the speed would eventually be better in Vista? I thought that was one of the whole reasons for the major overhaul of the driver system, to improve efficiency of API calls. Please let me know if I am wrong - its not like I am a graphics driver author for Nvidia or anything [grin]

Share this comment

Link to comment
I like the sound of where your journal is going [grin]

My interpretation of the WinHEC slides (2006?) was that the WDDM 1.0 included in Vista RTM was almost a placeholder. It checked the boxes but didn't really unleash the true potential - WDDM 2.0/2.1/3.0 were where/when this piece of the puzzle would come in to play. Promit - I'd be curious as to whether you agree with this given your new insight (provided you can comment).

As for the Vista SP1 release. That's OLD news [razz]


Share this comment

Link to comment
Original post by Jason Z
Shouldn't the Vista version eventually be faster?
It's a tricky question. You can argue that the extreme cheapness of draw calls under Vista should give a nice boost. But remember that games are designed for XP. We have engines batching heavily and aggressively to minimize state changes and draw calls. So that tends to counter the effect somewhat.

I do think Vista will run the same games faster, eventually. The drivers are still evolving and exploring the new model, so it's all a little touch and go. Like I said, there are plenty of changes coming down the line, and things should be really solid before too much longer.

Promit - I'd be curious as to whether you agree with this given your new insight (provided you can comment).
I don't know the details of what's coming in later WDDMs, as I'm not directly involved in that work. It also ties into D3D 10.1, which is coming with Vista SP1.

Share this comment

Link to comment
I hear WDDM introduces a mandatory copy operation of hardware command buffers. So the command buffer authored by the user-side driver is copied by the kernel-side driver before use by the hardware.

Isn't this a terrible overhead that can't be worked around with WDDM?

Share this comment

Link to comment

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now