DX maybe dead before long....

Started by
41 comments, last by Ravyne 13 years, 1 month ago

[quote name='MARS_999' timestamp='1300463348' post='4787514']
http://www.bit-tech....ll-to-directx/1

Comments....
[color="#222222"][font="Arial, Helvetica, sans-serif"]It seems pretty amazing, then, that while PC games often look better than their console equivalents, they still don't beat console graphics into the ground. [/font]
[font="Arial, Helvetica, sans-serif"][color="#222222"]according to AMD, this could potentially change if PC games developers were able to program PC hardware directly at a low-level, rather than having to go through an API, such as DirectX.[/quote]Ok - so the argument goes like this:[/font]
[font="Arial, Helvetica, sans-serif"][color="#222222"]** Consoles have worse hardware, but can program the device at a low-level, resulting in better bang for your buck.[/font]
[font="Arial, Helvetica, sans-serif"][color="#222222"]** PC is stuck having to go through DX's abstractions, which adds unnecessary overhead.[/font]
[font="Arial, Helvetica, sans-serif"] [/font]
[font="Arial, Helvetica, sans-serif"][color="#222222"]Both these points are true, but the thing that makes it seem like nonsense to me is that the low-down-close-to-the-metal API on Xbox360, which lets us get awesome performance out of the GPU is.... DX 9 and a half.[/font]
[font="Arial, Helvetica, sans-serif"][color="#222222"]It's DirectX, with some of the layers peeled back. You can do your own VRam allocations, you can create resources yourself, you've got access to some of the API source and can inline your API calls, you've got access to command buffers, your can take ownership over individual GPU registers controlling things like blend states, and you've got amazing debugging and performance tools compared to PC.... but you still do all of these things through the DirectX API![/font]
[font="Arial, Helvetica, sans-serif"] [/font]
[font="Arial, Helvetica, sans-serif"][color="#222222"]This means the argument is a bit of a red herring. The problem isn't DirectX itself, the problem is the PC-specific implementations of DirectX that are lacking these low-level features.[/font]
[font="Arial, Helvetica, sans-serif"] [/font]
[font="Arial, Helvetica, sans-serif"][color="#222222"]The above argument is basically saying that DirectX9.5 games can achieve better performance than DirectX9 games... which is true... but also seems like a fairly obvious statement...[/font]
I been saying DX is a dog for years, all the DX nut jobs, no its fast your doing something wrong… Bah eat crow…[/quote]Wow. Way to start a nice level-headed discussion... Attacking fanbois just makes you look like a fanboi from a different camp... Don't do that.
[/quote]

And if you read at the end of my post

"Anyway I am for whatever gets us the best IQ and FPS on the hardware gamers spend their hard earned money for."

I could care less about the API, but what I am sick of is the people saying DX is better and when its not in fact, so if you are sticking up for DX than you are just as much of a fanboy....

And that should have been apparent by me saying I wish Larrabee would have taken off...

You want to take another shot....
Advertisement
ok


[edit]Wow, BIG thumbs up for using the S word below!! I'm sorry for going off topic by getting stuck on the nut job comment too...[/edit]

ok


Sorry about jumping on you... Not usually what I like to do. :)

I could care less about the API, but what I am sick of is the people saying DX is better and when its not in fact, so if you are sticking up for DX than you are just as much of a fanboy....


Better than "what" exactly? What is a practical and realistic alternative to DX or OpenGL on PC? That's the crux of the issue here. If you know anything about how these PC API's work then you know why they're slower than the equivalent on consoles, but you also know that they're 100% necessary.

Better than "what" exactly? What is a practical and realistic alternative to DX or OpenGL on PC? That's the crux of the issue here. If you know anything about how these PC API's work then you know why they're slower than the equivalent on consoles, but you also know that they're 100% necessary.


I think there is both truth and over pessimism in that post.

It is true that we will always need a way to talk to the hardware on our PC. A way to feed the right information to the GPU at the right time in order to get it to calculate and display what we want when we want. In this case the chain goes like DirectX -> Drivers -> Hardware. The drivers are really part of the operating system of the machine. So I don't think that layer can be removed. And therefore neither can the DirectX layer.

Having said that, I don't think its true that this chain has to be slow. The drivers layer and the DirectX layer simply need to be better. Given the drivers are already about as low level as you can get, the biggest improvement in that chain can probably come in the form of making DirectX more low level (as has been stated before in this thread). And I don't think there is any reason why that can't happen.

Programmable shading was a step in the right direction. Now we need full programming access to the GPU in way that facilitates parallel programming. I think we need to get rid of the idea of treating the GPU as a way to process individual triangles through the vertex and pixel shaders. I think we need to instead start thinking about the GPU as way to perform a task thousands of times at the same time, and thus get that task done thousands of times faster. This will mean that the G in GPU no longer means Graphics, but instead means General. It also means that filling the ~10^6 pixels on a monitor will become a purpose of the GPU instead of the purpose.

Either that or we get rid of the GPU and go from ~10 core CPUs to ~1000 core CPUs. I would prefer that tbh, would be more efficient I think. Duplicating information on the GPU that I already have much better organised on the CPU is a pain for a start. Let the GPU die.

[font="Arial, Helvetica, sans-serif"][color="#222222"]The problem isn't DirectX itself, the problem is the PC-specific implementations of DirectX that are lacking these low-level features.[/font]


And the question really should be, is that low level even feasible on a PC?

Lets just talk draw calls. The 360 and the PS3 GPUs read the same memory that the CPU writes. The API level is just blasting the draw call and state information out of the CPU caches to main RAM. On a PC, that data has to be flushed, then DMA'd (which depend on non GPU or CPU hardware) to the GPU's memory where it can be processed. It may not seem like much, but its a substantial amount of work that makes this all happen reliably.

Even if the PC version can be stripped down to the barest essentials, you would still see large discrepancies in the number of things you could draw without instancing. The PC drawcall limits haven't really gone up in years despite huge performance improvements in GPU chipsets. This is because that type of data transfer (small block DMA) is not what the rest of the PC architecture has been made to handle.


Either that or we get rid of the GPU and go from ~10 core CPUs to ~1000 core CPUs. I would prefer that tbh, would be more efficient I think. Duplicating information on the GPU that I already have much better organised on the CPU is a pain for a start. Let the GPU die.


I actually like the separation, it makes programming GPU's far easier. Modern GPU's are a nice example of constrained and focused parallelism. Even neophyte programmers can write shaders that operate massively parallel and never deadlock the GPU. Most veteran programmers I've met that are working on general massively parallel systems still fight with concurrency issues and deadlocks.

Sure, you could constrain your massively parallel system on these new 1000 core CPUs such that you have the same protection, but then you have just re-invented shaders on less focused and probably slightly slower hardware.

With great power comes great responsibility. My experience with seasoned programmers and dual core machine has lead me to be skeptical that these massively parallel systems will actually be "general" in practice.

My prediction, 95% of the engineers that would use such a thing would subscribe to an API/Paradigm as restrictive as the current shader model on GPUs. The other 5% will release marginally better titles at best and go stark raving mad at worst.

And the question really should be, is that low level even feasible on a PC?
Yeah I don't think so, without putting a whole lot of extra work onto the game developers... but that's what the ATI rep in the article is suggesting. Some companies with enough manpower/money might be able to take advantage of such a low-level (portability nightmare) API though...
The 360 and the PS3 GPUs read the same memory that the CPU writes. The API level is just blasting the draw call and state information out of the CPU caches to main RAM. On a PC, that data has to be flushed, then DMA'd (which depend on non GPU or CPU hardware) to the GPU's memory where it can be processed. It may not seem like much, but its a substantial amount of work that makes this all happen reliably.[/quote]Well, the PS3's GPU can read from the CPU-local system RAM, but it does have two main banks (one local to GPU and one local to CPU), just like a PC does.
The CPU can write directly into VRAM, but sometimes it's faster to have a SPU DMA from system-RAM into it's LS and then DMA from there into VRAM (thanks, sony). However, on PS3, the CPU actually writes draw-calls to system RAM, and the GPU reads them directly from the system RAM (not from VRAM).
But yes, it's these kinds of details that DX (thankfully) protects us from ;)

Well, the PS3...


I did gloss over a lot of the gory details (hard to believe in a post as long as it was). The whole main RAM/VRAM distinction on the PS3 is the kind of thing that makes direct hardware access a pain. Yet it is still less restrictive than a modern PC architecture.

Perhaps this is the solution to this thread... all those who are in favor of ditching DX for "low level goodness", spend a dev cycle on a PS3. You will learn to love your DirectX/PC safety mittens roughly the first time you read about or independently re-invent an idea like using the GPU to DMA memory between RAM banks.
Perhaps this is the solution to this thread... all those who are in favor of ditching DX for "low level goodness", spend a dev cycle on a PS3. You will learn to love your DirectX/PC safety mittens roughly the first time you read about or independently re-invent an idea like using the GPU to DMA memory between RAM banks.


I am in favour of low level goodness, but I am not in favour of ditching DirectX (unless we also ditch the GPU all together). I simply think DirectX should be evolved, just as it it evolved from fixed function to programmable shading, it should now evolve to more flexible, efficient, and powerful programmable shading, which if I'm not mistaken is a synonym for lower level.

Either that or we get rid of the GPU and go from ~10 core CPUs to ~1000 core CPUs. I would prefer that tbh, would be more efficient I think. Duplicating information on the GPU that I already have much better organised on the CPU is a pain for a start. Let the GPU die.


The problem with this is that as any significant move in one direction by either the CPU or GPU will hurt what the CPU or GPU is good at as I previously indicated.

A 1000 core CPU still isn't going to match a GPU simply because of how the modern GPU works vs how a CPU works when it comes to processing work loads.

SPUs are a pretty good indication of what a more 'general' GPU would look like; horrible branch performance, but great at data processing, but a CPU made up of cores like that is going to fail horribly at the more general tasks which a CPU has to do.

From a hardware point of view imo an ideal setup would be;
  • x64/ARM cores for normal processing
  • SPU/ALU array on the same die as the above for steam style processing
  • DX11+ class GPU for graphics processing
That way you get a decent mix of the various processing requirements you have in a game; I'm kinda hoping the next consoles pick up this sort of mix of hardware tbh.


This topic is closed to new replies.

Advertisement