How to do 'from scratch GPU programming'?
I believe they talk directly to the driver. As for how I could not say that is where the trade secrets come in. My best guess is that the programmers of DX and OGL are kernel level programmers who know how to exactly tell the os to send so and so instructions to the driver and the driver passes it all the the GPU. You must remember Microsoft and the makers of OGL work in tandem with Nvidia and ATI.
Quote:Original post by blewisjr
I believe they talk directly to the driver.
They are the driver.
Quote:
You must remember Microsoft and the makers of OGL work in tandem with Nvidia and ATI.
The 'makers of OpenGL' actually work at NVidia or ATI (well, not so much ATI [wink]). Each chipset vendor writes their own drivers, and that includes the OpenGL/D3D parts of it. So obviously they know all the secret GPU protocols - because they own them. From the point of view of a GPU manufacturer, both OpenGL and D3D are just a specification for an interface. They have to implement this interface for their respective chipset, which is done as part of the driver.
You, the application developer, then talk to this interface from the "other side" if you want (well, it's actually routed through an additional thin layer, but conceptually you directly talk to the driver when using OpenGL or D3D).
Thanks for clearing that up Yann L. So if I get this right all the DX api and OGL api are then would be interfaces to the driver.
You could do like I did a few months ago...
Pickup a few microcontrollers or microprocessors, preferably 8-bit. Create the additional hardware for one to run on, and learn to program it in assembly. Then develop the crazy idea to build a home-built game console with it. Connect it to an RCA jack and program a GPU in software for the MCU/MPU that rasterizes buffered graphics. Then purchase a FPGA/CPLD and write a simple 3D rasterize in VHDL/verilog. If you haven't fried your TV by now, you have created a graphics card (well basically) and learned to program it at the lowest level.
After all that... direct register access isn't scary, and an API is luxury. Seriously it's a great way to learn to program at the lowest level... such as directly interfacing with a GPU.
Pickup a few microcontrollers or microprocessors, preferably 8-bit. Create the additional hardware for one to run on, and learn to program it in assembly. Then develop the crazy idea to build a home-built game console with it. Connect it to an RCA jack and program a GPU in software for the MCU/MPU that rasterizes buffered graphics. Then purchase a FPGA/CPLD and write a simple 3D rasterize in VHDL/verilog. If you haven't fried your TV by now, you have created a graphics card (well basically) and learned to program it at the lowest level.
After all that... direct register access isn't scary, and an API is luxury. Seriously it's a great way to learn to program at the lowest level... such as directly interfacing with a GPU.
Quote:Original post by Craig_jb
You could do like I did a few months ago...
Pickup a few microcontrollers or microprocessors, preferably 8-bit. Create the additional hardware for one to run on, and learn to program it in assembly. Then develop the crazy idea to build a home-built game console with it. Connect it to an RCA jack and program a GPU in software for the MCU/MPU that rasterizes buffered graphics. Then purchase a FPGA/CPLD and write a simple 3D rasterize in VHDL/verilog. If you haven't fried your TV by now, you have created a graphics card (well basically) and learned to program it at the lowest level.
After all that... direct register access isn't scary, and an API is luxury. Seriously it's a great way to learn to program at the lowest level... such as directly interfacing with a GPU.
Rate up for being a total power geek :D
Quote:Original post by blewisjr
Thanks for clearing that up Yann L. So if I get this right all the DX api and OGL api are then would be interfaces to the driver.
With OpenGL, you're directly talking to the driver, yes. For D3D, you're routed through an additional layer, which is rather thin on XP and significantly heavier on Vista (but this doesn't necessarily have performance implications, it's just that MS does a lot more work on 'their' part of the driver on Vista than on XP).
Quote:Original post by Craig_jb
You could do like I did a few months ago...
You must have a lot of time on your hands... ;)
Then I cant do 3D graphics in the video card without Direct3D or OpenGL? I only have these two choices?
Well from what I've understood about "low-level" graphics programming, the lowest and safest level you can work with is the shader languages: HLSL, GLSL, and Cg. Those are "portable" assembly languages for the video cards (well that's what I've heard them described as a year or two ago).
If you really want to, and you have an Nvidia card, install Linux and the Nouveau driver and you can use DRM to map and write directly to the GPU command buffer. Here is an example of such a program.
http://nouveau.cvs.sourceforge.net/nouveau/nv40_demo/main.c?revision=1.8&view=markup
I suggest you don't waste your time writing code at such a low level, unless you're interested in writing drivers.
http://nouveau.cvs.sourceforge.net/nouveau/nv40_demo/main.c?revision=1.8&view=markup
I suggest you don't waste your time writing code at such a low level, unless you're interested in writing drivers.
Quote:Original post by Yann L
Yes you can. You can increase the GPU clock rate beyond its thermal tolerance level. NVidia even lets you do this from the driver panel, if you agree to a length legal agreement which will void all your warranty.
Simple, don't use NVidia's overclocking tools [grin]
This topic is closed to new replies.
Advertisement
Popular Topics
Advertisement