Jump to content
  • Advertisement
Sign in to follow this  
Fletch F Fletch

OpenGL Drawing pixel by pixel to framebuffer

This topic is 399 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Hi all,

I recently started playing around with a little bit of old school software rendering. I'm trying to start from scratch before I actually make the move to OpenGL/D3D or whatever.

I wrote my little demo in C# as a Windows Forms app, but I've come to realize that that is far from ideal. My program works by writing to an internal framebuffer (int[]) and then copying this data into a standard .NET Bitmap that is displayed as the background image in the form. This feels all kinds of wrong though, so I've been looking into a more "modern" approach.

I looked at Monogame and although the whole game loop, input handling etc seems very nice, it doesn't appear to have a provision for writing a pixel directly to a texture/framebuffer and then displaying this. And I get the feeling there's a reason there is no such method. I read somewhere that a modern way of doing 2D rendering was to render to a texture and then to display this texture on a 3D quad (with orthographic projection) but how is that different?

Can anyone point me in the right direction? I just want an array of integers that I can manipulate however I want and then display.

Thanks

Share this post


Link to post
Share on other sites
Advertisement

Ultimately everything in Windows will end up in a D3D texture, and will get composited with the rest of the desktop by the Desktop Window Manager (DWM) using the GPU. By going through GDI+ (which is what WinForms uses under the hood) you'll be going through some layers of abstraction first, but at the end of the day your framebuffer is going to end up in a texture. By skipping GDI you can probably get better performance, but I couldn't tell you how much faster it would run or whether it would be meaningful for you. So if you want to learn a bit about MonoGame, D3D, or OpenGL, then go ahead and see if you can get your program working by more directly writing to a texture. But otherwise, I probably wouldn't worry about it too much unless you've determined (or strongly suspect) that overhead from GDI is really limiting your performance.

Anyhow, I have no experience with MonoGame but I did a bunch of XNA work back in the day. Their Texture2D class has a SetData method that you can use to fill a texture with data from an array of integers, which should be easy for you to use. To get that on the screen, probably the easiest way would be to create a SpriteBatch and pass your Texture2D to SpriteBatch.Draw. You could also of course manually draw a triangle to the screen that reads your texture, either by using a built-in shader program or by writing your own.

Edited by MJP

Share this post


Link to post
Share on other sites

Maybe I wasn't looking closely enough, but I previously had a look at the SetData method you mentioned. For some reason figured it wouldn't work for me. It does! I just knocked up a small test app featuring my fantastic 3D starfield code (yay!) and the windows forms version gets around 120 fps on my laptop. The Monogame version where I copy the int array to a texture and draw the texture in a SpriteBatch gets around 220 (after disabling the 60 fps lock/vsync).

Thanks a bunch!

Edited by Fletch F Fletch

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!