Drawing to own buffer?

Started by
3 comments, last by shinypixel 9 years, 11 months ago

Every time I've tried to be brutal to OpenGL it still manages to do what I want. So far I'm impressed.

My next attempt is trying to write to my own buffer (like an old DirectDraw software rasterizer). Please note I have no professional intentions of writing this way, but just want to see if it works.

GLubyte *back_buffer = new GLubyte[1280 * 800 * 32];   // 32-bit color back buffer in software

My stupid draw function

void drawPixel(const int x, const int y)
{
    back_buffer[y * SCREEN_WIDTH + x] = 0xFF; // draw something white here
}

Then, while rendering...

    // write to software double buffer
    glReadPixels(0, 0, SCREEN_WIDTH, SCREEN_HEIGHT, GL_RGBA, GL_UNSIGNED_BYTE, back_buffer);
    
    // draw to the buffer
    for (int i = 0; i < 1000; i++)
        drawPixel(rand()%SCREEN_WIDTH, rand()%SCREEN_HEIGHT);
    
    // throw the buffer onto the color buffer
    glRasterPos2f(0.0f, 0.0f);
    glDrawPixels(SCREEN_WIDTH, SCREEN_HEIGHT, GL_RGBA, GL_UNSIGNED_BYTE, back_buffer);

Why am I getting a blank screen?

Advertisement

First of all, its 32 bit (4 byte) color, not 32 byte color. So your buffer size should be 1280 * 800 * 4.

Second of all, you are writing to the wrong pixels. Each pixel contains 4 bytes, but you assume the size of a pixel is just 1 byte.


void drawPixel(const int x, const int y)
{
    int offset = (y * SCREEN_WIDTH + x) * 4;
    back_buffer[offset + 0] = 0xFF; // set R value to 255
    back_buffer[offset + 1] = 0xFF; // set G value to 255
    back_buffer[offset + 2] = 0xFF; // set B value to 255
    back_buffer[offset + 3] = 0xFF; // set A value to 255
}

I still think the problem is somewhere else. You should better post your whole code. Most probably your problem was that the alpha values might never have been set, making the result fully transparent. The setup of blending could affect the result of such operations, likely ignoring pixels with alpha set to zero.

There are a lot more things that can go wrong and why you don't see an image. I suggest you go through a detailed tutorial and then start doing experiments like this.

Finally, why would you do such a thing? Just use textures and shaders to produce images. You are throwing GPU acceleration out of the window with such an approach.

Thanks.I have a blank screen still, but at least it's going the right direction. I agree about starting my experiments after I'm done with the book.


It's in the plan to teach how graphics used to be done

Pixel Toaster. Or SDL, Allegro, etc. All of these are better suited to "old fashioned" drawing pixels to the screen via software.

Emulating this via glDrawPixels() is full of pitfalls. You'd probably be better off using a textured quad to draw your pixels to the screen, and upload the pixels to the texture every frame via glTexImage2D().

Tristam MacDonald. Ex-BigTech Software Engineer. Future farmer. [https://trist.am]

I like the idea of using textured quads. Thanks for the thoughts everyone.

This topic is closed to new replies.

Advertisement