Designing a bitmapped engine on a microcontroller

Started by
14 comments, last by Anthony Serrano 5 years, 10 months ago

I'm trying to implement a sprite animation on a microcontroller like ARM Cortex M4.

I can now draw a bitmap.. the bitmap is stored on eeprom or static memory as an array.. 

Now my plan is to load a sprite sheet, then try animate the sprite...

What is the easiest approach for that purpose ?

Would someone recommend resources like books or tutorials on old school bitmapped graphics programming ?

Game Programming is the process of converting dead pictures to live ones .
Advertisement

That sounds like a fun project! I've loved to program software rasterizer on arm!

What is your exact hardware?

 

What exactly do you want to program? A lot of old school tutorials teach about using special hardware to handle sprites, e.g. gameboy.

But on the other side,everything with sprites is quite straight forward.maybe ask for something specific we can help on :)

My hardware is STMF32 Discovery board 429

I'm trying to crop an image (bitmap) static array, I get scan lines

I made grap x,y 0,0

its hould draw the whole image, but it draws scanlines


void Rasterizer::GrapBitmap(uint16_t *bitmap, sprite_ptr sprite, int sprite_frame, int grap_x, int grap_y)
{
    sprite->frames[sprite_frame] = (uint16_t*)malloc(sprite->width*sprite->height*2);

    uint16_t *sprite_data = sprite->frames[sprite_frame];

    for (size_t row = 0; row < sprite->width; row++)
    {
        for (size_t col = 0; col < sprite->height; col++)
        {
            sprite_data[row*sprite->width  + col] = bitmap[((row + grap_y)*sprite->width) + (grap_x) + col];
        }
    }
}

void Rasterizer::DrawSprite(sprite_ptr sprite)
{
    int i, j;

    for (i = 0; i < sprite->width; i++)
        for (j = 0; j < sprite->height; j++) {
            int width = sprite->width;
             uint16_t *work = sprite->frames[sprite->curr_frame];
             uint16_t color = work[j*sprite->width + i];
            SetPixel(i + sprite->x, j + sprite->y, color);
         }
}

 

Game Programming is the process of converting dead pictures to live ones .

Well, it seems like you're iterating "row" from 0 to width, and not from 0 to height... and the converse mistake with "col". Unless your graphics data and screen buffer are column-major, and you're deliberately rendering rotated 90 degrees from the screen's natural orientation?

If you're on a tiny microcontroller like this, why are you burning RAM by copying image data from one "bitmap" area in EEPROM to many "sprite" areas the RAM heap at runtime? Is EEPROM access significantly slower than RAM access on this platform?

In your DrawSprite method, you only need to calculate the *work pointer once per method call, not once per pixel.

And last, if that processor has any kind of CPU cache and cache lines are wider than a UINT16, you will very likely get much better performance if you make Y coordinates your outer loop; currently it's your inner loop.

RIP GameDev.net: launched 2 unusably-broken forum engines in as many years, and now has ceased operating as a forum at all, happy to remain naught but an advertising platform with an attached social media presense, headed by a staff who by their own admission have no idea what their userbase wants or expects.Here's to the good times; shame they exist in the past.

Thanks for your comments, I will keep that thread as update to the progress.

Right now I'm doing the stuff on the SDL as a simulator to the C code that is going to be ported on ARM until I get the kit.

 

 

Game Programming is the process of converting dead pictures to live ones .
On 5/31/2018 at 12:33 AM, AhmedSaleh said:

My hardware is STMF32 Discovery board 429

Picking a board with a built in screen was a good idea - things get much more awkward, and slower, if you have to support the screen 'manually'.

You'll definitely want to look at the reference manual for the STM32F429.

The STM32Cube libraries have lots of example code you can get ideas from.

Interestingly, that microcontroller has a hardware 2D accelerator. You probably want to do things in software to start with though. If you look at how the 2D engine works, you can write your software implementation so it can be easily changed to work with the hardware later.

@dave j

@Wyrframe

I'm doing bitmapped graphics and eats up the memory... for example a frame of animation takes 10k bytes, 70*70*2 (16bits)..

any suggestions for reducing memory consumption ?

Game Programming is the process of converting dead pictures to live ones .
1 hour ago, AhmedSaleh said:

@dave j

@Wyrframe

I'm doing bitmapped graphics and eats up the memory... for example a frame of animation takes 10k bytes, 70*70*2 (16bits)..

any suggestions for reducing memory consumption ?

Use 8 bit colour mode

The built in display controller supports 8 bits per pixel with a palette. So you could halve the memory requirements by switching to 8 bit mode if you can limit your sprites to 256 colours from a single palette.

Use paletted sprites

You could stick with a 16 bits per pixel screen and store the sprites with less colours but will a palette for each sprite. This would cost a palette lookup per pixel but would mean you needed fewer bits per pixel. You'd still get to have the full range of colours overall but each sprite would be limited. Importantly, this scheme is supported by the hardware 2D accelerator I mentioned in my earlier post with both 8 and 4 bits per pixel.

 

Progress:

 

 

Game Programming is the process of converting dead pictures to live ones .

You need external EPROM to store the bitmaps.

S T O P C R I M E !

Visual Pro 2005 C++ DX9 Cubase VST 3.70 Working on : LevelContainer class & LevelEditor

This topic is closed to new replies.

Advertisement