8-bit or 16-bit?

Started by
34 comments, last by therapistgames 10 years, 3 months ago
Hey guys,
Thanks for all you input. This will come in handy for the future.

Honestly, I have only been drawing in this style for the past few days. So I've only been experimenting till now. I'm sorry if I have mislabeled these artworks.

I'm now going to base my characters on the third generation Pokemon games.

Thank you again,
Arannir
Advertisement

Depends on which 8-bit, Sega Master System or NES?

You can't do those sprites on the NES, as sprites can only have 3 colours (2-bit sprites).

But you can do those sprites on the Sega Master System, as it can do 15 colour sprites (4-bit sprites).

So yes those look like sprites on the 8-bit Sega Master System, a console you may not be aware of.

928413sega-master-system.system.jpg

Easiest way to make games, I love LÖVE && My dev blog/project

*Too lazy to renew domain, ignore above links

For computers, you can use almost all bit types which is another advantage of computers over consoles.

Personal life and your private thoughts always effect your career. Research is the intellectual backbone of game development and the first order. Version Control is crucial for full management of applications and software. The better the workflow pipeline, then the greater the potential output for a quality game. Completing projects is the last but finest order.

by Clinton, 3Ddreamer


For computers, you can use almost all bit types which is another advantage of computers over consoles.

Whut?

I assume you mean color depth (usually counted in number of bits).

For any modern console, you can use any color depth.

For the old consoles, it might be more relevant to compare them to their contemporary counterparts, and then the PC systems were very much limited in color depth too...

I'd say this art style is somewhere in between 16bit style graphics, and 8bit style graphics.

Too colorful for 8bit, but a bit too lowres for 16bit.

People are coding for PCs with 32-bit color depth just for the fun of it. I've even heard of 64-bit experiments. Granted, they are not in the mainstream, but you won't see that with consoles at all to the best of my [limited] knowledge. Software solutions for hardware limitations have been used for many years.

Personal life and your private thoughts always effect your career. Research is the intellectual backbone of game development and the first order. Version Control is crucial for full management of applications and software. The better the workflow pipeline, then the greater the potential output for a quality game. Completing projects is the last but finest order.

by Clinton, 3Ddreamer

I'd classify both as 8-bit. The jump up to 16-bit was a fundamental leap. With 8-bit color, games almost by necessity had to work with a predefined palette of colors. This tended to reduce shading ability and machine-generated effects. Colors tended to also have a greater amount of contrast, although a careful choice of palette could go a long way toward minimizing this constraint. (The original Starcraft was incredible in this regard. It technically had a palettized 8-bit color depth, but stylistically felt like 16-bit to me.)

16-bit color allowed games to treat colors as a composite of red, green, and blue channels, generally with red and blue each getting 5 bits (32 distinct levels), and green getting 6 bits (64 distinct levels) since the human eye is more sensitive to green. This allowed developers and artists to escape from the limitations of predefined palettes, enabling a variety of new options, such as applying machine generated effects, both at production time and game runtime. This also reduced the need to work with or around high contrast colors, enabling a large variety of stylistic options.

I would also avoid classifying either of the two images as 4-bit, even though both could technically be represented with a 4-bit palette. The reason is that outside of clever tricks, the 4-bit limitation applies to the entire scene, not individual sprites. When you only have 16 colors to work with for everything, each individual sprite is unlikely to make effective use of most of those colors. Most sprites would still only use 2 to 5 colors each. (But with care, similar to 8-bit palettes, you could construct a pretty effective 4-bit palette. Warlords II I feel did a great job of this. It had black, white, and 4 shades of gray, 3 shades of green, 2 shades of brown and blue, and just one shade each for yellow, orange, and red. This gave just enough gradient flexibility for the important stuff, while still keeping enough variety in the overall color selection.)

"We should have a great fewer disputes in the world if words were taken for what they are, the signs of our ideas only, and not for things themselves." - John Locke

I'd put them as late-gen 8-bit sprites, remember the Master System guys.

Easiest way to make games, I love LÖVE && My dev blog/project

*Too lazy to renew domain, ignore above links


People are coding for PCs with 32-bit color depth just for the fun of it. I've even heard of 64-bit experiments. Granted, they are not in the mainstream, but you won't see that with consoles at all to the best of my [limited] knowledge. Software solutions for hardware limitations have been used for many years.

Going a bit off topic here, but both the XBox 360 and the PS3 support 32 bit color depth (128 bit pixels). I'm no console developer, so I don't know how widely used it is, but I assume its used by the high end titles for HDR lighting.

Any software implementation of higher color depth would be possible to use on a console too, they are not that different from a PC, and getting more similar with every generation.

128 bit pixels? You're saying that they have 32 bits for each channel plus 32 bits alpha?

32 bit color depth means = 8 bit per channel (RGB) + 8 bit alpha. 32 bit per pixel.

128 bit per color, according to my maths, means 30Mb per frame on a 1080p display. Which means that hardware like that could push 4k resolution (32 bit per pixel) without breaking a sweat and I don't think it works like that just now biggrin.png

And yes, 32 bpp is what consoles and PCs have been using for years now. Both doing HDR in the GPU.

EDIT: Ohhh, google fu tells me this: Each color is stored in a float (ie, 32 bit) for HDR rendering. Though I do have heard of the FP16 format used for HDR. So that means, 8 bit per channel + 8 bit alpha for output, possibly 32 bit per channel for computation.

"I AM ZE EMPRAH OPENGL 3.3 THE CORE, I DEMAND FROM THEE ZE SHADERZ AND MATRIXEZ"

My journals: dustArtemis ECS framework and Making a Terrain Generator

Well, it's only 64 bits. But My card is able to use float formats for output. I'm not sure if newer cards support full 128bit modes.

fhzE03O.png

This topic is closed to new replies.

Advertisement