8-bit or 16-bit?

Started by
34 comments, last by therapistgames 10 years, 2 months ago

It seems there is some confusion here ?

The "bit" aspect of consoles that was so popular back in the days wasn't about the bitrate of Still Images,but a quality of hardware,specifically processors.

NES had an 8bit processor,SNES a 16 bit processor,ps1 a 32bit processor,and N64 a 64bit processor.

There is a reason why companies stopped marketing 'bits' for their consoles,and that is because up until PS4 and Xbox One,nothing more than 32 bit CPU was actually needed.

Even the high end computers today,still use either 32bits or 64bits. 32 bit processors are good enough until you need more of 4gb of RAM. 64bit processing is only now starting to become mainstream,as more and more heavy applications are made,and users tend to multitask more. 128 bit processing isn't coming any time soon,as 64 bit processors can use up to 16 exabytes of memory.

That basically means that no matter the pixel count of each individual art piece you make,if you want let's say to make a true 16bit game what you should do is to make a game that is limited only to what 16bit processors could do.

A limitation of that,is that the game shouldn't need more of 16 mb of RAM to run...

There is no confusion, terms like 8bit to describe artwork etc. have become seperated from any technical meaning.

Advertisement

It seems there is some confusion here ?

The "bit" aspect of consoles that was so popular back in the days wasn't about the bitrate of Still Images,but a quality of hardware,specifically processors.

NES had an 8bit processor,SNES a 16 bit processor,ps1 a 32bit processor,and N64 a 64bit processor.

...

That basically means that no matter the pixel count of each individual art piece you make,if you want let's say to make a true 16bit game what you should do is to make a game that is limited only to what 16bit processors could do.

Yes and no -- Certainly most of the early consoles were marketted as "bits" of their CPU, but regardless of whether the word-size of the CPU was tied to the graphics capabilities or not (mostly not), there's still a strong correlation to the display chips that were available and innexpensive at the time -- The graphics chip in the NES, Master System, Genesis, and some of the contemporary 8-bit home computers were all relatively similar in capability and some of them were within the same chip family. When people say "8-bit graphics" they don't mean an 8-bit graphics processor necessarily (and really, it doesn't even make much sense to speak of the graphics chips of the day themselves in terms of bit-ness), they mean "graphics typical of consoles with 8-bit CPUs".

Definately, though, 8-bit graphics are about the feel -- the NES color palette is probably the strongest example -- When you see a screenshot of an old NES game, you know its NES because of the somewhat odd palette, look at a Master System game from the same time period and you're not sure -- it could be a master system, or it could be an early EGA/VGA PC game just as easily. Then there are limits on the use of colors -- 4 colors in any 8x8 pixel tile -- which affected the overall aesthetics of the full-screen images.

And 8-bit games are about the feel too -- Take a game that is otherwise faithfully 8-bit, but allow for perfect sprite rotation -- completely breaks the 8-bit illusion. Same for having animation that's too detailed, or having sprites that are too large (that aren't static -- some games had large "sprites" that were part of the background, so had the same limitations as the background).


A limitation of that,is that the game shouldn't need more of 16 mb of RAM to run...

Not necessarily. Getting a retro-looking game to run feasiby on modern hardware in a straight-forward way (e.g. not writing what's essentially an emulator) is going to eat up more memory. The run-time format of your images alone will be 4-16 times larger, and sound effects and music can be many hundreds of times larger (even if compressed to mp3 format) -- modern hardware just doesn't work in the way the old consoles did.

The real litmus test is whether your retro game would be feasible to implement on one of those old systems.

throw table_exception("(? ???)? ? ???");

Do you think we'll get a 128-bit version of Windows 10?

Easiest way to make games, I love LÖVE && My dev blog/project

*Too lazy to renew domain, ignore above links

Do you think we'll get a 128-bit version of Windows 10?


Nah, its going to be a very long time before we see 128bit word-size/address-space in a commodity CPU.

While the transition from 8-to-16, and 16-to-32 bit architectures was driven by both address space *and* computation concerns, the transition from 32-to-64 bit was almost entirely driven by address-space. 32bit math is plenty large for most things (except pointer arithmetic on 64 bit machines), and for everything that's left over, a big-number library is probably a practical necessity anyways. There's no push for 128bit integers, and even large clusters are no where near exhausting a 64 bit address space. I bet google, Facebook, the other top-10 internet sites, and the NSA combined aren't near either.

Perhaps in specialized machines or big-iron, but not in the commodity market.

throw table_exception("(? ???)? ? ???");


Do you think we'll get a 128-bit version of Windows 10?
We're almost past the point of "CPU bits" being not very meaningful.

Modern CPUs can operate on 8-bit, 16-bit, 32-bit, 64-bit, 80-bit, 128-bit, and 512-bit datatypes, using different instructions and different registers.

Despite that, we'd just call it a 64-bit CPU.

I'd put them under 8bit artwork rather than 16bit. As 8bit tends to have less detail pixels than 16bit.

i would go with 8bit from a purely audience expectation perspective. It 'looks' 8bit....regardless of whether it is or not

This topic is closed to new replies.

Advertisement