It seems there is some confusion here ?
The "bit" aspect of consoles that was so popular back in the days wasn't about the bitrate of Still Images,but a quality of hardware,specifically processors.
NES had an 8bit processor,SNES a 16 bit processor,ps1 a 32bit processor,and N64 a 64bit processor.
There is a reason why companies stopped marketing 'bits' for their consoles,and that is because up until PS4 and Xbox One,nothing more than 32 bit CPU was actually needed.
Even the high end computers today,still use either 32bits or 64bits. 32 bit processors are good enough until you need more of 4gb of RAM. 64bit processing is only now starting to become mainstream,as more and more heavy applications are made,and users tend to multitask more. 128 bit processing isn't coming any time soon,as 64 bit processors can use up to 16 exabytes of memory.
That basically means that no matter the pixel count of each individual art piece you make,if you want let's say to make a true 16bit game what you should do is to make a game that is limited only to what 16bit processors could do.
A limitation of that,is that the game shouldn't need more of 16 mb of RAM to run...
There is no confusion, terms like 8bit to describe artwork etc. have become seperated from any technical meaning.