4K UHD

Started by
34 comments, last by Ravyne 9 years, 2 months ago

Yeah, I meant its even worse on movies. I read they do this for adjusting to the conditions in a movie theatre.

Advertisement

It might only be a novelty, but I could imagine a sidescroller with traditional cell-animation style would look really great at 4k, be perfectly ok running at 30hz, and probably would also be Ok with available GPU horsepower on either XBox One or PS4, assuming they could make fillrate work.

Basically, imagine Rayman Legends UltraHD remix, running 4K/30hz, with actual 30-cells / second animation; it would be gorgeous. One of the things that struck me both times going from VHS to DVD, and then DVD to Blueray, was just how great animation looks. I suspect that 1080p to 4k animation would strike me in the same way.

woah, that would be drool worthy... add some good artists and enough budget to get some spectacular and detailled sprites and smooth animations going, add some lighting glitz (at 2D, that shouldn't be too expensive GPU wise), and your 4K/30Hz 2D Game might be really getting attention even in this day and age.

For example, a UHD remaster of some 2D Brawlers...

What I'm about to say is one of those "easier said than done" sort of things, but it's an ideal, I feel, any developer worth their salt should consider...

Ignore screen size and develop your game assuming any resolution! This way, if someone is self-sadistic enough to try playing the game on a 2" 320x240 screen, so be it, but if they play it on a 100000000x100000000000000000 screen, all hail their C/GPU!

... in general, make the limitation of resolution that of the CPU or graphics processor, not what you believe to be the standard.

Like I said, "easier said than done", but that don't mean it's something that should be brushed away. Mip Maps exist for this reason, after all.

Its not easier said than done -- its done routinely. That's how basically all PC games work, at least ones that are 3D rendered or vector-based 2D; 2D bitmap graphics tend to look better if designed for a smaller set of pre-chosen resolutions.

But on a console you have known CPU/GPU and you also know that most will have 1080p TVs, with a smattering of 720p now, and a smaller but increasing number of 4k TVs. Given what you know, it only really makes sense to cater to 1080p right now for consoles. PCs, on the other hand, have a wide berth of CPU and GPU capabilities, and are attached to displays with a wide variety of resolutions, so it makes sense there.

throw table_exception("(? ???)? ? ???");

Its not easier said than done -- its done routinely. That's how basically all PC games work, at least ones that are 3D rendered or vector-based 2D; 2D bitmap graphics tend to look better if designed for a smaller set of pre-chosen resolutions.

But on a console you have known CPU/GPU and you also know that most will have 1080p TVs, with a smattering of 720p now, and a smaller but increasing number of 4k TVs. Given what you know, it only really makes sense to cater to 1080p right now for consoles. PCs, on the other hand, have a wide berth of CPU and GPU capabilities, and are attached to displays with a wide variety of resolutions, so it makes sense there.

I can't speak for AAA developers. Nor, really, indie devs. I'm just a hobbiest at the moment, but... limiting your engine/game because of "known specs" of the system you're running on, still feels silly and short sighted to me. Why cater when, if developing to be resolution agnostic, you cater to everyone! Not only would that allow your game to be as "future proof" as possible (lets say if XBox One or PS4 goes through a revision and gets a slight boost somewhere), but allow for the game/engine to migrate to platforms which support wildly different or varying resolutions with the minimum of effort.

Really... if this is being done already, why is there such a question as to whether 4k is worth it (from a development standpoint). The games should be written without resolution assumptions at all and the question of "if 4K worth it" should be left purely in the hands of people deciding whether they want to buy one.

As to the question on whether 4K is worth it from a financial standpoint... my opinion is no. I make a pretty decent salary at the moment and a financial purchase of $400+ is a big decision for me. At the current price point of 4K TVs, they're definitely not worth it. If I could nab a 30" or 40" 4k at about $400, then, maybe.


I can't speak for AAA developers. Nor, really, indie devs. I'm just a hobbiest at the moment, but... limiting your engine/game because of "known specs" of the system you're running on, still feels silly and short sighted to me. Why cater when, if developing to be resolution agnostic, you cater to everyone! Not only would that allow your game to be as "future proof" as possible (lets say if XBox One or PS4 goes through a revision and gets a slight boost somewhere), but allow for the game/engine to migrate to platforms which support wildly different or varying resolutions with the minimum of effort.

Consoles don't generally improve in ways that are visible to software developers over their lifetimes, and on the few occasions they have (Nintendo DS -> DSi, 3DS -> New 3DS), software generally either caters to one or both hardware models explicitly, not so much by deriving their behaviors from what capabilities they find available. in the DSi and New 3Ds, you literally ask the system which model it is, and then choose a path that takes advantage of those new abilities, you don't provide for arbitrary features configurations in the way you do when, say, targetting PC GPUs of various flavors and capabilities.

The reason its this way is because the consoles are a fixed platform, and will never not be a fixed platform. Even when there are differences, they are more like two similar fixed platforms, than they are like the open PC ecosystem.

And it can never not be this way because the fixed platform, and that developers are able to just take advantage of that platform and know its there, rather than testing for the millions of available configurations, are what gives the consoles their advantage. I mean, look at Halo 4 on Xbox 360 -- that generation of consoles was incredibly powerful for a console when it was launched, but in nominal terms they really are not very strong -- the Xbox 360 GPU is just 240 shaders at 500Mhz. A PC with 4 times the GPU on paper would likely struggle to keep up at same resolution. A PC with identical spec to Xbox 360 would dare not even dream such things were capable. So why does the PC suffer so much? Because on the PC it is normal and expected to do the things you are talking about. On console, there is exactly one path that recieves all the attention and is known to the last detail, therefore it is fast; on the PC there are 10s, if not hundreds, of paths that mostly work well enough, and might recieve special attention if the market-share of that particular GPU family is especially high around launch day -- plus you have to pay the tax of all the other layers of OS and hardware abstraction on a PC.

throw table_exception("(? ???)? ? ???");

This topic is closed to new replies.

Advertisement