Even a modern Intel CPU's built in hardware acceleration will give the RSX a run for it's money!
That PC spec is definitely aimed at delivering much higher quality than current-gen consoles do.
It's noticable for high-frequency details. Many details only change at a low frequency (except at edges), like indirect lighting, ambient occlusion, smoke/dust absorbtion, circle-of-confusion radius, etc... These can be computed at low-resolution, and then upscaled with a bilateral filter to fix edges, or the edges can be re-rendered at high resolution (using Hi-Stencil to avoid re-rendering the upscaled data).
Dropping down to that resolution would be very noticeable...But, are you saying that this is not done on the ports to PC of such games?
PC games likely use the same techniques, especially when selecting low detail settings. It's a standard optimization these days -- e.g. if playing in 2048*1152 on PC, expect some calculations to take place at 1024*576.
Perhaps if you select uber detail settings, they'd not perform these optimizations.
Halo 3 internally uses 1152×640, Reach internally uses 1152 x 720, however Halo 4 uses a new engine that apparently uses full 720p.
I'd like to read that information, sources?
I remember Bungie explained their choice somewhere (probably here), but googling [halo 720p 640 1152] brings up some other links besides the wikipedia one above.
You can find lists like this by searching for something like [1080p ps3 games], but keep in mind that these will include false-positives --
When a game boosts, it checks your XMB/dashboard settings to see what your desired TV resolution is, and then it can choose to create it's "front buffer" at that resolution, or a lower one (e.g. if you've selected 1080i, the game can still make a 720p front buffer). However, even if the game does create a 1080p front buffer, it may still be rendering at 720p and then up-scaling the results to 1080p itself.
Here's a good list: http://forum.beyond3...ead.php?t=46241[/edit]
Most games choose 720p over 1080p because it's half the pixel cost, and compared to modern PC's, the consoles suck at pixel processing.
Games with a lot of pressure to look great (like Halo or Modern Warfare) often go further, like the 640p example. Others dynamically change the resolution, like Wipeout. The last game I worked on, we'd time the GPU and if it started taking more than 33ms per frame, we'd continually reduce the horizontal resolution until the frame times stabilized (or until we hit a minimum resolution of 1024*720).
It's good and bad. It's separate to the regular 512MiB of RAM, which means if you want to bind a render-target and keep it's previous contents, then you've got to copy the previous contents from RAM into eDRAM. When you've finished rendering a render-target, you've also got to copy the results out of eDRAM into regular RAM. This means that switching render-targets can be very expensive on the 360 so you've got to avoid it (on other GPU's, switching render-targets can be as simple as changing a single pointer).
Isn't eDRAM an advantage?
Another down-side is that eDRAM is fixed size, and fairly small -- just 10MiB on the 360. This makes deferred rendering very hard (e.g. a 720p G-Buffer with 3 layers + a depth buffer is 14MiB) and also makes HDR complicated (e.g. a 720p RGBA FP16 + depth buffer is 10.5MiB). If you want to use those types of render-targets, then you either have to reduce their resolution until they do fit into the 10MiB limit, or split the target into multiple parts, and render your scene twice (doubling your vertex cost).
The upside is that eDRAM is lighting fast, so you're almost never ROP bound, even with alpha blending and high bits-per-pixel formats.