Sign in to follow this  
jollyjeffers

ATI's new R520 hardware - no SM3 vertex texturing..

Recommended Posts

There's a more general thread on the ATI Radeon X1x00 series here... but I was just reading the coverage over at Beyond3D and I spotted this little fragment:
Quote:
From this page: However, one element of Vertex Shader 3.0 compliance is the capability for vertex texturing, yet there appears to be an absence of any texture lookup capabilities from ATI's diagrams
I'm gonna go double check my docs, but from that article it indicates that vertex texturing via D3D9 only works with a set of texture formats as exposed by the driver. The ATI hardware exposes no texture formats as compatible. It seems that there is a loophole in the SM3 specification that allows the hardware to be "SM3 compliant" yet effectively disable vertex texturing. Clever loophole I suppose.
Quote:
as somewhat of an alternative to Vertex Texturing ATI will be promoting the use of a new extension to DirectX know as Render to Vertex Buffer... --- Snip --- ...rather than rendering to a displayable surface or texture the results are rendered to a buffer in memory that can be directly read as an input to the vertex shader. The upshot of this process is that an application can have access to the capabilities of the Pixel Shader which can then be fed back into the geometry processing pipeline, which should result in a superset of capabilities of vertex texturing and should actually perform better than current vertex texturing schemes because the pixel pipelines are inherently built to better cope with the latencies of texturing.
Sounds clever from a hardware point of view, but a little clunky from a graphics engine / software development point of view. Anyone here played with vertex texturing on the GeForce 6x00/7x00? Do they have the same limitation? Can they do this "R2VB" stuff that's mentioned by ATI? A little tech demo I was thinking about would require vertex texturing support [headshake]... Cheers, Jack

Share this post


Link to post
Share on other sites
I thought that their stalling of the PC graphics tech front for a year+ was bad enough, but yet it continues on. It's obvious that ATI only cares about money, and not the advancement of one of the most beneficial aspects of computing.

Meanwhile, everyone with a 6000 series nVidia card is sitting here waiting for their GPU to become useful to its full extent.

Share this post


Link to post
Share on other sites
Rather an old particle: http://www.theregister.co.uk/2004/05/04/ati_confirms_no_shader_3/


Quote:
Obviously the pixel-shader processors have been upgraded to provide full support for SM 3.0.

http://www.hardwareanalysis.com/content/article/1817

Share this post


Link to post
Share on other sites
Dang, that's annoying. I was kinda pumped for this new line of cards to come out, too. The stream output stuff is interesting and useful, but it will be really annoying to implement code that uses it just for ATI cards. In essence, not only will you have different code paths for each D3D version, but also for the IHV's (within DX9 anyways).

Share this post


Link to post
Share on other sites
Quote:
Original post by Sneftel
That's a little strange. Didn't the x800 support vertex texturing?

I'm not aware of any previous ATI hardware supporting (properly or otherwise) vertex texturing. The previous parts did "hack" a form of "hardware instancing" through SM2 that is normally (iirc) reserved for SM3 hardware...

Quote:
It's obvious that ATI only cares about money, and not the advancement of one of the most beneficial aspects of computing.

Sad truth is that pretty much any/every company on this planet is only concerned about money. If innovation and advancement of a "benefitial aspect" is likely to generate them more $$$ then they'll probably go for it [rolleyes]

Jack

Share this post


Link to post
Share on other sites
Damn. I was looking forward to ATI's new cards as well. Was hoping that if ATI finally came out with some cards in the same league as Nvidia's flagship models, we'd finally see some price competition again so prices would come down to reasonable levels. To me, SM3.0 is all about vertex texturing. Alas, I guess we'll have to wait.

My guess is that one reason ATI's been so late on providing SM3.0 hardware is that they've been busy working on chips for the Revolution and Xbox 360. Those are two totally different chips and the teams can't even talk to eachother because of agreements with Nintendo and Microsoft. So its basically double the work. I'm sure they've got people working on the PC cards but they've probably taken a backseat to the development of the console chips right now.

neneboricua

Share this post


Link to post
Share on other sites
Quote:
Original post by neneboricua19
To me, SM3.0 is all about vertex texturing. Alas, I guess we'll have to wait.

Yup, but given that a lot of the high-end/commercial engines tend to have seperate codepaths for NV/ATI (even if just for performance) I'd imagine we'll still see vertex texturing effects on the Radeon X1x00 cards.

Fundamentally it also seems that it makes vertex texturing a 2-pass approach, which is a fundamental difference (as I read it) from the standards implementation. Thus any code will probably be very different [headshake]

Quote:
Original post by neneboricua19
My guess is that one reason ATI's been so late on providing SM3.0 hardware is that they've been busy working on chips for the Revolution and Xbox 360.

That's probably a substantial factor, but in that article I linked in my original post it seemed like they also had some unfortunate (and badly timed) engineering problems to contend with.

Hopefully they'll take this as a small stumble and do whatever they can to make sure they don't carry it over into whatever they come up with next [grin]

Jack

Share this post


Link to post
Share on other sites
Quote:
Original post by jollyjeffers
Hopefully they'll take this as a small stumble and do whatever they can to make sure they don't carry it over into whatever they come up with next [grin]

Considering the D3D10 caps are all-or-nothing, they are gonna be pretty much forced to implement the full shader specs. Performance is certainly another issue, though...

You have got to love their spec sheets - "Shader Model 3.0 done right" [lol] What a crock of $*#(@

Share this post


Link to post
Share on other sites
Quote:
Original post by circlesoft
You have got to love their spec sheets - "Shader Model 3.0 done right" [lol] What a crock of $*#(@

[lol] I like it ...

Although, as mentioned above, it seems they found a perfectly valid loophole in the specs that Microsoft were "okay with", thus it is technically implemented correctly. Just in a somewhat retarded way from a software engineering perspective.

Jack

Share this post


Link to post
Share on other sites
Well, considering the SM3.0 specs seem to be more of a joint venture between NVidia and Microsoft, I can't really blame ATI. When I was looking to buy an X850 I made a little comparative sheet between the SM3.0 specs and the X850 specs (SM2.0b?) and it all comes down to glossy terms ATI can't legally use. The only true feature the X850 missed was the seperate progammable specular & fog shader. And of course with the whopping 255 shaders instructions missing for full SM3.0 compliance, you're gonna have a hard time with the 65280 instructions available ;p

Anyway, in case anyone's interested:

Share this post


Link to post
Share on other sites
This is very frustrating. I have been buying Nvidia cards for test machines to test all of shader 3.0s beauty. Now I suspose it wont help much for ATI's 3.0 support. Its kinda hard to knock out the high % of ati users. Maybe they will start to see the light and just go after the 7x00 series nvidia for the next gens.

Share this post


Link to post
Share on other sites
Quote:
Original post by Myopic Rhino
FYI, Jack, your email is bouncing (which is why you're not getting reply notifications):

User jhoxley not listed in Domino Directory

Thanks for the heads-up. Seems like I forgot to change my email address from my work account when I left the company 2 months ago. Whoops [oh]

Quote:
This is very frustrating. I have been buying Nvidia cards for test machines to test all of shader 3.0s beauty. Now I suspose it wont help much for ATI's 3.0 support.

So far it's only this one major feature that ATI seemed to have differed on. Maybe there are more... but in general, the rest of the ATI-SM3 implementation seems fairly respectable.

Quote:
Maybe they will start to see the light and just go after the 7x00 series nvidia for the next gens.

I doubt it. The benchmarks that matter to the end user ("which card makes my games look prettiest and run fastest?") indicate that the ATI X1x00 cards are at the very least competing with the Gf7x00's and sometimes beating them.

This ATI card might annoy us developers and not really offer us much more than what NV have been doing for a while, we're a relatively small percentage of potential customers [rolleyes]

Jack

Share this post


Link to post
Share on other sites
hmm.... I´m not getting anywhere near SM3.0 this year I guess.... first have to brush up my skills in the basics section of that 3D stuff. Still I think that s*cks big time... the shader models were supposed to be a standard of some kind, weren´t they? What good is a standard, if around 50% of sold hardware doesn´t comply with it exactly, but still says it does? Of course there are some tricks for NVidia and ATI cards to get more performance out of them, but if I understand it correctly, ATI effectively abuses the SM3.0 for advertising purposes on the outside of their boxes, without really complying with it, therefore forcing developers to do twice the work, if they want to use SM3.0 in their app? bleh....

Share this post


Link to post
Share on other sites
Quote:
Original post by matches81
the shader models were supposed to be a standard of some kind, weren´t they? What good is a standard, if around 50% of sold hardware doesn´t comply with it exactly, but still says it does?

I don't have the exact D3D9 specs to hand, but from my understanding is that they state a minimum feature set to be SM2 or SM3 "compliant". Through the enumeration functions there is quite a lot of room for the IHV's to manouver. Hence why you can get different cards supporting different numbers of shader instructions (etc..) - so long as they support the minimum demanded by the spec.

As for advertising... do you know many end-user/gamers that'll know (or care) about the different vertex texturing implementations for ATI/NV? As long as thats the case I'm sure neither company will be too fussed about "blurring" the facts [grin]

hth
Jack

Share this post


Link to post
Share on other sites
Quote:
Original post by KronosGL
There is more: the R520 doesn't have filtering for FP textures.
This is very very annoying...

Do you have a link/reference for that?

this page from the B3D article includes this fragment:
Quote:
ATI have supported various HDR methods since the introduction of R300 with its floating point texturing capabilities, however NVIDIA have supported a more optimal method of High Dynamic Range blending, something that ATI's part have not been able to previously. With the entire range of chips using the R520 architecture ATI will now support HDR blending, but will do so under a number of formats:

* FP16 - 64-bit floating point
* Int16 - 64-bit integer
* Int10 - 32-bit 10-10-10-2
* Custom formats (eg Int10+L16)


Cheers,
Jack

Share this post


Link to post
Share on other sites
Quote:
Original post by jollyjeffers
Do you have a link/reference for that?

this page from the B3D article includes this fragment:
Quote:
ATI have supported various HDR methods since the introduction of R300 with its floating point texturing capabilities, however NVIDIA have supported a more optimal method of High Dynamic Range blending, something that ATI's part have not been able to previously. With the entire range of chips using the R520 architecture ATI will now support HDR blending, but will do so under a number of formats:

* FP16 - 64-bit floating point
* Int16 - 64-bit integer
* Int10 - 32-bit 10-10-10-2
* Custom formats (eg Int10+L16)


Cheers,
Jack


What does that have to do with floating point texture filtering? ATI only allows point filtering, no mipmaping or anisotropic.

http://www.beyond3d.com/reviews/ati/r520/index.php?p=03
Texture processing is achieved in a similar fashion to ATI's previous parts. Although the earlier pipeline diagram indicates a texture sampler array, all the the texture units are not re-allocatable to different pipelines, instead 4 are dedicated to each of the quads. At present ATI see the primary use of float texture sampling as being for lookup data, which only requires point sampling, and so at present haven't put floating point texture filtering in place, instead relying on filtering in the shader if required

Share this post


Link to post
Share on other sites
Quote:
Original post by KronosGL
What does that have to do with floating point texture filtering? ATI only allows point filtering, no mipmaping or anisotropic.

Okay, good point - you win [smile]

I misread the "blending" as "filtering".

Sucks that they don't see it as a "primary use". My recent HDR Demo looks a *lot* nicer/smoother when running under the REF where all the FP data is linearly filtered.

The thing with implementing it in the shader is that it makes it just that little bit more complex to implement - even if it is only ~30 more instructions. Guess in light of what I started this thread about I/we shouldn't be too surprised [rolleyes].

Thanks,
Jack

Share this post


Link to post
Share on other sites
Quote:
Original post by jollyjeffers
Quote:
Original post by matches81
the shader models were supposed to be a standard of some kind, weren´t they? What good is a standard, if around 50% of sold hardware doesn´t comply with it exactly, but still says it does?

I don't have the exact D3D9 specs to hand, but from my understanding is that they state a minimum feature set to be SM2 or SM3 "compliant". Through the enumeration functions there is quite a lot of room for the IHV's to manouver. Hence why you can get different cards supporting different numbers of shader instructions (etc..) - so long as they support the minimum demanded by the spec.

As for advertising... do you know many end-user/gamers that'll know (or care) about the different vertex texturing implementations for ATI/NV? As long as thats the case I'm sure neither company will be too fussed about "blurring" the facts [grin]

hth
Jack


I know that the shader models are a set of minimum specs / abilities... but that´s exactly my point: If the info in this thread is correct, ATI´s next chip won´t fulfill those minimum specs, but it is still advertised as shader model 3 "compliant", which doesn´t seem to be the case.
Sure, no end-user will care, as long as the software developers (mainly game developers) find a way to get the wanted result, which was my point again. As a developer I would really love to see whether the hardware complies with shader model 3 and implement my shaders all the same for shader model 3 hardware, which I obviously can´t do when the ATI cards are unable to do some features that would be required for SM3 but still propagate they are. It seems I would have to write a completely different shader for the ATI cards, which I really dislike...

To make things short I simply dislike that ATI seems to expect something like special treatment when they could have done it the "standard" way, which results in more work for the devs.

Share this post


Link to post
Share on other sites
Quote:
Original post by matches81
To make things short I simply dislike that ATI seems to expect something like special treatment when they could have done it the "standard" way, which results in more work for the devs.

I agree.

From everything I've read about it... the R5x0 cards seem to have had big hardware problems - and the big indication of no "normal" vertex texturing is that the silicon actually doesn't exist between the VShader units and the texture samplers. Which, to me, reads more of a case that the hardware engineers did it the way they wanted and then sorted out the software later on [rolleyes]

I think ATI should give everyone in this thread a free dev-board to verify that it is (or isn't) rubbish [grin]

Jack

Share this post


Link to post
Share on other sites
Quote:
Original post by jollyjeffers
I think ATI should give everyone in this thread a free dev-board to verify that it is (or isn't) rubbish [grin]

I think I can already verify that is rubbish, but I think they should still give us the dev board. Maybe then, I could sell it and make enough $$$ to buy a new Nvidia card, and actually have real 3.0 [lol]

Share this post


Link to post
Share on other sites
Quote:
Original post by circlesoft
Quote:
Original post by jollyjeffers
I think ATI should give everyone in this thread a free dev-board to verify that it is (or isn't) rubbish [grin]

I think I can already verify that is rubbish, but I think they should still give us the dev board. Maybe then, I could sell it and make enough $$$ to buy a new Nvidia card, and actually have real 3.0 [lol]

Ssshhhhhhhhhh!! they might hear your plan [looksaround]

Did you not get enough money by selling your controller on EBay then? [lol]

Jack

Share this post


Link to post
Share on other sites
Quote:
Original post by jollyjeffers
Ssshhhhhhhhhh!! they might hear your plan [looksaround]

Did you not get enough money by selling your controller on EBay then? [lol]

Nope, it's just usually that I have to take out a small loan and refinance my house in order to by a new graphics card hehe

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

Sign in to follow this