Graphic Cards HDR+AA In Action with ATI's Radeon X1800/X1900s

Status
Not open for further replies.

Blade_Runner

Inactive
Forerunner
HDR+AA In Action with ATI's Radeon X1800/X1900s

Over the past several months, we’ve detailed the performance of ATI’s latest Radeon X1K cards at length. Back in October we took our first look at the Radeon X1800 family, previewing the performance of the X1800 XT and XL cards, the Radeon X1800 XT 512MB in particular turned into an impressive performer in subsequent driver updates from ATI. Then, at the beginning of this year we took a look at ATI’s Super AA anti-aliasing modes.

But the one topic we hadn’t discussed at length (that you continued to remind us about in the comments) is ATI’s unique ability to support anti-aliasing with high dynamic range lighting (HDR), a feature found on all of their X1K cards. Only ATI’s Radeon X1K cards are capable of pulling off this feat.



Conclusion:


Adding 4xAA/8xAF to Far Cry running with HDR had very little effect on the Radeon cards relatively speaking. At 1600x1200 the Radeon X1900 XTX’s performance drops by just 2 fps, or a little over 4%, while the X1800 XT 512MB sees an even slimmer 2% drop off. Even the slower Radeon X1800 GTO and X1900 GT cards see only slight declines once AA is added on top of HDR in Far Cry.

Under the greater demands of Oblivion, the margins are definitely greater, but we still saw manageable frame rates; adding AA to HDR actually comes free at 1024x768 for all cards except the Radeon X1800 GTO, and keep in mind that we could easily turn down the graphics settings a little for even better performance. In our outdoors testing the Radeon X1900 XTX saw a performance dropoff of nearly 30% while the Radeon X1800 XT took at performance hit of 21% once HDR+AA was enabled. Similarly, the Radeon X1900 GT took a greater hit than the GTO.

This is probably because the GPU on the older R520 cards is already pretty bottlenecked once HDR is running in Oblivion, once AA is added, the GPU can’t bottom out much further. In the case of the R580-based cards, they’re not quite as overtaxed with just HDR running, so once HDR+AA is enabled you see performance decline significantly.

The differences between running with HDR and HDR+AA aren’t quite as significant in foliage testing simply because the foliage area is more stressful on the graphics card than the outdoors area.

Looking over the results, HDR+AA is certainly a lot more feasible on ATI’s Radeon X1K cards than we initially thought. Getting playable frame rates with HDR+AA shouldn’t be too hard as long as you keep the eye candy in check, and with older games like Far Cry you should be able to turn it all on while also running HDR+AA without any problems.



Source:
HDR+AA In Action with ATI's Radeon X1800/X1900s
 
Yeah ... probably one reason that makes u get a X1900XTX over a 7900GTX , even though the later has better SLi implementation ...
 
deathvirus_me said:
Yeah ... probably one reason that makes u get a X1900XTX over a 7900GTX , even though the later has better SLi implementation ...
That's crossfire for ATI :P
And the XT and XTX don't have much in the way of difference. You'd be better served with an XT.

For games like FEAR HDR don't do much, cause a the gloomy atmosphere's throughout the game, so the 7900GTX is keeping me happy :)
 
That's crossfire for ATI

What i meant was that SLi has better performance gain than Crossfire ..

And the XT and XTX don't have much in the way of difference. You'd be better served with an XT.

The XTX has can do more shader operations than the XT , infact if i'm not mistaken , almost twice ...

For games like FEAR HDR don't do much, cause a the gloomy atmosphere's throughout the game, so the 7900GTX is keeping me happy

But the extra shaders in the X1900XTX matters quite a bit ...

so the 7900GTX is keeping me happy

It has to ... u probably get over 50 fps anyway ... does it matter ?? HDr yet needs some better implementation for ex. HL2 Episode 1 looks a bit better than Oblivion , with HDr , although the former doesn't implement true HDr. . thats a bit funny isn't it ???
 
I am Just curious about the HDR and AA limitation in the nVidia cards. Is it that you cannot enable both of them simultanously in the games or that you can enable both, but only one works internally. Or is it that the cards cannot handle Full HDR and AA due to sheer processing power required?

I was just playing HL2 Lost Coast yesterday (on a 6800GT) and I had kept Anti Aliasing at 2x and HDR at Full and I felt that both were in action.
 
Loast coast uses Fake HDR and does not blend in FP16. The basic reason why it doesn't work on NV is cos they reuse some transistors that are used for AA when doing HDR. The internals i'm not aware of.

@Deathvirus: X1900XT and X1900XTX are the same basic GPU both have 16TMUs/48PPs unlike what you believe. The only diff is in the clocks. XT is clocked at 625/725 whereas the XTX is clocked at 650/775. Also some of the XTs use 1.2ns RAM whereas all XTX boards use 1.1ns RAM.
 
deathvirus_me said:
What i meant was that SLi has better performance gain than Crossfire ..

No it dosent, in most cases the gains are pretty much similar. In a number of cases Crossfire's gains are more than SLI and vice versa.

deathvirus_me said:
The XTX has can do more shader operations than the XT , infact if i'm not mistaken , almost twice ...

Lol no way, its just about 4-6%. The XTX is exactly the same as the XT excpet for the clock speeds which are 650 mhz core and 1550 Mhz mem compared to 625 and 1450.

lord_nemesis said:
I am Just curious about the HDR and AA limitation in the nVidia cards. Is it that you cannot enable both of them simultanously in the games or that you can enable both, but only one works internally. Or is it that the cards cannot handle Full HDR and AA due to sheer processing power required?

I was just playing HL2 Lost Coast yesterday (on a 6800GT) and I had kept Anti Aliasing at 2x and HDR at Full and I felt that both were in action.

You cant enable both simultaneously on nvidia cards. its not because of processing power its because of the architecture. something about not being able to do both in the pipeline at the same time. In HL2, HDR is implemented in a different way and hence u can use them both at the same time.
 
Status
Not open for further replies.