HDR+AA In Action with ATI's Radeon X1800/X1900s
Conclusion:
Source: HDR+AA In Action with ATI's Radeon X1800/X1900s
Over the past several months, we’ve detailed the performance of ATI’s latest Radeon X1K cards at length. Back in October we took our first look at the Radeon X1800 family, previewing the performance of the X1800 XT and XL cards, the Radeon X1800 XT 512MB in particular turned into an impressive performer in subsequent driver updates from ATI. Then, at the beginning of this year we took a look at ATI’s Super AA anti-aliasing modes.
But the one topic we hadn’t discussed at length (that you continued to remind us about in the comments) is ATI’s unique ability to support anti-aliasing with high dynamic range lighting (HDR), a feature found on all of their X1K cards. Only ATI’s Radeon X1K cards are capable of pulling off this feat.
Conclusion:
Adding 4xAA/8xAF to Far Cry running with HDR had very little effect on the Radeon cards relatively speaking. At 1600x1200 the Radeon X1900 XTX’s performance drops by just 2 fps, or a little over 4%, while the X1800 XT 512MB sees an even slimmer 2% drop off. Even the slower Radeon X1800 GTO and X1900 GT cards see only slight declines once AA is added on top of HDR in Far Cry.
Under the greater demands of Oblivion, the margins are definitely greater, but we still saw manageable frame rates; adding AA to HDR actually comes free at 1024x768 for all cards except the Radeon X1800 GTO, and keep in mind that we could easily turn down the graphics settings a little for even better performance. In our outdoors testing the Radeon X1900 XTX saw a performance dropoff of nearly 30% while the Radeon X1800 XT took at performance hit of 21% once HDR+AA was enabled. Similarly, the Radeon X1900 GT took a greater hit than the GTO.
This is probably because the GPU on the older R520 cards is already pretty bottlenecked once HDR is running in Oblivion, once AA is added, the GPU can’t bottom out much further. In the case of the R580-based cards, they’re not quite as overtaxed with just HDR running, so once HDR+AA is enabled you see performance decline significantly.
The differences between running with HDR and HDR+AA aren’t quite as significant in foliage testing simply because the foliage area is more stressful on the graphics card than the outdoors area.
Looking over the results, HDR+AA is certainly a lot more feasible on ATI’s Radeon X1K cards than we initially thought. Getting playable frame rates with HDR+AA shouldn’t be too hard as long as you keep the eye candy in check, and with older games like Far Cry you should be able to turn it all on while also running HDR+AA without any problems.
Source: HDR+AA In Action with ATI's Radeon X1800/X1900s