I don't have the 7xxx, but 2x6970.
Rig is not yet operational but when it was, I never had any problems that I could have noticed. I was running the cards pretty much at their limit at 2560x, all details maxed out, and max AA/AF. Never had notcieable frame stutter, ever.
I suspect that any single card will be fine at 1920x.
[rant]Please note it is NOT 1080p, that is a video scanline format, whereas 1920x1080 is a pixel count. Games use the latter, not the former. Monitor manufacturers love for this distinction to be blurry, as they immediately saved 15% of screen area for the same diagonal size, and added to some people's retirement funds. There is a reason that good review sites still refer to it as a HxV number which gives you exact pixels. Users should know the difference.[/rant]
Now, to the TR graphs. Those are NOT FPS, but frame render times.
Our first result is a simple plot of the time needed to render each frame during one of our test runs. Because the frame render times are reported in milliseconds, lower times are preferable. Note that, although you may see FPS-over-time plots elsewhere, those usually are based on averaging FPS over successive one-second intervals; as a result, they tend to mask momentary slowdowns almost entirely. Our plots are sourced from the raw frame time data instead.
Whereas it is an interesting approach, I don't see this as anything but creative marketing. 30FPS is 33.333ms per frame (33x30~1000ms). This test basically shows that the 7950 is not able to hit minimum 30FPS mark in some games with the chosen settings - which can also be logged by a simple FPS reader. I don't see the point of displaying the data this way. Stutter is a subjective quality, not objective. Different people will have different sensitivity to it, and a lot of times between your monitor and input peripherals, less than 50ms is pretty difficult to a) detect and b) react to in a gaming situation. The best sim racers have difficulties in judging laptime differences under 300ms, and that is on a known track with a lot of thinking done in advance (unlike fps games where you do a lot of sailing by the seat of your pants, so even larger differences swamp the seemingly irritating 50ms spikes).
Obviously the reviewer has noticed visible micro-stutter. What we do not know is whether the fact came before the data, or after it. And we will never know. However, this does not necessarily mean you will or will not. I do not have sufficient data to recommend a card at this point in time. Have not done enough background checking, not being in the market for a card myself. However you must understand that every reviewer in the world, in every category, tries a fresh approach and a fresh perspective. Whereas I do not believe the TR link is dishonest, I do also not believe that the 7950 has trouble at 1920x, that an average Joe will notice - unless there is something desperately wrong with the system.
Good Luck with your search.
- - - Updated - - -
ALPHA17 Shouldn't the fact that AMD has acknowledged the problem be enough to lay any doubts about the test's authenticity to rest?
Unfortunately, no.
If I was AMD and a particular way of testing the cards put me at a huge disadvantage to a rival, I would definitely try and fix it.
Whether or not this affected perceived performance of the card, or made it a better or worse buy, it wouldn't have mattered. Older users will remember how nVidia would cheat at 3DM tests by killing image quality in 'optimised' drivers and users had to resort to hacks to get their better IQ 'back'.
The world is run by marketing, my friend. The simplest fix would be to set your refresh rate to 30Hz and switch Vsync 'ON' with all games, and back off IQ settings in games that force the card to 'dip' below 30FPS.
In real world terms, I would bet that none but the best fighter pilots could tell the difference between smooth motion at 24 versus 30fps.