Graphic Cards 2 6600gt v 6800gt

Get the 6800GT since it will score over the 2*6600GTs at higher resolutions and AA and AF. The 6600GT is basically memory bandwidth starved a bit at higher resolutions due to the 128bit bus.
 
Blade_Runner said:
Get the 6800GT since it will score over the 2*6600GTs at higher resolutions and AA and AF. The 6600GT is basically memory bandwidth starved a bit at higher resolutions due to the 128bit bus.

exactly :D The 6600GTs 128Bit memory interface just doesnt cut it

The only reason to get 2x6600GT's is if you already had one and wanted to upgrade....

Although 2x6600GT's will be faster at medium res and moderate AA...

but i would also go for a 68GT over 2x66GT's
 
the only thing god with 2x6600GT is better score in 3dmark.

With high resolutions and AA / AF enabled it looses out in gaming to single 6800GT.
 
6800 GT > 2 6600GT SLI.

The SLI rig may get you a lil more fps on lower resos, like 800*600 but when it goes to 1024, youll see the 6800 GT fly past the 2*6600GT.
 
Chaos said:
Single 6800GT or X800XL(if u care abt image quality more than fps) is the best bet ;)
I really don't think ati still has the superior iq advantage.
high quality mode + 16x af + clamped negative lod bias = a good as it gets with hardly any performance hit
 
Chaos said:
Single 6800GT or X800XL(if u care abt image quality more than fps) is the best bet ;)
I think your right about the 9xxx series looking better than the 5xxx BUT now there ism no difference...check
 
saumilsingh said:
I really don't think ati still has the superior iq advantage.
high quality mode + 16x af + clamped negative lod bias = a good as it gets with hardly any performance hit
NVIDIA still doesn't have the AA quality of ATI... it has really no answer to the 6x temporal AA of ATI(effectively 12X if u maintain 60fps and above). I know u'll tell me that there is 8xAA which is ntn but 4xMSAA and 2xSSAA. Still it doesn't get close to ATI's quality since the 6800 lacks gamma corrected blends. Also the performance in 8xAA is really bad and its not really a usable mode! Gamma corrected blends are present in the 7800 series but even that is currently plagued with bad AF issues which can't even be solved by High quality + negative lod clamping. Check the link below for more info abt that issue.

http://3dcenter.org/artikel/g70_flimmern/index_e.php
 
get the 6800gt.

& i dint notice any diff in image quality between a 9700pro & 6600gt in anygame other than farcry which was fixed in the latter patches.
 
stormblast said:
get the 6800gt.

& i dint notice any diff in image quality between a 9700pro & 6600gt in anygame other than farcry which was fixed in the latter patches.

At the default settings, (quality not high quality) try running serious sam. U'll notice loads and loads of shimmering. Most of it is fixed by negative lod clamping + disabling AF/trilinear optimizations but then with the optimizations disabled, the performance drop is pretty dramatic!
 
lol temporal aa is useless, even 0xaa is better than something that keeps switching on and off as you play.

Nvidia's quality mode isn't to be used anyway unless one owns a gf4mx, only that card benefits any from the 5% boost it provides by raping image quality.

High quality mode, 16xaf, -ve lod clamp (antialiasing as required) is where it's at and no game shows any noticeable shimmering at those settings.
In quality mode, some areas in kotor2 show heavy shimmering, but high quality mode looks crazy insane picture perfect.

And the new transperancy aa in 7800 series improves the image quality even more.
 
Have you ever used temporal AA? Its awesome where it works! The shimmering is present only on 7800 cards not on NV4X cards so obviously u wouldn't notice it ;). Also about performance diff between quality and high quality, even on a 6600GT, its around 25% and not <5% as you mentioned. I just checked it out after you mentioned.... The screenies are at taken from serious sam second encounter technology test(1280x960, 4xAA, 8xAF) at the position where it begins. Quality yields 111fps with severe texture shimmering. High Quality with all optimizations off yields 84fps. Quite a severe drop I must say....

Here are the screenies.

Quality...



High Quality



Edit: Forgot to mention.... no shimmering on high quality. This is the mode websites shud use for benchmarking against ati since the default image quality of radeons is as good as high quality.
 
It's strange that serious sam should suffer such a big performance hit.

In doom3 it cost me 4-5fps in the timedemo at 1280x960, 2xaa, 8xaf - down to around 39 from 43.
Similar results in kotor2 at 1280x960, 4xaa, 16xaf and farcry at 1280x960, 2xQ aa, 8xaf (altho farcry doesn't show any shimmer in the first place).
After that I just force hi-quality in every game.

Ati definitely has better aa, but the difference isn't big enough to be a deciding factor.
 
lol no1 cares bout serious sam.

& till the time i had the 9700pro i dint find any use of temporal aa. maybe some games implement it well now.
 
stormblast said:
lol no1 cares bout serious sam.

& till the time i had the 9700pro i dint find any use of temporal aa. maybe some games implement it well now.
LOL I did it in serious sam as it loads quickly :P. Its the same state in every app that i've seen. In GL stuff that i write, i've noticed the same 20-25% drop with AA and AF enabled. Without AA its slightly less.

@saumil: Doom ain't texture intensive enough. Try HL2 (don't have it right now or else wud have tried) or Fear Demo or even UT2k4. You'll get the same 25% drop.

@funky... its not a hack... since ATI cards support programmable sample patterns, it makes sense to use it :P. Something similar existed on SGI octane systems that i've worked long before for line antialiasing. It even exists on the current prism range as it has ati graphics chips in it.
 
Back
Top