Graphic Cards The Nvidia vs Ati Thread

Yes switch u r right.But then for FanAtics ATis is god and supposedly engages in none of the tricks adopted by nvidia.
But apart from market segmentation,the entire scheme of Ati so far has been nothing more than clock speed increases.Even Nvidia radically improved their fx series (from 5600,5800 to 5700,5900)in a few months time.I don't deny Nvidia resorted to deperate tactics during NV3x days but Ati's X850 is nothing better to the discerning eye.
Its been an year since sm3.0 cards form nvidia came out(Now fanatics will rant about sm3.0 being weak and so on,but games have taken advantage of it )and now we will have nvidia's second genertion sm3.0 vs Ati's first generation sm3.0.My bets are with Nvidia all the way.Ati's dominance was a one-off,starting and ending with the R3XX series.
 
Well undertaker again if you look closely what ATI has done then you would like them more and not hate them... The X800 and X800 XL are smaller chips... I dont know how much exactaly but i beleive 90 nm while X800 pro and X800 Xt were 130nm... One of the main reasones why ATI was able to introducel these cards at such lower prices while when nVIDIA was doing it it was at the $400 mark when we talk about the 5900 chip... So in a way ATI is true to there fanatics(Delivering a 6800GT equvilalent card at $100 less)... What do you say about that...

P.S. In my heart i am an nVIDIOT but i use my brain more.
 
undertaker said:
But apart from market segmentation,the entire scheme of Ati so far has been nothing more than clock speed increases.

My bets are with Nvidia all the way.Ati's dominance was a one-off,starting and ending with the R3XX series.

talking about innovations, nVidia came up with SLi.. ATI is going one up on them and bringing similar (dual-GPU) concept and it doesnt need two card with the same specs.. :hap2:
 
wouldnt it be cool if you have a graphics mobo and on which you can just swap the chip like the NV 40 to G70 and add more DDR3 and thats the upgrade... Or you can add another chip when you need the power...

@edit... By chip i meant the GPU's...
 
^^ Pure speculation.. the thing i dont understand is why ppl argue about stuff when they sure aint plan to buy it :P.. I wud always buy something that gives me a good Value-for-Money ratio.. I wanted a X800XL, but then I got a good deal on a 6600GT so i bought it instead.. :D.. as of now, there is not better VFM card than a Sapphire X800 that sells in the US for $195 (inclusive of shippping)
 
undertaker said:
But apart from market segmentation,the entire scheme of Ati so far has been nothing more than clock speed increases.

My bets are with Nvidia all the way.Ati's dominance was a one-off,starting and ending with the R3XX series.
Ati has always bought more features to their hardware than Nvidia.
Better antialiasing, truform, smartshaders, ai, overdrive...

Nvidia only has digital vibrance afaik.
sm3 isn't innovation, it's simply the next logical step after sm2b.
And SLI was done by 3dfx originally.
 
Out of the thibgs u mentioned only better AA is something that Ati has to its credit.
The rest are just gimmicks.
DV on the other hand improves what u do 80-90% of the time-2d work.
So thats a draw.
BTW,Nvidia own's 3dfx research and has many people from the original 3dfx.Any way
Nvidia perfected SLI to where it is today.Gotta give 'em that.
ATi's response to SLI has so far been vaporware at best.
 
Ok, I dont' want to disrupt the discussion but are we forgetting about something. The hardware we buy is used to run software (games), so we are essentially buying it to play better. I have just bought a new rig and I stuck to AGP, didn't move to PCIe even though the X800XL was within my budget. The matter of fact is that when HL2 was released my 10th exams were approaching and I wanted to play a bit at home. But I had a GeForce 2 then. By chance I installed and it worked !

After all the hype, my good old GF2 played one of the most anticipated computer games in history at 800x600 with medium settings at 32-34 fps ! Playable !

New technologies come but we dont need them. DX 10 is bound to come, so is 11 so should we keep waiting ? Nope not necessary.

Even though you guys critisized the FX series, they do play HL2 at a decent res and the 5950 will play it with all the eye-candy.

My final word : Look one step ahead, not two, you wil end up where you started.
 
Anish dont get me wrong but 30 fps is more like a slide show then smooth... Also as it is mentioned before that the game engine was made in such a way that even low end machines could play the game... Try playing farcry on the same machine and you would know... Source engine was developed long back but got delayed due to certain reasones which we all know...
 
Well, Switch, I must say that 30fps was playable but I got about 44 when I lowered the settings in 800x600.

Come on, that game was made for the FX series, and a GeForce4 Ti would play it as well as a 9800 Pro at 1024x768.

My point is that every time a new card or technology is launched or for that matter even if a new version of DX is launched, we don't need to jump for it.

My friend is still having great fun with his GeForce4 Ti 4600 which is a 8.1 card.
 
You talk abt games...... why do you play games??? ANS: To Enjoy your time in front of the comp.

If you turn up the effects like water effects, shadows, lighting, etc.... there is no way you will get 10 FPS on GF2. But on a 6800 they will play very well and you will enjoy the effects also.
 
undertaker said:
Some people have 'selective reading disorder'.nvidia have not altered the base clocks of the 6800 series from launch.Unlike Ati x800(how many are there plain,pro and XL?)>x800xt->x800xt pe->x850->x850xt->x850xt pe.(wtf?)
Well nVidia tried to confuse us all (though it was brilliant move buisness wise) by suffixing XT with downgraded cards in thier FX series
 
Well there was a 5600xt and a 5500xt too.
And agreed the 5900xt was the best buy in the entire fx series due to its agressive priceing but still of all the possible suffixes in the world, they had to use XT :P
 
The fx series cards below the NV35 were very slow,no denying that.Even the NV35+ cards required shader optimisation for good performance.
For me ,the 5900 and 5700 were actually very decent cards despite their draw backs.
I bought the 5700le knowing that it will do only dx8.1 in HL2.However ocing it to 425/506 gives very good fps at 1024x768,slow downs are very few.
The worst of the fx series was the 5600 coz it slower than even the ti 4200,which it was supposed to replace.The 5200 may have lacked in performance but it was the bottom end card any way.
The 9600 series cards were good and they did offer better performance with dx9 titles however they are not magical solutions.They are still limited by the bandwidth limits of using slower 128-bit ddr1 like the 5700 series and their geometry units are weaker(geometry has always been nvidia's turf,thats why quadro dominates).The very +ve thing about them was excellent shader performance for the price.Sadly,they were always way overpriced and poorly marketed here.
 
truform, smartshaders and overdrive are gimmicks?

If ati had effectively promoted truform similar to how nvidia does with their TWIMTBP spinner, we would be seeing smoother looking models today.
It didn't always work perfectly, but models built with truform in mind looked noticeable better.

The hdr'ish smartshader can add bloom to all old and new opengl games instantly making them look much better.
Funny how you call this a gimmick while DV is perfectly alright in your books.

Overdrive is free performance boosts without any headache, trial and error,fear of damage to the hardware or voiding guarentees.
 
Truform is known to cause a lot of problems.There were many threads back in the days of 8500 on this.
Overdrive:Well Nvidia drivers have "auto-detect" feature in drivers for "auto-overclock".(if u r smart enough to use coolbits,which some mfgs provide on their driver disks,my 5700 came with coolbits on the driver disk!)
Anyway no self-respecting geek will depend on either of these to overclock.
Plus nvidia's drivers are more resposive and smoother,I had posted a thread if you remember about this, a direct comparison of nvidia vs ati's drivers.
 
undertaker said:
Overdrive:Well Nvidia drivers have "auto-detect" feature in drivers for "auto-overclock".(if u r smart enough to use coolbits,which some mfgs provide on their driver disks,my 5700 came with coolbits on the driver disk!)
Which before anything else flashes the eula on your screen making it very clear your hardware may be damaged and warranties void.
Also it cannot throttle clock speeds in real time like overdrive.

Anyhow the point remains, ati hasn't been about simply pushing clock speeds.
They have always tried interesting things with their gpu's.

Whether they got accepted or not isn't important.
 
Back
Top