Graphic Cards The Nvidia vs Ati Thread

Well the thing is a system with 6800U or even 6800GT SLI is unmatched for now.
See the games like SC:CT. It brings every single card system to its knees. But you can enjoy it with eyecandy on on SLI 6800GT SLI system.
Hey dont get me wrong, i am backed out on SLI even when i had chance to. I even sold my 6800GT and went to 6600GT for now as i am getting new card in June.

About ATI cards. I am not sure about the ATI OpenGL fiasco anymore.
Have you guys seen the latest reviews with newer drivers. X800XT/X850XT is now within 2FPS of 6800GT and 6800U in doom3 with new catalysts. Thats a big improvement on the ATI's part in OpenGL department.
I mean even 6800NU and 6600GT used to outperform it few months back.

Also even now the card with the most horsepower is X850XT. IMHO its a better card to have if you are to choose between 6800U and X850XT.
 
Yup with the newer cats, i think ATi owners will have to worry less about reduced performance in Opengl games. Most newers benchies carried out using newer verions of cats confirm that the Opengl performance of ATi cards is now on par with Nvidia. So Nvidia fanboys can now stop crying hoarse on that front. The only thing that remains to be seen is the actual implementation of AMR.
 
Well,HL 2 was the original "optimised for ATi" game.Doom3 is not much optimised for Nvidia at all.If ATi produces a good enough OpenGl driver they will become very competitive in Doom3.But the way ATi treated the Geforce Fx cards is unforgivable.This despite it being proved now that fp16 was all that HL2 needed in more than 95% of the game-something that NV3x does with ease.And I am not saying this coz I own one!I know that nv3x can't do fp32 well(which is why it sucks in far cry when enabling ultra quality in D3D coz then it does fp32 all the time)
 
Well,HL 2 was the original "optimised for ATi" game.Doom3 is not much optimised for Nvidia at all.If ATi produces a good enough OpenGl driver they will become very competitive in Doom3.But the way ATi treated the Geforce Fx cards is unforgivable.This despite it being proved now that fp16 was all that HL2 needed in more than 95% of the game-something that NV3x does with ease.And I am not saying this coz I own one!I know that nv3x can't do fp32 well.
You cannot be more wrong, hell in beyond3d and Rage3d they have gone as far to say that John Carmack was sleeping with an Nvidia PR person !! That game is optimised so thoroughly for FX architecture. Nvidia FX series was absolutely pathetic and they required immense optimisations !!!
What about the pathetic shader performance of the FX series !!! And what about their stupid marketing gimmick of Pipeline calculation (4*2) etc. Their FX line was hopeless, one of the main reasons I dislike them, and the way they started optimising their drivers and then maiming 3dmark and then their the way its meant to be played campaign.

HL2 is optimised for Direct 3d API, hence the NV4x series which has proper shader support doing pretty well in it.

Any guy who supports NV3x series is a plain ignorant NVIDIOT. No offense meant Undertaker :)
 
undertaker dude, noone expected Nvidia cards run so well in HL2. But HL2 is one of the best coded piece of game when it comes to graphics. Even FX card run preety well. Try running farcry on 5900U with AA and AF enabled.
Sure with very powerful systems even FX will be able to perform good even with DX9 forced. Thats what many people did using 3danalyser. but if you own legal copy of HL2 you cannot use that anymore as 3danalyser alters steam files and thats now considered as hack and it bans the user.
But seriously I did not expect my 5900U to play HL2 the way its played. Farcry ran like crap compared to my 9800XT. Almost every graphically good DX9 game sucked on 5900U when you crank up the details.
I bought 5900U for totally different reason. I softmoded it to quadro and thats what the card is good for. not the level of gaming that it should have provided. NV3X was measery for Nvidia. They themself admitted it when they launched NV40. They made fun of their own 5800 fanblower at a NV40 launch presentation ;)
 
Aces170 said:
look at the amount of titles which the Doom 3 engine has been licensed for only 3. On the other hand 12 titles have been registered with Unreal 3 engine and 4 titles with the source engine.
You got that reversed.
UE3 has more than 12, Doom3Tech 6 (Q4, RTCW2, Prey, Quake Wars, 2 yet undisclosed), Source 3 (+ Bloodlines, my benchmark for pathetic coding).

For SLI, both cards don't need to be from the same manufacturer, only the card/bios need to be identical.
 
I will agree that the entire FX line was plain crap.Nothing explains it better.

I also had 9800 Pro,which i really liked a lot.
Even if i get that card again, i will be more than happy to use it in my Gaming Rig.
For now i am satisfied with my 6800GT in my main setup.
 
funkymonkey said:
undertaker dude, noone expected Nvidia cards run so well in HL2. But HL2 is one of the best coded piece of game when it comes to graphics. Even FX card run preety well. Try running farcry on 5900U with AA and AF enabled.

Farcry ran like crap compared to my 9800XT. Almost every graphically good DX9 game sucked on 5900U when you crank up the details.
HL2 needs to reload every 5-10minutes, Farcry could fit 10 HL2 levels in a single level and not sweat.

And don't forget HL2 doesn't use advanced effects even half as much as FarCry.
There was some excellent shader work in glass panels and water but those levels especially reloaded less than every 500mts only leading one to believe that Source cannot handle large maps at all.
I could be horribly wrong but I base this off HL2 and Bloodlines which everyone has experienced.

The game only had the most amazing artwork I have seen yet (hi-res photo texture work stuck on every wall/skybox).
It gave 40+fps on my 8500 at 1280x960 16xAF everything high except textures at medium.

Farcry on the other hand has HUGE levels, virtually neverending 3D vegetation, complex AI pathing + advanced shaders all running in one single map.
Inspite of all this, it also ran very good on my 8500 at 1152x864 medium to high details (water@medium, texutres@high), around 40 outdoors, and 70-80 indoors.

Ofcourse who gives a damn how it's done when the end result ends up looking good, but to say Source is the most efficient engine around is wrong.
If it's curent record is anything to go by, it must be the most inefficient one.
 
I do not support the NV3x's faults.But I do hate Fanatics(hint,hint) who keep on going on a 'crap-war' against nvidia.
BTW Aces about the forums,people "claim" about carmack sleeping with nvidia.But I posted the link to a forum sometime back where using 3danalyse a 5700 was averaging between 35-65 fps in HL2 when using dx9.0 forced to 16-bit precision.And there was no IQ difference.
Forget even that ,they even screwed the DX8.1 path's water rendering inorder to make it appear that it was dx9.0 that was giving the true water effects.I know coz I applied the dx8.1 water patch on my card and voila there was dx9.0 water for no performance loss whatsoever.
That doesn't absolve nvidia of the nv3x-fiasco.But nor does it absolve Valve of not only sleeping with ATi but producing an illegitimate (but still beautiful I might add) child call HL2. :tongue:
 
There is no shying away from the fact that the FX series sucked big time. Why do you think ATi was able to take over the majority of the market share. At the end of the FX's lifetime, ATI had a bigger share of the graphics than nvidia. While 2-3 yrs back(before FX), nvidia had a much larger share.

The main problem for nivida was that while they produced a really crappy series, ATi went a step above the avreage and brought out the Radeon 9600, 9800 cores. They were far better than the FX. But nvidia tried fighting back with the 5700,5900 which were certainly better than the 5600 and 5800 but still not good enouigh to match the 9800s.

The FX were supposed tu be DX9 but they were not as good in DX9 as the 9x00 series of ATI.

But of course, all that is history.......... let us move on. nvifdia came back with a bang with the 6xxx series while it was ATI's turn to have problems. But now, ATI seems to have returned with a vengence with the X850, X800XL series.
 
saumilsingh said:
HL2 needs to reload every 5-10minutes, Farcry could fit 10 HL2 levels in a single level and not sweat.

And don't forget HL2 doesn't use advanced effects even half as much as FarCry.
There was some excellent shader work in glass panels and water but those levels especially reloaded less than every 500mts only leading one to believe that Source cannot handle large maps at all.
I could be horribly wrong but I base this off HL2 and Bloodlines which everyone has experienced.

The game only had the most amazing artwork I have seen yet (hi-res photo texture work stuck on every wall/skybox).
It gave 40+fps on my 8500 at 1280x960 16xAF everything high except textures at medium.

Farcry on the other hand has HUGE levels, virtually neverending 3D vegetation, complex AI pathing + advanced shaders all running in one single map.
Inspite of all this, it also ran very good on my 8500 at 1152x864 medium to high details (water@medium, texutres@high), around 40 outdoors, and 70-80 indoors.

Ofcourse who gives a damn how it's done when the end result ends up looking good, but to say Source is the most efficient engine around is wrong.
If it's curent record is anything to go by, it must be the most inefficient one.
No offence saumil but in most of your posts i found you overrating something or the other, just an observation. While opinions can be overrated one should be careful while stating facts ;). Its not that the source engine can't handle huge maps or anything, the game is made such that should run good on a low spec machine. I know ppl who have played on very low specs even below the required specs. So the maps have been made such that its loaded part by part and doesnt hurt a low end machine. Also another thing is the source engine is pretty old :P (we all know that) but still the punch it packs is worthy . The best thing about the source engine is the netcode is gr8 while if you are playin online with farcry u need a machine with fairly good specs. If you ask me both the engines have their pros and cons and are right up there with each other. That said i'd like to tell ya'll that this thread is deviating from the topic :P
 
saumilsingh said:
HL2 needs to reload every 5-10minutes, Farcry could fit 10 HL2 levels in a single level and not sweat.

And don't forget HL2 doesn't use advanced effects even half as much as FarCry.
There was some excellent shader work in glass panels and water but those levels especially reloaded less than every 500mts only leading one to believe that Source cannot handle large maps at all.
I could be horribly wrong but I base this off HL2 and Bloodlines which everyone has experienced.

The game only had the most amazing artwork I have seen yet (hi-res photo texture work stuck on every wall/skybox).
It gave 40+fps on my 8500 at 1280x960 16xAF everything high except textures at medium.

Farcry on the other hand has HUGE levels, virtually neverending 3D vegetation, complex AI pathing + advanced shaders all running in one single map.
Inspite of all this, it also ran very good on my 8500 at 1152x864 medium to high details (water@medium, texutres@high), around 40 outdoors, and 70-80 indoors.

Ofcourse who gives a damn how it's done when the end result ends up looking good, but to say Source is the most efficient engine around is wrong.
If it's curent record is anything to go by, it must be the most inefficient one.

Well the bloodlines is at fault here.
It uses source engine but its poorly coded.
About FarCry. Well ofcourse it ran well on 8500. Without AA and AF it ran preety well on FX and GF4 series as well.
Its with all details maxed and 4XAA and 8XAF enabled that the problem arise.
Its playable on 9700/9800 Pro/xt.
its perfectly playable on all current generation cards.

And Farcry too has indore levels, on FX the FPS will be so crappy even inside, specially with flashlight on.

Thats not the only game. FS2004 was almost unplayable on FX 5900U with all details cramped up. IT wasent perfectly smooth even on 9800XT but it was above 40FPS and FX struggled to hit 30.
Its not that FX was a poor card, it was much better than GF4 series, its just that ATI had much much better card and that too 6 months before FX5800U.
Its been more than 3 years and 9700 pro still performs solidly. You must give credit to ATI for that.
They have got things together this time around and from the looks of it they will have things properly sorted out even with next gen cards.
Nvidia needed that kick to make this market as its now.
 
If a poll was conducted as to which card has failed most on this forum(meaning among those people who owned it)it will be the "mighty" 9800 pro.
As to people who keep on harping about the X850XT is the fastest single card.The answer to that is its fastest single card coz Nvidia doesn't care about that,it already has SLI.The mobile version of the 6800U already uses some of the fabricatuion tech. used in the G70.So if Nvidia wanted there could be a 6800U doing 600/1200 tomorrow.
Now people will call me 'nvidiot' and what not but there is no denying this fact.
And why,my dear friends,did valve change the water shader o/p in dx8.1 mode?
 
what that has to do? Of course more people bought 9800 pro so of course there will be more people with problems and its brands that we have in India thats the problem
Powercolor, club3d are not very highly rated brands. Powercolor was rated one of the worst manufacturers of 9800 cards and these are the cards that are available here.
I had sapphire 9800 pro and 9800XT till date running fine with the guy whom i sold it.
 
undertaker said:
If a poll was conducted as to which card has failed most on this forum(meaning among those people who owned it)it will be the "mighty" 9800 pro.
As to people who keep on harping about the X850XT is the fastest single card.The answer to that is its fastest single card coz Nvidia doesn't care about that,it already has SLI.The mobile version of the 6800U already uses some of the fabricatuion tech. used in the G70.So if Nvidia wanted there could be a 6800U doing 600/1200 tomorrow.
Now people will call me 'nvidiot' and what not but there is no denying this fact.
And,my dear friends,did valve change the water shader o/p in dx8.1 mode?
U mean the single most failed brand isnt it ? Cuz all of them are club3d cards.....It has nuthing to do with the card being a 9800pro ! Also do u think a single card solution should be compared to a something like SLI. That would be a true apples-to-apples comparison. We can wait for the multi VPU solution from Ati and then make the decision.
 
Some people have 'selective reading disorder'.nvidia have not altered the base clocks of the 6800 series from launch.Unlike Ati x800(how many are there plain,pro and XL?)>x800xt->x800xt pe->x850->x850xt->x850xt pe.(wtf?)
All they have done along this line is give the card a lil mhz push so that the end result x850xt just gets ahead of the 6800u in some benchies,not all.
 
Back
Top