Graphic Cards Nvidia beats ATI even with a hand tied behind its back.

Well Source is D3D. Where does OpenGL come into the question?

Yes nVidia has owned the OpenGL scene for quite long.

x1800xt in FarCry is Faster @ Higher Resolutions.

Guess ATi's High Clock could make up there.

If new drivers do not improve the performance, this card could be quite a Flop really. I really thought the card would be much better!

Also, chekc this out. http://www.techreport.com/reviews/2005q4/radeon-x1000/index.x?pg=9

Riddick. the 6800 Ultra beats the x1800xt. Whats up with that??

Is Chronicles a OpenGL game? :S.

Also, http://www.techreport.com/reviews/2005q4/radeon-x1000/index.x?pg=7

Doom 3. Whats up with the Performance :(. x1800 xt is pretty dissapointing :(.
 
all the reviews on the net are between a 7800gtx 256mb & x1800xt 512mb. so i guess when a 7800gtx 512mb card comes out they should pretty much be neck to neck even in d3d games because res upto 1280 x 1024 with everything maxxed out both cards are almost the same but at 1600 x 1200 i guess the 512mb is helping the x1800xt get a substantial lead in some of the games.

i may be wrong in what i said. but i guess ill just wait for 512 vs 512 review or the 256 x1800xt review.
 
Thats exactaly what i mentioned in the earlier thread... I need an apples to apples review... And by the look of it 512 GTX will be available by the time X1800XT would be available...
 
Well Nvidia has been better that Ati in clock for clock and pipe for pipe performance since the NV40 series was out.......

the real competirion will start when we will have a unified architecture in graphics chips .... :)
 
Well Nvidia has been better that Ati in clock for clock and pipe for pipe performance since the NV40 series was out.......

Thats not the criteria at all. If I get a AMD proccy rated at 1.6 ghz, and a Pentium 4 proccy at 2 hz at the same price or lower ( ATI's prices are genrally lower ). Why would I care for clock to clock performance etc...
 
Well there is no point in those results.

Both card bank on two different paramerts for their performance.

The X1800XT banks on its pure GPU and specially memory speed. Both use totally different architectures.

BTW goldenfrag. Chronicles of Riddick is indeed OpenGL game.
 
Well all that article proves is that clock to clock, The G70 is more efficient than R520. But its not really relevant in any sense as G70 cant scale to the insane frequencies that R520 cards are hitting. NV will need a 90mm part for that. (Is G70M 90nm? I'm not sure) I still have a feeling that ATI hasn't optimized the GL driver for R520 at all since the R520 seems to be slower than even their older gen hardware in riddick :S
 
I can't understand why ATI can't get the right drivers for X1800 out !

The Cat. 6 Beta atleast should have been released and with Series 80 coming out soon, ATI needs some serious driver-optimisation...
 
Anish said:
I can't understand why ATI can't get the right drivers for X1800 out !

The Cat. 6 Beta atleast should have been released and with Series 80 coming out soon, ATI needs some serious driver-optimisation...
Cat 6 will come out in 2006 ;). I'm guessing they'll release a beta along with Q4. That should seriously boost OGL performance. Its the same thing they did with the D3 launch.
 
^^ The execs at ATI sure will be praying hard that thier dev team doesn't screw up Cat 6 then ;)

OT : Chaos, isn't you sig as big as you post now ? Where's Albert :rofl: !
 
goldenfrag said:
Well Source is D3D. Where does OpenGL come into the question?
Yes nVidia has owned the OpenGL scene for quite long.

x1800xt in FarCry is Faster @ Higher Resolutions.
Guess ATi's High Clock could make up there.
If new drivers do not improve the performance, this card could be quite a Flop really. I really thought the card would be much better!

Also, chekc this out. http://www.techreport.com/reviews/2005q4/radeon-x1000/index.x?pg=9
Riddick. the 6800 Ultra beats the x1800xt. Whats up with that??
Is Chronicles a OpenGL game? :S.
Also, http://www.techreport.com/reviews/2005q4/radeon-x1000/index.x?pg=7
Doom 3. Whats up with the Performance :(. x1800 xt is pretty dissapointing :(.

it's based on the d3 engine which runs faster on an nvidia card......therefore 6800 ultra beat x1800xt in fps......:(
 
^^Riddick is not Doom3 engine. Its an inhouse engine created by Starbreeze. Strange that it uses OGL since riddick was an xbox game :S.
 
KingKrool said:
Well as I see, apples have to be compared by cost.
I think 7800 is cheaper now right?
So it would be a better buy?
7800GT is the best buy right now. X1800XL will appeal only to people who wanna do some serious overclocking. Seems like not all of them are duds. Ppl are hitting 700+ clocks on both core and memory. It comes with 1.4ns GDDR3 that is clocked at 1GHz :S. Rather strange to say the least!
 
Chaos said:
7800GT is the best buy right now. X1800XL will appeal only to people who wanna do some serious overclocking. Seems like not all of them are duds. Ppl are hitting 700+ clocks on both core and memory. It comes with 1.4ns GDDR3 that is clocked at 1GHz :S. Rather strange to say the least!
hehe I think that is why it is costly now. Latter on maybe it will come with 2.0ns ram and price will also be low :S
yes 7800 GT is best now. MSI 7800GT 335$ I wish we could get that price here :(
 
Back
Top