muzux2 said:looks like ATI is not considering GTX 295 threat to HD4870X2.. had so, they would have dropped the prices of X2 even a week before.. further, GTX 295 is not a X2 killer at all..hyeah: we have not seen full throttle of GTX 295 apart from those four Handpicked games that Nvidia chooses for preview.
ATI might come surprise with Catalyst 9.1, to boost performance on X2 that might be enough to beat GTX 295..It could be the reason ATI hasn't yet dropped prices on X2hyeah: & it seems they have something up in their sleeves, hope it's so..
![]()
Aces170 said:What is worse, NVDA's Dx10.1 denial... they are just harping its not worth it, making sure no game developers utilize the tech, but went as far removing it from Assasins creed...
Nvidia's GTX 2xx was a farce, no matter how you look at it. Its certainly no 8800GT in terms of innovation or vfm...
Udit said:what the hell is wrong with you?
can't you just buy/praise the best instead of these biased statements which are quite irritating, dumb & idiotic
Actually, G200 could be regarded as “G92 on steroidsâ€. Just look at the increased number of all functional units: ALU, TMU, RBE, and wider 512bit memory bus. The only significant architectural change is the addition of the third shader processor into computational clusters that use to have only two of those.
The results of preliminary theoretical benchmarks turned out not very optimistic. New Nvidia’s solution lost to a simpler and cheaper ATI RV770 in a lot of synthetic benchmarks except the fillrate test and texture sampling pure performance. Theoretically, G200 based solutions should feel at home in those games that have a lot of high-resolution complex textures and shaders working mostly with textures, and should lose to ATI only in those games that require high mathematical performance.
elvesbane said:NVIDIA has good reason to downplay DirectX 10.1's importance. Read up on it at wikipedia - as the article indicates, the new specification doesn't introduce any earth-shattering features. It's just a minor improvement which makes the spec stricter.
Calling GTX 2xx a farce is reaaally pushing it. As almost every reputed hardware review site has stated - the revised price structure of the GTX 2xx, especially the 260, brings it onto a level playing field with the ATI alternative. Sure, the GTX 280 lost the performance crown when the X2 came out, but then you can't expect a graphics card to be the best forever, can ya?
And you most certainly cannot expect every graphics card generation to be as revolutionary as the 4850/70 was in terms of price/performance. While leaps of that sort are impressive, it just doesn't work that way (continuously) in the real-world.
Though this was made mandatory only with Direct3D 10.1, all Direct3D 10 parts out there support at least 4x multisampling and 32-bit floating point filtering, so this is not a new feature per se, just a change in wording of the specification.
Aces170 said:What is worse, NVDA's Dx10.1 denial... they are just harping its not worth it, making sure no game developers utilize the tech, but went as far removing it from Assasins creed...
Nvidia's GTX 2xx was a farce, no matter how you look at it. Its certainly no 8800GT in terms of innovation or vfm...
Yep so even ATI has done it, no denying the fact but it couldnt get away as there were games coming out for SM 3.0. Although performance of the Geforce 6xxx with SM 3.0 was pathetic, so it was just a checkmark on feature list etc..oh come on,you should know PS 1.3,1.4.then 2.0,2.0a,2.0b.Then the famous denial for 3.
Udit said:what the hell is wrong with you?
can't you just buy/praise the best instead of these biased statements which are quite irritating, dumb & idiotic
muzux2 said:i'm praising the best man.. X2 & GTX both are the best.:rofl: i didn't said X2 is GTX 295 killer..:hap5: if you find my posts irratating, you better not reply on my post.. Its a fact GTX 295 isn't X2 killer.. check reviews particularly [H], i've posted in other thread.. To say someone dumb & idiotic is what you are showing that you are an nvidiot..:rofl:
Udit said:i use a ATI 4870 512MB dude
i got it as it killed GTX 260 core 192 in July 2008
though ATI drivers screwed me later