Graphic Cards Catalyst 5.6 Released ...

Guru3d.com's pricing engine shows the 128mb 6800 for $189 ,the 6600gt for $159 and the 128mb and 256mb 6800gts for $295 and $339.The X800XL is for $285.Can't nVidia is asking a lot more either,can you?
 
X800XL is for 279$ at newegg.com, and the nearest PCIE 6800 GT is for 350$. For AGP its 6800 GT allrite.

But I would wait now as the advent of the newlineup from the red and green team will reduce the prices further.
 
undertaker said:
Those are speculative dude.20% in HL2 sure,ATi owns the game,its ATi's illegitimate child with Valve.50% in Far Cry..yeah right.5-15% at most.
We'll see in a month ;)

Chaos posted 1.02 minutes later:

Aces170 said:
Hyper memory, so thats only for the lowend Hyper memory based PCIE cards right ?
Well the optimizations are valid for all cards... AGP and PCI-E. So expect a boost on ur 9800 pro as well :)
 
With this information in hand, it looks as if NVIDIA could have a big leg up on ATI for the next several quarters. While the rumored specs of the G70 are not as impressive as that of the R520, NVIDIA looks to have no problems producing G70 chips. This is actually quite reminiscent of the R300/NV30 situation. One company decided to use a new process for a large and complex part, while the other company sacrificed die size and overall clock speed to achieve more sustainable yields (and less risk). My impression is that the R520 is not a dog, and will be a very competent SM 3.0 part, but the ability to adequately cool/power/produce the R520 is in severe doubt at this time. While ATI will most likely respin the design (or already has done so many times) to achieve better yields and lower leakage, their time to market will be severely impacted by the issues that they have encountered so far. If the latest design they have sent off for production is a success, we still will not see the R520 introduced until early Fall, and then we have to question the availability of this product. While the G70 is a huge die on 110 nm (or so the current speculation goes), that is a very well known and mature process that will allow solid yields and speed bins for a product designed for it.
The R520's gonna be ATi's 5800.The Nvidiot in me can't wait to see ATi fall.LOL.
 
dloaded and installed :D

lemme see how much this speeds my comp now ;) (been using cat 5.4 b4 )
 
Which is the best driver for nvidia??? I do not mean the stabdard nvidia drivers. I am talking abt the modified drivers like NGO, Omega(I don't think they are for nvidia though),etc..........

Which is the best????
 
Radeon X800 XL @ 1600x1200 noAA/AF
Doom 3 Half Life 2 Wolf: ETSplinter Cell: CT
Catalyst 5.5 50.2 104 95.3 39.9
Catalyst 5.6 50.2 104.1 96 39.9
So much for perf increases!!!!LOL
ATi drivers will never match NV's no matter what they do.
 
The improvements are:
# Doom3: A performance gain of up to 7% is noticed at lower resolutions as a result of more efficient memory use
# Chronicles of Riddick: A performance gain of up to 10% is noticed as a result of more efficient storage of vertex data
# Halo: A performance gain of up to 20% is noticed as a result of generic driver Z-optimizations
# 3DMark05: A performance gain of up to 5% is noticed as a result of generic driver Z-optimizations
It never claimed improvements in the games you mentioned. It said doom3 in low resolutions in CPU limited situations. I was playing riddick in the evening and there is a definite improvement in that. My 3dmark 05 score went up from 2593 to 2678. Don't have halo to test so cant comment. These are inline with what they claimed. But the best thing is that CPU utilization for WMV HD video clips is way down to 30% from the earlier 55-60%. Now that is a major improvement.

Edit: Also forgot... AM3 scores went up from 46300 odd to 47800 odd.
 
It never claimed improvements in the games you mentioned.
LOL.This testing is done by Anandtech,I didn't claim or say anything Chaos.
But I do feel ATi is obsessed with 3DMARK'05 always trying to gain something there and all the way accusing nvidia of doing so.I feel the entire X800's line 3dmark'05 scores are "inflated".Buts that just me ,a hardcore Nvidiot,if u may.
Lets not forget that at lower resolutions cards like X800 and 6800 are cpu-limited anyway.
During the reign of the R3XX solutions ATi needed lower clocks than Nvidia to maintain or even beat its direct rivals.Now ATi needs whopping 20-30% more clocks than comparative Nvidia cards to be competitive.That says something.
 
LOL can you stop bashing anything ATI for a while, infact ATI's drivers have improved so much so they are much more stable then Nvidia as claimed by a site. Linux is another story altogether.
BTW Nvidia's cards perform horribly in everquest 2, check out the [H] review of X800XL.
 
Driverheaven has tested Cats 5.6 here

Conclusion
Improvements which we found to be beneficial were the Real Time Video Preview which works surprisingly well and does improve the end user experience hugely. WMV acceleration didn’t provide the huge benefits we’d hoped for in all situations though it did provide some improvements overall, …it should be noted though that this is one item that really needs to be tested with your setup to see what you gain (we’ve no doubt that users with lower spec CPU’s are going to see some excellent benefits with WMV acceleration enabled.)

On the gaming front as our tests showed there are also some nice improvements to be had, especially in OpenGL games which is nice to see as many end users have been looking for enhanced OpenGL performance from ATI for a while.
 
In June 2005, ATI commissioned AppLabs, a leading provider of quality assurance and testing, to conduct the test, pitting ATI's Radeon® display adaptors against comparable NVIDIA GeForce products. The objective of the test was to perform advanced software stress testing of the Radeon product line against the GeForce product line. AppLabs used publicly available test applications extracted from Microsoft's latest Windows Hardware Quality Lab test suite 5.3 to conduct the study in its Lindon, Utah facility. Testing was conducted on a variety of graphics adaptors from each of the two manufacturers, and included the repeated execution of a multitude of test cases (for more than 500 times on each card) to mimic a typical, long term, real-world PC usability scenario.
Did anyone not notice the word "commissioned"? :rofl:
 
Back
Top