deathvirus_me
Discoverer
From what nvidia did with FX series .. better wait for 8900's ...
don't 2 X1950's give somewhere near that score ?
If they OC that proccy more and even the card then they can push that score to really insane limits
How come still GDDR3 when ATI is already shifted to GDDR4 with X1950 series ?
SoulFire said:now all we have to do is wait for retail prices in a few days
Darklord said:^^ Some SOURCE of mine told me the reason for GDDR3 over GDDR4 is something entirely different,although what you say makes sense too....and might be one of the reasons.
ROFL this is so lame . As if you can't create a 384bit interface using GDDR4 :rofl:. They are pin compatible and the package size is the same so if the mem controller supports it, the pcb can stay the same. The issue here is latency... gddr4 runs at significantly higher latencies than gddr3 and maybe nv's new chips is sensitive to latency.deathvirus_me said:Well .. like it or not .. nVidia is quite smart here ... GDDR3 with high. bus interface (384 bit), to GDDR4 with old bus interface (256 bit)... hmm .. seems like the same result .. plus > 80 GBps bandwidth .. man .... hardly games use up 20 GBps .. and in that case ... what matter more will be the "core clock" eehh ...