Graphic Cards GeForce 8800 Ultra is just an overclocked G80

Status
Not open for further replies.

Dark Star

Innovator
NVIDIA IS BUSY preparing to spoil AMD's homecoming, and we hear that Nvidia worked hard on improving the clock performance and power consumption on the new revision of G80 chips, which are going to be used for the spring refresh part.

The GeForce 8800 Ultra comes with 768MB of GDDR-3 memory clocked at very aggressive 1.175 GHz in DDR mode, or 2.35 GHz. The bandwidth has now grown to surpass any upcoming competition, regardless of the width of the memory bus itself. Nvidia will have around 112GB/s on their disposal, tucking in nicely with GPU clocked at 675 MHz, which is the default clock of 8600 GTS as well.

Our sources have informed us that the clock of 128 scalar units inside the GPU didn't change in a dramatical manner, so we can expect shader clock to be from 1.35 to 1.55 GHz. Anything more would be a surprise at this point in time (manufacturing process, that is). Card comes with a new cooler, just like in the old days of 7800GTX and 7800GTX512.

Nvidia is now preparing a press event, so your usual suspects will be going to the airports again, getting the NDA briefings and releasing a review of the cards around May 1st, with products in stores on May 15th.

Only problem was that controlling information that is floating around usually backfires, so forcing people to log on to http://paranoid.nvidia.com when opening PDF's with roadmaps, logging of the IPs etc. etc... really isn't doing the trick, since Nvidia still manufactures products in China. $999 is a tag for limited edition of products, while real future volumes of this product remain uncertain.

Guys, if you really want to control the information, ditch partners and move production to North America.

News source: THEINQUIRER
 
Maybe they worked on performance but power consumption and clock are insane on the G80 is what I've heard. It seems they have to do that because of their unified shader architecture v/s dedicated shaders in previous generations of cards which were more effiicient at doing either pixel or vertex shading. There may be more to the story though.
 
Status
Not open for further replies.