Graphic Cards AMD Radeon HD 6990 Press Deck Leaked

Status
Not open for further replies.

Hades.

Galvanizer
Source: http://www.techpowerup.com/141317/AMD-Radeon-HD-6990-Press-Deck-Leaked.html

For More Slides: http://www.donanimhaber.com/ekran-k...ran-karti-Tum-detaylariyla-Radeon-HD-6990.htm

While it is slated for release on the 8th of this month, AMD's Radeon HD 6990 was detailed to sections of the industry. It wasn't long before someone leaked the press deck, revealing all the specifications and AMD performance estimates. The Radeon HD 6990 is a dual-GPU graphics card making use of two AMD Cayman GPUs in internal CrossFire. Cayman is the same GPU that is at the center of Radeon HD 6950 and HD 6970.

In the HD 6990, each Cayman is configured to use all 1536 of its VLIW4 stream processors. The GPU core is clocked at 830 MHz, and memory at 1250 MHz (5.00 GHz GDDR5 effective). The 256-bit wide memory interface of each GPU is populated with 2 GB of memory, setting the total board memory to 4 GB. Display IO includes five connectors, including one DVI and four mini-DisplayPort 1.2. While the HD 6990 is said to outperform GeForce GTX 580, AMD is also bracing itself for NVIDIA's dual-GPU GTX 590, with HD 6990 OC variants following an official specification of 880 MHz core, and the same 5.00 GHz memory.

255n8uq.jpg
xaue13.jpg


Two power two GPUs with over 2.64 billion transistors, you need a lot of power. AMD has engineered its PCB to deliver up to 450W of power. This should give you a rough idea of what the HD 6990 will draw with its PowerTune feature disabled. AMD is known for using high-quality digital PWM components on its high-end reference boards, the trend is kept up with. The HD 6990 uses two 4+2 phase PWM circuitry per GPU. The inductors and PWM chips are hand-picked to offer the least leakage and highest efficiency. The PowerTune technology throttles power to ensure the best energy efficiency. This is both a boon and a bane for enthusiasts.

2ekospl.jpg
2zp8g0m.jpg


Lastly, the business-end of the presentation. AMD will have relaxed CCC overdrive limits to let you crank up the clock speeds. With a typical max board power of 375W and a PCB designed for 450W, there should be some a decent overclocking headroom. In its comparison with NVIDIA products, AMD used the GeForce GTX 580, the green team's fastest graphics card. When put through batteries of DirectX 9, DirectX 10/10.1, DirectX 11, and OpenGL game tests, AMD claims its HD 6990 to be faster than GTX 580 by 67% average, and up to 110% faster in some cases. The HD 6990 OC variant is claimed to push that a little higher, with up to 8% increase in performance over the HD 6990 reference.

9a96on.jpg
2dt245s.jpg


14wcb9h.jpg
 
I dunno. 50% over a GTX 580 isn't a big deal if the GTX 590 will be a GTX 570 SLI. Even it will score around this much if not 10-15% more than this!
 
If Nvidia used 2xGTX580 in its 590 card then i think HD6990 will definately get out performed by it... :) But if Nvidia used 2x570 then i think performance difference between HD6990 and GTX590 wudn't be that huge...
 
rite said:
I dunno. 50% over a GTX 580 isn't a big deal if the GTX 590 will be a GTX 570 SLI. Even it will score around this much if not 10-15% more than this!

67% is with OC and 59% (~60%) on stock speeds. Over 50% is an average performance which is impressive.

Hades. said:
If Nvidia used 2xGTX580 in its 590 card then i think HD6990 will definately get out performed by it... :) But if Nvidia used 2x570 then i think performance difference between HD6990 and GTX590 wudn't be that huge...
TDP of two GTX 580's will be 488W @ 772 Mhz and two GTX 570's ~440W @ 742 Mhz. It is highly unlikely that GTX 590 ends up as 580 Sli or even 570 SLi unless nvidia downclock them to keep under power limits. HD6990 with 375W have broken PCI-e Spec limit which is 300W. Now the question is, have AMD catch nvidia of gaurd by breaking the Pci-e limit? Nvidia might have been trying to stick with 300W limit, if that is the case then i think HD6990 will be quite faster than GTX 590. [just my observation] :)
 
Thats why there has been an increase in the Global warming with the Nvidia making heaters . Jokes apart / wat would matter in the Dualithic GPU`s would be performance per watts and then performance to money ratio . I am expecting that AMD RADEON would definitely be in a win win situation .
 
Status
Not open for further replies.