XFX 790i Ultra SLI & XFX 9800GX2 Review

Hi people. The hardware scene has witnessed a series of new hardware releases in past few months. Especially the green team is quiet busy since November last year. Today I am having a look at two of their latest and greatest toys.

NVIDIA recently released a new platform in the form of 790i family of chipset and along with it Geforce 9 series. Today I will be reviewing XFX 790i Ultra SLI motherboard and XFX 9800GX2.

With the introduction of the 9800GX2, NVIDIA launched its second dual GPU card after 7950GX2 which honestly was a disappointment. So I was apprehensive about what I will be seeing while testing this card on this new platform which is again following a disappointing nForce 780i. But what impression this setup left on me was something that I honestly did not expect or anticipated. I wont spoilt it too much for you guys just yet.

Lets have a look, shall we?

33047f9f46df2791.jpg
33047f9f46e10966.jpg

[BREAK=nForce 790i Platform]
nForce 790i Platform.

The board features the NVIDIA 790i Ultra SLI chipset. The 7 series nForce platform consists of 5 chipsets now.

• 790i Ultra SLI
• 790i SLI
• 780i SLI
• 750i SLI.

790i is replacing 780i which was quiet frankly a disappointment.

790i Ultra SLI brings the DDR3 into the picture. It’s the first NVIDIA chipset to support the DDR3 memory. It also brings support for 45nm Penryn to the NVIDIA camp. So this is now a complete chipset that will support entire Intel LGA range.

nForce 790i Ultra SLI is basically top of the line flagship chipset from Nvidia for now supporting the NVIDIA TriSLI feature. 790i SLI is basically same chipset but only 790i Ultra SLI is officially supporting 2000Mhz DDR3 RAM and is supposed to be top yield chipset.

The south bridge is good old NF570. Nothing has changed in this department from the last generation. There is no more NF200 chip required now as the functionality is not built into Northbridge.
Lets have a quick look at the block diagram of 790i Ultra SLI.

[BREAK=790i Ultra SLI Block Diagram]

790i Ultra SLI Block Diagram

33047f9f4c40468a.jpg

The motherboard supports 2 PCI EXPRESS 2.0 slots through SPP ( Northbridge ). The 3rd PCI express is provided through the Southbridge and is not PCI Express 2.0. It does not make much of a difference to be honest.

The chipset supports 1600Mhz FSB, 10 Usb 2.0 ports, DDR3 2000 supports with the new EPP 2.0, up to 5 PCI slots, Azalia HD audio 2 Native PATA and 6 SATA connectors and Dual GB LAN. So what’s special here? Nothing if you just look at the specifications, but the the implementation of Dual GB LAN on NVIDIA chipset is unique.
The DualNet ( that’s what NVIDIA calls it ) is through the MCP Southbridge on the NVIDIA motherboard. And you can team these 2 together to work as a single LAN connection which works really well on this board.
But NVIDIA does not seem to stress on this exceptional functionality of its chipset as most home users will never use this feature. But if you are one of those who might use it, you are in for a treat.

As you can see, its fully loaded platform and you would expect that from the top of the line chipset for which you are asked to pay premium price.

This board supports the 2 NVIDIA SLI technologies PWShort and Broadcast method.


33047f9f5228ae49.jpg
33047f9f522a3701.jpg

In new broadcast mode, instead of having to send the same data and commands to each GPU consecutively, only one data-segment is sent across the FSB to the chipset, which replicates it in parallel to all GPUs. This greatly reduces congestion across the FSB and reduces latencies for CPU-to-GPU messages. So basically CPU has to send data on the FSB only once which is then sent to all the GPUs which reduces traffic on the bus.

The PWShort ( Post Write Short ) method works by sending small amount of data amongst the GPUs to keep them in sync with each other and this is handled by the controller built into the Northbridge.
With 3 Way SLI, though majority of communication between cards is handled by the SLI connectors, as the 3rd slot is though the MCP, it does put extra load on the MCP and HT link.

The new gimmick with this platform seems to be Nvidia ESA. We will have a look at it little later in the review.

Lets take a look at the board itself.

[BREAK=XFX 790i Ultra SLI]

The XFX board is 100% NVIDIA reference model. Basically its generic nForce 790i Ultra board under XFX brand name.

Here is a quick overview of the features.

SATA Speed
3.0 Gb/s
RAID
0,1,0+1,5
Socket
Intel Socket 775
Chipset
NVIDIA nForce 790i Ultra MCP
System Memory
Dual Channel 240-pin DDR3 up to 2000MHz w/
EPP 2.0
SLI Technology
3 x16 (3-Way SLI)
PCI Slot
PCI-E 2.0 (2), PCI-E 1.0 (1), PCI-E x1 (1), PCI (2)
Front Side Bus
1600 MHz with supporting CPU
Audio
8-Channel High Definition Audio
Supported CPUs
Intel Penryn, Core 2 Extreme, Core 2 Duo, Core 2
Quad, Pentium
USB
(10) USB 2.0 ports (6 Rear + 2x2 Onboard)
IEEE 1394 (Firewire)
(2) 1394a @ 400 Mb/s (1 Rear + 1 Onboard)
LAN
Dual Onboard LAN Supports 10/100/1000 Mb/s
PCI-E
(2) PCI Express 2.0, (1) PCI Express 1.0
Native Gigabit Ethernet Connections
Dual
SATA/PATA Drives
6/2
Highlighted Features
Windows Vista Ready,TCP/IP Acceleration, NVIDIA
LinkBoost Technology, SLI-Ready Memory with
EPP, NVIDIA MediaShield Storage
Technology, NVIDIA FirstPacket Tech

[BREAK=Whats in the Box]

Lets have a look at what we get.

33047f9f5d14accc.jpg


33047f9f5d183f06.jpg


33047f9f5d1ac663.jpg

The motherboard ships in a huge XFX box. Its really big. I was wondering why in this world it ships with this box. But amount of accessories, cables that are provided with the board justifies the sheer size. Everything is packed in very nicely. You get typical XFX do not disturb door handle placecard. :p

Following is the list of accessories and cables included with the board.

• Six black SATA cables with XFX logos and dust caps
• Three molex to six SATA power connectors
• One black, rounded floppy cable
• One black, rounded IDE cable
• One PCI bracket with four USB sockets
• One PCI bracket with Firewire socket
• Manual and driver CD
• 3 Way SLI and 2 Way SLI Bridge connectors.
• Northbridge Fan.
• I/O Shield / Backplate
33047f9fb60073c5.jpg

[BREAK=Look at the motherboard design]
Look at the motherboard design.

As I have already stated, its 100% reference design.
33047f9fdae04c27.jpg
Overall the design looks good. It’s unusual but functional.

33047f9fdae28aa2.jpg

You get 4 DDR3 slots at top right of the board. The placement is good and replacing RAM sticks with long graphic card in slot is not a problem. Next to the RAM slot we have 24 pin ATX power connector.

33047f9fe3da9a39.jpg

The CPU socket is covered by NB and PWM heatsinks and hearpipes from all 4 sides. The board has 6 phase power supply and this assembly keep the mosfets nice and cool. But there is plenty of clearance and even installing big cooler like Thermalright Ultra 120 Extreme was not an issue at all.

The NB heatsink is tall. We could not mount the Ultra 120 facing the back of the board as it was coming in way of the north bridge heatsink.
A fan is provided with the board that clips on to the NB heatsink. Thankfully NVIDIA got the fan placement right this time around. On previous reference board, the fan used to blow hot air in downward direction which was basically dumping all the heat on the back of graphic card. But they have corrected that and fan now blows heat in the direction of the CPU socket.

33047f9fe3d85484.jpg

Southbridge is connected to north bridge heatsink assembly by what looks like a small followed by a metal strip that connects it to Northbridge heatsink.
The Northbridge cooling is sufficient and it stayed pretty cool even while overclocking. But we did have to get a good airflow across the Southbridge to keep it cool. It did get pretty warm which is not usual of NVIDIA MCP. All the nForce platform board I have used in recent past including 650i / 680i boards did have hot Southbridge.

33047f9fdae4c169.jpg

The placement of SATA connectors is little odd. 2 SATA connectors are located at bottom right of the board and rest 4 are located next to the RAM slots. This is highly unusual. But to the credit of NVIDIA, none of these ports created any problem of clearance or obstruction.

There is 2 Digit LED display on the board which displays post error codes. The board also has 3 LEDs of 3 different colours at top of the board above RAM slots indicating main power, RAM power and standby power which is real nice touch. It will avoid accidental killing of DDR3 chips. Many times we tend to remove the RAM chips while standby power and ram power is still on which might kill you chips. Having a bright LED indicating there is residual and standby power in your RAM slots is very useful.

You are also provided with the Power and Reset push buttons at bottom of the board which is very useful for testing the board on test bench.

33047f9ff176d1ca.jpg

Now the board is 3 PCI express X16 slots for 3 way SLI setup. Slot placement is spot on with plenty of clearance for dual slot coolers. Though you will be left with no functional PCI slots if you go for 3 way SLI setup on this board with dual slot cooling solution. There are also 2 PCI express X1 slots present on this board which will be useful for adding other PCI express cards like newly rolled out PCI express sound cards from ASUS and Creative.

There is 1 eSATA port right next to top PCI express X1 slot which IMHO is pretty useless. NVIDIA should have moved it on rear I/O.

The placement of EPS12v connector also might be headache specially with the bottom PSU cases. Routing cables all the way up there avoiding the CPU heatsink is nightmare at times. Specially reaching there with big cooler like TRUE installed can be an issue.

33047f9fe3dd4649.jpg

The back I/O is pretty standard. You have 8ch Audio, Optical and coaxial digital outputs for sound, 6 USB 2.0 ports, 2 LAN ports, 1 IEEE, 1 eSATA and PS2 connectors for keyboard and mouse.

The only real let down on this board is the use of standard electrolytic capacitors. When you are paying such heavy premium for the board, I would personally want to see all solid capacitors on the board.

On the positive side, all heatsinks are held down by screws which is a good thing. I never liked the pushpins on the motherboard to hold down the heatsinks and mosfet sinks.

[BREAK=Motherboard Bios]
Lets move onto the board Bios.

The motherboard uses Phoenix AWARD bios. Its pretty standard usual AWARD layout. Most users will feel right at home.

Lets look at all the options in the bios. There is not much of an explanation needed here. No point narrating whats visible on the screen ;) The screen pics will show you the range of voltages and other options available to you.
Overall bios has all you need to overclock this board substantially. Clocking this board was a piece of cake to be honest.
33047fa00b5838c7.jpg
33047fa00b5a1bc2.jpg
33047fa00b5bdd2b.jpg
33047fa01088d3c5.jpg
33047fa0108aba87.jpg
33047fa0108c9550.jpg
33047fa0202914f5.jpg
33047fa0202afb7e.jpg
33047fa0202d076e.jpg
33047fa02ad9b443.jpg
33047fa02add03a8.jpg
33047fa02adf1eb8.jpg
33047fa02ee36290.jpg
33047fa02ee8ed33.jpg
33047fa02eeeccaa.jpg
33047fa03301ecbb.jpg
33047fa03303f262.jpg
33047fa03306e1fe.jpg


[BREAK=NVIDIA ESA and Control panel]

Now NVIDIA ESA is interesting concept. We did not receive or have any ESA certified products with us to actually test this feature. But the concept looks interesting. How much functional it will be, we just can’t tell you right how.

The ESA basically enables 2 way communication between ESA certified devices like cabinets, coolers over the USB.

33047fa0394e31c1.jpg

Everything gets integrated in the NVIDIA control panel and you can basically control everything from here. Its nice tool. You can even check for software and bios update and update them right from the windows using this tool. Very handy.

Now lets move onto the 9800GX2

[BREAK=9800GX2]
XFX 9800GX2
When NVIDIA announced this card and sheer amount of rumors, previews, leaked benchmarks that followed the same, a lot of hype was created around this card. This is NVIDIA’s return to the dual GPU scene after a while and it had to do thing to have an answer to the ATI 3870x2.

What was surprising about the launch of Geforce 9 series was that Nvidia decided to launch 9800 series which is based on essentially same architecture and core to that of 65nm G92 used in 8800GT and 8800GTS 512MB cards with few minor tweaks if any.

The problem that has plagued SLI for a while now is cost and its inability to scale well with addition of the additional GPU. And that was one of the main reason many people had played down 9800GX2 as a possible failure and disappointment.

So what is the picture in reality? We will have a look at that but first let’s have a look at this card.

[BREAK=XFX 9800GX2]

The card too like motherboard shipped in huge XFX box. Nicely packed.
33047fa041a64879.jpg
33047fa041a8ce9c.jpg
33047fa041aafb63.jpg

I was little disappointed by the bundle to be honest. It shipped with molex to 6 pin PCI express converter, Driver and Game CDs, 2 DVI to VGA dongle and Audio cable for DVI audio pass through function of the card which is very similar to what we have seen so far on the ATI cards. Now this bundle will create a problem which we will have a look in a while.

Here is a quick look at specification

Spec.png

[BREAK=XFX 9800GX2 continued]
XFX 9800GX2 continued

Lets have a look at the card itself.

33047fa04ee408c0.jpg

33047fa04ee69c6a.jpg
33047fa04eea0683.jpg
As you can see in the picture. It’s a monster of a card. I had seen its picture but never really I imagined it will be this big and heavy. It really looks and feels like a large brick.
It’s big bad and mean.

It is one big unit made up of 2 pcbs and a cooling solution enclosed in a plastic enclosure. Here is the reference image provided by NVIDIA of how this assembly actually looks.

33047fa05408f853.jpg

The card is little odd. The two GPUs face each other cooling solution is sandwiched between the two PCBs. Highly unusual but at the same time ingenious. Each PCB contains 1 GPU and 512MB GDDR3 memory and these two are linked together by a SLI bridge which is PCI express 1.0 standard. So even though GPU is PCI express 2.0 internally its using PCI express 1.0 for on card SLI which in itself is not that much of disadvantage but at the same time helps keep cost down for NVIDIA.

33047fa05947b771.jpg
33047fa0594a60a8.jpg

There are two power connectors on the card. One of them is 6 pin PCI express and the other is 8 pin PCI express 2.0 connector. Now this is what created a big problem. We have dedicated entire page for that in this review and how to solve the problem the ghetto way.

33047fa05948e189.jpg

The card has 2 dual link DVI connectors as well as HDMI output which is really nice to see.

[BREAK=Uh oh!! Look ma, no power.]

As soon as I received the product, I immediately ran off to set up the test bench. Plugged the card on the motherboard and then went to attach the PCI express power connector to it.

The 6 pin connector went it nicely. As I went to attach the 8 pin connector, it wont go in. Was shocked to see that. I though this must be some mistake and this problem should not be in a retail version of the card which this is. But it is there.

The slit which allows the power connector clip to slide in and lock itself starts 1-2 mm left from where it should. This means the connector wont go in. I though ok this must be issue with old power supplies. I went out and got Tagan 800 watt PCI express 2.0 certified PSU. Its 8 pin connector also won’t go in.

So I go out and borrow a Corsair HX620 with new 8 pin cables. That too wont go in. And as this point I am angry and frustrated.

I was not going to break my PSU’s locking clip. So I though let me try and somehow get the 6 pin in there. So I forcefully managed to insert the 6 pin connector in. It went in little crooked but it went in. And every other card in past that shipped with 8 pin connector booted just fine with just 6 pins plugged in. So as the 6 pin connector went it, I hit the power button with a smile on face only to be greeted by red light from the card and horrible warning sound from the onboard speaker of the card. It wont boot with 6 pin connector. It wont accept 8 pin connectors of 3 PSU’s I have with me. So with 2 days of hunting I got hold of Molex to 8 pin converter.

Thought ok problem is solved. But even this connector won’t work for some reason. It greeted me again with red light and buzzer.

Not ready to give up, I went to see PCI express 2.0 specifications only to find out the two extra pins are nothing but ground. I banged my head against wall at this point to be honest.

So a simple solution came into mind. I took out the pins of old female Molex connector and attached wires to it. Inserted those two pins into the 7th and 8th pin and other end was inserted into ground pins of PCI express connector only.

33047fa063654831.jpg

And Viola, we had a liftoff. Card gave nice green LED signals and finally booted for the first time.

I do not know what the hell NVIDIA engineers thinking designing the connecter this way. I digged up some dirt and asked around and this problem has been confirmed by XFX, MSI, eVGA, BFG card owners. And yes RETAIL card owners. So this was not the problem restricted to review samples only. It’s a real problem and XFX did not bundle 6 pin to 8 pin PCI express converter with this card which would have made life much easier.

I hope they take a notice of this and do the needful. Though the mod is simple, its not really ideal solution to this problem as even if you touch or move these pins by mistake, display goes blank.

[BREAK=Software for the card.]
As you must know, NVIDIA forceware driver suit is all you need for this card. So why is this page here?

This page is here to show you something new I noticed in the control panel with GPU. Its additional video properties.

33047fa06d13d3f5.jpg

NVIDIA has added contrast adjustment and Dynamic contrast feature in forceware. Up on inquiring, it is confirmed that this feature will be available on GF8 series of cards. And it really works well.

Two thumbs up.

Enough of this talk. Let’s now move onto some real benchmark numbers.

[BREAK=Test system and Software used]
Our test system consists of following hardware.

CPU : Intel C2D E8400
Ram : Kingston HyperX DDR3 1625
PSU : OCZ GameXstream 700W
Operating System : Windows XP SP2 ( I still don’t consider Vista as Gaming platform. So sorry folks no vista numbers for now. )
Cooler : Thermalright Ultra 120 Extreme.
Drivers : Forceware 174.74 WHQL
Motherboard Bios : P03R02
Software : HDTach , 3Dmark 2006, Rivatuner 2.08 for card overclocking, GPUZ, CPUZ
Games Used : World In Conflict, Unreal Tournament 3, Bioshock, Call Of Duty 4, Crysis

[BREAK=How we tested.]

This is high end gaming platform. So all tests were done in a way that will reflect high resolution gaming. With 20 and 22 inch monitors becoming mainstream, resolution selection of 1680x1050 made sense. And this being a high end GPU, we eliminated the lower resolutions.

Each test was ran multiple times to make sure results are consistent and an average of the results was taken. The settings used for each game will be mentioned on the page itself.

We pitted the card against 8800GT which was clocked to 600/1000 same as the 9800GX2 clock speeds. It made sense to use it against this card. This will give you an idea how well the 9800GX2 dual GPU solution is actually scaling in the games.

[BREAK=The DDR3 RAM]

As stated, the RAM used was Kingston HyperX PC3 13000 DDR3 memory. Model no KHX13000D3LLK2/2G.

The kit we received for the testing was rated at 1625 Mhz. With DDR3 memory, you really need some fast ram like this one to really see the potential of the DDR3 memory and advantage over DDR2.

Here is the quick look at the features and specifications.

  • Non-ECC, Unbuffered
  • 2GB kit (2x 128Mx64)
  • DDR3-1633 CL7-7-7-20-1T latency
  • RoHS Compliant
  • Enhanced low power features and thermal design
  • Rated Voltage : 1.9v
  • Lifetime Warranty

The Kit is rated at 7-7-7-20 at 1625 Mhz.
But we managed to get it running at 7-6-6-24 at 1.9v which is indeed very good. The max this kit was able to clock was 1800Mhz with 2.0v 7-7-7-24.

Here are couple of snaps of the kit.

33047fcbbf0de37c.jpg


33047fcbbf10774d.jpg


Now lets move onto the actual benchmarks.

[BREAK=Hard Disk Performance.]

The Hard Disk performance was satisfactory. It took 19 seconds to copy 1.07GB file from one Seagate 7200.10 320GB drive to another identical drive.

The HDTach scan of the drive was again impressive. Nothing much to complain about here.

33047fa071c55c15.jpg

[BREAK=Memory performance.]

As I have stated earlier. This board supports the DDR3 memory. The beauty of Nvidia chipset in past has been its ability to run memory totally independent from the FSB / Clock speeds. You have the option to run memory totally unlinked, linked and 1:1 synced.

So how does it affect the performance?
Lets have a look.

We used Everest memory and cache benchmark tool to test.
Memory was kept constant at 1600Mhz with timings of 7-6-6-24. We then tried running it in Auto, Sync and also totally unlinked.

Here is the first screenshot with FSB at default 1333Mhz and Memory at Default 1600Mhz.

33047fa0784327a9.jpg
The next screenshot if FSB and Memory running in sync mode at 1600Mhz. As you can see memory bandwidth jumps substantially.

33047fa078450afa.jpg

Now we ran both FSB and Memory in Unlinked mode but at same speed of 1600Mhz. Memory bandwidth and latency turned out to be roughly same as Sync mode but if you look closely the L2 cache BW jumps significantly. I have no rational explaination for this. But its interesting observation.

33047fa07846eb08.jpg
Last we pumped up the FSB to 1860Mhz keeping memory unlinked at 1600Mhz. Again memory bandwidth jumped up significantly even though the memory speed and timing is constant.

33047fa07d49d825.jpg

You can safely conclude that the FSB and memory bandwidth have direct relationship. Even in SYNC and UNLINKED mode, higher the FSB, better the memory bandwidth.

Now lets move onto the 3D tests.

[BREAK=3DMark 2006 v1.1.0]

Now I had a chance to test this card directly with the new 174.74 drivers which are suppose to be optimized for this card.
With moderate overclock on dual core E8400 and keeping the card at stock, we got score in excess of 19000 which was quiet honestly little more than what I expected. Good start.
Here is a chart comparing the scores of 8800GT and 9800GX2.
3D06.png
As you can see, we get roughly 43% improvement in our score with 9800GX2

33047fa084f790e0.jpg

[BREAK=Unreal Tournament 3]
Unreal Tournament 3

33047fa08d159a75.jpg

All in game settings were set to max. 4X AA, 16XAF was forced from NVIDIA control panel.
Game was ran at 1680x1050. This is one of the few games which is really using the multi core CPUs.

Here are the numbers.
UT3%20Heatray.png


Ut3%20Shangrila.png
Now this is where this card started kicking out some serious numbers. Just have a look at the score improvement. 81% and 90% improvement in 2 maps that we tested. That is indeed impressive. Really shows how well the 2 GPUs are scaling here.

[BREAK=World In Conflict]
World In Conflict

33047fa0b6d9dd06.jpg
This is one of the best looking RTS games out there. Beautiful graphics, very good gameplay and very good online action. This game has it all. The graphics on RTS were never this good before.

We used Very High settings from game options which forces 4XAA and 4XAF in the game. No manual adjustments were made and we used game’s inbuilt benchmark to test.

Lets have a look at the numbers.

WIC.png

Again we see more than 50% improvement in performance. Not bad at all.
[BREAK=Bioshock]
Bioshock

33047fa0b6dada3d.jpg

This was one of the first game that really made UT3 engine shine. The game environment is absolutely fantastic.

Benchmarking this game is little tough. There is no inbuilt benchmark. So we relied on FRAPS to do the work.
On the first level save point, we started the game on each card and moved around the level for a minute and tried to replicate same movement on the second card. 4XAA and 16X AF was used.

Here are the numbers.

Bioshock.png
Again 94% improvement. Looks like UT3 engine is really scaling well with this card.

[BREAK=Call of Duty 4]

Call of Duty 4

33047fa0b6dbcd17.jpg

Arguably the best release of 2007. One of the most popular game with very intense action and very good gameplay. It looks great as well. And with the popularity of this game, it was impossible to emit this game from our tests.

The game was ran with everything maxed out 4xAA and 16xAF. Every single in game option was set to max.

Here are the numbers.

COD4.png
Again 9800GX2 really shines here. 86% improvement and this game really benefits from this card. Its very fast and intense game. Lots of action. And having high FPS in shooters really helps. Playing this game with 9800GX2 was heavenly experience.

[BREAK=Crysis]

Crysis

33047fa0bc2759b4.jpg

Ahh so finally the last test. I kept it last on purpose. This game has been named nemesis of the modern GPUs. This game is all about pushing the limits of the graphics and pure eye candy.

All the settings in game were set to max possible settings of HIGH. Then we forced fake VERY HIGH on XP with the system.cfg file. The AA in this game really looks bad. But this game allows you to force Edge AA using the cfg file which looks lot better than in game AA setting.

Here is the cfg file we used.

con_restricted=0
r_motionblur=0
r_UseEdgeAA=2
r_UsePOM=1
r_sunshafts=1
e_water_ocean_fft=1
q_Renderer=3
r_colorgrading=1

This gives best possible results in terms of eye candy on XP.

Lets have a look at the numbers.
Crysis.png

Well 60% improvement, and the framerate we got from this card was impressive to say the least. Gameplay was totally smooth. This is 1 card which will actually let you play this game the way it was meant to be.

Now that all 3D tests are over. Lets head over to the overclocking.

[BREAK=Overclocking.]

Overclocking the processor using the motherboard was piece of cake. It just kept clocking. We tried using auto setting for overclock and it allowed us to reach impressive 450FSB without touching any setting in the bios.
The max FSB on air on this motherboard with the dual core E8500 was 580Mhz. Which is not bad at all. We did ran into few crashes at this FSB. The max stable FSB was 575Mhz without any vmods. I did not get too much time to actually experiment with this board as we had a time restriction. But with little more mature bios and playing with settings should get this board even higher.

The board shipped with the bios dated 14th Feb 08 which was actually horrible in memory performance. It did clock well but the performance was simply not there. The board was flashed with new bios version P03R2. This had better voltage options, P1 / P2 options and actually brought the performance to this board.
This bios though restricted the memory overclock on our board. We were not able to go beyond 1700Mhz with P03 bios. On P02 bios however we could go as high as 1900Mhz. Trying all memory options yielded nothing. However loose timings we used, we were limited to DDR3 1700.
As far as 9800GX2 is concerned, I was able to clock it to 720 / 2400 Mhz. But this card seems pretty much limited by the dual core CPU. 3Dmark yielded no significant increase with max score of 19922 which is less than 1000 3Dmark improvement.

But there is not much to complain. The board does need more mature bios and I hope NVIDIA with its partners like XFX work on this and release good bios for this board.

33047fa0c65865f3.jpg

[BREAK=Analysis and Conclusion.]
Analysis.

790i Ultra SLI as a platform is looking very good. It’s a complete platform for Intel if you want to opt for SLI. Its got everything you can imagine. Performance, features the wow factor all are there.
The main competition to this is Intel’s X48. Intel has been in the hot seat for a while now. Their P35 and X38 platform matched the memory performance of the 680i and 780i boards and with DDR3 even outperformed them. So NVIDIA really needed something good to compete against Intel’s chiset if it had to survive in this business.
They came out with a bang. 790i is impressive. This was exactly what they needed. What NVIDIA has to work on is the pricing. At the moment they are charging premium to their motherboard partners for this chipset. They have exclusive support for SLI, so that’s probably the reason NVIDIA is charging so much for these. But it has to understand and learn from its previous mistakes.

This motherboard is costliest motherboard I have ever tested and its currently one of the costliest board to purchase restricting its reach to very few people.
9800GX2 is also in the same boat. Performance is stunning. No single card out there will match the performance of 9800GX2, and it does deliver the goods this time around with fantastic efficiency and dual gpu scaling. The performance is actually improving significantly. What they need to work on now is to support these cards with proper drivers for 9800GX2 SLI and also churn out more WHQL drivers. Mind you, I never cared about WHQL certification and have been using more beta drivers than official releases for last year or two. But here are plenty of people out there who do care about this.

Also price performance ratio of this card is a concern. 2 8800GT cost less than this card and will surely match this card or come very close to it. But then again this card allows you to have dual GPU setup on non SLI motherboard. So its this or that situation for the customers.

To round it off, both the products reviewed today offer great performance but at a cost. If you can afford this and to top it off a fast DDR3 memory, you have an unmatched configuration on your hands.

If you ask me to give it a score out of 10 here is how it will be.

XFX 790i Ultra 3 Way SLI Motherboard

Performance : 9/10
Built Quality : 9/10
Features : 10/10
Ease of use : 9/10
Price : 7/10

XFX 9800GX2

Performance : 9/10
Built Quality : 8/10
Features : 8/10
Ease of use : 8/10 ( 1 mark lost for the annoying power connector problem)
Price : 7/10
Our sincere thanks to Rashi Peripherals and XFX for providing these two products for the review.
Special thanks to EXL Public Relations and Kingston for providing the Kingston HyperX DDR3 1625Mhz Kit.

Reviewed by : Shripad aka Funky.
 
gannu said:
Ah.. SLi Bugger... :p

Drooling over just the mobo- not the card. :D

Will wait and see if and what DFI does with this chipset- and anyways i am looking at DDR3 as my purchases are planned sometime from now. :)

Plus, if i have to go for intel as far as proccies are concerned and no choices there for sometime now, atleast the mobo will be not intel. :p
24x7 OC is not hampered in anyway i think- though extreme runs will be better on the other chipset mobos. :)

Plus CF mobos do not tempt me much coz even ATi is out of the performance scene for sometime now- nVidia is churning out great cards and one can always buy/borrow one more for some benching fun/TP. :eek:hyeah:
And trust me, having an sli mobo helps big time at times- like not having to spend extra for some silly reasons like i had to recently.

I know how helpful it was after having parted with one. :rofl:
 
BIKeINSTEIN said:
Drooling over just the mobo- not the card- and i still love my nf4 mobo- nVidia was the king then. :p
Will wait and see if and what DFI does with this chipset. :eek:hyeah:
Plus, if i have to go for intel as far as proccies are concerned and no choices there for sometime now, atleast the mobo will be not intel. :p
24x7 OC is not hampered in anyway i think- thougfh extreme run will be better on the other chipset mobos. :)
Plus CF mobos do not tempt me much coz even ATi is out of the performance scene for sometime now- nVidia is churning out great cards and one can always buy/borrow one more for some benching fun/TP. :eek:hyeah:
And trust me, having an sli mobo helps big time at times- like not having to spend extra for some silly reasons like i had to recently.
I know how helpful it was after having parted with one. :rofl:

Acha, dats rite.. :)

But this mobo aint good fer ppl who wudn want to opt fer SLi..
Yet some may still buy it but tat'll be unnecessary if not a waste since X38 + DDR2 combos can still compete with a Single good gfx card...

DFI's known to make kickass mobos btw.. Neve had an experience to use one though.. Maybe, if M lucky enuf to sell my present one and acquire another one, but P35 does the job fer me.. :)
 
gannu said:
Acha, dats rite.. :)
But this mobo aint good fer ppl who wudn want to opt fer SLi..

Yet some may still buy it but tat'll be unnecessary if not a waste since X38 + DDR2 combos can still compete with a Single good gfx card...
DFI's known to make kickass mobos btw.. Neve had an experience to use one though.. Maybe, if M lucky enuf to sell my present one and acquire another one, but P35 does the job fer me.. :)

IP-35E/pro + DDR2 is still the best out there as of now- i wouldn't look for anything else if buying one now. :)
 
Funky said:
Thx people :)

@ibz : You can hook up another card for quad SLI. SLI bridge is hidden with dust protection cover. ;)

now im pretty sure everyone here would like a review of a quad sli setup...wat say funky?:clap:

excellent review :eek:hyeah:
 
nice review :) i dont believe your got faster ddr3 ram i wish i need some ddr3 ram for test my p35 board :(
 
Back
Top