Graphic Cards Exclusive: Intel Larrabee 5 times faster than Nvidia, AMD

vishwaswish

Disciple
NOTE:Earlyer today the story was posted at another site and removed after 2½ hours online.

Intel's upcoming Larrabee chip will pack nearly two billion transistors and 64 tiny x86 processor cores. That's what sources close to Intel's intriguing co-processor project have told TechRadar today.

Factor in an expected operating frequency of at least 2GHz and it's likely Larrabee will have four to five times the raw computational grunt of the fastest current graphics chips.

Now, you might be wondering whether this is just another throw away rumour, a headline grabbing stab in the dark. Certainly Larrabee remains a long way out. It won't appear until the end of next year at the absolute earliest. Much can change in that time.

But if you cast your web browser back to last summer, you may remember a little story we broke. We suggested that Larrabee's core architecture would be based on the Pentium MMX CPU of the late 1990s. And whaddya know, that turned out to be true.

So, what else do we know about Larrabee? We're told the first generation chip will be available in both 64 and 32 core flavours (though we're not sure if both will be go at launch). The chip itself is said to be big, really big.

In fact, it's on a similar scale to Intel's recently released six-core Xeon server chip, codenamed Dunnington. Indeed, Dunnington is built using essentially the same 45nm silicon process as Larrabee will use and packs 1.9 billion transistors. In other words, Intel is already making a Larrabee-scale x86 chip with existing technology.

integratedgraphics.jpg


We also hear the board that Larrabee will be strapped to looks very much like a conventional graphics card complete with a large but not ludicrous cooling solution.

Of course, what we all really want to know is how powerful Larrabee will be. Well, that's something we can also shed some light on, at least in theoretical throughput terms.

Intel has revealed that each Larrabee core will pack a supercharged ALU capable of 16 vector operations per cycle. With 64 cores that works out to no less than 1,024 vector ops per cycle. By comparison, AMD's top GPU, the Radeon HD 4870, can do 800 per cycle while NVIDIA's GeForce GTX 280 knocks out a relatively modest 240.

The next part of the equation, of course, is operating frequencies, or how many cycles a chip can crank out per second. NVIDIA's GPUs tend to have high shader frequencies of around 1.5 to 1.7GHz, helping to close the performance gap to AMD's typically wider ALU arrays which currently operate well below 1GHz.

But how fast will Larrabee run? That's probably the hardest attribute to pin down since attainable clockspeeds are so sensitive to the quality of final production chips. But expect something in the range of 2GHz to 2.5GHz.

If we roll with the more conservative forecast, here's how Larrabee lines up compared with the best current technology:

The Specs
Intel Larrabee: 1,024 vectors at 2GHz
NVIDIA Geforce GTX 280: 240 vectors at 1.6GHz
AMD Radeon HD 4870: 800 vectors at 725MHz.

larrabee.jpg

As things stand, that makes Larrabee around four to five times as powerful as anything available today. Needless to say, AMD and NVIDIA will not be stand still between now and the launch of Larrabee. As NVIDIA's Ben Barraondo says, "NVIDIA has never stood still in the market. We're more than confident we'll be very competitive will Larrabee when it eventually arrives."
larrabee-gpu.gif


There's also a big difference between theoretical throughput and usable performance. That's even more true in the context of Larrabee's revolutionary programmable architecture. With a software controlling nearly the entire rendering pipeline, gauging how efficiently Larrabee will use its compute resources is currently impossible.

What's more, raw vector throughput is just one, extremely simplified, metric of performance. Larrabee will need to perform well in many other areas. But on paper, at least, it looks like it will be right up there with the best at launch.

So now you know why Intel is so bullish about the prospects of its brand new GPU-killing architecture in the face of AMD and NVIDIA's tried and tested chips. And now you know why the industry is so excited about Larrabee.

larrabee-pci.gif


Source
 
They still have to write the driver for it.

And after repeated fiascos with the G965, G31, G43 and G45, I'm not sure they can do it. The G45 still doesn't do proper T&L in hardware, and none of the chips can read read EDID off the monitor and adjust resolution to match. It still depends on resolutions hardcoded in the BIOS (system BIOS). Which means that if you move to a monitor with an unsupported resolution, you're SOL.

No money to Intel for a product that depends on their software. They make killer hardware but can't seem to be able to write a proper driver for anything.
 
If it is true, nothing better than it. Whichever company it might be, faster chips will always push the complete market to get faster.
In the end, we win.
 
ntel dangling a non-existent carrot right now.

Unfortunately theoretical throughput tells very little about actual performance though, so we'll prolly have to wait like 1 year before getting any more wise about actual performance numbers. I hope it turns out good though, more competitors the better, I would be shocked if it performed say more than 1.3~1.5x better than ATI/AMDs or NVIDIAs offerings by the time it gets released though. I'm still in doubt if it can even handle the most demanding games as good as AMD & NVIDIA's offerings but I hope it will. Even though it will have lots and lots of small cores I think it will be a trouble using them all efficiently and will probably take a while until games would start running optimally on it.

Theoretical hardware performance is one thing, actually using that via smart software is another. Look how much effort it takes Nvidia and AMD's driver teams to manage their GPU business - it will be a daunting task for Intel, starting from scratch.
 
Well guys we are comparing Larabee 2010 n ATI NVIDIA-2008...within 2yrs ie approx 2-3 generations of gfx cards nvidia n amd will kick larrabee's ass!!!

Its really like comparing 6800GT with 9800GTX or 7800GT with GTX 260\280....come on guys grow up!!!
 
Its really like comparing 6800GT with 9800GTX or 7800GT with GTX 260\280....come on guys grow up!!!

If u read it's is Intel who's boosting about there new gpu/cpu sandwich. Everyone know things dramatically change in week's. What's mind - bothering is Intel comparing Larabee with GTX n HD4870(By 2010 Hd4870 will be in budget group)

everyone knows how badly Intel Sucks in graphic's seen. They current G45 @ ~6-10k can't caught up with 780G @ 4-6k. My 2 cent
 
sangram said:
They still have to write the driver for it.

And after repeated fiascos with the G965, G31, G43 and G45, I'm not sure they can do it. The G45 still doesn't do proper T&L in hardware, and none of the chips can read read EDID off the monitor and adjust resolution to match. It still depends on resolutions hardcoded in the BIOS (system BIOS). Which means that if you move to a monitor with an unsupported resolution, you're SOL.

No money to Intel for a product that depends on their software. They make killer hardware but can't seem to be able to write a proper driver for anything.

u hit the nail on the head. Intel's graphics drivers suck big time.
 
Back
Top