Graphic Cards Larabee GPU cancelled !

Intel told Ars today that its long-delayed Larrabee discrete graphics product has suffered yet another delay, so the company has had to "reset" its overall GPU strategy and reposition plans and its expectations for the first version of the Larrabee product.

Specifically, Larrabee v1 is so delayed that, at the time it eventually launches, it just won't be competitive as a discrete graphics part, so Intel plans to wring some value out of it by putting it out as a test-bed for doing multicore graphics and supercomputing development. Intel will eventually put out a GPU, but may not be the one we've been calling "Larrabee" for the past few years.

If the fact that Larrabee is "launching" not as a GPU, but as a kind of multicore graphics demo unit, sounds like a cancellation to you, that's because it kinda sorta is. It's not a cancellation in the sense that Intel is throwing in the towel on discrete graphics, because that's definitely not the case. The company reiterated that it still plans to launch a many-core discrete graphics product, but won't be saying anything about that future product until sometime next year. Whatever it is, it won't be the hardware/software combination that it previously announced, and that we described in our coverage of Intel's big August 2008 Larrabee reveal. It will be something else, and Intel wouldn't even characterize the relationship of that future something to the current Larrabee product.

The main issue behind the delay, it appears, was the hardware. That's not surprising, because Larrabee is a big, complex part, and it's quite a departure from anything that Intel has done. The hardware delay would have resulted in a software delay, and if Intel were to launch Larrabee with an immature software stack then it would be roadkill as a GPU.

Even though Intel couldn't have the Larrabee software ready on a timeframe that would make it competitive with NVIDIA and ATI (again, Larrabee is really a hardware/software hybrid GPU), the chipmaker can still push out the hardware itself and let others have a go at using it for graphics and HPC. Hence the plan to release it as a development platform for doing multi- and many-core research for HPC and graphics.

The Larrabee delay is obviously great news for NVIDIA, and even better news for AMD. This leaves both competitors to share the market for discrete GPUs, and quite possibly next-gen game consoles, for the next two years. NVIDIA's underlying long-term problems (the death of its integrated graphics processor market and the ongoing decline in the high-end discrete GPU market) are still there. And AMD still has well-documented challenges of its own as it struggles to regroup and return to growth after a brutally punishing few quarters of layoffs and cost-cutting. But execs at both companies have got to be high-fiving each other right now.

As for Intel's long-term future in the discrete GPU market, we'll have to wait until next year before we know more. Thankfully, that's just around the corner.

Source:
Intel's Larrabee GPU put on ice, more news to come in 2010
 
Intel never failed to deliver since quite a while now. I'm pretty sure it'll put nvidia and ATI out of business when it does come out, judging by their claims they are planning something very big, these setbacks are very likely. Cant wait for it!

Nice share!
 
intel never can and never will put ati/nvidia out of business -- first they would need to dent the gamer's thick headed mentality--there's only room for 2 and it ain't big enuff for intel to step in!!!(i would never buy an intel gfx solution bcoz it won't be comaprable to the biggies ati/nvidia).and 16 yrs back when i was a kid --we had 3dfx voodoo savage,s3,trident,nvidia,ati,mirage and loads of other vendors just two survived - ati/nvidia.(probably bcoz they ate up the other competitors)

i am highly skeptical of intel's larabee - maybe good for htpcs or low end gaming systems but nothing more and it would fail! call me a pessimist if you like!
 
sunny27 said:
intel never can and never will put ati/nvidia out of business -- first they would need to dent the gamer's thick headed mentality--there's only room for 2 and it ain't big enuff for intel to step in!!!(i would never buy an intel gfx solution bcoz it won't be comaprable to the biggies ati/nvidia).and 16 yrs back when i was a kid --we had 3dfx voodoo savage,s3,trident,nvidia,ati,mirage and loads of other vendors just two survived - ati/nvidia.(probably bcoz they ate up the other competitors)
i am highly skeptical of intel's larabee - maybe good for htpcs or low end gaming systems but nothing more and it would fail! call me a pessimist if you like!

They can actually....they got so much resources....we all know how they can kill AMD...Nvidia chipsets maybe phased out soon too...i was hoping they would launch it...just wanted to see Intel fail big time...
 
sunny27 said:
intel never can and never will put ati/nvidia out of business -- first they would need to dent the gamer's thick headed mentality--there's only room for 2 and it ain't big enuff for intel to step in!!!(i would never buy an intel gfx solution bcoz it won't be comaprable to the biggies ati/nvidia).and 16 yrs back when i was a kid --we had 3dfx voodoo savage,s3,trident,nvidia,ati,mirage and loads of other vendors just two survived - ati/nvidia.(probably bcoz they ate up the other competitors)

i am highly skeptical of intel's larabee - maybe good for htpcs or low end gaming systems but nothing more and it would fail! call me a pessimist if you like!

Wow.. those are some pretty heavy words you've used there. Can you justify by saying something other than
i would never buy an intel gfx solution bcoz it won't be comaprable to the biggies ati/nvidia
and
i am highly skeptical of intel's larabee

If you can't, I would suggest you go and read up bit :)

When Intel wants IN on something, they get IN on it :)

Also, just fyi, the i740 from Intel, back in the day before they exited the gfx market, was pretty damn good.
 
I am pretty sure that now we are talking INTEL can't do it , but at this very instant they are doing it and they will do it.

They have monetary resources, engineers , hardware

I see no reason why INTEL can't make a name in GPU industry.

The more the number of competitors , the better it's for the consumers

The GPU may progress even further and new advanced technologies will be developed due to competition from intel.

If only Intel was there in CPU industry , then we would be awind in P4 @ 3.2ghz
 
muzux2 said:
He was right about Larrabee being a ppt Presentation

Actually... NO. Larrabee was, or is, pretty much real. The only reason Intel want to axe the current project, because it's past it's time. In current form, Larrabee is not only unfinished, but also uncompetitive with current crop of GPUs.

It's a very very vague comparison, but recently unveiled prototype of 48-core processor is similar approach for Larrabee. Multiple X86 cores with instruction sets for OGL and DX for graphic processing. X86 instructions are well established and there is not a lot of learning curve involved as compared to learning and adapting CUDA, so Intel would very well want to take advantage of strong established base of developers. Let's just say that Intel was building their own kind of Fermi... if Fermi is cGPU, then Larrabee would be very well gCPU, if you catch my drift.

Intel will come back with Larrabee, but not the current project or name as we know it. Parallel processing on GPU like mechanism is promising area for HPC markets and Intel wouldn't want giving nVidia a comfortable headstart there.
 
Anubis said:
I am pretty sure that now we are talking INTEL can't do it , but at this very instant they are doing it and they will do it.

They have monetary resources, engineers , hardware

and yet, larrabee is an utter failure right now. all their monetary resources, engineers and hardware couldn't save this disaster from happening. and guess what, with the fermi gpgpu, nvidia is already moving the goalpost even further away from intel. amd/ati will soon follow the same path. larrabee will remain vaporware :rofl:
 
stalker said:
Wow.. those are some pretty heavy words you've used there. Can you justify by saying something other than and

If you can't, I would suggest you go and read up bit :)
When Intel wants IN on something, they get IN on it :)

Also, just fyi, the i740 from Intel, back in the day before they exited the gfx market, was pretty damn good.

too optimistic??? The same optimism was shown by certain enthusiast when Larabee was announced and see where it is now even after so many years. Experts agree on one thing that GPU's are far more complex than making CPU's. So yes 10 Years down the line they just might have enough experience to have Nvidia/ATI beating hardware but do not expect their 1st or 2nd iteration to be revolutionary.....

And by which standard do you say that GMA was ever competitive??? It was always the worst compared to Nvidia/ATI......
 
It was already planned to launch late, and now its delayed...

At first, heard that they will produce GPUs with power close enuf to nvidia GTX 275 almost, but only during 2011~2012. That woukld simply put them almost to the IGP performance level of that time, and now its getting more delayed.

GOD help larabee.
 
iGo said:
Actually... NO. Larrabee was, or is, pretty much real. The only reason Intel want to axe the current project, because it's past it's time. In current form, Larrabee is not only unfinished, but also uncompetitive with current crop of GPUs.

It's a very very vague comparison, but recently unveiled prototype of 48-core processor is similar approach for Larrabee. Multiple X86 cores with instruction sets for OGL and DX for graphic processing. X86 instructions are well established and there is not a lot of learning curve involved as compared to learning and adapting CUDA, so Intel would very well want to take advantage of strong established base of developers. Let's just say that Intel was building their own kind of Fermi... if Fermi is cGPU, then Larrabee would be very well gCPU, if you catch my drift.

Intel will come back with Larrabee, but not the current project or name as we know it. Parallel processing on GPU like mechanism is promising area for HPC markets and Intel wouldn't want giving nVidia a comfortable headstart there.

Yes, LRB exist as real GPU, Intel showed real LRB boards even showed Demo of Working LRB.. Nvidia is yet to show off REAL Fermi..:rofl:
 
@ rajan and the stalker --if you were to blow 10--20k of your blood sweat and tears on a gpu would it be an amd-ati/nvidia or an intel?

intel is a new player in the dedicated gpu market now(if they enter it ie) i would bet you wouldn't buy you might consider it but not buy it--you would choose the safer route with ati/nvidia.

also they may kill amd in the cpu wars but the gfx and chipset wars intel loose big time---if you want a htpc would you choose intel's onboard solution now or an amd solution?

hope these questions answer your questions!!!!
 
thebanik said:
too optimistic??? The same optimism was shown by certain enthusiast when Larabee was announced and see where it is now even after so many years. Experts agree on one thing that GPU's are far more complex than making CPU's. So yes 10 Years down the line they just might have enough experience to have Nvidia/ATI beating hardware but do not expect their 1st or 2nd iteration to be revolutionary.....

And by which standard do you say that GMA was ever competitive??? It was always the worst compared to Nvidia/ATI......

Mr OCer, please try not to confuse things that others might have said :p

I never spoke about the GMA IGPs.. I was referring to the i740 GPU

Also, mine isn't exactly misplaced optimism. Sadly, I lack the means to prove myself otherwise.

What I can say is, Intel is trying to change the game here. Admittedly, it's not an easy task, especially for a firm which has been out of the market for as long as Intel has been.

The very first gen LRB product IS gonna be revolutionary, just not for churning frames in Crysis.

Someone at intel said pretty much this ->

"First gen LRB is not going to be a regular consumer product. It's going to be released internally/externally as a HPC sku"

I say this again, intel is not making just an uber GPU. LRB is something wayy beyond that :)

PS - you mention 'soo many years'. LRB was announced sometime in 07 afaik.

sunny27 said:
@ rajan and the stalker --if you were to blow 10--20k of your blood sweat and tears on a gpu would it be an amd-ati/nvidia or an intel?

intel is a new player in the dedicated gpu market now(if they enter it ie) i would bet you wouldn't buy you might consider it but not buy it--you would choose the safer route with ati/nvidia.

also they may kill amd in the cpu wars but the gfx and chipset wars intel loose big time---if you want a htpc would you choose intel's onboard solution now or an amd solution?

hope these questions answer your questions!!!!

I would be able to answer that once they actually have a product in the market :)

So i'd rather wait and watch than write them off at this stage. Everyone wanted to write off ATI as well, just before they killed all else with the 9800s.. and again with the 48x series.

Also, do you have any idea about how big intels chipset and gfx marketshare is?

Don't jump to conclusions based upon completely irrelevant facts is all i say
 
sunny27 said:
@ rajan and the stalker --if you were to blow 10--20k of your blood sweat and tears on a gpu would it be an amd-ati/nvidia or an intel?
intel is a new player in the dedicated gpu market now(if they enter it ie) i would bet you wouldn't buy you might consider it but not buy it--you would choose the safer route with ati/nvidia.
also they may kill amd in the cpu wars but the gfx and chipset wars intel loose big time---if you want a htpc would you choose intel's onboard solution now or an amd solution?
hope these questions answer your questions!!!!

Really? Intel holds 53% of overall graphics market share and that too with only IGP's..:lol: Regarding chipsets its over 80%...
 
Intel's entry cud just kill the gamers...coz they always play w/o price cuts and they do it only if its too necessary to make sales, as they have been doin in CPU business.

If they enters GPU business, then nvidia/ATi would seem no reason to reduce price frequently as competitor wudnt be doin the same.

This is just my perspective. Just look at i7. Selling at almost same price as the time it was launched, and look at PII 955 which is selling almost 5k less than initial price. I cant stand to see same happening on GPU, prices staying stabilized for too long.
 
@Sunny27: You are thinking only in one track and one track only... gaming centric GPU-solution. While nVidia and Intel (with larrabee) is looking forward to diversify. AMD is not investing too much on GPGPU bandwagon openly, but betting mostly on OpenCL.

If you're looking for buying on gaming card, then yes, your argument stands valid... but then again, neither Larrabee nor Fermi is out yet it's absolutely stupid to write off or bet too much unreleased product. You don't know how it performs, until it performs.

Hope that makes some sense for you.

@dOm1naTOr: LOL... you do understand how price cuts happen, don't you? why PhenomII prices are down and why not Core i7... because Intel doesn't have any reason to reduce prices because AMD is not putting enough pressure on Intel to do so. Just look at HD5xx prices... because there is no pressure from nVidia, AMD/ATi can keep the prices high. 58xx series is selling around $300+ price-point. Off-course, part of the price clog is also attributed to TSMC F*up and AMD/ATi not able to generate enough cards to fulfill demand.

I don't see just because Intel is entering in market it's going to change the price game. To be able to do that, Intel first need to capture enough market share in discreet GPU to call shots on prices. Unlike CPUs... consumer GPU is very very cut-throat market, and it will be some time before Intel could really capture that kind of marketshare.
 
stalker said:
Mr OCer, please try not to confuse things that others might have said :p
I never spoke about the GMA IGPs.. I was referring to the i740 GPU

What has OCing got to do anything with this man??? Do I hurt any1 by OCing and spending my own god damn money on products and cooling, this is the 3rd time I have been referred like that on TE, and most of the time its sarcastic rather than encouraging.......

Oh, never heard of the i740, was released in 1998 it seems, and from initial googling it seems it was one of the major debacle for Intel and worst then Nvidia and 3DFx's offering in its time....

Top Tech Blunders: 10 Products that Massively Failed [Archive] - nV News Forums
 
Back
Top