Graphic Cards The Nvidia Fermi / GTX 4XX Thread

Status
Not open for further replies.
spindoctor said:
nvidia has confirmed that the card they showed in the pictures was a mockup. but they did have one demo at the conference working on the real fermi card.

I am not so sure about that. I would believe it when they show the working card. I was doubtful about the existence of the card ever since I read the specs. if the specs are correct, then I would say that the retail cards are atleast 6~8 months away. The launch event was a typical strategy employed under desperate circumstances.
 
Nvidia plans three Fermi cards at launch

High end, dual and performance

We learned that towards the end of 2009 Nvidia plans to launch three Fermi architecture based cards. The one that Nvidia talked about, is the high end single chip card and this one was said to to have 16 clusters and total of 512 shader cores. Let’s call this it Fermi GTX.

The dual card will naturally have two of these cores and likely a bit less than 2x512 shader cores simply as it has to operate with a TDP of around 300W. Heat is a big enemy of semiconductors and once you get up to 3.1 billion transistors even with small 40nm gates, you still end up with a lot of heat. If we could come up with a codename for better understanding, it would be Fermi GX2.

Fudzilla
 
Lord Nemesis said:
I am not so sure about that. I would believe it when they show the working card. I was doubtful about the existence of the card ever since I read the specs. if the specs are correct, then I would say that the retail cards are atleast 6~8 months away. The launch event was a typical strategy employed under desperate circumstances.

apparently there was some cuda demo for astrophysics during the conference that was done on working fermi silicon. the fact that they don't have a card you can slot into your motherboard now doesn't mean they don't have anything working. i'm pretty sure they would not lie about the specs.

it's obvious they are in trouble though. ati had working cards back in august i think, while nv is not expected to launch properly for another 3-4 months, so they are definitely losing marketshare to amd/ati. having said that, let's not forget just how ambitious fermi actually is. they are not just creating a faster 3d accelerator than ati... this thing they are making is a computational monster. there is a possibility that the fermi might flop or it might just revolutionize the way we think about gpus. and in any case, it is being said that the gt300 will be 20-30% faster than the 5870 (although i believe this is all theoretical speculation). i guess that might be worth the wait for some.
 
^^ Well thats why I said that the retail cards would be at least 6~8 months off. Even if they have working silicon it does not mean that are ready to ship retail cards (especially if yields are low).

Also I think there is a shift in nVidia's strategy with regard to how their chips are being positioned. I think they are moving away from GPU market and taking the GPGPU thing more seriously (so more of high performance parallel computing chips than simple GPU's) Makes sense how Intel and AMD have been gradually alienating nVidia from the Desktop GPU market.

From a Business perspective this is a great move for nVidia's to secure their future and In fact I was myself expecting this from nVidia sooner than later, but its not such a great as far as GPU market is concerned. They are adding features that a GPU does not really need, its going to incur unnecessary cost. This is sort of like how Sony put something like the Cell Engine in a game console. Great chip and massive processing power, but mostly unused in a game console while adding considerably in terms of cost (both from a financial perspective as well as in terms of power consumption).

Also if GT300 is only 30% faster than 5870, then that's really a bad thing for nVidia considering AMD actually positions the X2 series as the high end solution to compete with nVidia's high end GTX x80 series. For example 4870X2 was competition for GTX280.
 
Ray-tracing included

You have to love the Peoples Republic of China for all the leaks that it provides. This time, the good chaps from Pczilla have leaked some "lifelike" images that have supposedly been rendered on Fermi.
We have no idea if these are pre-renders of real-time generated pictures, but the human face rendering is definitely a multilayer rendered image with a thus far unseen level of detail. The beard looks great and you can even see the fat on the face that creates a much more realistic image.

The third image is a ray-traced demo that looks impressive and despite the fact that an average Bob might not see it, the lightning on the screen looks great. You can easily see multiple light sources scattered throughout the scene, including several point light sources and an area light source with different light temperatures.

The shadows, reflections and highlights look highly realistic, and you basically get that soft, fuzzy global illumination look of off-line rendering. It might take a few years before we get such realism in games, but we're obviously getting there.

Fudzilla - Fermi lifelike rendered images spotted

pictures are here

NVIDIA Fermi(GT300) ¹âÃß×·×ÙÑÃʾ - ¡õ-ÃÔ¿¨GPUÓÎ÷ÌÖÂÛÇø - PCZILLAÓ²¼þÂÛ̳ ÃÔ¿¨,GPU,ÓÎ÷,ÃÂÔØ,³¬Æµ,Ö÷°å,ÃÔ¿¨³¬Æµ,CPU,CPU†Âµ,ÓÎ÷½Øü,ÓÎ÷ÃÂÔØ,ÊýÂë,IT,ITÃÂÎÅ,¼¼Êõ - Powered by Discuz!
 
Nvidia Corp. has confirmed in a brief media interview that it will be able to cut-down the Fermi-G300 graphics processors in order to address certain specific markets and price-points. The move is natural for all graphics chips designers, but this time Nvidia openly admits that many of implemented capabilities will hardly offer benefits for the consumer right away.

“We are paying a bit of a compute tax in that we launched a part where a lot of the consumer compute applications haven’t really taken hold yet. But over time as more consumer computer applications are developed that take advantage of our compute (consumer) features, I think it's going to give us a big leg up,” William Dally, chief scientist at Nvidia, told Cnet News.com.

Nvidia Can Disable Certain Fermi Features on Gaming Graphics Cards - X-bit labs

PS: We might never see a full performance of GT300 on a gaming card. full fleged GT300 is only for professional workstation/gpgpu..
 
all this is boring. why don't they release gaming benchmarks so that we can see how it matches up against the 5800 series? it's already november ffs.
 
They have lost the lead, Hope they are back with a bang.

March 2010 is the expected arrival of the performance/mainstream cards.
 
Even I was told from very reliable sources that Fermi is up and running Win 7 nicely :p Cannot comment on the performance but Fermi is for real :p
 
stalker said:
All I can say is that I have seen the real thing working :eek:hyeah:

does it has so many wires protruding from it that Nvidia can't show it to even press? This was the argument of Nvidia when Jensun waved that fake Fermi board... please confirm?

Also, according to Charlie, Nvidia has seven working Fermi chips. Did you see one of them or was it the eighth one?

also some hint/leak will be nice. For now, even some info about the board, cooling system, or power connectors will do :p
 
Status
Not open for further replies.
Back
Top