Graphic Cards More rumors on nVidia G71, G80 & ATi R600 :)

Rahul said:
dude increasing die size is not that simple ...... y AMD not launched dual core in 110nm :) ..... as the tecnology is shifting towards smaller manufacturing tech they will try to integrate two GPU's in single processor and maybe multi threading will also come into gaming scene .....

missa think amd didnt go dual core @ 110nm because of power restriction requirements....... a 90nm process means lower power consumption. they knew the transistor count would increase a plenty once they added a core...... so they waited for 90nm so that the die size (physical) as well as the power requirements would remain in control.

the quad core will happen @ 65 nm because by doing a die shrink (considered a simple process) to 65 nm from 90 nm u r almost cutting the physical die size by half and that means u can pack in even more transistors........ god those 10xx pins must connect to something :D
 
I dont think the cores would be bandwith starved if Nvidia uses an on die memory controller something like ATI is doing on its R5xx series.
Also mfg a 512 bit cores will be insanely expensive not to forget if you wanna couple them with ultra fast memory chips. I doubt if Nvidia will do that.

In any case I think ATI has got the future outlook right Pixel shaders are gonna be the forte of next generation games, you will see long shaders for pratically each and every pixels 4-5 years down the line.
 
kidoman said:
missa think amd didnt go dual core @ 110nm because of power restriction requirements....... a 90nm process means lower power consumption. they knew the transistor count would increase a plenty once they added a core...... so they waited for 90nm so that the die size (physical) as well as the power requirements would remain in control.

the quad core will happen @ 65 nm because by doing a die shrink (considered a simple process) to 65 nm from 90 nm u r almost cutting the physical die size by half and that means u can pack in even more transistors........ god those 10xx pins must connect to something :D

isn't that wat i said ...... when i said 110nm to 90nm that means all benefits that came with 90nm manufacturing ..... do some googling and u will see ..... smaller die size was the main factor for going 90nm ...... coz with 110nm die size was so big that it was not practical coz from a single waffer they can get very little amount of processor so it will be too expensive ...... and power dissipation was a reason also but not the major one ..... it was like icing on the cake :)
 
So DX10 and SM4 pop there heads up....finally with G80. ATI guys will have more problems with this. Dual-Core GPUs, man NVIDIA does take risks every now and then. Looking at there track of getting away with it, I say they should have a winner in there Hands. If true, they must have had good reasons to go Dual-Core way. Also with G80, I guess SLI in its brand new avtaar will roll out. As it makes sense with AM2 also being around corner.
 
Aces170 said:
I dont think the cores would be bandwith starved if Nvidia uses an on die memory controller something like ATI is doing on its R5xx series.
Also mfg a 512 bit cores will be insanely expensive not to forget if you wanna couple them with ultra fast memory chips. I doubt if Nvidia will do that.

In any case I think ATI has got the future outlook right Pixel shaders are gonna be the forte of next generation games, you will see long shaders for pratically each and every pixels 4-5 years down the line.

ok u tell me something....... where exactly is the memory controller now in current nvidia cards?? :ohyeah:
 
Back
Top