Graphic Cards Bottleneck or not?

Status
Not open for further replies.
antarcticprince said:
Anyways i finally got a quotation

Phenom II 550 - 5.1k
Gigabyte 785GMS2H - 4.3k
LG DVD RW - 1.1k
500GB HDD - 2.5k
CM 310 - 1.5k
Corsair 450W - 3.8k
GTX 260 core 216 - 10.8k

I am getting a rebate of 5400 on my old system so it brings the entire budget to 23k. He he!!

No RAM ?? :O

Cut down the graphic card to a MSI/Saphire ATI HD 4850 1GB for 7.3k. Spend the rest 3.5k in 3GB Corsair xms2 DDR2 800MHz RAM. :) With a 4850, you can play all games with max settings @ 1440x900
Anubis said:
1. Yes it will be.
Dude , seriously it lacks L3 cahce , its a huge bottleneck.

2. You gotta be joking.
How can he not feel the differenece between 40fps and 60 fps ?

20fps difference
Reply With Quote

1. I think there's been a misconception. Lack of L3 cache doesn't cause bottleneck, not in gaming. Bottleneck is mainly caused by the processor speed and some other things. Although its true that the presence of L3 cache does improve general/non gaming computing scenarios significantly. And of course I'll suggest a Phenom II X2 550 over a Athlon II X4 620 for gaming until most of the game engines start to utilize 4 cores and I don't see that in near future.

2. No. You won't see any difference between 40fps and 60fps. But we like to keep avg fps high for a different reason. The fps value vary significantly in the texture heavy games. And for a game which run at 40fps avg will have a few frames where the rate drops to ~20fps ( actual figure will depend on the game ). Thus the higher the avg fps value, the higher the lowest fps value ( mathematically the converse is the TRUTH :P )
 
Guys, Hes got a 19" monitor with maximum resolution of 1440x900.

Spending on anything above HD4850 512MB DDR3 HDMI isnt worth.

Would have agreed with HD4870 512MB if OP is planning to move to 22" soon. Anything above this is USELESS!

EDIT:

@Sabby:

So, one IS able to differentaite between 40fps-avg. and 60fps-avg. ..Right? You said so.

Anyways, im even able to differentiate easily between 40fps-constant and 60fps-constant. Its a HUGE difference. Most evident in: Race Driver GRID, Counter Strike, Unreal Tournament 3. And very evident in other F.P.S. and racing games too. Though doesnt make difference in RPG/RTS games.

Easy example which anyone can experiment:

Play counter strike with V-Sync on. i.e. 60fps constant.

Play Counter strike with V-sync off. i.e. 100fps constant.

Huge difference.
 
  • Like
Reactions: 1 person
sabby said:
No RAM ?? :O

Cut down the graphic card to a MSI/Saphire ATI HD 4850 1GB for 7.3k. Spend the rest 3.5k in 3GB Corsair xms2 DDR2 800MHz RAM. :) With a 4850, you can play all games with max settings @ 1440x900
1. I think there's been a misconception. Lack of L3 cache doesn't cause bottleneck, not in gaming. Bottleneck is mainly caused by the processor speed and some other things. Although its true that the presence of L3 cache does improve general/non gaming computing scenarios significantly. And of course I'll suggest a Phenom II X2 550 over a Athlon II X4 620 for gaming until most of the game engines start to utilize 4 cores and I don't see that in near future.

2. No. You won't see any difference between 40fps and 60fps. But we like to keep avg fps high for a different reason. The fps value vary significantly in the texture heavy games. And for a game which run at 40fps avg will have a few frames where the rate drops to ~20fps ( actual figure will depend on the game ). Thus the higher the avg fps value, the higher the lowest fps value ( mathematically the converse is the TRUTH :P )

Nicely put
 
Lots of wild stuff floating about nowadays.

550BE >>>> 620 in games - higher clocks work better than more cores, plus L3 makes a big difference. 550BE clocks higher and easier too, providing good boost for most games - I have mine running @3.8GHz and it runs everything maxed out on my Dell 3008FWP *at full resolution, 2560x1600* (I don't play Crysis).

CUDA for games? Which games use Cuda? The 5770 is currently the best bet. You will miss the PhysX effects in some games which are locked down (like Batman AA) - but no difference in playability. I don't know how many devs are working with PhysX, but a lot are working with DX11, so when those titles arrived you're SOL if you get a GTX260 or anything from the nVidia camp. Since this is a long-term buy, think carefully about the GPU.

There is no playable difference between 30fps and 60fps as long as there are no framerate drops. Unless you can actually aim and shoot in under 1/30 of a second, you will feel no difference.

Movies are recorded at 23.97fps, FWIW - the speed at which you can start to tell a moving image from a series of frames.

Late as it may be, the 240 would bottleneck anything more onboard graphics. Seriously. If you could (and it's not tough) clock it past 3.5GHz you have a shot at somewhat respectable performance. The 620 lags even the 240 in gaming. It's a nice multicore for those who are using a machine that needs it, but for gaming it is pretty average.
 
where ?

Why don't you order one from the TE dealers ?

Even a 5850 comes at 18k which far better than 5770.
 
:eek: u have a good chip, mine does 3.4 ghz @ 1.235 vcore (stock), but i use everything @ stock volts these days as i do almost 24x7 torrenting :P
tosh79 said:
+1, for this combo, E5200 + Gigabyte G31EMS2L are made for each other combo, I recently bought the same, does 3.8Ghz on stock cooling :hap2:
It will be just enough, to be safe i would've bought the VX550, some feel VX450 is the KING can handle every shit rofl! It is good to have some headroom.
antarcticprince said:
will the Corsair 450W be enough for my PC??Especially with the 260GTX???
 
I don't understand why you guys are saying X2 240 will "seriously" bottleneck a 4870, GTX 260 Core 216, and one more member says it's only good for IGP. Agreed that a 550 BE will be better for gaming and other stuff, but remember the L3 cache makes a 10% difference. But that doesn't mean that X2 240 can't run games. In fact it's faster than e5200 in games. Most importantly, there won't be any scenario where it can't provide comfortable FPS. When OCed, it can easily outperform a stock 550BE.
 
dvijaydev46 said:
I don't understand why you guys are saying X2 240 will "seriously" bottleneck a 4870, GTX 260 Core 216, and one more member says it's only good for IGP. Agreed that a 550 BE will be better for gaming and other stuff, but remember the L3 cache makes a 10% difference. But that doesn't mean that X2 240 can't run games. In fact it's faster than e5200 in games. Most importantly, there won't be any scenario where it can't provide comfortable FPS. When OCed, it can easily outperform a stock 550BE.

+1

and to all those who say X2 240 wud be heavy bottleneck for a weak GPU like the GTX 260 or 4870....plz go and read some more reviews and dont mislead the OP.

please...

go through this thread and you will know

http://www.techenclave.com/graphic-cards/upgrading-gpu-will-cpu-cause-bottle-152488.html

and this> Intel E6850 Bottleneck Investigation | AlienBabelTech

this clearly says that, at high setings which the GPU can possible handle [with more than enuf fps for smooth gameplay], even an old Intel C2D E6850 downclocked to 2Ghz isnt big of a bottleneck with a GTX 285. And you are saying an AII 240~ 2.7Ghz would be highly bottleneck for a GTX 260...:rofl:

And dont start saying, he will be playing at 1440*900 and CPU becomes bottleneck at lower resolutions. Coz u see the E6850@ 2GHz was easily able to pull thu more than enuf fps @ 1920*1200. It can easily do much more fps @ 1440*900 where GPU feels much more lighter and the fps wud be much much better than @ 1920*1200.

And overclocking wud just add fuel on the fire.

And ofcouse it would limit the fps if you play at lower quality levels @ 1440*900 but still fps wud be not less than 100 in such cases and it cud be 150 with an i5. But will anyone do that ?

And lastly, i would suggest a 4870 512 over a GTX 260. Just coz the price difference is huge, and performance difference is just < 8%.

deltapage is selling XFX 4870 512 for 8k + ship and there is a GO goin on TE now on same.

And a VX 450 should be more than enough for a GTX 260/4870. Infact you can run more more peripherals and devices with it too and it will handle it easily. Tagan TG500, though isnt as good as a VX 450, too is a good choice and can run a 4870 oober cool.
 
Thanks you guys for all your comments and suggestions will post my new rig config in a few hours.!!! :-)

Btw some of you suggested the 5770. It has a 128 bit bus compared to the 448 bit bus of the GTX. so isn't the GTX faster with its GDDR3 than the 5770 with GDDR5 and its DX 11 support. Also there is still about two years left before DX 11 goes mainstream and games are actually implemented with DX 11. Plus before DX 11 dominates the market, DX 10 would be still be available on games like DX9 is still an option inspite of DX10 being in market for around 2 years now. So going in for an ATI is not good IMO as nvidia will soon launch its monsters from their green vault and eat up the red ones. And in 4 years when DX 11 goes mainstream will get a card then and use my old card for Physx.!!!So Thanks again!!!!!!!!!!!!!!!!!!!
 
good value config, decent price

antarcticprince said:
Anyways i finally got a quotation

Phenom II 550 - 5.1k
Gigabyte 785GMS2H - 4.3k
LG DVD RW - 1.1k
500GB HDD - 2.5k
CM 310 - 1.5k
Corsair 450W - 3.8k
GTX 260 core 216 - 10.8k

I am getting a rebate of 5400 on my old system so it brings the entire budget to 23k. He he!!
 
@antarcticprince: you can't say that fermi will eat up ATI. In fact there are news that Fermi has missed it clock targets. Unless the cards hit the markets, you can't say for sure which one will be faster. We can't say when DX11 will become mainstream, but it shouldn't take more than 1.5 years. In 4 years who knows, there can be DX 12 or even 13
 
antarcticprince said:
Will the GTX 260 be bottlenecked by a X2 240 @ stock??
If yes then is the 4870 1Gb a better buy?

Buddy, just go ahead and buy the 240. Its faster than an E5200 and you can overclock the 240 by a decent amount. The GTX260/4870 are practically the same performance. They will take a hit of some 10-15% at worst. Go ahead with your buy, overclock it a bit, even if you dont, you will not notice much difference in terms of playability. If a 550x2 fits you budget, go for that instead.
 
@dominator: when you run a benchmark at 1920x it's very heavily GPU dependant, so obviously there will be very little framerate drop. In the ABT link you posted, you will see that the quantum of the reduction is inversely related to the resolution - higher resolutions need more GPU, lower resolutions need more CPU.

In the end it's about balancing a setup, and with the kind of GPU options you see today ~11K, the 240 will be a laggard (though it would probably be fine with something less than a 4850 and a budget 19" monitor). I kind of buy the point that 1440x is too low a resolution for hardware like this, then it's probably time to re-examine the setup.

Also remember these tests are for games today, not tomorrow. As horsepower requirements increase, the beefier setup will be able to keep up for a much longer period of time.

@TS: GDDR5 has 2x the bus width of GDDR3. So a 128-bit bus is effectively 256 wide, and the higher mem speed makes up for the apparent lack of bandwidth.

The GTX260 is over a year old and is heavily outdated. I see you've made your mind up to go nVidia, but I can tell you it's a poor choice. The problem with DX10 was that it was a Vista-only option, and obviously devs are not going to bother with something that the majority of the market rejected. Where DX10 was about candy, 11 is about efficiency and devs are all over it (releases starting Jan 2010), also given the fact that Win7 is selling well, even for old machines that were running XP. Your choice is a poor one because you're using conjecture and not evidence.

Also about DX11, here is the list of upcoming titles: PC Perspective - AMD's list of DX11 titles is getting some 'company' Starting next month.

PhysX is for effects, DX11 is core engine code. There's a pretty big difference between the two. Also, PhysX is a closed, proprietary standard - whereas DX11 is open (as open as something from microsoft can be, but it's an industry standard. PhysX is *not*. Guess which developers will choose?

nvidia will soon launch its monsters from their green vault and eat up the red ones

Basis what information, links or facts? This is typical fanboy behaviour, and it will only lead you into a hole of despair and one in your wallet. I bought my first ATI card last year, and I realised what I had been missing for years by sticking with the green guys. Wish you all the best in your journey and your upgrade. Good luck.
 
^^ Yeah. But the point was, if a CPU could muster playable FPS at higher resolutions, so it can for lower resolutions. The second thing was, everyone was saying 240 would bottleneck "severely" the likes of 4870 GTX260 etc, but I wanted to make this clear to everyone that, it wasn't the case. Obviously it's a good idea to pair a more beefy CPU with higher end GPUs so that the full potential of the cards could be utilised, but i wanted others understand that a 240 is no slouch at stock and when OCed, it could outperform more powerful CPUs and that L3 cache isn't a game changer in AMD CPUs.
 
dvijaydev46 said:
^^ Yeah. But the point was, if a CPU could muster playable FPS at higher resolutions, so it can for lower resolutions. The second thing was, everyone was saying 240 would bottleneck "severely" the likes of 4870 GTX260 etc, but I wanted to make this clear to everyone that, it wasn't the case. Obviously it's a good idea to pair a more beefy CPU with higher end GPUs so that the full potential of the cards could be utilised, but i wanted others understand that a 240 is no slouch at stock and when OCed, it could outperform more powerful CPUs and that L3 cache isn't a game changer in AMD CPUs.
^+1

Yes we all know that at low res its more or less cpu Dependant. But, say you get 70fps with a 240 and 110fps with an i7 975, in terms of "playability" both are practically the same.
 
You will notice diff b/w 40fps and 60 fps in some rare games like Counter Strike and UT. But CS and UT3 cud be easily handled even by a single core @ more than 100 fps.

AII is way better than older athlons, with double cache and higher clocks due to 45nm and greater Overclockability. Also Intel E 5xxx is equally good, but i wud suggest AII for gaming and E5xxx for other works.

And it does give some improvement with PII X2 550 or greater, but it more worth to invest that amount on better GPU for a budget constraint gamer [here 25k]. If he had 30k budget, surelly better CPU cud be suggested. And AII 240 is really cheap @ 2.9k, and you cant ask more for that price. And you have always the option to upgrade the CPu after some years to anythin available with Am3 package [and there wud be plenty unlike LGA 775], and the 25k constrain is todays situation, rite?
 
Status
Not open for further replies.