Graphic Cards 2nd Price Cut gets Effective GTX 280/260 -$449/299!

Status
Not open for further replies.
gannu said:
:rofl:

U haven't played it until now??? :rofl:

And getting the card to play it; good gosh; :lol:

50 bucks here; 10 chars; :|

lol i have played it but the "ahem" version :bleh:

i want the original to play on online servers :)

i was going to buy it but then it comes free with card which i already wanted so its a better deal :rofl:

who cares how much the game costs to card manufacturers , if it costs that cheap why dont ati give some good budle with their card :P :rofl:
 
Okie Lets consider price on newegg, right now the 260 is selling for $270 after MIR, I'm taking us prices cause they will reflect here in some time and the 4870 is for $295.

LOL Is it saar :P

Even after 1 mnth pr even say 2...bet me on it....GTX 260 wont be even reaching 20k :rofl:

Even the 9800GTX sells for 20k here.....now show me the value in India for Nvidia. Abt KMD pricing also it will be like .

300$ + 8-10$ shipping + 30$ tax = 340$ now :P...comes to almost 18.5k+ local shipping or so. With local 4870 cards at arnd 19.5k does it make sense?

In India still ATI FTW :D...and after the 4870 prices drop in 1-2 weeks locally , where will NV hide :P

If someone can arrange me a GTX 260 locally at even 4870 current price I will just shut up & rest my case :rofl: ....what say guys :P

Or even if imax can buy it for me frm newegg its done :rofl:

And Ya I dnt want those frigging US (that also newegg) prices. Even most of the ppl in US can buy frm them because of their crazy CC policies :P.
 
Lol I wouldn't buy the card at 250$ even for bundling crap like COD4 :rofl:
Some decent price drops. Still waiting, watching...just like I did last year before the G80 :ohyeah:
 
We understand imax and robo ur position, u have a Nvidia board, with maybe 1 Nvidia card, for you people the current prices might look attractive, but for others who are on Intel chipset boards, going with ATI cards still make more sense(since thinking of SLI wud cost us more whereas we can buy a single ATI card for now, and then wait for further price drop and buy another in the future), till the time there is more price cuts from Nvidia.
 
The upcoming HD 4870 X2 will be better than GTX 280SLI or close, so even with these price cuts, AMD wins on price/performance!:hap5:
 
Supra said:
Good deal if below 17k

Can you post the site you are getting the deal frm?

He'd have meant the 9800GTX Supratim ji; :|

I got a quote of 16.5k around a month back;
 
Supra said:
LOL Is it saar :P
Even after 1 mnth pr even say 2...bet me on it....GTX 260 wont be even reaching 20k :rofl:

Even the 9800GTX sells for 20k here.....now show me the value in India for Nvidia. Abt KMD pricing also it will be like .

300$ + 8-10$ shipping + 30$ tax = 340$ now :P...comes to almost 18.5k+ local shipping or so. With local 4870 cards at arnd 19.5k does it make sense?

In India still ATI FTW :D...and after the 4870 prices drop in 1-2 weeks locally , where will NV hide :P

If someone can arrange me a GTX 260 locally at even 4870 current price I will just shut up & rest my case :rofl: ....what say guys :P
Or even if imax can buy it for me frm newegg its done :rofl:

And Ya I dnt want those frigging US (that also newegg) prices. Even most of the ppl in US can buy frm them because of their crazy CC policies :P.

Er.. I dont get this, but $340 is only around 14-15k right? And BTW, I got a quote of 16.3k from KMD for the $299 4870. So should I get around the same price for the 260?

I will be waiting for GTX280 prices to drop again (hopefully) :hap2: . Maybe by then I will have the monies.:clap:
 
HD 4870 beats GTX 260 agreed .............no question about that...

BUT how the heck that 4850CF beats GTX280 :S

well in many of games 4850CF fails to kick its second card in game..that why CF fails with CRYSIS .

no point going for HD4850CF than GTX280 .

well see 4870 .....its a real power eater ....needs very much power & power requrement has always been issue with ATI cards.....
 
Vandal said:
^^About this I must say something. Do you even know what GITG and TWIMTBP programs started by ATi and NVIDIA are all about? Its all about garnering developer support, and supporting and working with developers in turn, and working with feedback from the community of developers. To bias a game so that it works improperly under a particular piece of hardware, as opposed to working flawlessly with another piece of equipment isn't the purpose. The main aims being support and feedback and YES - optimisations do occur. But there are exceptions to optimisations also. Remember a superb looking game called Call of Juarez? Don't? Why don't you read up on it first before 'resting' your case?

If that is the case why DX 10.1 was removed from Assasin's Creed? Is it not because the superiority Radeons were enjoying with this title?

It's not only F***king developer support it's also a matter of dollars. I have no probs with the TWIMTBP program if it makes a difference to end-user for better, but why the hell stifle technological advances like DX10.1 when everybody knows that it corrects the completely f***ed up DX10 that in my opinion was deliberately broken by MS to please Nvidia.

While it's not talked about many, it seems that Nvidia's G80 and the architecture based on it is not suitable to subtleties of DX10.1. Instead, G80 and similar architecture relies on brute force as evident from massive shader power and humongous memory requirements of new Geforce cards. Nvidia will never do DX10.1 on this architecture. It will say this API is not needed or does not bring any benefits to developers but that's not true.

Now, I am not asking you to believe me. Read it on AnandTech:

I know many people were hoping to see DX10.1 implemented in GT200 hardware, but that is not the case. NVIDIA has opted to skip including some of the features of DX10.1 in this generation of their architecture. We are in a situation as with DX9 where SM2.0 hardware was able to do the same things as SM3.0 hardware albeit at reduced performance or efficiency. DX10.1 does not enable a new class of graphics quality or performance, but does enable more options to developers to simplify their code and it does enhance performance when coding certain effects and features.

It's useful to point out that, in spite of the fact that NVIDIA doesn't support DX10.1 and DX10 offers no caps bits, NVIDIA does enable developers to query their driver on support for a feature. This is how they can support multisample readback and any other DX10.1 feature that they chose to expose in this manner. Sure, part of the point of DX10 was to eliminate the need for developers to worry about varying capabilities, but that doesn't mean hardware vendors can't expose those features in other ways. Supporting DX10.1 is all or nothing, but enabling features beyond DX10 that happen to be part of DX10.1 is possible, and NVIDIA has done this for multisample readback and can do it for other things.

While we would love to see NVIDIA and AMD both adopt the same featureset, just as we wish AMD had picked up SM3.0 in R4xx hardware, we can understand the decision to exclude support for the features DX10.1 requires. NVIDIA is well within reason to decide that the ROI on implementing hardware for DX10.1 is not high enough to warrant it. That's all fine and good.

But then PR, marketing and developer relations get involved and what was a simple engineering decision gets turned into something ridiculous.

We know that both G80 and R600 both supported some of the DX10.1 featureset. Our goal at the least has been to determine which, if any, features were added to GT200. We would ideally like to know what DX10.1 specific features GT200 does and does not support, but we'll take what we can get. After asking our question, this is the response we got from NVIDIA Technical Marketing:

"We support Multisample readback, which is about the only dx10.1 feature (some) developers are interested in. If we say what we can't do, ATI will try to have developers do it, which can only harm pc gaming and frustrate gamers."

The policy decision that has lead us to run into this type of response at every turn is reprehensible. Aside from being blatantly untrue at any level, it leaves us to wonder why we find ourselves even having to respond to this sort of a statement. Let's start with why NVIDIA's official position holds no water and then we'll get on to the bit about what it could mean.

The statement multisample readback is the only thing some developers are interested in is untrue: cube map arrays come in quite handy for simplifying and accelerating multiple applications. Necessary? no, but useful? yes. Separate per-MRT blend modes could become useful as deferred shading continues to evolve, and part of what would be great about supporting these features is that they allow developers and researchers to experiment. I get that not many devs will get up in arms about int16 blends, but some DX10.1 features are interesting, and, more to the point, would be even more compelling if both AMD and NVIDIA supported them.


Next, the idea that developers in collusion with ATI would actively try to harm pc gaming and frustrate gamers is false (and wreaks of paranoia). Developers are interested in doing the fastest most efficient thing to get their desired result with as little trouble to themselves as possible. If a techique makes sense, they will either take it or leave it. The goal of a developer is to make the game as enjoyable as possible for as many gamers as possible, and enabling the same experience on both AMD and NVIDIA hardware is vital. Games won't come out with either one of the two major GPU vendors unable to run the game properly because it is bad for the game and bad for the developer.

Just like NVIDIA made an engineering decision about support for DX10.1 features, every game developer must weight the ROI of implementing a specific feature or using a certain technique. With NVIDIA not supporting DX10.1, doing anything DX10.1 becomes less attractive to a developer because they need to write a DX10 code path anyway. Unless a DX10.1 code path is trivial to implement, produces the same result as DX10, and provides some benefit on hardware supporting DX10.1 there is no way it will ever make it into games. Unless there is some sort of marketing deal in place with a publisher to unbalance things which is a fundamental problem with going beyond developer relations and tech support and designing marketing campaigns based on how many games dispaly a particular hardware vendors logo.

The idea that NVIDIA is going to somehow hide the capabilities of their hardware from AMD is also naive. The competition through the use of xrays, electron microscopes and other tools of reverse engineering are going to be the first to discover all the ins and outs of how a piece of silicon works once it hits the market. NIVIDA knows AMD will study GT200 because NVIDIA knows it would be foolish for them not to have an RV670 core on their own chopping block. AMD will know how best to program GT200 before developers do and independantly of any blanket list of features we happen to publish on launch day.

So who really suffers from NVIDIA's flawed policy of silence and deception? The first to feel it are the hardware enthusiasts who love learning about hardware. Next in line are the developers because they don't even know what features NVIDIA is capable of offering. Of course, there is AMD who won't be able to sell developers on support for features that could make their hardware perform better because NVIDIA hardware doesn't support it (even if it does). Finally there are the gamers who can and will never know what could have been if a developer had easy access to just one more tool.

So why would NVIDIA take this less than honorable path? The possibilities are endless, but we're happy to help with a few suggestions. It could just be as simple as preventing AMD from getting code into games that runs well on their hardware (as may have happened with Assassin's Creed). It could be that the features NVIDIA does support are incredibly subpar in performance: just because you can do something doesn't mean you can do it well and admitting support might make them look worse than denying it. It could be that the fundamental architecture is incapable of performing certain basic functions and that reengineering from the ground up would be required for DX10.1 support.

NVIDIA insists that if it reveals it's true feature set, AMD will buy off a bunch of developers with its vast hoards of cash to enable support for DX10.1 code NVIDIA can't run. Oh wait, I'm sorry, NVIDIA is worth twice as much as AMD who is billions in debt and struggling to keep up with its competitors on the CPU and GPU side. So we ask: who do you think is more likely to start buying off developers to the detriment of the industry?

And Nvidia hopes that we believe these lies...

AnandTech: NVIDIA's 1.4 Billion Transistor GPU: GT200 Arrives as the GeForce GTX 280 & 260
 
junkiedogg said:
Er.. I dont get this, but $340 is only around 14-15k right? And BTW, I got a quote of 16.3k from KMD for the $299 4870. So should I get around the same price for the 260?

:S

U say u got a quote of 16.3k for a 299USD card and hw can u expect 340USD card for 14-15k?? :S

Errrm did u mean those rates're applicable for US peeps?? :)
Thats true indeed;

Right now, the GTX260 may be cheaper than the ATI 4870 in the US due to MIR;

But doesn't apply for India; :'(

The 9800GTX still retails at the 18.5k price point, reiterated so many times by Supra;
 
i_max2k2 said:
Also for some reason ATI cards are not able to scale well on Crysis, to me its the most stressful game right now and the best gauge for a video card, if there were more titles like these, the reviews would have been a different story!

lol... Worst game to test cards coz with Crysis you never know where is bottleneck. IMHO, the game is just horribly coded. Didn't a few weeks ago, CRYTEK exec said that they can improve Crysis performance for a lot of GFX cards but it requires some serious recoding. They said they will take take the necessary steps with Crysis: Warhead.

Now as for Crysis quirks, see this:





Checkout the FPS count for 4870X2 and 4870CF at 1680X1050 and 1920X1200. At lower res the cards have slightly lower FPS count :P

Also see the FPS count on 9800GTX SLI and compare it with 9800GX2 (2X8800GTS) across three resolutions.

The scaling for all cards is very uneven... you can't pinpoint a bottleneck with this scenario. do you?
 
kmd quoted 16600 for XFX GTX 260 with 2 free games COD4 and assassin's creed . thats awsome deal i am getting it :ohyeah: :bleh: !!
 
@ morgoth
Sun is going to shine better on ATI side coz Saga, EA & Blizzard are launching Dx 10.1 games in Q4. Expect some more low blow on GT200 & NV fancy Dx 10..:@:
BattleForge, ive heard is coming from EA..:D:
 
XFX didnt sell enough 9800GTX, what they are doing now is including 9800GTX games in GTX 260, since 9800GTX is now dirt cheap to include any bundle.. What a way to compete with 4870.. Wah wah.
:@:
 
Status
Not open for further replies.