Graphic Cards GPU pricing trends

Why not get a 3080Ti at around 50-55k in the used market? Performance is not that much worse for a much lower price; the 4070 is definitely not a 55% performance incresee than the 3080Ti. But uderstandable if you are not open to buying used GPUs.
From firsthand use the main difference i found is power consumption and the card runs very cool compared to 30 series cards....less heat = longer life...my card when capped to 4k 60 does it with varying power consumption from 100-202w....the same game with 3060ti chewed constant 180-200w while performing a bit worse.
 
From firsthand use the main difference i found is power consumption and the card runs very cool compared to 30 series cards....less heat = longer life...my card when capped to 4k 60 does it with varying power consumption from 100-202w....the same game with 3060ti chewed constant 180-200w while performing a bit worse.
Fair point, the 40 series is much more efficient than the 30 series. But I don't think that will affect longevity all that much - maybe the 40 series cards will last 10-12 years instead of 15+ years, since the cooling tech for most aftermarket 30 series cards is excellent anyway and they're kinda over engineered (maybe except the 3090 :p). Personally I'd still get the 3080Ti and invest the 25-30k difference for the next GPU, but you do you :D
Enjoy your new GPU!
 
Why not get a 3080Ti at around 50-55k in the used market? Performance is not that much worse for a much lower price; the 4070 is definitely not a 55% performance incresee than the 3080Ti. But uderstandable if you are not open to buying used GPUs.
What about a used 3090 non FE ?
 
Fair point, the 40 series is much more efficient than the 30 series. But I don't think that will affect longevity all that much - maybe the 40 series cards will last 10-12 years instead of 15+ years, since the cooling tech for most aftermarket 30 series cards is excellent anyway and they're kinda over engineered (maybe except the 3090 :p). Personally I'd still get the 3080Ti and invest the 25-30k difference for the next GPU, but you do you :D
Enjoy your new GPU!
Yea for me it was use a GPU for 5 years and also look at how much money i will end up paying in bills at month end. The money u end up saving in the long run turns out to be around the amount u save immediately plus lesser heat well that was my thinking for taking the decision.

Edit :- the heat factor was major in decision making...less means i can play longer hours with quieter system and no need for Ac. I did look at 3080ti which i was offered for 44k in the end it was if i would be able to play for long hours comfortably without spending 2k extra on bills....i can confirm i had stopping playing high graphics game on my 3060ti which i got during peak mining for 70k due to need for Ac...i have spent around 30hrs gaming since the 4070ti arrived.
 
Last edited:
i had stopping playing high graphics game on my 3060ti which i got during peak mining for 70k due to need for Ac...
Haha that's exactly the reason I got an AC. Winters are best for gaming - the heat output keeps the room cosy


ZOTAC GEFORCE RTX 3080 AMP HOLO 10GB @ 43.5k
  • 10GB GDDR6X NON LHR
  • *Open Box With 1 Year Warranty
  • *No Return / Refund Allowed
This is a used item. Open box typically to an unboxed item that is still new & unused, not spent the last two years in a mining rig.

If this were an RMA'd item, it would have been, mentioned
 
Did anyone you know purchase it at that price. I have seen vendors quote really low price on Youtube to pull crowd to their stores but they wont honor that price. They would say 'that stock is over or price went up again' blah blah. Same is the case with one such guy in Bangalore.
True, but some have mentioned in the comments that they bought items at the advertised prices. I guess you can only verify in person at the shop.
 

Attachments

  • Img.jpg
    Img.jpg
    43.2 KB · Views: 129
With GST, all-inclusive AFAIK. Even if it's without, still a hell of a deal!

Also, 3060Tis (especially Galax) have dropped to an all-time low.
I was really interested in the 3060Ti (even used ones have dropped by a lot) but the Hogwarts Legacy thing and games nowadays being more glutton of memory makes me think that I might regret it later. F.
 
I was really interested in the 3060Ti (even used ones have dropped by a lot) but the Hogwarts Legacy thing and games nowadays being more glutton of memory makes me think that I might regret it later. F.
What res you play on and what is your upgrade cycle?

If you play on 1080p, 8GB card is fine for at least 2-3 years more. 1440p and above will need more than that soon if playing on higher settings.
 
What res you play on and what is your upgrade cycle?

If you play on 1080p, 8GB card is fine for at least 2-3 years more. 1440p and above will need more than that soon if playing on higher settings.
i would have agreed a few months back but recent games cannot do max 1080p on 8gb. you would still be able to get away by tuning the settings a bit and most eye candy would be there.
 
i would have agreed a few months back but recent games cannot do max 1080p on 8gb. you would still be able to get away by tuning the settings a bit and most eye candy would be there.
There's a difference between allocated memory and actual usage. Games tend to allocate as much as possible even when not required. Consider it as buffer space but it's not in actual use until needed.
It's done so as to not constantly swap data in and out of the memory to load new textures in cases of loading say a new area or level etc.
Perfect example of this is COD-MW2. There's a slider for target memory allocation.

They're fine even if they need to swap the data to compensate for lack of an oversized VRAM.

PS: Also, Ultra settings is not for playing games, it's for screenshots.
 
There's a difference between allocated memory and actual usage. Games tend to allocate as much as possible even when not required. Consider it as buffer space but it's not in actual use until needed.
It's done so as to not constantly swap data in and out of the memory to load new textures in cases of loading say a new area or level etc.
Perfect example of this is COD-MW2. There's a slider for target memory allocation.

They're fine even if they need to swap the data to compensate for lack of an oversized VRAM.

PS: Also, Ultra settings is not for playing games, it's for screenshots.
well i invest to play at ultra :) and yes games allocate more and forza horizon 5 kept warning of low memory when maxing at 1080p on 3060ti to the point the game started artifacting after 20-30mins of gameplay on 3060ti and rx6600 with 10-12gb it ran fine upto 4k maxed...when looking for a new card its always good to understand where tech is headed before investing.
 
well i invest to play at ultra :) and yes games allocate more and forza horizon 5 kept warning of low memory when maxing at 1080p on 3060ti to the point the game started artifacting after 20-30mins of gameplay on 3060ti and rx6600 with 10-12gb it ran fine upto 4k maxed...when looking for a new card its always good to understand where tech is headed before investing.
To be fair then you invested on the wrong card.
To play on Ultra you need something like a 3080 12GB or above and keep changing the card to the next best thing every year or so.

Anyways, FH5 has some known issues in memory allocation for 3060Ti and 3070 which Nvidia acknowledged in their driver release notes. It's not related to the card or VRAM but the fix has to come from Microsoft.

I can't give u links to "proofs", not that I need to but you can Google about it if you want.

Developers target midrange hardware because that's where most of the customers are. 8GB is plenty for 1080p high for the next few years, period.
 
To be fair then you invested on the wrong card.
To play on Ultra you need something like a 3080 12GB or above and keep changing the card to the next best thing every year or so.

Anyways, FH5 has some known issues in memory allocation for 3060Ti and 3070 which Nvidia acknowledged in their driver release notes. It's not related to the card or VRAM but the fix has to come from Microsoft.

I can't give u links to "proofs", not that I need to but you can Google about it if you want.

Developers target midrange hardware because that's where most of the customers are. 8GB is plenty for 1080p high for the next few years, period.
I am on a 4070ti right now...the 3060ti i bought in 2021 on release is my secondary rig now so i put my money where my mouth is :p

its not just the fh5, Marvels guardians of galaxy, fc6 and a lot of more recent games cant do ultra on 8gb. i am talking from experience now it maybe a bug but with more memory it goes away...the 4070ti i got is not seeing max usage on any game at the moment as i play at 4k 60fps only. it should last me 2 years with ease with some adjustments needed in 3rd year when i would probably change it.
 
I am on a 4070ti right now...the 3060ti i bought in 2021 on release is my secondary rig now so i put my money where my mouth is :p

its not just the fh5, Marvels guardians of galaxy, fc6 and a lot of more recent games cant do ultra on 8gb. i am talking from experience now it maybe a bug but with more memory it goes away...the 4070ti i got is not seeing max usage on any game at the moment as i play at 4k 60fps only. it should last me 2 years with ease with some adjustments needed in 3rd year when i would probably change it.
My 5700XT is doing fine in all those games at 1080p Ultra, including Hogwarts Legacy. But yes, days are certainly numbered!
 
well i invest to play at ultra :) and yes games allocate more and forza horizon 5 kept warning of low memory when maxing at 1080p on 3060ti to the point the game started artifacting after 20-30mins of gameplay on 3060ti and rx6600 with 10-12gb it ran fine upto 4k maxed...when looking for a new card its always good to understand where tech is headed before investing.
Was your game on hdd or ssd as mine was on hdd and when i moved it to ssd it ran butter smooth and i'm using 2070 super for that.
 
Back
Top