51-70k Will 12 GB VRAM be enough for 1440p for 4-5 years?

Status
Not open for further replies.

ShashaParallax

Beginner
Efficiency is a non-negligible factor for me, hence I am debating between 4070 and 6800xt. One is efficient other one has more VRAM. My reasoning is that 12gig might last me for 4 years or so as requirement for games are usually dictated by the consoles of current generation and this console generation should have about 4 years left; yes, the games are already hitting 12gb on 1440p but that might already be the limit for this generation's games.

But in the end I still need opinions from people with better experience with gaming and hardware.
 
games are usually dictated by the consoles of current generation

You already answered your own question. Consoles have 16GB GDDR6 memory. 12GB is already hitting limits at highest settings. If you keep your expectations in check and drop down to medium settings when needed then maybe yes, you can stretch your 12GB card for next 3ish years.
16GB would be ideal.
 
  • Like
Reactions: warman and Stronk
Ill take VRAM over efficiency and if efficiency is a concern wait for AMD refresh. Do you want to reduce textures for efficiency in future games? Nvidia labeled most of their 4000 gpus a tier above than what they should have been so now they are 'efficient'.
Also the final 10-15% of performance usually takes much more power. So in my case with 3080 i can reduce my power limit by ~100w ( from 320w to 225w - 70% PL ) + under volt a bit and lose only 10-12% performance vs stock. Or reduce power limit to 85%(272w) + under volt with no performance loss vs stock. Or keep my limit at max ( 320w) + undervolt for about 15% more performance vs 70% power limit. So basically 40-45% more power from 225w to 320w only gives about 15% more performance roughly.

So you have that option, and 6800xt will probably be similar. In your place, if i was willing to wait a bit, then i might wait for AMD to refresh 6800xt. Unfortunately, both Nvidia and AMD have been greedy and have not improved price/perf this generation yet, not by much anyway.
In my case, with 3080 + 10gb, i will likely have to reduce textures to High or so for future games at 1440p and in some cases may have no option to enable RT because of Vram limits. Right now, i think i am ok with that but i wont buy a new gpu with these kind of constraints at that kind of price. And i kinda dislike Nvidia now esp because of the really shitty prices and/or compromised products below 4090. Only con is that you dont have Zotac with 5y warranty for AMD. Gigabyte might have 4y for some products but verify.
 
Last edited:
  • Like
Reactions: Stronk
The thing is that it highly depends on the game developers, whether on or not 12gb will be enough for 1440p. It's the unoptimized games that bring out this question. If we are looking at the current gaming market then no it won't be suffiecient for the next 4-5 years. At Tops 2-3 years but then again personally I feel this is a bit exageratted as you can turn down settings usually without noticing visual differences. It also comes down to your usage, I play e-sports only so 12gb would be suffiecient for me for a good 4-5 years even more possibly lol since I have an RX570 4gb which is almost 5-6 years old now.
 
  • Like
Reactions: DestGod
Nope, 12 GB will barely survive 6 months max. After that you have to use low settings at 1080p.
Devs will soon be introducing a new potato option for 12 GB cards which will be lower than
'low' settings and a new 'Ultra pro max' setting will be added for the mighty 16 GB cards.

On a serious note, who is stopping us from reducing a few settings? A few years ago, no GPU had the horsepower to run the games at ultra settings and still the GPU's ran fine for many years. The games had beautiful graphics too such as far cry 4, crysis, dead space, assassin's creed etc.
What is the sudden compulsion in 2023 to max out everything?
Drop down a few settings and enjoy your game with zero loss in visuals while enjoying 10-20 extra fps.
 
eh, 8GB can still run all the games on 1440+ DLSS quite easily with everything maxed out. Even TLOU after the 1.04 update is easily playable. PS5 has 16 gigs of shared memory with roughly 3-4GB used for the OS which roughly corresponds to the 12gig VRAM limit. Honestly? between those I would pick 4070 over 6800XT any time. DLSS is way underrated even without talking about RT performance. In games like RDR2, its a necessity to get a non-blurry image and after dlss 3.x.x (not the Frame Gen thing but the dlss file itself) and the ability to set presets with it/ enable DLAA in non-supported games too. it's a game changer for me specially in RDR2. As much as I hate Nvidia, I would go with them just for DLSS and 4070 itself is decent card price not withstanding.
 
You already answered your own question. Consoles have 16GB GDDR6 memory. 12GB is already hitting limits at highest settings. If you keep your expectations in check and drop down to medium settings when needed then maybe yes, you can stretch your 12GB card for next 3ish years.
16GB would be ideal.
But console 16gb is shared with cpu as well. If OP can lower the settings then it may last 5ish years.
 
On a serious note, who is stopping us from reducing a few settings? A few years ago, no GPU had the horsepower to run the games at ultra settings and still the GPU's ran fine for many years. The games had beautiful graphics too such as far cry 4, crysis, dead space, assassin's creed etc.
What is the sudden compulsion in 2023 to max out everything?
Drop down a few settings and enjoy your game with zero loss in visuals while enjoying 10-20 extra fps.
This makes sense for someone who already has a card. It makes sense for me with 3080 10gb, I am not going to pay more and buy another gpu until i really need to. I have so many old games to play too and they will clearly have no issue with VRAM. Metro LL i am playing right now uses 2gb ..

But why would someone buying a new card go for a compromised gpu? Compromised perhaps so that Nvidia can protect its AI profits ( just wild guess, or perhaps planned obsolescence).
Game after game recently have given VRAM trouble. If even after all that, someone wants to roll the dice, trading hardware for some software features, so be it.

Does someone buy a 60k-80k gpu to play with lower settings? Even 40k i guess with the garbage 4060ti.
Market reception of 4070, 4060ti from what i have heard is very poor and deservedly so.

 
  • Like
Reactions: Stronk
Thank You everyone, I have taken everyone's input account and will wait it out for now. What's a few more weeks when I've waited nearly 3 years already.
 
Thank You everyone, I have taken everyone's input account and will wait it out for now. What's a few more weeks when I've waited nearly 3 years already.
If 7800xt does not work out - Because this gen has been bit of a flop so far in terms of price/perf and other issues, one could consider used gpus too which have decent discount. If you can manage a good condition card with few years warranty atleast. 3060ti for 20k, 3080ti for mid 40s maybe ( not sure ). Recently a Zotac 3080 10gb with 4+ years waranty sold for 42k (or lower if negotiated, dont know). Or AMD equivalent ( dont seem to sell much here). Lower price here can maybe justify lower VRAM ( and dont have that issue with AMD Cards). I don't have experience of buying 2nd hand though.

Flipkart had some sales ( over now ) for Sapphire 6700xt at mid 20s. If warranty is there ( which some say its there for these), then at that price i think your are getting maybe 70%? of perf or so at less than half the price. That becomes an easy buy until we get a better market - but right now i dont see it in FK at these prices.
 
Last edited:
But why would someone buying a new card go for a compromised gpu? Compromised perhaps so that Nvidia can protect its AI profits ( just wild guess, or perhaps planned obsolescence).
Game after game recently have given VRAM trouble.
You have to play at lower settings not because you have been deceived but because the GPU at that price point does not have the power to do ultra settings.
16 GB 6800/6900 xt or 24 GB 3090 cannot do 4k Ultra either because these lack the horsepower.
If even after all that, someone wants to roll the dice, trading hardware for some software features, so be it.
DLSS is awesome. It is how it is. Ray tracing do not look good in all games but people like it. Also, It is not about software, it is primarily about what is available in the market. Where are the AMD cards from good brands? No, I am not going to buy Sapphire or ASROCK just because of VRAM. It is because of pricing, availability and after sales support why people are buying Nvidia. People prefer Zotac, Gigabyte/ASUS /MSI because the cards are better built and customer support is better. That's it.
Does someone buy a 60k-80k gpu to play with lower settings?
It is not 'lowered settings', it is optimized settings with zero or negligible loss in visual quality. Also, 80k GPU cannot do 4k ultra, not even a 4090 can do that in all games.
Another important point is, If the game is not pushing cutting edge visuals then what is the compulsion to max it out ?
I am playing crysis 2007 these days and it still looks better than 90% of the games of today. It runs on 3 GB VRAM, it has vast open levels which needs a lot of time to cover on foot. It has destructive environment too.
Crysis looks photorealistic at very high (Ultra) settings and turns to stylized kind of look towards medium.
Crysis is the kind of game where you want to do ultra, because you are getting photo realism. What is the point of doing Ultra in modern games? Most Modern games look exactly the same at High and Ultra, sometimes even medium is shockingly close to Ultra.
 
  • Like
Reactions: Tracer_Bullet
DLSS is awesome
Yeah, i have seen it work very well and also seen a horrible blurry messed up version in two different games. Resolution was not 4k. And yes FSR is much worse esp at lower quality where its just plain unusable. DLDSR is also a good option in some cases - worked very nice with GTA5 which has horrible AA.

Ray tracing do not look good in all games but people like it
Yeah, but IQ vs frame impact is pretty bad so far from what i have seen. Reflections are the most obvious improvements and it might be game dependent on whether its worth it. Other stuff, so far perhaps you need a keen eye for details for difference to matter. That said, once you don't have enough VRAM, DLSS3/RT/Textures will be compromised and one will have to reduce settings. Perhaps today its ok similar to high vs ultra, but in future its always a risk. I have seen that in an extreme way in 1050 2gb vs 4gb. Obviously that was lower end, but 4gb lasted much better vs my 2gb.

Where are the AMD cards from good brands? No, I am not going to buy Sapphire or ASROCK just because of VRAM. It is because of pricing, availability and after sales support why people are buying Nvidia. People prefer Zotac, Gigabyte/ASUS /MSI because the cards are better built and customer support is better. That's it.
yeah, you know best here.
5y warranty was a major consideration for me too. That said we have gigabyte and Asus for AMD too, only question is to get it at good price and should be available. gigabyte + 4y warranty would be something i would hope for whenever i purchase next, if i go AMD ( probably will ). No idea about ASROCK and sapphire situation might improve going forward, lets see. People in US have high praise for Sapphire, so build quality is probably good generally - only warranty stuff has to be fixed.

But again, buying an expensive VRAM limited card is probably just as much of an issue, atleast for me. I would probably downgrade and buy cheaper AMD gpus (say 6700xt-6800xt today) + 3y warranty + upgrade more often rather than buy expensive knowing it has compromised hardware for god knows why.

It is not 'lowered settings', it is optimized settings with zero or negligible loss in visual quality. Also, 80k GPU cannot do 4k ultra, not even a 4090 can do that in all games.
Its 'optimized' today and in future you will have more and more cases where its degraded rather than optimized. Also what someone wants can be different, maybe 50-60FPS is enough for someone. In VRAM limited scenarios, AMD gpus will hold much better vs Nvidia. We can see that already happening for 8gb vs 12gb cards even at lower resolutions. Perhaps faster texture streaming through Direct storage might mitigate - who knows.
Anyway, lets agree to disagree - I dont see why i should accept limited VRAM at these price points and i think many feel like me as well. I wont spend that much money on a new gpu while risking future issues and certainly wont accept Nvidia pricing nonsense for 4000 series. Dont really mind being few years behind either in terms of games so that also reduces need. So atleast demand from me will reduce and i will just downgrade to cheaper/used gpus.

Another important point is, If the game is not pushing cutting edge visuals then what is the compulsion to max it out ?
I am playing crysis 2007 these days and it still looks better than 90% of the games of today. It runs on 3 GB VRAM, it has vast open levels which needs a lot of time to cover on foot. It has destructive environment too.
Crysis looks photorealistic at very high (Ultra) settings and turns to stylized kind of look towards medium.
Crysis is the kind of game where you want to do ultra, because you are getting photo realism. What is the point of doing Ultra in modern games? Most Modern games look exactly the same at High and Ultra, sometimes even medium is shockingly close to Ultra.
I agree man, this applies to games today. Future requirement can change, 10-12gb is at limit today and ideally is suited for cards below 40k - not 80k ( my opinion ).
Anyway, with same logic, i dont really need a 60-80k gpu at 1440p for that. I can make do with a 6700xt or so, it has enough VRAM for its performance and should give 70-75% of performance for much lower price and without having to accept any bs from nvidia.
 
  • Like
Reactions: awestorr
Depends on the price class.

As a 3070 user at 1440p, yes 12GB VRAM is fine for 1440p for a few years from now. 8GB VRAM has barely started hitting the limit on some new games, and will likely get worse with time. But something like a 4070Ti is not good as it's priced way too much to have just 12GB VRAM.

For "budget" users, 6700XT is an easy choice. 4070 is also fine with 12GB VRAM if you don't want to risk with Asrock or Sapphire 6800XT.
 
can someone here list the games which have started hitting the 8GB VRAM limit? I know the Hogwarts Legacy as one of the games but is it rectified with updates?

Also, for 1080p how long according to you guys, say a 3060ti would last for? If it's not 4-5 years, then is it a plausible choice to get a used 3060ti for 20k or look for other higher VRAM options?
 
can someone here list the games which have started hitting the 8GB VRAM limit? I know the Hogwarts Legacy as one of the games but is it rectified with updates?

Also, for 1080p how long according to you guys, say a 3060ti would last for? If it's not 4-5 years, then is it a plausible choice to get a used 3060ti for 20k or look for other higher VRAM options?
Hogwarts Legacy, TLOU part 1 & FH5 at extreme textures, all at 1440p. My 3070 has the raw performance, but have to compromise on texture quality on those games. FH5 is fine with lower textures, but other 2 had issues like texture popups in Hogwarts & poor 1% low fps in TLOU for <8GB VRAM GPUs. You can see how even at 1080p, 3060 has similar 1% low to the superior 3060Ti.

For 1080p, used 3060Ti at 20k seems solid, unless you get used 6700XT for a similar price. Can't predict how 3060Ti will perform in 5 years, but with lower textures, it will be fine.

1686671826645.png


1686671943878.png
 
  • Like
Reactions: Anuj333
slightly off topic but 16 gb Ram for 1080gb has become redundant in 2022.
jedi survivor uses 22-23gb just in the opening scene.
 

Attachments

  • Screenshot (17).jpg
    Screenshot (17).jpg
    239 KB · Views: 190
  • Screenshot (16).jpg
    Screenshot (16).jpg
    251.4 KB · Views: 197
So, one should opt for a minimum 32gb RAM when building a PC in 2023? Or is it limited to just 1-2 titles?
almost every new titles will be using more than 16.
dont get me wrong they may work in 16 too, but its better to go for 32gb, especially when ram is so cheap(ddr4)
 
slightly off topic but 16 gb Ram for 1080gb has become redundant in 2022.
jedi survivor uses 22-23gb just in the opening scene.
This is true as well. Can confirm that TLOU was using 12-14GB for the game itself, you need another 4GB+ RAM for OS with no background apps. My PC was crossing 20+ GB RAM easily without Chrome.

So, one should opt for a minimum 32gb RAM when building a PC in 2023? Or is it limited to just 1-2 titles?
For mid-range PCs, get 32GB RAM. DDR5 prices aren't too high vs DDR4 as well now. I saw 32GB DDR5 6000MHz CL30 stick for about 10k.

I opted for 32GB because I want to keep the web browser in the background while playing games. On my older laptop, some games with background apps were consuming ~15GB RAM. Paying 4-5k extra for 16GB is not a big deal when spending 80k+ IMO.
 
  • Like
Reactions: Anuj333
Status
Not open for further replies.