Graphic Cards Will GPUs with 8GB of Vram struggle to run upcoming games?

Bennyboi72

Patron
Recruit
It seems like 8gb isn't enough for 1440p gaming and is almost at the brink when gaming at 1080p. Recent games like Hogwarts Legacy, Last of Us have had issues with Vram management. Is this the new trend or are those games poorly optimized?
 
I don't know about Hogwarts Legacy but Last of Us is indeed poorly optimized.

I think 8gb vram is still okay, it's not like games will stop working anytime soon, games have started asking for 2gb vram minimum now a days or they won't start.

Usually people misunderstand the vram usage, by people I mean those who like to crank every graphics setting in game to the max (in hope that it would look better) and then they see that their top of line graphics card has not enough vram and the fps tank. (I gotta get the card with more vram)

I recommend watching this video to get little idea of how textures work - Link

Of course more is better if you are buying a new card, but if someone has a 8gig card, they shouldn't be worried it still has plenty of life left. All they have to do is turn down setting a bit with time and that's how cards live their full life.
 
Last edited:
8GB gpus are here to stay for next 5 yrs at least and also the newer gpus low-end cards will be having 8gb for sure so nothing to worry.
 
The noise about VRAM is exaggerated because of two games, Hogwart legacy and Last of us, like the entire gaming industry is based on these two games and suddenly we should forget about all the games which performs great on good hardware.

Both these games want to use same amount of VRAM at 4k and 1080p which doesn't make any sense because 4k is 2 times the pixels of 1080p so obviously the games are not optimized.

Regarding performance, Last of Us was performing poorly because of RAM issue not VRAM issue which they fixed later and Hogwart Leagcy has other optimization issues such as frame drops which has nothing to do with VRAM. The VRAM usage is 11 GB at 4k ULTRA.

Some other points
- According to steam surveys, 76% of cards in the market are from NVIDIA, out of which only about 11% has 12 GB VRAM. About 80% of the cards have 8 GB or less VRAM. No game company is fool enough to ignore this unless they want Nvidia users not to buy their games.
- AMD uses up to 2 GB more VRAM than Nvidia at same settings. This is the primary reason why AMD offers more VRAM. It may increase in future that's why AMD gave upto 4 GB more VRAM than an Nvidia GPU.
- If your card has more VRAM then game allocates and uses more. This is the reason why youtubers testing the games with their 4090's see VRAM usage touching 14-16GB while the same game runs fine on a 8 GB GPU (with tweaked settings )
- 30 series and 4070/ ti cannot max out on 4k or even 2k ultra in all games. You will hit performance limit before reaching VRAM limit on your 30 series or 4070/4070 ti.
- What is the compulsion to max out? Maxing out a game reduces the performance by upto 40-50% while adding either negligible or most of the times nothing to the visual experience.
- Some people think that game companies are ignoring optimizations and relying on DLSS. Well DLSS does not reduce VRAM needs in all games so NO, game companies cannot rely on DLSS and they will have to make games which fits in 8 GB VRAM at 1440p.

There is no need to loose your sleep if you have a 3060 ti or 3070. You have the same amount of VRAM as 80 % of the market has. A 3080 or 6800xt would be great for 2k or even 4k but there is no need to upgrade just because of VRAM. Do not throw away your perfectly working card just for VRAM.
 
Last edited:
For unoptimized games, even 24GB is not enough. But on optimized games, the lower VRAM will definitely limit higher graphic quality. Pre-ordering a digital game makes no sense - publishers hurry to get a release out of the door asap to get money flow inwards. The games you mentioned were over-hyped to get more sales.

For eg. if a game's fps matches your monitor's Hz at say 50-60% GPU usage, then you can crank up the settings to get better visuals. However the VRAM will then become the bottleneck as it struggles to swap more detailed textures from disk. Basically you paid for a Ferrari, but it has a 50kmph speed governor - you are unable to use it's full power.

Nvidia is creating this problem and offering the solution in the form of DLSS. They want you to play at less than native resolution and then scale it up. DLSS should be used towards the end of your card's life, to prolong its usability, not right off the bat. Game devs can skip optimizing by depending on DLSS - it is a vicious circle.

That said, it's very hard to tell the difference between Medium-High and High-Ultra in most games when you're actually playing and not just watching, so it's mostly to flex on others.

-------------------------------------

ReShade: This is an amazing way to make ANY game look better at even low settings. It's completely adjustable to your individual taste, check it out.
Edit: forgot to mention it has negligible performance hit!
 
Last edited:
He is just correcting you where you said 4k has twice the number of pixels WRT 1080p.
What is the compulsion to max out? Maxing out a game reduces the performance by upto 40-50% while adding either negligible or most of the times nothing to the visual experience.
Very good point. I always keep the settings a little lower than Max. Most of the games runs great on 1440p (3070).
 
Last edited:
- AMD uses up to 2 GB more VRAM than Nvidia at same settings. This is the primary reason why AMD offers more VRAM. It may increase in future that's why AMD gave upto 4 GB more VRAM than an Nvidia GPU.
Why AMD uses 2 GB more? Just want to understand the technicality here.
 
That will be the norm as texture size and quality keeps growing with newer titles.

You wanna enjoy 2K gaming? Do it on medium-high settings for 8GB cards. There may still be newer games which are natives + well optimized and not ports (sadly, most are).
They might be able to run fine on high settings @ 2K but in general for the next 3 years or so 8GB VRAM cards are 1080p high settings cards.

Do not expect major game publishers to invest time into optimization. That is always the last priority in software development as per management. They need to go to market ASAP to keep stakeholders happy which means bad/broken/buggy game launches and unguaranteed weeks/months/years of patches to fix the half-ass broken software.
This is the new norm as gamers keep buying and/or pre-ordering games regardless of performance issues.

2K @ Highest settings will be struggle for newer AAA games on 8GB VRAM in general, period!
 
i can confirm that games easily use 7-8gb vram in 1080 high/ultra textures,sometimes even more.
now whether its allocated vram or used vram, that i have no idea.
 
As a 3070 owner, I will say that VRAM depends on the price segment.

Even if just a handful of games have issues with 8GB VRAM, say from a mid-range & above GPU it isn't expected. So, considering AMD has 12GB on a 6700XT for its price & performance class, 8GB cards from Nvidia don't make sense even if they perform 10-15% better, as 3070 does. For a 20k GPU, 8GB VRAM is still fine.

Similarly, 12GB VRAM for 4070Ti is not great, considering people might expect 4K gaming from an 80k GPU. Remember that 1080Ti had 11GB VRAM for a $799 GPU in 2017, so with all the advances in technology, 3070Ti with 12GB VRAM is not fine for 2023 IMO. Nvidia is being greedy & deserves the flak it is getting.
 
- What is the compulsion to max out? Maxing out a game reduces the performance by upto 40-50% while adding either negligible or most of the times nothing to the visual experience.
This..
And this is one of the reasons I generally end up preferring a console when possible.

I am currently playing HFW on a PS5 and TLOU 1 on the PC.
On the PC, i spent several hours trying several permutations of various gfx settings to identify a combination that keeps my frame rate at 60 .
On the console, it was simply a matter of firing up the game and getting started.

An hour into TLOU and a few hours into HFW (coz i didn't have to waste time)..
And honestly, i couldn't care less about a missing shadow here .. or a more detailed rock there.
Not that I would really notice it - and I guess it says a lot about the quality of the gameplay of a title itself if I start caring about tiny differences in rendering so much

Yet I will do the same useless exercise on the next game I pick on the PC because FOMO :laughing:
 
...currently playing HFW on a PS5 and TLOU 1 on the PC.
On the PC, i spent several hours trying several permutations of various gfx settings to identify a combination that keeps my frame rate at 60 .
On the console, it was simply a matter of firing up the game and getting started.
Both those games were console centric and ported to PC without optimizing. Consoles will struggle too if a game is ported from PC that isn't optimized for their architecture.

Also PS5 has 16GB shared VRAM.

Yet I will do the same useless exercise on the next game I pick on the PC because FOMO :laughing:
If you stop watching YouTubers overhyping games, that FOMO will magically disappear, along with the desire for a 4090Ti ;)
 
well yes for the upcoming games 8gb will limit you to 1080p. Though vram side isnt the only metric

i have tested 30 games myself on my 3060ti 8gb and i could run every game on 1440p or even 4k at times with DLSS on my 4k TV. but that memory kept hitting 7500-8000 but i could get easy 60fps and it was stutter free almost.

eventually it depends on your need. What resolution, frame rate and graphics settings you are targetting. If you are buying today a new GPU then avoid 8GB for sure unless you are on a tight budget and willing to compromise little on the settings and frame rate. I was able to play several AAA games on my 4k tv adjusting wherever necessary and still had a wonderful experience.

if you want high frame rates and 1440p or higher resolution then 12gb is the bare minimum, 16gb is the sweet spot.

Meanwhile we will get a true glimpse of what the future holds once UE5 comes out with immortals of avneum. We expect it to be the starting point of what the future holds. Just check out its requirements.
 
Both those games were console centric and ported to PC without optimizing. Consoles will struggle too if a game is ported from PC that isn't optimized for their architecture.

Also PS5 has 16GB shared VRAM.


If you stop watching YouTubers overhyping games, that FOMO will magically disappear, along with the desire for a 4090Ti ;)
haha, yeah..
Nah, I am sticking to my last gen card for at least some time to come...

Maybe it's just me but I don't really notice a difference between even medium and Ultra while playing..
and additionally, I am perfectly happy with DLSS upscaling as against native 4k..
 
4060Ti 16GB announced. If "entry level" cards are having that much VRAM, it says something about where things are headed.
This looks like another relatively last minute change, like how 3060 had 12GB last year but potent cards like 3060Ti & 3070 which needed it didn't. Nvidia received a lot of flak during launch of 30 series regarding VRAM of 3070 & 3080. If crypto miners didn't mess up the market, maybe 3070Ti would have had 10GB or 12GB VRAM. Since day 1, 3070Ti was a pointless product, not even 10% higher performance vs 3070, wouldn't have survived if market was normal.

Now AMD again has a chance to entice gamers & increase their market, but as usual, they will likely mess it up.

Without Frame Gen, Nvidia says the RTX 4060 Ti is only 15% faster than the RTX 3060 Ti. Yes, it's the same price, but if you wanted 15% more performance you've had that option with the RTX 3070 for over two years now — the 3070 is 12–16 percent faster than the RTX 3060 Ti, going by our latest GPU benchmarks. Similarly, the RTX 4060 according to Nvidia's data is about 20% faster than the RTX 3060, not including Frame Gen. Of course, it's priced better at $299, but that's two years of waiting to get a card that is, ultimately, not even as fast as an RTX 3060 Ti (which is 30–35 percent faster than a 3060).
If that is indeed true, 6700XT is already at an advantage over Nvidia. Currently, it is under $350 in the US, so 4060Ti 8GB is 15% faster for $50 extra, but with lower VRAM.
 
Yeah the 4060Ti is what the 3070Ti should have been.

Looking at 3000 & 4000 series it's almost like Nvidia intentionally cripples the 60 & 70 models to make the higher ones seem more VFM.
 
Back
Top