Graphic Cards Will GPUs with 8GB of Vram struggle to run upcoming games?

6700XT is already at an advantage over Nvidia. Currently, it is under $350 in the US
In mid 20s in India with flipkart sales. Seems like a much better buy vs anything from nvidia in that range if you don't fall for the 'features' trap. Features are only a bonus, cannot compromise basic hardware for that.

4060ti 16gb is priced too high to make any sense. And 8gb makes no sense to me for a new gpu priced at 400$. 200-250$ perhaps. Who buys new-not-very-cheap gpu to immediately play at medium-ish settings? Hopefully, goes the way of 4070. If they want volume sales, price should be much better.

With the main details of these products covered, there is a clear elephant in the room: the low memory capacity of two of these GPUs, at least. The RTX 4060 Ti with just 8GB of VRAM priced at a whopping $400 seems quite problematic, given the increasing number of games that struggle to run using ultra settings on 8GB cards. As we've shown in many examples, this is a clear trend for modern games, which may relegate the RTX 4060 Ti to only being usable on medium-ish quality settings depending on the title.
Nvidia stated in their briefing with us that the RTX 4060 Ti isn't designed to be used on ultra settings; being a mid-range card, it's more suited to high-ish settings at modest resolutions. Indeed, many games will run fine on an 8GB card in the short term with dialed-down texture quality. The issue is that VRAM capacity will likely be the only reason full ultra setting gaming will be unachievable (especially with ray tracing enabled), as the core GPU performance of this product is otherwise capable of ultra settings.

Looking at 3000 & 4000 series it's almost like Nvidia intentionally cripples the 60 & 70 models to make the higher ones seem more VFM.
It is like that. Very clear after launch price of 4090 and 4080. Both absurdly high but 4090 had better relative value. Probably nothing will change now as crypto demand has been replaced by AI demand
 
Yeah the 4060Ti is what the 3070Ti should have been.

Looking at 3000 & 4000 series it's almost like Nvidia intentionally cripples the 60 & 70 models to make the higher ones seem more VFM.
A solid $250, $400 & $550 GPU from AMD help them a lot. 8/12/16 GB VRAM will be fine at those price points, like their previous gen had. Hopefully, they decide to compete with Nvidia instead of mediocrity.
In mid 20s in India with flipkart sales. Seems like a much better buy vs anything from nvidia in that range if you don't fall for the 'features' trap. Features are only a bonus, cannot compromise basic hardware for that.

4060ti 16gb is priced too high to make any sense. And 8gb makes no sense to me for a new gpu priced at 400$. 200-250$ perhaps. Who buys new-not-very-cheap gpu to immediately play at medium-ish settings? Hopefully, goes the way of 4070. If they want volume sales, price should be much better.




It is like that. Very clear after launch price of 4090 and 4080. Both absurdly high but 4090 had better relative value. Probably nothing will change now as crypto demand has been replaced by AI demand
I have been recommending 6700XT over even 3070 even since 6700XT's price dropped to be same as 3060Ti. I own a 3070 & when I purchased my GPU, 6700XT was about 10% more expensive than the 3070. For 9 months or so, it has been stupid to purchase a 3070 most of the times.
 
Playing with every setting at max is not very economical.. Some of the settings don't even result in any noticeable visual differences while causing a good degree of impact on the overall performance.
Hardware unboxed's youtube has optimization guides for several games which demonstrate the same. I usually follow it and get very good performance out of my 3070 @ 2k.
This way we can even set that 60fps frame cap and lower the overall GPU usage which would result in power savings too.
 
Playing with every setting at max is not very economical.. Some of the settings don't even result in any noticeable visual differences while causing a good degree of impact on the overall performance.
Hardware unboxed's youtube has optimization guides for several games which demonstrate the same. I usually follow it and get very good performance out of my 3070 @ 2k.
This way we can even set that 60fps frame cap and lower the overall GPU usage which would result in power savings too.
yes, so that is what people with some of these existing cards will have to do (me too for future games with 10gb). But its not a good thing for new cards. With time, we will start to have more noticeable IQ loss. And its even funny that in a recent game, best option was to disable RT too because it used up extra VRAM that 8 gb cards did not have.

And same reasoning can be applied to RT - not much IQ improvement for too much performance cost. And with that it just makes sense to go with AMD for better VRAM and better raster performance for the price - esp for 'cheaper' cards.

Only pro for me right now is DLSS. And esp DSR which can be used in all games and can sometimes be helpful ( very nice for GTA 5 which has crappy AA). This is the only doubt in my mind right now. I am observing that sometimes AA solutions can be blurry - FXAA was blurry on gta 5 ( but good in serious sam 4). TXAA is blurry on watch dogs 1 ( playing it right now) and i am hearing ( there is a subreddit for that - r/F***TAA) that TAA can be blurry if badly implemented ( i liked it on ID games). If so, and if TAA is forced, DSR/DLDSR may be a fix for that. I don't like blurry graphics.
 
Last edited:
Playing with every setting at max is not very economical.. Some of the settings don't even result in any noticeable visual differences while causing a good degree of impact on the overall performance.
...
This way we can even set that 60fps frame cap and lower the overall GPU usage which would result in power savings too.
What if you're able to max out all settings without hitting 100% GPU load at 60fps capped; you're having fun, but then suddenly the game starts to stutter. That is what running on less VRAM does: your GPU has some ooomph left to produce more fps, but textures and object details will tank your VRAM and you'll end up with less fps even at less than 100% load.

(I'm showcasing a severely un-optimized game here to highlight the issue where there is a very noticeable difference between high and medium textures. Most would prefer stable 60fps to eye candy and have to drop detail level)

So in the end it becomes a question of whether a card is only good for 1080p 144fps or if it can do 1440p 60 fps too at high settings with the only limitation being the VRAM.
 
I don't think the 30 series is that bad even with low VRAM. They still have a lot bigger bus width to make up for that, however lower tier 40 cards have crippled memory bus width making them highly inefficient at higher resolutions where more VRAM is needed, don't know why Nvidia did this this gen. Maybe to make 4090 a most vfm product or something.
 
I don't think the 30 series is that bad even with low VRAM. They still have a lot bigger bus width to make up for that, however lower tier 40 cards have crippled memory bus width making them highly inefficient at higher resolutions where more VRAM is needed, don't know why Nvidia did this this gen. Maybe to make 4090 a most vfm product or something.
yeah, this is a terrible card. Its actually slower than 3060ti in some cases!
Nvidia just relabeled lower tier cards upwards and expect software to make up for gimped up hardware. That 500$ card now even looks worse for anyone with a 1440p display inspite of extra vram. Much better either to go with AMD or a used 3060ti ( for half the price ... ) instead of 8gb cards. And a 6800 instead of 16gb 4060ti. They cant even compete with prev gen AMD cards.

Nvidia is the scalper this gen. I think i will buy AMD next time out of spite, as long as AMD is reasonably competitive ( as they are today ). open source linux drivers will be a plus too. Or intel.
 
Last edited:
I don't think the 30 series is that bad even with low VRAM. They still have a lot bigger bus width to make up for that, however lower tier 40 cards have crippled memory bus width making them highly inefficient at higher resolutions where more VRAM is needed, don't know why Nvidia did this this gen. Maybe to make 4090 a most vfm product or something.
I'm not sure what Nvidia executives are thinking by crippling budget products. Lower end gamers will just shift to a console & that benefits AMD. So in a way AMD doesn't lose much with shitty entry level products like 6400 & 6500.

We can just hope AMD has a good answer to Nvidia's BS, but AMD might just match Nvidia's BS as well. Time will tell.
 
We can just hope AMD has a good answer to Nvidia's BS, but AMD might just match Nvidia's BS as well. Time will tell.
Highly likely that AMD will follow the Nvidia's foot steps and will do the same, but even then there entry level product will beat Nvidia's in raw performance. The only reason Nvidia thinks they can get away with this BS is because they think DLSS 3 is some kind of messiah that will still make there shitty product line up compelling.
AMD does not have any answer to Nvidia's DLSS so they don't have that luxury, they have to provide more raw power in a budget but like Nvidia they will increase price of there X600, X700, X700XT w.r.t last gen for sure.
 
Highly likely that AMD will follow the Nvidia's foot steps and will do the same, but even then there entry level product will beat Nvidia's in raw performance. The only reason Nvidia thinks they can get away with this BS is because they think DLSS 3 is some kind of messiah that will still make there shitty product line up compelling.
AMD does not have any answer to Nvidia's DLSS so they don't have that luxury, they have to provide more raw power in a budget but like Nvidia they will increase price of there X600, X700, X700XT w.r.t last gen for sure.
PS5 makes much more sense vs this shit. And we don't have supply issue anymore for consoles. So atleast a decent portion of gpu demand will shift to consoles if they keep up with their bs. AMD still did bring gpus down to much more reasonable prices last gen. 6600 at 20k was fine i think. Lower ones were garbage. Will have to see what they do now.

As an aside, there is absolutely nothing good in sub 10k category anymore. I remember buying 1050 2gb for 10k 5-6 years ago. Now i still see 1050ti at 10k+ and only ancient parts below that. Absolutely no progress. What would anyone buy if he happens to have cpu with no igpu and only needs desktop usage ? Probably driver support will stop for some of old crap we have available for sale.
 
Highly likely that AMD will follow the Nvidia's foot steps and will do the same, but even then there entry level product will beat Nvidia's in raw performance. The only reason Nvidia thinks they can get away with this BS is because they think DLSS 3 is some kind of messiah that will still make there shitty product line up compelling.
AMD does not have any answer to Nvidia's DLSS so they don't have that luxury, they have to provide more raw power in a budget but like Nvidia they will increase price of there X600, X700, X700XT w.r.t last gen for sure.
I'm still not sold on frame generation. Check Hardware Unboxed's detailed analysis on it. AMD has FSR 3 in development. FSR 2 is almost as good as DLSS 2, so I won't give Nvidia an advantage now. RT performance is better on Nvidia even now, but for mid-range gamers, it still doesn't matter.
 
I'm very happy to see the shittiest mid-tier cards this generation. Performance barely 2-5% better (something you can get by overclocking LOL) or sometimes lower than their last gen counterparts.
This just means more usable life (delayed obsolescence) for last gen mid-tier cards. People who bought 3060-70Ti cards rejoice as you have at least 2 more years now if not 3.
 
I'm still not sold on frame generation...
I find DLSS and RT a gimmick. Raw raster performance is useful in any game, but these things need the game dev to specifically implement them into each and every game, which takes time. Time that could be better used optimizing the game itself, which would give more performance for everyone.

You look at reviews that show off DLSS benchmarks, but when you switch to a non-DLSS game you're up for a rude awakening.
 
I find DLSS and RT a gimmick. Raw raster performance is useful in any game, but these things need the game dev to specifically implement them into each and every game, which takes time. Time that could be better used optimizing the game itself, which would give more performance for everyone.

You look at reviews that show off DLSS benchmarks, but when you switch to a non-DLSS game you're up for a rude awakening.
DLSS 2.x & FSR 2.x are just upscaling tech which actually work well now. But you have to keep render resolution in mind when using these, for example, both of them are useless for 1080p gaming, unless you play on a laptop screen where lower resolution is ok. At 1440p, DLSS/FSR quality mode render at just below 1080p, so has enough pixels to work with. 4K is where it really shines because quality mode renders at 1440p & balanced mode at 1080p, both have enough pixels to work with in a way.

RT is just too demanding to be worth it for most. On Spiderman, the reflections on buildings were great with RT, but my PC with i5 12400 + 3070 couldn't maintain consistent 60+ fps even with 1440p DLSS quality, so in the end I decided to turn it off & play at 1440p ultra instead, got 100+ fps. Not sure if some clever algorithm will help with this in future or brute strength is the only way.
 
i dont think dlss and fsr are even worth considering when you are playing on a 1080p panel.
both look bad, fsr2 more than dlss.
as when buying a 1080p card like 7600,4060,4060ti , rt and upscaling tech are just gimmicks.
The only thing i dont understand is whos gonna buy a 400/500 usd(around 40-50k in india) card for 1080p gaming.
 
DLSS 2.x & FSR 2.x are just upscaling tech which actually work well now.

Not just upscaling, its a frame generation tech too.
Hence it not only helps in performance improvements by using say a 720p image to upscale to 1440p resolution but also predicts next frames and improves overall FPS output.
 
Back
Top