Graphic Cards Will GPUs with 8GB of Vram struggle to run upcoming games?

Not just upscaling, its a frame generation tech too.
Hence it not only helps in performance improvements by using say a 720p image to upscale to 1440p resolution but also predicts next frames and improves overall FPS output.
Frame generation is a scam. It actually degrades input latency. So basically compared to the DLSS 2.0 or native image, you are getting worse input lag which will make the game feel worse than DLSS 2.0
 
Not just upscaling, its a frame generation tech too.
Hence it not only helps in performance improvements by using say a 720p image to upscale to 1440p resolution but also predicts next frames and improves overall FPS output.
DLSS 3 is frame generation, FSR 3 will be frame gen as well. DLSS/FSR 2 are not frame gen.

Generated frames are not equal to true frames.
 
DLSS 3 is frame generation, FSR 3 will be frame gen as well. DLSS/FSR 2 are not frame gen.

Generated frames are not equal to true frames.

True..
As others indicated it will add input lag too..
but IMO it shouldnt matter that much in single player games atleast
Anyways yet to experience it myself.
adds latency + artifacts as well

From what I have seen in videos so far, it is barely noticeable and you can see only those artifacts if you go looking for it.
Else while gaming, it just blends with the motion..
 
Last edited:
As others indicated it will add input lag too..
but IMO it shouldnt matter that much in single player games
It does, for example when you want to parry an attack in Devil May Cry, or when some indie game needs that frame perfect input.
I always prefer low input lag vs high resolution/graphics but to each their own.
 
I recommend people to refrain from praising and recommending tech that they are yet to experience themselves. DLSS3 is bad and the latency increase is horrible, period.

Don't be fanboys or do free marketing for nvidia.
Along with VRAM, even 32gb ram won't be enough for gaming after 1-2 years.
That's very extreme, any game examples you can provide which take more than 20GB of RAM without any mods applied to establish trend? Otherwise pure FUD.
 
games releasing in 2023 require minimum of 16gb, almost every game i have played released this year is using anywhere between 10-14gb, while some of the unoptimized ones like last of us req more than 16gb.
this is on 1080p
 
Along with VRAM, even 32gb ram won't be enough for gaming after 1-2 years.
Not gonna happen at least till PS6 & next-gen Xbox. Consoles kind of dictate the PC hardware requirements. Consoles have 16GB of VRAM+RAM right now. Because of poor optimisation, we see games using up 16GB RAM + 8GB VRAM.
 
Not gonna happen at least till PS6 & next-gen Xbox. Consoles kind of dictate the PC hardware requirements. Consoles have 16GB of VRAM+RAM right now. Because of poor optimisation, we see games using up 16GB RAM + 8GB VRAM.
Games don't actually use 16GB all to itself. It's your OS + background apps + game which is the usage you see in task manager.
 
Games don't actually use 16GB all to itself. It's your OS + background apps + game which is the usage you see in task manager.
True for not just VRAM, but also CPU & RAM as well, but it can pale in comparison to games' requirements. The way you measure it is by taking the difference before launching a game and then in the middle of gameplay (models are dynamically loaded to VRAM as and when necessary), so at the beginning of a level it will consume the least amount.

Alternately and much more easily, in MSI Afterburner enable "Memory usage \ process" - that will show only the VRAM consumed (not allocated) by current process.
 
Games don't actually use 16GB all to itself. It's your OS + background apps + game which is the usage you see in task manager.
I think in TLOU, I saw 12-14GB RAM being used by the game itself. With all background apps, it was the first game to cross 20GB RAM for me (w/o Chrome in the background). I was playing at 1440p, so 8GB VRAM was filling as well. But this is a poorly optimised game.
 
I think in TLOU, I saw 12-14GB RAM being used by the game itself. With all background apps, it was the first game to cross 20GB RAM for me (w/o Chrome in the background). I was playing at 1440p, so 8GB VRAM was filling as well. But this is a poorly optimised game.
It has been patched well since v1.0.4.1. The latest as of this post is v1.0.5.0 and working very well. I've seen total RAM usage of 17-18GB with no crashes and VRAM is sitting just under 8GB for high settings and Ultra character and environment details.

The RAM usage is normal for such a port from PS. It maybe higher when shaders are built at the start, again that's normal too.

I've finished the game with no issues just a week ago after the Nvidia 531.58 hotfix driver for random TLOU crashes.
 
It has been patched well since v1.0.4.1. The latest as of this post is v1.0.5.0 and working very well. I've seen total RAM usage of 17-18GB with no crashes and VRAM is sitting just under 8GB for high settings and Ultra character and environment details.

The RAM usage is normal for such a port from PS. It maybe higher when shaders are built at the start, again that's normal too.

I've finished the game with no issues just a week ago after the Nvidia 531.58 hotfix driver for random TLOU crashes.
every ps port till now were highly optimized , except maybe uncharted.
lou is only ps port that runs like this.
not like it looks anything exceptional.
imo spiderman is the best looking ps game on pc folowed by gow.
 
It has been patched well since v1.0.4.1. The latest as of this post is v1.0.5.0 and working very well. I've seen total RAM usage of 17-18GB with no crashes and VRAM is sitting just under 8GB for high settings and Ultra character and environment details.

The RAM usage is normal for such a port from PS. It maybe higher when shaders are built at the start, again that's normal too.

I've finished the game with no issues just a week ago after the Nvidia 531.58 hotfix driver for random TLOU crashes.
I played the game 1 week after launch, had no crashes, but optimisation was poor back then.
 
I recommend people to refrain from praising and recommending tech that they are yet to experience themselves. DLSS3 is bad and the latency increase is horrible, period.

Don't be fanboys or do free marketing for nvidia.

Praising a new tech doesn't equate to fanboyism or free marketing!

This is the problem with the forum these days. People can't have a mature and healthy discussion without stooping to name callings and forcing one's views onto others.

The discussion was going well till now with others enlightening me on the differences between DLSS 2.0 and 3.0.
Then others chimed in about their inputs regarding the "input lag" with good examples.

Anyways further to the actual discussion..
Being from a software background I personally find it fascinating the way GPU designers think and achieve a technology like this of predicting the next frame using AI.
Technology like this can help older gen cards to be usable for longer. Which in turn helps in reducing the electronic wastage.
I know Nvidia being greedy restricted DLSS 3.0 only for series 4000 cards and above. But I have hopes from AMD with FSR 3.0 which could enable me to get a few more years out of my 3070.

p.s. I also recommend people refrain from posting just for the sake of it unless they have anything to add to the discussion.
 
Technology like this can help older gen cards to be usable for longer. Which in turn helps in reducing the electronic wastage.
Not older gen, it will help 4000 series and later be usable for longer. Theoretically. This will backfire though as evident from the growing no. of unoptimized releases. Devs just raise the minimum/recommended specs else the game will be unplayable even on one generation old hardware.

The truth of the matter is that you won't be leaning heavily on upscaling + Frame Generation at the end of your GPU's life; you will be dependent on them right out of the box. This trend of throwing more tech instead of solving the underlying problem may actually end up reducing the usable life of a GPU.

I know Nvidia being greedy restricted DLSS 3.0 only for series 4000 cards and above. But I have hopes from AMD with FSR 3.0 which could enable me to get a few more years out of my 3070.
Optical flow is a very resource intensive task. To be able to do it in (close to) real time demands beefy hardware, in this case Optical Flow Accelerators. Don't think the 3000 series has enough of them to be usable for gaming.


Praising a new tech doesn't equate to fanboyism or free marketing!
Isn't blindly praising something without taking the time to learn about it exactly what fanboys and PR do?
 
Optical flow is a very resource intensive task. To be able to do it in (close to) real time demands beefy hardware, in this case Optical Flow Accelerators. Don't think the 3000 series has enough of them to be usable for gaming.

Hmm.. makes sense.. But then how FSR 3.0 will fare? It is not supposed to be restricted to a particular gen right?

Isn't blindly praising something without taking the time to learn about it exactly what fanboys and PR do?

If you read my posts carefully, you might (or may not) notice that I'm not pro nVidia or AMD.
I even clearly did mention that I'm yet to experience either of the tech myself..

I'm just impressed by the frame generation/prediction tech in general. It's not that I'm banging drums for DLSS 3.0 alone!
Also, isn't this a forum where one can discuss these topics and learn about them?
 
Back
Top