2.x doesn't generate frames, it is available only in 3.xNot just upscaling, its a frame generation tech too.
adds latency + artifacts as well...improves overall FPS output.
2.x doesn't generate frames, it is available only in 3.xNot just upscaling, its a frame generation tech too.
adds latency + artifacts as well...improves overall FPS output.
Frame generation is a scam. It actually degrades input latency. So basically compared to the DLSS 2.0 or native image, you are getting worse input lag which will make the game feel worse than DLSS 2.0Not just upscaling, its a frame generation tech too.
Hence it not only helps in performance improvements by using say a 720p image to upscale to 1440p resolution but also predicts next frames and improves overall FPS output.
DLSS 3 is frame generation, FSR 3 will be frame gen as well. DLSS/FSR 2 are not frame gen.Not just upscaling, its a frame generation tech too.
Hence it not only helps in performance improvements by using say a 720p image to upscale to 1440p resolution but also predicts next frames and improves overall FPS output.
DLSS 3 is frame generation, FSR 3 will be frame gen as well. DLSS/FSR 2 are not frame gen.
Generated frames are not equal to true frames.
adds latency + artifacts as well
It does, for example when you want to parry an attack in Devil May Cry, or when some indie game needs that frame perfect input.As others indicated it will add input lag too..
but IMO it shouldnt matter that much in single player games
That's very extreme, any game examples you can provide which take more than 20GB of RAM without any mods applied to establish trend? Otherwise pure FUD.Along with VRAM, even 32gb ram won't be enough for gaming after 1-2 years.
Not gonna happen at least till PS6 & next-gen Xbox. Consoles kind of dictate the PC hardware requirements. Consoles have 16GB of VRAM+RAM right now. Because of poor optimisation, we see games using up 16GB RAM + 8GB VRAM.Along with VRAM, even 32gb ram won't be enough for gaming after 1-2 years.
Games don't actually use 16GB all to itself. It's your OS + background apps + game which is the usage you see in task manager.Not gonna happen at least till PS6 & next-gen Xbox. Consoles kind of dictate the PC hardware requirements. Consoles have 16GB of VRAM+RAM right now. Because of poor optimisation, we see games using up 16GB RAM + 8GB VRAM.
True for not just VRAM, but also CPU & RAM as well, but it can pale in comparison to games' requirements. The way you measure it is by taking the difference before launching a game and then in the middle of gameplay (models are dynamically loaded to VRAM as and when necessary), so at the beginning of a level it will consume the least amount.Games don't actually use 16GB all to itself. It's your OS + background apps + game which is the usage you see in task manager.
I think in TLOU, I saw 12-14GB RAM being used by the game itself. With all background apps, it was the first game to cross 20GB RAM for me (w/o Chrome in the background). I was playing at 1440p, so 8GB VRAM was filling as well. But this is a poorly optimised game.Games don't actually use 16GB all to itself. It's your OS + background apps + game which is the usage you see in task manager.
It has been patched well since v1.0.4.1. The latest as of this post is v1.0.5.0 and working very well. I've seen total RAM usage of 17-18GB with no crashes and VRAM is sitting just under 8GB for high settings and Ultra character and environment details.I think in TLOU, I saw 12-14GB RAM being used by the game itself. With all background apps, it was the first game to cross 20GB RAM for me (w/o Chrome in the background). I was playing at 1440p, so 8GB VRAM was filling as well. But this is a poorly optimised game.
every ps port till now were highly optimized , except maybe uncharted.It has been patched well since v1.0.4.1. The latest as of this post is v1.0.5.0 and working very well. I've seen total RAM usage of 17-18GB with no crashes and VRAM is sitting just under 8GB for high settings and Ultra character and environment details.
The RAM usage is normal for such a port from PS. It maybe higher when shaders are built at the start, again that's normal too.
I've finished the game with no issues just a week ago after the Nvidia 531.58 hotfix driver for random TLOU crashes.
I played the game 1 week after launch, had no crashes, but optimisation was poor back then.It has been patched well since v1.0.4.1. The latest as of this post is v1.0.5.0 and working very well. I've seen total RAM usage of 17-18GB with no crashes and VRAM is sitting just under 8GB for high settings and Ultra character and environment details.
The RAM usage is normal for such a port from PS. It maybe higher when shaders are built at the start, again that's normal too.
I've finished the game with no issues just a week ago after the Nvidia 531.58 hotfix driver for random TLOU crashes.
The random crashes only affected Nvidia GPUs, specifically 30 series cards.I played the game 1 week after launch, had no crashes, but optimisation was poor back then.
I recommend people to refrain from praising and recommending tech that they are yet to experience themselves. DLSS3 is bad and the latency increase is horrible, period.
Don't be fanboys or do free marketing for nvidia.
Not older gen, it will help 4000 series and later be usable for longer. Theoretically. This will backfire though as evident from the growing no. of unoptimized releases. Devs just raise the minimum/recommended specs else the game will be unplayable even on one generation old hardware.Technology like this can help older gen cards to be usable for longer. Which in turn helps in reducing the electronic wastage.
Optical flow is a very resource intensive task. To be able to do it in (close to) real time demands beefy hardware, in this case Optical Flow Accelerators. Don't think the 3000 series has enough of them to be usable for gaming.I know Nvidia being greedy restricted DLSS 3.0 only for series 4000 cards and above. But I have hopes from AMD with FSR 3.0 which could enable me to get a few more years out of my 3070.
Isn't blindly praising something without taking the time to learn about it exactly what fanboys and PR do?Praising a new tech doesn't equate to fanboyism or free marketing!
Optical flow is a very resource intensive task. To be able to do it in (close to) real time demands beefy hardware, in this case Optical Flow Accelerators. Don't think the 3000 series has enough of them to be usable for gaming.
Isn't blindly praising something without taking the time to learn about it exactly what fanboys and PR do?