News Nvidia DLSS4, MFG, and full ray tracing tested on RTX 5090 and RTX 5080

bssunilreddy

Keymaster

Nvidia DLSS4, MFG, and full ray tracing tested on RTX 5090 and RTX 5080​

This is how Blackwell delivers its biggest "gains."

Nvidia's new GeForce RTX 5090 and GeForce RTX 5080 have arrived, and coupled with a new testbed and a revised test suite, not to mention new drivers and a host of other changes, our initial benchmarks had to gloss over a few areas. One of the biggest selling points for the Blackwell RTX 50-series GPUs — according to Nvidia, at least — is DLSS4 with Multi Frame Generation (MFG), an AI-based technology that offers further "performance" improvements over DLSS3 and framegen. But there are other changes as well.

DLSS4, MFG & RTX tested in 5 games below:

Source: https://www.tomshardware.com/pc-com...l-ray-tracing-tested-on-rtx-5090-and-rtx-5080
 
Hilarious for 1L-1.6L gpu

As for the RTX 5080 and 4080 Super, with the current public build and without forcing DLSS4 through the Nvidia App, they both run out of VRAM and effectively fail to work at 4K with quality upscaling and framegen. (I didn't check if DLSS Transformers helped, but I suspect not.) The 5080 just locked up the game completely and we had to kill the process manually, while the 4080 Super dropped to a slideshow-like 18 FPS.

That's another "thanks but no thanks" to frame generation with the current public release of the game, if you're keeping track.

Those who think this is ok and an outlier, see something similar few years back ( just found it accidentally).
And saw some people hoping that some future tech will make it easier to stream and so high bandwidth will compensate etc etc. Similar faulty logic ( in hindsight).


And some bs from Nvidia.
We're constantly analyzing memory requirements of the latest games and regularly review with game developers to understand their memory needs for current and upcoming games. The goal of 3080 is to give you great performance at up to 4k resolution with all the settings maxed out at the best possible price. In order to do this, you need a very powerful GPU with high speed memory and enough memory to meet the needs of the games. A few examples - if you look at Shadow of the Tomb Raider, Assassin's Creed Odyssey, Metro Exodus, Wolfenstein Youngblood, Gears of War 5, Borderlands 3 and Red Dead Redemption 2 running on a 3080 at 4k with Max settings (including any applicable high res texture packs) and RTX On, when the game supports it, you get in the range of 60-100fps and use anywhere from 4GB to 6GB of memory. Extra memory is always nice to have but it would increase the price of the graphics card, so we need to find the right balance.

Anyway, my warnings are done now.
 
Even after all this, somehow the 3080 outperforms 6800xt to this day on 4k. 8GB cards are definitely not feasible anymore but 10 and up still seem fine with a few exceptions. Obviously you wont be maxing out all settings with a 3080 in the first place in 4k, you'd want to run optimized settings in which scenario the 3080 pulls ahead overall.

There is a lot of fearmongering for vram and some cases rightfully so (like 8gb cards priced at 30k+), but in this scenario I dont think it's valid.

 
Hilarious for 1L-1.6L gpu



Those who think this is ok and an outlier, see something similar few years back ( just found it accidentally).
And saw some people hoping that some future tech will make it easier to stream and so high bandwidth will compensate etc etc. Similar faulty logic ( in hindsight).


And some bs from Nvidia.


Anyway, my warnings are done now.
RDR2 natively maxed out at 1440p uses 9 GB of VRAM on my 3080. 4K and 4-6 GB? Sure.
Even after all this, somehow the 3080 outperforms 6800xt to this day on 4k. 8GB cards are definitely not feasible anymore but 10 and up still seem fine with a few exceptions. Obviously you wont be maxing out all settings with a 3080 in the first place in 4k, you'd want to run optimized settings in which scenario the 3080 pulls ahead overall.

There is a lot of fearmongering for vram and some cases rightfully so (like 8gb cards priced at 30k+), but in this scenario I dont think it's valid.

3080 in recent testings is closer to a 6900 XT in raster performance, these days.
 
RDR2 natively maxed out at 1440p uses 9 GB of VRAM on my 3080. 4K and 4-6 GB? Sure.
I just quoted from that resetera link.
Searching on this, they did say it.


Even after all this, somehow the 3080 outperforms 6800xt to this day on 4k. 8GB cards are definitely not feasible anymore but 10 and up still seem fine with a few exceptions. Obviously you wont be maxing out all settings with a 3080 in the first place in 4k, you'd want to run optimized settings in which scenario the 3080 pulls ahead overall.

There is a lot of fearmongering for vram and some cases rightfully so (like 8gb cards priced at 30k+), but in this scenario I dont think it's valid.
Nope, no fear mongering. Its same as less RAM. When you run out, you can get into trouble. With excess vram, we can increase texture quality / draw distance etc ( native or mods) even if performance has to be scaled down in future games.

This will depend from game to game. But i did face issues with Witcher 3 Next gen because of vram - at 1440p. Had to use a mod to fix that.
I also wanted to increase draw distance in blood and wine ( beyond game max) but couldnt - because of vram.

People keep making this mistake. But anyway, i dont care that much. For now i only play older games and in future whenever i buy next gpu, i wont be buying one with limited just enough vram.
I have made this mistake twice.
ofc there will be plenty of games that run fine too. KCD 2 seems to run well.
 
Last edited:
RDR2 natively maxed out at 1440p uses 9 GB of VRAM on my 3080. 4K and 4-6 GB? Sure.

I have odyssey g9 and rdr2 uses about 6-7 gigs or vram on my 4080 super. At 5120x1440. Everything maxed out

Can you tell me what's the reason it's consuming less vram on my system compared to yours?
 
I have odyssey g9 and rdr2 uses about 6-7 gigs or vram on my 4080 super. At 5120x1440. Everything maxed out

Can you tell me what's the reason it's consuming less vram on my system compared to yours?
Are you using MSAA? Could be a factor for me, since I have that set to 2x. Could be other factors, as well.
 
  • Like
Reactions: draglord
Hilarious for 1L-1.6L gpu



Those who think this is ok and an outlier, see something similar few years back ( just found it accidentally).
And saw some people hoping that some future tech will make it easier to stream and so high bandwidth will compensate etc etc. Similar faulty logic ( in hindsight).


And some bs from Nvidia.


Anyway, my warnings are done now.

I just quoted from that resetera link.
Searching on this, they did say it.



Nope, no fear mongering. Its same as less RAM. When you run out, you can get into trouble. With excess vram, we can increase texture quality / draw distance etc ( native or mods) even if performance has to be scaled down in future games.

This will depend from game to game. But i did face issues with Witcher 3 Next gen because of vram - at 1440p. Had to use a mod to fix that.
I also wanted to increase draw distance in blood and wine ( beyond game max) but couldnt - because of vram.

People keep making this mistake. But anyway, i dont care that much. For now i only play older games and in future whenever i buy next gpu, i wont be buying one with limited just enough vram.
I have made this mistake twice.
ofc there will be plenty of games that run fine too. KCD 2 seems to run well.
In an ideal world, all three companies would offer plenty of VRAM, perfect software implementations, and great pricing. Ideally, a card would have 50xx-series software and features, including DLSS, CUDA, Intel's encoder/decoders, and an additional 4-8GB of VRAM depending on the model.

However, since we have to pick and choose, I believe the better compromise is to get a 5070 Ti or 5080 with 16GB rather than an AMD or hypothetical Intel card with 20+GB of VRAM today. I really don't see any extra longevity from a 20GB VRAM card if the trade-offs in other features outweigh the benefit. There will be exceptions where other cards perform better in specific cases—if you don’t need the added VRAM and the game requires an upscaler due to the current state of optimization, then more VRAM won't help. In that scenario, you’d want the best upscaler available and I feel 90% of games come under this category.

Of course, none of this matters if you can afford a 32GB 5090. :p
 
Last edited:
In an ideal world, all three companies would offer plenty of VRAM, perfect software implementations, and great pricing. Ideally, a card would have 50xx-series software and features, including DLSS, CUDA, Intel's encoder/decoders, and an additional 4-8GB of VRAM depending on the model.

However, since we have to pick and choose, I believe the better compromise is to get a 5070 Ti or 5080 with 16GB rather than an AMD or hypothetical Intel card with 20+GB of VRAM today. I really don't see any extra longevity from a 20GB VRAM card if the trade-offs in other features outweigh the benefit. There will be exceptions where other cards perform better in specific cases—if you don’t need the added VRAM and the game requires an upscaler due to the current state of optimization, then more VRAM won't help. In that scenario, you’d want the best upscaler available.
yeah, agreed its just one factor but an important one. Someone who has to buy has to look at it and make a choice,
But this gen will make very few people look for upgrades. Its not much different from 4000 in perf.

Coming from 3080, nothing is interesting to me at current prices.

Of course, none of this matters if you can afford a 32GB 5090. :p
Even if i can, doesn't mean i want one. Performance yes, but heat and price - no. Just because, say, we have money doesnt mean we gotta spend it.
Money can make future money ( That's my profession too).
 
  • Like
Reactions: Kestya