Nvidia DLSS4, MFG, and full ray tracing tested on RTX 5090 and RTX 5080

[HEADING=2]Nvidia DLSS4, MFG, and full ray tracing tested on RTX 5090 and RTX 5080[/HEADING]
This is how Blackwell delivers its biggest “gains.”

Nvidia’s new [GeForce RTX 5090](’

Nvidia GeForce RTX 5090 Founders Edition review: Blackwell commences its reign with a few stumbles | Tom's Hardware

) and [GeForce RTX 5080](’

Nvidia GeForce RTX 5080 Founders Edition review: Incremental gains over the previous generation | Tom's Hardware

) have arrived, and coupled with a new testbed and a revised test suite, not to mention new drivers and a host of other changes, our initial [benchmarks](’

GPU, CPU and Performance Benchmarking | Tom's Hardware

) had to gloss over a few areas. One of the biggest selling points for the [Blackwell RTX 50-series GPUs](’

Nvidia Blackwell and GeForce RTX 50-Series GPUs: Specifications, release dates, pricing, and everything we know (updated) | Tom's Hardware

) — according to Nvidia, at least — is DLSS4 with Multi Frame Generation (MFG), an AI-based technology that offers further “performance” improvements over DLSS3 and framegen. But there are other changes as well.

DLSS4, MFG & RTX tested in 5 games below:

Source:

Hilarious for 1L-1.6L gpu

Those who think this is ok and an outlier, see something similar few years back ( just found it accidentally).
And saw some people hoping that some future tech will make it easier to stream and so high bandwidth will compensate etc etc. Similar faulty logic ( in hindsight).

https://linustechtips.com/topic/1439392-do-any-games-currently-use-more-than-10gb-vram-even-at-max-settings/

And some bs from Nvidia.

Anyway, my warnings are done now.

Even after all this, somehow the 3080 outperforms 6800xt to this day on 4k. 8GB cards are definitely not feasible anymore but 10 and up still seem fine with a few exceptions. Obviously you wont be maxing out all settings with a 3080 in the first place in 4k, you’d want to run optimized settings in which scenario the 3080 pulls ahead overall.

There is a lot of fearmongering for vram and some cases rightfully so (like 8gb cards priced at 30k+), but in this scenario I dont think it’s valid.

RDR2 natively maxed out at 1440p uses 9 GB of VRAM on my 3080. 4K and 4-6 GB? Sure.

3080 in recent testings is closer to a 6900 XT in raster performance, these days.

I just quoted from that resetera link.
Searching on this, they did say it.

nvidia/comments/ilhao8/nvidia_rtx_30series_you_asked_we_answered

Nope, no fear mongering. Its same as less RAM. When you run out, you can get into trouble. With excess vram, we can increase texture quality / draw distance etc ( native or mods) even if performance has to be scaled down in future games.

This will depend from game to game. But i did face issues with Witcher 3 Next gen because of vram - at 1440p. Had to use a mod to fix that.
I also wanted to increase draw distance in blood and wine ( beyond game max) but couldnt - because of vram.

People keep making this mistake. But anyway, i dont care that much. For now i only play older games and in future whenever i buy next gpu, i wont be buying one with limited just enough vram.
I have made this mistake twice.
ofc there will be plenty of games that run fine too. KCD 2 seems to run well.

Sorry, wasn’t directed at you. It was directed at Nvidia’s BS claims in the article.

1 Like

yeah i know, but i thought i will clarify because i hadn’t checked myself.

1 Like

I have odyssey g9 and rdr2 uses about 6-7 gigs or vram on my 4080 super. At 5120x1440. Everything maxed out

Can you tell me what’s the reason it’s consuming less vram on my system compared to yours?

Are you using MSAA? Could be a factor for me, since I have that set to 2x. Could be other factors, as well.

1 Like

In an ideal world, all three companies would offer plenty of VRAM, perfect software implementations, and great pricing. Ideally, a card would have 50xx-series software and features, including DLSS, CUDA, Intel’s encoder/decoders, and an additional 4-8GB of VRAM depending on the model.

However, since we have to pick and choose, I believe the better compromise is to get a 5070 Ti or 5080 with 16GB rather than an AMD or hypothetical Intel card with 20+GB of VRAM today. I really don’t see any extra longevity from a 20GB VRAM card if the trade-offs in other features outweigh the benefit. There will be exceptions where other cards perform better in specific cases—if you don’t need the added VRAM and the game requires an upscaler due to the current state of optimization, then more VRAM won’t help. In that scenario, you’d want the best upscaler available and I feel 90% of games come under this category.

Of course, none of this matters if you can afford a 32GB 5090. :face_with_tongue:

yeah, agreed its just one factor but an important one. Someone who has to buy has to look at it and make a choice,
But this gen will make very few people look for upgrades. Its not much different from 4000 in perf.

Coming from 3080, nothing is interesting to me at current prices.

Even if i can, doesn’t mean i want one. Performance yes, but heat and price - no. Just because, say, we have money doesnt mean we gotta spend it.
Money can make future money ( That’s my profession too).

1 Like

These are usual figures for me in RDR2. 1440p, everything maxed out, no upscaling, MSAA2X