) have arrived, and coupled with a new testbed and a revised test suite, not to mention new drivers and a host of other changes, our initial [benchmarks](’
) — according to Nvidia, at least — is DLSS4 with Multi Frame Generation (MFG), an AI-based technology that offers further “performance” improvements over DLSS3 and framegen. But there are other changes as well.
Those who think this is ok and an outlier, see something similar few years back ( just found it accidentally).
And saw some people hoping that some future tech will make it easier to stream and so high bandwidth will compensate etc etc. Similar faulty logic ( in hindsight).
Even after all this, somehow the 3080 outperforms 6800xt to this day on 4k. 8GB cards are definitely not feasible anymore but 10 and up still seem fine with a few exceptions. Obviously you wont be maxing out all settings with a 3080 in the first place in 4k, you’d want to run optimized settings in which scenario the 3080 pulls ahead overall.
There is a lot of fearmongering for vram and some cases rightfully so (like 8gb cards priced at 30k+), but in this scenario I dont think it’s valid.
Nope, no fear mongering. Its same as less RAM. When you run out, you can get into trouble. With excess vram, we can increase texture quality / draw distance etc ( native or mods) even if performance has to be scaled down in future games.
This will depend from game to game. But i did face issues with Witcher 3 Next gen because of vram - at 1440p. Had to use a mod to fix that.
I also wanted to increase draw distance in blood and wine ( beyond game max) but couldnt - because of vram.
People keep making this mistake. But anyway, i dont care that much. For now i only play older games and in future whenever i buy next gpu, i wont be buying one with limited just enough vram.
I have made this mistake twice.
ofc there will be plenty of games that run fine too. KCD 2 seems to run well.
In an ideal world, all three companies would offer plenty of VRAM, perfect software implementations, and great pricing. Ideally, a card would have 50xx-series software and features, including DLSS, CUDA, Intel’s encoder/decoders, and an additional 4-8GB of VRAM depending on the model.
However, since we have to pick and choose, I believe the better compromise is to get a 5070 Ti or 5080 with 16GB rather than an AMD or hypothetical Intel card with 20+GB of VRAM today. I really don’t see any extra longevity from a 20GB VRAM card if the trade-offs in other features outweigh the benefit. There will be exceptions where other cards perform better in specific cases—if you don’t need the added VRAM and the game requires an upscaler due to the current state of optimization, then more VRAM won’t help. In that scenario, you’d want the best upscaler available and I feel 90% of games come under this category.
Of course, none of this matters if you can afford a 32GB 5090.
yeah, agreed its just one factor but an important one. Someone who has to buy has to look at it and make a choice,
But this gen will make very few people look for upgrades. Its not much different from 4000 in perf.
Coming from 3080, nothing is interesting to me at current prices.
Even if i can, doesn’t mean i want one. Performance yes, but heat and price - no. Just because, say, we have money doesnt mean we gotta spend it.
Money can make future money ( That’s my profession too).