News NVIDIA GeForce RTX 5080 Is 2x Faster Than 4080 At $999, RTX 5070 Ti 2x Faster Than 4070 Ti At $769, RTX 5070 Faster Than 4090 For $549

NVIDIA quality control fail: Redditor’s RTX 5080 Founders Edition arrives with 5090 branding​

zyqegae.jpeg



Source: https://videocardz.com/newz/nvidia-...0-founders-edition-arrives-with-5090-branding
 
Last edited:
16gb vram is already not enough for even medium path tracing apparently in Indiana Jones. 4080/5080 already has examples of running out of vram. With DLSS quality.
Yeah its an extreme example ( perhaps not if even medium PT doesnt work?), but this is always how it starts.


Hell, even at DLSS perf as per this.

Basically, in the long run a lot of these features are only useful at (increasingly) lower res or for 90 series which does get crappy vrams.
With 3080, i already don't use RT anymore. Much rather get higher fps.
This is due to the supreme textures which increase the draw distance lowering to very ultra makes it run fine and there is visually no difference.
 
  • Like
Reactions: PixelDew
This is due to the supreme textures which increase the draw distance lowering to very ultra makes it run fine and there is visually no difference.
yeah, so this is how it starts creeping in and soon more and more examples will come. Same kind of reasoning to justify that its enough.
One obvious trigger is whenever next gen consoles start showing up.

It depends on price ofc, It would have been enough for say a 50k-60k gpu maybe at best.
With high end, we should be having large enough buffer. And why not be able to use largest lod ? Look at how 1080ti lasted nicely because of extra vram.

Same thing happened with 3080. 10Gb was enough even at 4k with only some far cry game with stupidly high res texture that caused issue or something like that.
See some old links here. See people saying that 8gb is enough, 10 gb is enough etc.
I fell for similar reasoning that 10 is enough and RT etc ( DLSS ended up being much more imp than RT for me)
I am happy with 3080 for my own use with older games, but it does lack in vram and its showing often.


Anyway, i have written enough on this now as a warning. People can make up their mind ( most will ignore ofc )
 
  • Like
Reactions: edept
yeah, so this is how it starts creeping in and soon more and more examples will come. Same kind of reasoning to justify that its enough.
One obvious trigger is whenever next gen consoles start showing up.

It depends on price ofc, It would have been enough for say a 50k-60k gpu maybe at best.
With high end, we should be having large enough buffer. And why not be able to use largest lod ? Look at how 1080ti lasted nicely because of extra vram.

Same thing happened with 3080. 10Gb was enough even at 4k with only some far cry game with stupidly high res texture that caused issue or something like that.
See some old links here. See people saying that 8gb is enough, 10 gb is enough etc.
I fell for similar reasoning that 10 is enough and RT etc ( DLSS ended up being much more imp than RT for me)
I am happy with 3080 for my own use with older games, but it does lack in vram and its showing often.
I know and I agree that with you I was just pointing out that why the issue had occurred and if anyone was running into that issue how they were supposed to mitigate it. Rest assured, 16gb is not at all sufficient for 4k gaming at all, most games exceed that by quite a lot like the new spider-man 2, last of us remastered, indiana jones etc. and it is only going to get worse in the future if the trend continues and we should probably expect it to for another generation or so until like 2028 or so when the PS6 will release at least if the PS5 follows the PS4 release cycle.

Ofc if you use 10gb for 4k gaming its going to lack even 12gb, for 1440p 12gb is at the limit right now and 16gb is somewhat sufficient, for 4k 16gb is at the limit and 20gb is sufficient.