bssunilreddy
Keymaster
NVIDIA quality control fail: Redditor’s RTX 5080 Founders Edition arrives with 5090 branding
Source: https://videocardz.com/newz/nvidia-...0-founders-edition-arrives-with-5090-branding
Last edited:
This is due to the supreme textures which increase the draw distance lowering to very ultra makes it run fine and there is visually no difference.16gb vram is already not enough for even medium path tracing apparently in Indiana Jones. 4080/5080 already has examples of running out of vram. With DLSS quality.
Yeah its an extreme example ( perhaps not if even medium PT doesnt work?), but this is always how it starts.
Hell, even at DLSS perf as per this.
Basically, in the long run a lot of these features are only useful at (increasingly) lower res or for 90 series which does get crappy vrams.
With 3080, i already don't use RT anymore. Much rather get higher fps.
yeah, so this is how it starts creeping in and soon more and more examples will come. Same kind of reasoning to justify that its enough.This is due to the supreme textures which increase the draw distance lowering to very ultra makes it run fine and there is visually no difference.
As for the RTX 5080 and 4080 Super, with the current public build and without forcing DLSS4 through the Nvidia App, they both run out of VRAM and effectively fail to work at 4K with quality upscaling and framegen. (I didn't check if DLSS Transformers helped, but I suspect not.) The 5080 just locked up the game completely and we had to kill the process manually, while the 4080 Super dropped to a slideshow-like 18 FPS.
That's another "thanks but no thanks" to frame generation with the current public release of the game, if you're keeping track.
I know and I agree that with you I was just pointing out that why the issue had occurred and if anyone was running into that issue how they were supposed to mitigate it. Rest assured, 16gb is not at all sufficient for 4k gaming at all, most games exceed that by quite a lot like the new spider-man 2, last of us remastered, indiana jones etc. and it is only going to get worse in the future if the trend continues and we should probably expect it to for another generation or so until like 2028 or so when the PS6 will release at least if the PS5 follows the PS4 release cycle.yeah, so this is how it starts creeping in and soon more and more examples will come. Same kind of reasoning to justify that its enough.
One obvious trigger is whenever next gen consoles start showing up.
It depends on price ofc, It would have been enough for say a 50k-60k gpu maybe at best.
With high end, we should be having large enough buffer. And why not be able to use largest lod ? Look at how 1080ti lasted nicely because of extra vram.
Same thing happened with 3080. 10Gb was enough even at 4k with only some far cry game with stupidly high res texture that caused issue or something like that.
See some old links here. See people saying that 8gb is enough, 10 gb is enough etc.
I fell for similar reasoning that 10 is enough and RT etc ( DLSS ended up being much more imp than RT for me)
I am happy with 3080 for my own use with older games, but it does lack in vram and its showing often.
ok, didnt know about the other 2. For now, probably it might be some extreme setting too, but in a few years or so it will probably become more common. Not good for 1L+ gpu imo.Rest assured, 16gb is not at all sufficient for 4k gaming at all, most games exceed that by quite a lot like the new spider-man 2, last of us remastered, indiana jones etc.
I don't follow consoles, but it seems logical that xbox should be in greater hurry to get a new one since they are falling behind. Anyway don't know.we should probably expect it to for another generation or so until like 2028 or so when the PS6 will release at least if the PS5 follows the PS4 release cycle.
well to end your hopes they are not, in fact they are actually leaving the major console space in the next generation probably and going towards handhelds either like the steamdeck or the switch.xbox should be in greater hurry to get a new one since they are falling behind
we already have them its just that nvidia did not implement them as they came out after they had their spec finalized and were also more expensive.after we get 3gb vram chips
haha, ok good to know.well to end your hopes they are not, in fact they are actually leaving the major console space in the next generation probably and going towards handhelds either like the steamdeck or the switch.
we already have them its just that nvidia did not implement them as they came out after they had their spec finalized and were also more expensive.
Mass production is planned for early next year.
well the 5090 laptop version already is using 3gb modules:i dont think 3gb ones are ready yet, i read that in few places that thats next year. From google -
https://www.tweaktown.com/news/1027...r7-on-512-bit-bus-has-been-spotted/index.htmlWe know that the GeForce RTX 5090 Laptop GPU will use 3GB GDDR7 memory modules with its 24GB GDDR7, using the GB203 GPU on a 256-bit memory bus.
yes so probably supply will be limited for those chips initially ( as you said - higher prices), why would they prioritize 5080/5070 ?well the 5090 laptop version already is using 3gb modules:
https://www.tweaktown.com/news/1027...r7-on-512-bit-bus-has-been-spotted/index.html
Also Nvidia has already prepare blackwell workstation gpus with 3gb modules too.
Anyway, i am no expert. There are rumors of 18gb/24gb versions in future. If they dont release, then AMD will atleast have vram and/or price advantage atleast.
yeah, most of us will do same and look at reviews and form an opinion based on past experienceI don't think anyone here is performing tests with actual RTX 5090, RTX 5080 cards. You're giving your take based on what you've read / seen / heard so I think that's... ok?
yeah, most of us will do same and look at reviews and form an opinion based on past experience
Anyway, i don't really care too much for this gen. 3080 is enough at 4k for my needs for now.
I havent played fallouts and skyrims and dragon ages and deus ex - and so many more games to catch up on before i bother with latest stuff.
All of them. I played ME but never got to DA. Currently playing ME Andromeda ( so far is fun if we manage expectations as writing is mediocre + very good OOB HDR).I heard the new Dragon Age has received a mixed reaction or are you referring to Origins ?
All of them. I played ME but never got to DA. Currently playing ME Andromeda ( so far is fun if we manage expectations as writing is mediocre).
Bioshocks, Shadow Warriors too, etc etc.