News NVIDIA GeForce RTX 5080 Is 2x Faster Than 4080 At $999, RTX 5070 Ti 2x Faster Than 4070 Ti At $769, RTX 5070 Faster Than 4090 For $549

NVIDIA quality control fail: Redditor’s RTX 5080 Founders Edition arrives with 5090 branding​

zyqegae.jpeg



Source: https://videocardz.com/newz/nvidia-...0-founders-edition-arrives-with-5090-branding
 
Last edited:
16gb vram is already not enough for even medium path tracing apparently in Indiana Jones. 4080/5080 already has examples of running out of vram. With DLSS quality.
Yeah its an extreme example ( perhaps not if even medium PT doesnt work?), but this is always how it starts.


Hell, even at DLSS perf as per this.

Basically, in the long run a lot of these features are only useful at (increasingly) lower res or for 90 series which does get crappy vrams.
With 3080, i already don't use RT anymore. Much rather get higher fps.
This is due to the supreme textures which increase the draw distance lowering to very ultra makes it run fine and there is visually no difference.
 
  • Like
Reactions: PixelDew
This is due to the supreme textures which increase the draw distance lowering to very ultra makes it run fine and there is visually no difference.
yeah, so this is how it starts creeping in and soon more and more examples will come. Same kind of reasoning to justify that its enough.
One obvious trigger is whenever next gen consoles start showing up.

It depends on price ofc, It would have been enough for say a 50k-60k gpu maybe at best.
With high end, we should be having large enough buffer. And why not be able to use largest lod ? Look at how 1080ti lasted nicely because of extra vram.

Same thing happened with 3080. 10Gb was enough even at 4k with only some far cry game with stupidly high res texture that caused issue or something like that.
See some old links here. See people saying that 8gb is enough, 10 gb is enough etc.
I fell for similar reasoning that 10 is enough and RT etc ( DLSS ended up being much more imp than RT for me)
I am happy with 3080 for my own use with older games, but it does lack in vram and its showing often.


Anyway, i have written enough on this now as a warning. People can make up their mind ( most will ignore ofc )
 
  • Like
Reactions: edept
yeah, so this is how it starts creeping in and soon more and more examples will come. Same kind of reasoning to justify that its enough.
One obvious trigger is whenever next gen consoles start showing up.

It depends on price ofc, It would have been enough for say a 50k-60k gpu maybe at best.
With high end, we should be having large enough buffer. And why not be able to use largest lod ? Look at how 1080ti lasted nicely because of extra vram.

Same thing happened with 3080. 10Gb was enough even at 4k with only some far cry game with stupidly high res texture that caused issue or something like that.
See some old links here. See people saying that 8gb is enough, 10 gb is enough etc.
I fell for similar reasoning that 10 is enough and RT etc ( DLSS ended up being much more imp than RT for me)
I am happy with 3080 for my own use with older games, but it does lack in vram and its showing often.
I know and I agree that with you I was just pointing out that why the issue had occurred and if anyone was running into that issue how they were supposed to mitigate it. Rest assured, 16gb is not at all sufficient for 4k gaming at all, most games exceed that by quite a lot like the new spider-man 2, last of us remastered, indiana jones etc. and it is only going to get worse in the future if the trend continues and we should probably expect it to for another generation or so until like 2028 or so when the PS6 will release at least if the PS5 follows the PS4 release cycle.

Ofc if you use 10gb for 4k gaming its going to lack even 12gb, for 1440p 12gb is at the limit right now and 16gb is somewhat sufficient, for 4k 16gb is at the limit and 20gb is sufficient.
 
Rest assured, 16gb is not at all sufficient for 4k gaming at all, most games exceed that by quite a lot like the new spider-man 2, last of us remastered, indiana jones etc.
ok, didnt know about the other 2. For now, probably it might be some extreme setting too, but in a few years or so it will probably become more common. Not good for 1L+ gpu imo.
There are rumors of 24gb 5080 in future after we get 3gb vram chips, that one will hold much better + hopefully prices will come down if amd can compete. ( say close to 4080 raster at half the price as per rumours )

If people keep picking Nvidia even after AMD gets competitive/better value with FSR4/RT, then market is asking for bad prices and compromised gpus like this.

we should probably expect it to for another generation or so until like 2028 or so when the PS6 will release at least if the PS5 follows the PS4 release cycle.
I don't follow consoles, but it seems logical that xbox should be in greater hurry to get a new one since they are falling behind. Anyway don't know.
 
xbox should be in greater hurry to get a new one since they are falling behind
well to end your hopes they are not, in fact they are actually leaving the major console space in the next generation probably and going towards handhelds either like the steamdeck or the switch.
after we get 3gb vram chips
we already have them its just that nvidia did not implement them as they came out after they had their spec finalized and were also more expensive.
 
well to end your hopes they are not, in fact they are actually leaving the major console space in the next generation probably and going towards handhelds either like the steamdeck or the switch.

we already have them its just that nvidia did not implement them as they came out after they had their spec finalized and were also more expensive.
haha, ok good to know.
i dont care, i dont want to play on consoles at all and never have if we dont count nes type ones in 90s.

i dont think 3gb ones are ready yet, i read that in few places that thats next year. From google -
Mass production is planned for early next year.
 
i dont think 3gb ones are ready yet, i read that in few places that thats next year. From google -
well the 5090 laptop version already is using 3gb modules:
We know that the GeForce RTX 5090 Laptop GPU will use 3GB GDDR7 memory modules with its 24GB GDDR7, using the GB203 GPU on a 256-bit memory bus.
https://www.tweaktown.com/news/1027...r7-on-512-bit-bus-has-been-spotted/index.html

Also Nvidia has already prepare blackwell workstation gpus with 3gb modules too.
 
well the 5090 laptop version already is using 3gb modules:

https://www.tweaktown.com/news/1027...r7-on-512-bit-bus-has-been-spotted/index.html

Also Nvidia has already prepare blackwell workstation gpus with 3gb modules too.
yes so probably supply will be limited for those chips initially ( as you said - higher prices), why would they prioritize 5080/5070 ?
Anyway, i am no expert. There are rumors of 18gb/24gb versions in future. If they dont release, then AMD will atleast have vram and/or price advantage atleast.
 
I don't think anyone here is performing tests with actual RTX 5090, RTX 5080 cards. You're giving your take based on what you've read / seen / heard so I think that's... ok?
yeah, most of us will do same and look at reviews and form an opinion based on past experience :)
Anyway, i don't really care too much for this gen. 3080 is enough at 4k for my needs for now.
I havent played fallouts and skyrims and dragon ages and deus ex - and so many more games to catch up on before i bother with latest stuff.
 
yeah, most of us will do same and look at reviews and form an opinion based on past experience :)
Anyway, i don't really care too much for this gen. 3080 is enough at 4k for my needs for now.
I havent played fallouts and skyrims and dragon ages and deus ex - and so many more games to catch up on before i bother with latest stuff.

I think you mentioned that once or twice :) Quite a few gamers upgrade every other gen.
Some who upgraded from 3000 series, and managed to get a 5080 at MSRP appear satisfied with the gains thus far.
But as I said before you do you. I heard the new Dragon Age has received a mixed reaction or are you referring to Origins ?
 
User reports of faulty Nvidia GeForce RTX 5090 and 5080 graphics cards come flooding in

I've experienced similar, and while the list of proposed fixes grows ever longer, the root cause remains unidentified.
Early adopters of Nvidia’s latest GeForce RTX 50 Series graphics cards are reporting recurring display issues, with the new hardware delivering unwanted and unexpected blank screens.

Scores of users across social media, forums, and Reddit, have reportedly experienced the same dilemma, with the joy of a new graphics card quickly turning to despair.

In the interests of full disclosure, Club386 has witnessed similar during testing. While RTX 5090 ran our battery of benchmarks without a hitch for my initial RTX 5090 Founders Edition review, subsequent RTX 5080 cards proved problematic.

Our range of issues ranged from blank screens to massive Windows stutter, and the intermittent nature of the fault made diagnosis all the more problematic. Uncertain of the root cause, we tried myriad fixes that included clean installing Windows, switching from PCIe 5 to PCIe 4, and even trying different power supplies.

Today, a fully configured system from a well-respected UK system integrator has arrived at Club386 HQ outfitted with AMD Ryzen 7 9800X3D processor and GeForce RTX 5080 graphics card. At first boot, the system behaves as expected. At second boot, we’re hit by the same blank screen, which leads us to believe the issue is potentially widespread.

Our internal testing is ongoing, but for the time being, forcing a PCIe slot to run at 4.0 x16 appears to be the most successful remedy, though even this doesn’t appear to guarantee success.

Online chatter has led some to suggest the bug could be a result of Nvidia’s reengineered PCIe interface, which is now detached from the main PCB, but it is too early to speculate. Users may recall that RTX 4090 and RTX 4080 faced similar black-screen issues at launch that were addressed through firmware updates for both cards. We can only hope this latest hiccup is no more sinister and can be rectified through a software patch.

Club386 has reached out to Nvidia for comment.

Source: https://www.club386.com/user-report...090-and-5080-graphics-cards-come-flooding-in/

Club386 encounters GeForce RTX 5080 black screens issue, “PCIe 4.0 Fix” shows promise​

Several users from China & Der8auer & Hardware Canucks raised same issues with their RTX 5090/5080's
No response from Nvidia yet...

Source: https://videocardz.com/newz/club386...lack-screens-issue-pcie-4-0-fix-shows-promise
 
LOLWA. 5080 is only 65% (which is garbage relative to a 2-gen cycle) faster than the 3080, 11% faster than the 4080, and the first 80-series card in 4 generations to NOT beat the previous gen's flagship, but also be 17% SLOWER :clown:. The 3080, over the last 4 generations seems to have the highest uplifts, relatively.

1.png
 
I heard the new Dragon Age has received a mixed reaction or are you referring to Origins ?
All of them. I played ME but never got to DA. Currently playing ME Andromeda ( so far is fun if we manage expectations as writing is mediocre + very good OOB HDR).
Bioshocks, Shadow Warriors too, etc etc.

Compared to gpu upgrade, proper oled with perfect blacks and HDR is so much better imo. ( ofc can do both too). Very impactful.
 
Last edited: