News NVIDIA GeForce RTX 50 Series GPUs Already Facing Shortages, Prices For RTX 5090 & RTX 5080 Have Doubled In Some Regions

bssunilreddy

Keymaster

NVIDIA GeForce RTX 50 Series GPUs Already Facing Shortages, Prices For RTX 5090 & RTX 5080 Have Doubled In Some Regions​

Miscommunications Between NVIDIA & its Board Partners are Reportedly Causing Shortages in GeForce RTX 50 GPUs


NVIDIA GeForce RTX 5090 & 5080 are reportedly short in supply, causing delays in shipments and even price increases in some regions.
Since the reveal of NVIDIA GeForce RTX 50 series GPUs, there have been a couple of reports suggesting that it could be difficult to secure an RTX 50 GPU at the time of launch.

A few days ago, we reported that due to the extremely limited availability of higher-end RTX 50 GPUs, particularly the RTX 5090, it will be hard to buy one at MSRP. Now more reports have confirmed that this is indeed true, making the GPUs extremely scarce in the market worldwide.

Benchlife reports that there have been some miscommunications between NVIDIA and its board partners, due to which the market will see a limited stock of RTX 50 series GPUs. This includes the RTX 5090 and RTX 5080, but the RTX 5070 GPUs might also be affected. As per the sources, the availability is going to improve only at the Spring Festival, but till then, there is no guarantee that customers can get one of these GPUs at MSRP.

A similar report has emerged from another source, UDN, which claims that due to an acute shortage of RTX 5090 and 5080 inventory, a significant surge in prices has been seen in some regions (specifically China) and that now the GPUs can retail at twice their MSRPs. This means the RTX 5090 could retail at up to $4000 while the RTX 5080 is at $2000. The AIBs are supposedly going to benefit a lot from this price increase, including ASUS, Gigabyte, and MSI.

This seems one of the worst-case scenarios for a launch and is probably going to exist for a couple of weeks or even months before one can see these GPUs at their official MSRPs. The RTX 5090 is already at a high $1999 price tag with AIB custom editions going at even higher prices. Both the RTX 5090 and RTX 5080 are launching on 30th January and some of the custom editions have already been listed on retailers.

The EU GPU prices are particularly going to be higher due to the premium tax and, considering how bad the availability of these GPUs is, don't expect them to go back to normal in a few days.
As far as the NA market is concerned, UDN reports that one of the US dealers has reported that only 20 RTX 5080 units have arrived for the first month and there were no RTX 5090 at all. This scarce supply could last up until the end of Q1 or even Q2, which is quite different from the time when NVIDIA launched the RTX 40 series GPUs.

Source: https://wccftech.com/nvidia-geforce...es-ahead-of-launch-rtx-5090-5080-price-surge/
 
  • Like
Reactions: anony
Glad to see NVIDIA hasn't changed its business practices. Hype up a product then artificially keep its stock low so that people buy at high prices (More profit) or resort to 40 series GPUs hence they can get rid of excess stock.
 
Should be a fun couple of months watching people melt down after selling their 40-series or 30-series card before they get their hands on a 50-series card :snaphappy:
 
  • Like
Reactions: anony
Reviews show rasterization performance is not that great and power consumption vs 40 series is bad as well. Will continue to use my 4070ti till 60 series offers some actual gains.
 
This imo will stabilize much faster than previous two generations. These cards are (a) not even released yet and (b) don't offer any real fps per dollar advantage.
 
this has mainly happened due to this, this is why evga left. Also Nvidia itself is to blame as well as they gave the AIBs the chance to capitalize on the pricing of especially the 5080 due to the large gap in pricing which I thing they left intentionally to fill with the Ti/Super variants later down the line. I have got a heavy suspicion that this generation of cards is going to last much longer than any other generation, maybe even over 3 years, this is mainly due to neither TSMC nor Samsung being able to get good yields in their 3nm and 2nm nodes. TSMC is fairing a much better though but it is still not enough to meet demands especially due to the increasing demand in automotive sector and the AI boom. Samsung is in such a bad spot that it might have to outsource the production of its 3nm based Exynos 2500 to TSMC.
1737851640317.png


I have got a heavy suspicion that this generation of cards is going to last much longer than any other generation, maybe even over 3 years, this is mainly due to neither TSMC nor Samsung being able to get good yields in their 3nm and 2nm nodes. TSMC is fairing a much better though but it is still not enough to meet demands especially due to the increasing demand in automotive sector and the AI boom. Samsung is in such a bad spot that it might have to outsource the production of its 3nm based Exynos 2500 to TSMC.
 
Last edited:
I'm not convinced this generation will last longer; in fact, I think the opposite might be true. The 5090 arrived after more than two years and can essentially be seen as a 4090 Ti, given its underwhelming raster performance. There are also rumors that the next GPU, Rubin, might be ahead of schedule with a target release in 2026. While this is purely speculation for now, I’m hoping we’ll get more clarity during Nvidia's GTC in March.
 
Last edited:
I'm not convinced this generation will last longer; in fact, I think the opposite might be true. The 5090 arrived after more than two years and can essentially be seen as a 4090 Ti, given its underwhelming raster performance. There are also rumors that the next GPU, Rubin, might be ahead of schedule with a target release in 2026. While this is purely speculation for now, I’m hoping we’ll get more clarity during Nvidia's GDC in March.
i feel the same, it feels like pascal to turing once again
 
And this is on top of tepid last gen which did not move price/perf much for most of the price points. So two nothing gens more or less with small gains.
No clear upgrade path for 3080 in price/perf + meaningful perf jump.
 
I was planning to build my PC after 10 years with the 5090 GPU, but now I’m unsure what to do with this disappointing generation. 4090 owners definitely got lucky, buying at the right time and enjoying a long and worthwhile run with it.

I’ve decided to postpone building my PC until GTC, hoping Nvidia provides some clarity about their next lineup. Given the cost involved, I’m not willing to invest unless it’s for the long term.
 
Last edited:
I was planning to build my PC after 10 years with the 5090 GPU, but now I’m unsure what to do with this disappointing generation. 4090 owners definitely got lucky, buying at the right time and enjoying a long and worthwhile run with it.

I’ve decided to postpone building my PC until GDC, hoping Nvidia provides some clarity about their next lineup. Given the cost involved, I’m not willing to invest unless it’s for the long term.
i am in the same boat. (couldnt control but reply, so pissed off!)
 
I spent so much time carefully selecting my computer parts, constantly going back and forth and making countless changes, only for Nvidia to ruin it. Looking back now, I feel like I wasted my time and made a fool of myself.

I’m so frustrated.
 

RTX 5090 vs RTX 4090 tested at same power level: RTX 50 flagship shows efficiency downgrade vs previous gen​


Nvidia's latest flagship GPU, the RTX 5090, has a 28% higher TDP than its predecessor, the RTX 4090. This 28% increase in TDP, and a healthy increase in CUDA cores, makes the RTX 5090 around 27% faster vs the RTX 4090. However, the gain is much less when both cards use the same amount of power.

Reviews of the RTX 5090 are out and, for folks who want the best performance regardless of cost and power consumption, Nvidia has created a monster in the RTX 5090. From rasterization to ray tracing, the RTX 5090 delivers unmatched performance at 4K. Throw in DLSS 4 and you have a GPU that can do 4K gaming at above 200 FPS in titles that support Nvidia’s Multi-Frame Generation tech.
That said, the RTX 5090 isn’t as impressive of a jump in performance as the RTX 4090 was vs the RTX 3090. In our testing, the GPU is, on average, 27% faster than the RTX 4090. However, this 27% gain in performance comes with a significant increase in power consumption, as the RTX 5090 is rated at 575 W vs the RTX 4090’s 450 W.
This begs the question: What does the performance difference between the RTX 5090 and the RTX 4090 look like if both cards are normalized for power? Thankfully, ComputerBase has done the testing, and the results are quite interesting.

RTX 5090 vs RTX 4090 at 450 W


According to ComputerBase, the RTX 5090 manages 17% more FPS on average vs RTX 4090 in rasterization workloads when both cards are limited to 450 W. So, the RTX 5090 suffers a performance regression of 8% when using 125 W less power.
This 17% performance bump at 450 W is also quite interesting, since the RTX 5090 packs 33% more CUDA cores than the RTX 4090. As such, one would expect the performance difference to be noticeably greater than the measured 17% when the RTX 5090’s power consumption advantage is removed. Granted, the lower GPU clocks of the RTX 5090 (113 MHz less) are undoubtedly negatively affecting the overall performance.
All things considered, it seems that Nvidia is brute-forcing its way to a decent lead for the RTX 5090 over the RTX 4090. However, this doesn’t fare well for the GPU’s efficiency, as the RTX 5090 appears to scale linearly with TDP (27% more performance for 28% more energy). For reference, the RTX 4090 has a >50% advantage vs RTX 3090 at a 29% higher TDP.
9je72ZA.jpeg


Source: https://www.notebookcheck.net/RTX-5...iency-downgrade-vs-previous-gen.952084.0.html
 
  • Like
Reactions: k660

RTX 5090 vs RTX 4090 tested at same power level: RTX 50 flagship shows efficiency downgrade vs previous gen​


Nvidia's latest flagship GPU, the RTX 5090, has a 28% higher TDP than its predecessor, the RTX 4090. This 28% increase in TDP, and a healthy increase in CUDA cores, makes the RTX 5090 around 27% faster vs the RTX 4090. However, the gain is much less when both cards use the same amount of power.

Reviews of the RTX 5090 are out and, for folks who want the best performance regardless of cost and power consumption, Nvidia has created a monster in the RTX 5090. From rasterization to ray tracing, the RTX 5090 delivers unmatched performance at 4K. Throw in DLSS 4 and you have a GPU that can do 4K gaming at above 200 FPS in titles that support Nvidia’s Multi-Frame Generation tech.
That said, the RTX 5090 isn’t as impressive of a jump in performance as the RTX 4090 was vs the RTX 3090. In our testing, the GPU is, on average, 27% faster than the RTX 4090. However, this 27% gain in performance comes with a significant increase in power consumption, as the RTX 5090 is rated at 575 W vs the RTX 4090’s 450 W.
This begs the question: What does the performance difference between the RTX 5090 and the RTX 4090 look like if both cards are normalized for power? Thankfully, ComputerBase has done the testing, and the results are quite interesting.

RTX 5090 vs RTX 4090 at 450 W


According to ComputerBase, the RTX 5090 manages 17% more FPS on average vs RTX 4090 in rasterization workloads when both cards are limited to 450 W. So, the RTX 5090 suffers a performance regression of 8% when using 125 W less power.
This 17% performance bump at 450 W is also quite interesting, since the RTX 5090 packs 33% more CUDA cores than the RTX 4090. As such, one would expect the performance difference to be noticeably greater than the measured 17% when the RTX 5090’s power consumption advantage is removed. Granted, the lower GPU clocks of the RTX 5090 (113 MHz less) are undoubtedly negatively affecting the overall performance.
All things considered, it seems that Nvidia is brute-forcing its way to a decent lead for the RTX 5090 over the RTX 4090. However, this doesn’t fare well for the GPU’s efficiency, as the RTX 5090 appears to scale linearly with TDP (27% more performance for 28% more energy). For reference, the RTX 4090 has a >50% advantage vs RTX 3090 at a 29% higher TDP.
9je72ZA.jpeg


Source: https://www.notebookcheck.net/RTX-5...iency-downgrade-vs-previous-gen.952084.0.html
my response in tldr; not much point for 30 series (higher tier ones), 40 series holders to upgrade until nvidia really comes with a new process node tech. and old nvidia stock aint going or will not go down in price either (maa kasam itni galiya man mai aa rahi hai, itna frustration)
 
my response in tldr; not much point for 30 series (higher tier ones), 40 series holders to upgrade until nvidia really comes with a new process node tech. and old nvidia stock aint going or will not go down in price either (maa kasam itni galiya man mai aa rahi hai, itna frustration)
Because of this, I expect used 40 series cards (and higher-end 30 series) to still hold their value well.
 
  • Like
Reactions: k660
RTX 5090 vs RTX 4090 tested at same power level: RTX 50 flagship shows efficiency downgrade vs previous gen
I don't get what's the downgrade though. Is it performance at iso-power? If so, 5090 shows better results than 4090 at 450W and 400W, according to the chart linked above. Is it performance per unit chip area? That might be a possibility. However, performance does not increase linearly with the number of parallel processing cores. 4090 has +68% cores over 4080 (16384 vs 9728), yet it's "only" 26% faster. 5090 packs +33% cores over 4090, and is ~30% faster, better scaling compared to the 4000 series. As to why performance does not scale linearly, a very basic explanation is Amadahl's law (https://en.wikipedia.org/wiki/Amdahl's_law), and possibly more factors such as shared memory access, cache etc. play a part.

While 4090 -> 5090 is not as big as of an upgrade as the previous gen, I feel the efficiency losses are overblown and mostly sensational titles for clicks. Of course, if someone does another test at 350W (or lower) and at any point 4090 does exceed 5090 in performance at iso-power, we can indeed say that power efficiency has gone down. Don't know when that will be possible though, as Nvidia has locked the power target slider to a minimum of 70% (~400W).