News NVIDIA GeForce RTX 50 Series GPUs Already Facing Shortages, Prices For RTX 5090 & RTX 5080 Have Doubled In Some Regions

I don't get what's the downgrade though. Is it performance at iso-power? If so, 5090 shows better results than 4090 at 450W and 400W, according to the chart linked above. Is it performance per unit chip area? That might be a possibility. However, performance does not increase linearly with the number of parallel processing cores. 4090 has +68% cores over 4080 (16384 vs 9728), yet it's "only" 26% faster. 5090 packs +33% cores over 4090, and is ~30% faster, better scaling compared to the 4000 series. As to why performance does not scale linearly, a very basic explanation is Amadahl's law (https://en.wikipedia.org/wiki/Amdahl's_law), and possibly more factors such as shared memory access, cache etc. play a part.

While 4090 -> 5090 is not as big as of an upgrade as the previous gen, I feel the efficiency losses are overblown and mostly sensational titles for clicks. Of course, if someone does another test at 350W (or lower) and at any point 4090 does exceed 5090 in performance at iso-power, we can indeed say that power efficiency has gone down. Don't know when that will be possible though, as Nvidia has locked the power target slider to a minimum of 70% (~400W).
entire point is that its more of a 4090++
 
I don't get what's the downgrade though. Is it performance at iso-power? If so, 5090 shows better results than 4090 at 450W and 400W, according to the chart linked above. Is it performance per unit chip area? That might be a possibility. However, performance does not increase linearly with the number of parallel processing cores. 4090 has +68% cores over 4080 (16384 vs 9728), yet it's "only" 26% faster. 5090 packs +33% cores over 4090, and is ~30% faster, better scaling compared to the 4000 series. As to why performance does not scale linearly, a very basic explanation is Amadahl's law (https://en.wikipedia.org/wiki/Amdahl's_law), and possibly more factors such as shared memory access, cache etc. play a part.

While 4090 -> 5090 is not as big as of an upgrade as the previous gen, I feel the efficiency losses are overblown and mostly sensational titles for clicks. Of course, if someone does another test at 350W (or lower) and at any point 4090 does exceed 5090 in performance at iso-power, we can indeed say that power efficiency has gone down. Don't know when that will be possible though, as Nvidia has locked the power target slider to a minimum of 70% (~400W).
The fact that they stuck to the same process node, slapped on deceptive marketing with fake frames (5070=4090 LOL), is what got most riled up. This could have easily passed off as a 4090 Ti.

Well, they just lost $600 Billion in market cap. Karma finally bit them in the ass for their greed.
 
  • Like
Reactions: bssunilreddy
entire point is that its more of a 4090++
The fact that they stuck to the same process node, slapped on deceptive marketing with fake frames, is what got most riled up. This could have easily passed off as a 4090 Ti.

Well, they just lost $600 Billion in market cap. Karma finally bit them in the ass for their greed.
Agreed on most points, and why I had refrained from commenting before. I was interested in the efficiency regression claims, which just seems like a big nothingburger.
 
  • Like
Reactions: PunkX 75