News NVIDIA GeForce RTX 5080 Is 2x Faster Than 4080 At $999, RTX 5070 Ti 2x Faster Than 4070 Ti At $769, RTX 5070 Faster Than 4090 For $549

I've previously used framegen on my 4090 and i wasnt able to tell a difference in latency when i was actually playing.. however visuals i could somewhat notice if i looked for it.

That is mostly because the base frame rates were also high. So interpolation became easier for the FG model. It'll be a complete sh**t show if you reproduce the same with 5070 MFG. It's demonstrated in the following chart.
Screenshot 2025-01-25 at 5.42.42 PM.png

If your base frame is 72, latency increases by 42%. It's still bad, but you can look the other way for increased smoothness. But if you go from 30 to 120 FPS, latency shoots by WHOPPING 425% to 0.11 seconds, leaving the game unplayable. This doesn't even address the artefacts issues which is another topic. This is why people are so mad about 50 series performance uplift. We'd be lucky to witness even a 5080 matching 4090 performance.
 
That is mostly because the base frame rates were also high. So interpolation became easier for the FG model. It'll be a complete sh**t show if you reproduce the same with 5070 MFG. It's demonstrated in the following chart.
View attachment 221803

If your base frame is 72, latency increases by 42%. It's still bad, but you can look the other way for increased smoothness. But if you go from 30 to 120 FPS, latency shoots by WHOPPING 425% to 0.11 seconds, leaving the game unplayable. This doesn't even address the artefacts issues which is another topic. This is why people are so mad about 50 series performance uplift. We'd be lucky to witness even a 5080 matching 4090 performance.
5080 will NOT be matching 4090 performance tho (and that makes me hella mad but it is what it is).. at best it's gonna be 10% slower than 4090 but for the price, it's not like we have anything better in the market :/ A 4090, even a used one will be more expensive than 5080 (atleast FE)
 
I've previously used framegen on my 4090 and i wasnt able to tell a difference in latency when i was actually playing.. however visuals i could somewhat notice if i looked for it.

for eg: in wukong the monkeys hair are spikey and anytime u change the camera angle quickly.. you can see the fake frames causing artifacts around the monkeys head or anytime you have alot of fast paced gameplay you will see some artifacts here and there.
Interesting! So have you tried the new transformer model for FG, has it reduced the visual artifacts?
 
Nvidia has removed "Hot Spot" sensor data from RTX50 GPU's.
Source:https://videocardz.com/pixel/nvidia-has-removed-hot-spot-sensor-data-from-geforce-rtx-50-gpus

Scalpers already charging double or triple with no refunds for RTX5090's. Look at the prices they are quoting in ebay and elsewhere. It seems both kidneys arent enough. we need to sell the liver by bidding it to the highest bidder. When my wifes brother went for liver transplant he paid 40lacs. Brokers are saying it might even for as high as 5crores depending upon the buyers status and demand. In 2030 for buying RTXGPU's I might need to sell my liver who know becaue liver will grow back but kidneys wont right.
Source:https://videocardz.com/newz/scalpers-already-charging-double-with-no-refunds-for-geforce-rtx-5090
 
If you're serious and it's not sarcasm, that's messed up.
Yes this was in 2014 in KIMS Hospital in Hyderabad. He was about to die and has 2 years old daughter. So we collected money from donations from the family and local MLA and surgery done successfully.

ASUS GeForce RTX 5090 Astral Overclocked With LN2 Over 3 GHz, Results In Almost 1000W Power Consumption

In terms of performance improvement, the ASUS GeForce RTX 5090 ROG Astral overclocked GPU achieved a 25% uplift in 3DMark Port Royal, an 18% uplift in Time Spy Extreme, and a 12% uplift in FireStrike Ultra. This is a nice improvement, but we should remember that this is from a 41% uplift in its boost clock to 3.4 GHz (almost 1 GHz higher than the stock NVIDIA GeForce RTX 5090) and the card itself was consuming around 1000W of power with the whole system's power reaching almost 1800W.

Source: https://wccftech.com/nvidia-blackwe...-astral-overclock-3-4-ghz-ln2-consumes-1000w/
 
  • Wow
Reactions: anony
The thing to understand is how the frames are generated to begin with. Frame generation uses interpolation, which is a fancy way of saying "guessing". Interpolation in any form reduces quality of the rendered frame. A frame generated by the game engine is significantly different from the frame generated by the driver, hence it always looks worse and because the driver has no way to respond to game input, you get fairly severe lag.

With a 40 series card the difference is already quite noticeable and severe even if the native framerate is high enough. It can cause severe motion sickness as the movement on screen is disjointed from your actual input (it does for me at least). I have three 40 series cards (90, TiS and 60) and all exhibit the same issues (in Hogwarts Legacy, for example, I find it unusable). In slower paced games it's acceptable but not needed if the game is well-optimised.

One of the earliest uses of interpolation was its use for digital audio, and used to render output samples that were not captured during digitisation, but were needed to complete the actual waveform. The method was the same and early versions of digital audio were quite poor compared to their analog counterparts. FG is no different in its intent, but much more lacking because the dataset is larger, and the variables are more. If it is tightly integrated into the game engine it can work, but it's usually closer to the latter half of the pipeline (at least the first FG was).

Now that they are moving to 4x FG and higher TOPS we can also expect better LLM and media generation once Torch gets updated for the new cards. However the smaller framebuffers will mean lots of models will not fit fully in VRAM, which is why for the AI tests they had to run FP4 for the 5070 - the FP8 model is 11GB (means even 16GB cards will have a tough time) and is known to be half speed of FP4, besides not being able to run on the 5070. This is why they hobbled the 4090 as much as possible for that test. Everybody knows the 5070 will be slower than the 4090. That is a given, the question is by how much. If you lose 25% performance and keep to a 300W power window at half or a third the price, it will still be a great card as long as you can live with half the framebuffer.

All of this does not excuse the fact that the numbers shown at CES are heavily gamed to fit the narrative that marketing chose to go with. A like for like comparison would have been better, but billion dollar copanies aren't paragons of honesty to begin with.
Totally agree with what you said.

I know the shortcomings of the generated frames including the hit on visuals and input latency. While still far from perfect, the improvements in both these areas with DLSS 4 can't just be ignored though. For most casual gaming scenarios, some of these stats are perfectly reasonable.

And I know this year the obvious decision was to lean more heavily towards AI because it's a priority elsewhere for them (read datacentres) and as a corporation why wouldn't they leverage economies of scale.

All I'm saying is that reality is somewhere between "fake frames are completely meaningless" and "magic ai that quadruples your framerate and makes your gameplay experience four times better".
 
Ek aur buri khabar - RTX 5090 was not a huge performance jump in native 4K as greedia advertized.
The 5070/5070Ti are already out, so I'm unsure if we'll see a Super series for the 5070. However, I’m pretty confident we’ll see the 5080Ti or 5080 Super, as well as the 5090Ti.
How much power will the 5090ti take rhe 5090 is already and 575w
 
I am using RTX 4070 for 4k gaming on my C3. Should i go for RTX 5080 for good 4k Experience?
For 4k it should be the top model if you have the budget. If max graphics is not what you are after then you could wait for AMD ones too. Should give more value imo, just not the top performer.
 
For most casual gaming scenarios, some of these stats are perfectly reasonable.
......
All I'm saying is that reality is somewhere between "fake frames are completely meaningless" and "magic ai that quadruples your framerate and makes your gameplay experience four times better".

For a 2.2 lakh GPU, casual gaming sessions isn't a valid use case, is it really?

The problem with fake frames is that marketing is trying to create equivalence between an interpolated frame and a rendered frame. This is simply not true. The reality is that any product of interpolation is fake. Any computer scientist will tell you so, and Tim's investigation proves it conclusively. The bigger worry is a lack of efficiency improvements with architecture and manufacturers being only able to achieve any gains with node shrinks, and an increase in shader count. The question is whether we are approaching a real silicon computing limit, or the limit of competence of our scientific computing community.

Back on topic, the additional frames add a new issue, that your 'real' framerate is lowered when hitting the max refresh rate of your monitor, and using lower than optimal framegen. For a 180Hz monitor, for example, and a 100FPS real framerate, you cannot run more than 2x framegen anyway because your base framerate will drop below 90FPS. Most folks will not know how to hit the correct target for optimal experiences (most gamers are actually not on TE or similar forums). There's also the lazy game dev issue, where there's over-dependence on these kind of hacks to improve gameplay in poorly optimised titles.

DLSS4 is overall an improvement but MFG is *not*. There's a clear distinction we need to draw between the two. It looks a lot worse than 2x framegen, because you see more "fake' frames than real frames. Upto thrice as many. That's a regression.
 
Yes this was in 2014 in KIMS Hospital in Hyderabad. He was about to die and has 2 years old daughter. So we collected money from donations from the family and local MLA and surgery done successfully.

ASUS GeForce RTX 5090 Astral Overclocked With LN2 Over 3 GHz, Results In Almost 1000W Power Consumption

In terms of performance improvement, the ASUS GeForce RTX 5090 ROG Astral overclocked GPU achieved a 25% uplift in 3DMark Port Royal, an 18% uplift in Time Spy Extreme, and a 12% uplift in FireStrike Ultra. This is a nice improvement, but we should remember that this is from a 41% uplift in its boost clock to 3.4 GHz (almost 1 GHz higher than the stock NVIDIA GeForce RTX 5090) and the card itself was consuming around 1000W of power with the whole system's power reaching almost 1800W.

Source: https://wccftech.com/nvidia-blackwe...-astral-overclock-3-4-ghz-ln2-consumes-1000w/
And you forgot the most important part, which is 40% uplift in cost over the FE making it even worse in terms of frames/$.
Don't suppose anyone would be purchasing imported units from MX2. The retail units from partners will probably be a few thousand cheaper.