News NVIDIA GeForce RTX 5080 Is 2x Faster Than 4080 At $999, RTX 5070 Ti 2x Faster Than 4070 Ti At $769, RTX 5070 Faster Than 4090 For $549

1736285876905.png


I am going to wait for the reviews to be out for 5080 and given the price difference between the GPU's, I am expecting Super series like 5080s/5080Ti coming along.
 
TBH I don't understand the obsession with raw performance. What matters is the output quality (frame rates, frame pacing, latency and visuals). If all of that is better then how does it matter if the frames are being rendered "artificially"?

We are already approaching the shrinking limits of the transistors - in order to get more performance there will of course have to be focus on other aspects of the rendering pipeline.
 
TBH I don't understand the obsession with raw performance. What matters is the output quality (frame rates, frame pacing, latency and visuals). If all of that is better then how does it matter if the frames are being rendered "artificially"?

We are already approaching the shrinking limits of the transistors - in order to get more performance there will of course have to be focus on other aspects of the rendering pipeline.
Exactly, when gaming if dlss is on or off there is very minimal difference mostly not noticeable unless you keep looking at the screen to find flaws and not actually play the game.
 
TBH I don't understand the obsession with raw performance. What matters is the output quality (frame rates, frame pacing, latency and visuals). If all of that is better then how does it matter if the frames are being rendered "artificially"?

We are already approaching the shrinking limits of the transistors - in order to get more performance there will of course have to be focus on other aspects of the rendering pipeline.
The thing with artificial ai generated frames is that they increase the latency noticeably which was a big deal for many. With the new gen reflex 2 nvidia claims that the latency is reduced but lets wait for third party reviews. Their claim of 5070 performing similar to 4090 is with multi frame gen and in specific scenarios so i wouldnt trust their benchmarks.
 
TBH I don't understand the obsession with raw performance. What matters is the output quality (frame rates, frame pacing, latency and visuals). If all of that is better then how does it matter if the frames are being rendered "artificially"?

We are already approaching the shrinking limits of the transistors - in order to get more performance there will of course have to be focus on other aspects of the rendering pipeline.
1. We need some min no of fps else it feels bad. FG wont do much there. Wont be an issue on 5090, but will be for most others ( esp with Max RT).
2. FG if done well is nice, but its not a replacement. Basically we want motion clarity, there are number of ways to get that including FG.
60 fps on CRT screen is apparently more smooth/clear than say 240 IPS. So we dont really need high fps.
BFI is another way.
Look at this - i am more excited in this - hope will be able to run it on games some day.
3. Nvidia wants to sell software instead of raw hardware. The way they gimp their lower end cards means that this is not about moore's law.
When they talk about AI moores law is alive, but for gamers its dead.
4. 1L gpu with 16gb VRAM is not good. And neither is 12gb on 5070. Better to wait until they refresh it to 18gb and 24gb resp.

5. Also - just want to say - HDR/OLED is a much better improvement in graphics than raytracing and it does not cost performance.
It does cost money today, but its getting cheaper.
Those who don't have it, consider that instead of expensive gpus.
Older games can look much better this way too.
I am playing Arkham city, and it looks much nicer in HDR.
 
Last edited:
how does it matter if the frames are being rendered "artificially"?
The thing to understand is how the frames are generated to begin with. Frame generation uses interpolation, which is a fancy way of saying "guessing". Interpolation in any form reduces quality of the rendered frame. A frame generated by the game engine is significantly different from the frame generated by the driver, hence it always looks worse and because the driver has no way to respond to game input, you get fairly severe lag.

With a 40 series card the difference is already quite noticeable and severe even if the native framerate is high enough. It can cause severe motion sickness as the movement on screen is disjointed from your actual input (it does for me at least). I have three 40 series cards (90, TiS and 60) and all exhibit the same issues (in Hogwarts Legacy, for example, I find it unusable). In slower paced games it's acceptable but not needed if the game is well-optimised.

One of the earliest uses of interpolation was its use for digital audio, and used to render output samples that were not captured during digitisation, but were needed to complete the actual waveform. The method was the same and early versions of digital audio were quite poor compared to their analog counterparts. FG is no different in its intent, but much more lacking because the dataset is larger, and the variables are more. If it is tightly integrated into the game engine it can work, but it's usually closer to the latter half of the pipeline (at least the first FG was).

Now that they are moving to 4x FG and higher TOPS we can also expect better LLM and media generation once Torch gets updated for the new cards. However the smaller framebuffers will mean lots of models will not fit fully in VRAM, which is why for the AI tests they had to run FP4 for the 5070 - the FP8 model is 11GB (means even 16GB cards will have a tough time) and is known to be half speed of FP4, besides not being able to run on the 5070. This is why they hobbled the 4090 as much as possible for that test. Everybody knows the 5070 will be slower than the 4090. That is a given, the question is by how much. If you lose 25% performance and keep to a 300W power window at half or a third the price, it will still be a great card as long as you can live with half the framebuffer.

All of this does not excuse the fact that the numbers shown at CES are heavily gamed to fit the narrative that marketing chose to go with. A like for like comparison would have been better, but billion dollar copanies aren't paragons of honesty to begin with.
 
I have found DLSS as a way of cheating (my take on this tech) and the good news is that we already have used 4090's showing up in user listings. I do want 5080 to beat the 4090.
Something we will see in coming weeks.
 
I have been looking at some of the 5090 designs.. and wow most of them look hideous.. why cant these companies make a sleek looking card like the FE? 2000$ card with RGB's and stupid graphics. Especially Zotac: https://www.zotac.com/in/product/graphics_card/zotac-gaming-geforce-rtx-5090-solid-oc

so far only the Gigabyte Aero is looking like a decent card.

I'm trying to find a sleek looking 5090 which is nearby the 2000 MSRP, dont want to pay ridiculous amounts of money just for the cooler design. If somebody finds a decent looking model.. please do share. Hopeing to buy one.
 
Last edited:
I have been looking at some of the 5090 designs.. and wow most of them look hideous.. why cant these companies make a sleek looking card like the FE? 2000$ card with RGB's and stupid graphics. Especially Zotac: https://www.zotac.com/in/product/graphics_card/zotac-gaming-geforce-rtx-5090-solid-oc

so far only the Gigabyte Aero is looking like a decent card.

I'm trying to find a sleek looking 5090 which is nearby the 2000 MSRP, dont want to pay ridiculous amounts of money just for the cooler design. If somebody finds a decent looking model.. please do share. Hopeing to buy one.
That sadly seems to be the case. Wait for the ProArt and MSI Expert design to show-up. Galax White one looks decent as well for the 5070ti. This time the FE looks dull and repetitive; Maybe the node change will bring a fresh design.