Graphic Cards Is it worth it to buy 9070xt now?

Anyone who is currently using it can share your experience on 1440p gaming with ray tracing capablities ?

Did a quick run of built-in benchmark in Cyberpunk 2077 at 1440p. Settings:
  • Preset: RT Overdrive (i.e.Path Tracing enabled)
  • Upscaling: FSR3 Quality -> FSR4 Quality (using Optiscaler)
  • Film Grain and Chromatic Aberration turned off
Cyberpunk 2077_2025.05.15-08.59.jpg


Take note that I am running my 9070 XT with -30% total power, so a regular one should perform better. ~43 FPS seems okay, and you can turn ON frame generation to have a smoother presentation.

I personally play at 4K, and this card isn't fast enough for 4K + Path Tracing even at FSR Performance, but regular RT modes work well. I also find 4K FSR Performance to be much more temporally stable than 1440p FSR Quality.

Let me know if there are any other specific benchmarks you'd want me to run.
 
Did a quick run of built-in benchmark in Cyberpunk 2077 at 1440p. Settings:
  • Preset: RT Overdrive (i.e.Path Tracing enabled)
  • Upscaling: FSR3 Quality -> FSR4 Quality (using Optiscaler)
  • Film Grain and Chromatic Aberration turned off
View attachment 235566

Take note that I am running my 9070 XT with -30% total power, so a regular one should perform better. ~43 FPS seems okay, and you can turn ON frame generation to have a smoother presentation.

I personally play at 4K, and this card isn't fast enough for 4K + Path Tracing even at FSR Performance, but regular RT modes work well. I also find 4K FSR Performance to be much more temporally stable than 1440p FSR Quality.

Let me know if there are any other specific benchmarks you'd want me to run.
This is what i wanted to see thank you so much for taking your time to post this
 
  • Like
Reactions: GauravDas
Did a quick run of built-in benchmark in Cyberpunk 2077 at 1440p. Settings:
  • Preset: RT Overdrive (i.e.Path Tracing enabled)
  • Upscaling: FSR3 Quality -> FSR4 Quality (using Optiscaler)
  • Film Grain and Chromatic Aberration turned off
View attachment 235566

Take note that I am running my 9070 XT with -30% total power, so a regular one should perform better. ~43 FPS seems okay, and you can turn ON frame generation to have a smoother presentation.

I personally play at 4K, and this card isn't fast enough for 4K + Path Tracing even at FSR Performance, but regular RT modes work well. I also find 4K FSR Performance to be much more temporally stable than 1440p FSR Quality.

Let me know if there are any other specific benchmarks you'd want me to run.
Check out UltraPlus on nexusmods, the mod offers both performance and visual improvements for path tracing, might give you just enough to get 50-60 fps
nvm, there seems to an issue with 9070 xt users. Ignore.
 
Last edited:
For those running regular 9070XTs, what temps do you have? This is for both core and memory. Also, should an 80% PL work in bringing temps down to a manageable level?
 
Seeing some ASUS models at suspiciously low prices on Shweta Computers. A non-XT 9070 for 59.3K, 9070 XT for 66.5K, and even a 5070 Ti for 82K!

There's a couple months old thread on TE itself with multiple people reporting the website is legit https://techenclave.com/threads/is-shwetacomputers-com-pc-shopping-site-trusted.225427/. The prices sound too good to be true, especially as ASUS models are typically more expensive, but perhaps someone interested could message them and check.

For those running regular 9070XTs, what temps do you have? This is for both core and memory. Also, should an 80% PL work in bringing temps down to a manageable level?
Funnily enough, I moved back from 70% PL → 81% PL to regain some performance. Here's my temps after a 5 run loop of the Metro Exodus Enhanced Edition benchmark.
  • Core Temp: 64 °C
  • Hot Spot Temp: 80 °C
  • Memory Temp: 88 °C (!)
  • Fan Speed: 1581 RPM
Attached a screenshot of HWiNFO as well. I get the feeling the GPU is programmed to allow memory temps to reach ~90 °C. My ambient temps are ~10 °C lower with the recent rains, but GPU temps seem same while fan speed is lower. I had seen over 2K RPM when benchmarking last month during the peak of summer.
 

Attachments

  • hwinfo-metro-tpb19.png
    hwinfo-metro-tpb19.png
    920.5 KB · Views: 21
Attached a screenshot of HWiNFO as well. I get the feeling the GPU is programmed to allow memory temps to reach ~90 °C. My ambient temps are ~10 °C lower with the recent rains, but GPU temps seem same while fan speed is lower. I had seen over 2K RPM when benchmarking last month during the peak of summer.
yes, so you will need to change fan curve to get lower temps. At 2k rpm you will get lower temps vs before.
Since fans can be easier to replace, i run them slightly higher ~65-70% in my case and get lower temps.
Memory temp in high 80s seems to be normal, so it should be fine.
 
FSR 3.1 is very close to DLSS now and they keep updating and improving it, If I play on an nvidia card obviously I would still be running DLSS but for AMD cards FSR is a godsend and remember it still runs on all GPUs so if DLSS in a particular game is causing you issues (ghosting/trails, artifacts etc.) then FSR saves the day.
its not, just not, not with DLSS 4.0 Transformer model release, AMD still needs a couple of generations to even match Nvidia for upscaling, but it is indeed a very viable card now.

P.S. which games are giving you ghosting with DLSS and not with FSR? no really? I wanna know that
Anyone who is currently using it can share your experience on 1440p gaming with ray tracing capablities ?
if you want RT. go for second hand Nvidia, Ray Reconstruction alone is essential in cyberpunk imo, RT on AMD is just serviceable
 
  • Like
Reactions: LYNK BYTE
its not, just not, not with DLSS 4.0 Transformer model release, AMD still needs a couple of generations to even match Nvidia for upscaling, but it is indeed a very viable card now.

P.S. which games are giving you ghosting with DLSS and not with FSR? no really? I wanna know that
I think for me that game was Silent hill 2 but I'm not sure as it's been some time now, I just remember noticing weird stuff happening while using DLSS which disappeared when I switched to FSR. Here's a video showing ghosting using DLSS in Silent hill 2 which looking at the comments section seems like it got fixed by a patch later. I think it's the implementation of specific DLSS versions (same for FSR) combined with games which can/will have issues like the one I described and is not true for all games or DLSS/FSR in general. It's good that we have multiple options now including XeSS which to be honest is looking alright these days, like the differences aren't noticeable while playing, only when you take screenshots and compare pixels you realise.

Also today my stance on frame generation is slightly positive than it used to be earlier. I tried frame gen on Horizon: Forbidden West and I was amazed by how smooth it was without glitches or anything while giving more FPS (although some fake ones), the only thing that's noticeable is the input delay which is expected and sometimes inconsistency in the frames as they dip down from time to time depending on areas. The dipping can be resolved or at least mitigated using a separate dedicated GPU for frame gen but the input delay is here to stay.
If you are satisfied with having slightly more input delay or are one of those who didn't/couldn't notice then frame gen is amazing.
 
I think for me that game was Silent hill 2 but I'm not sure as it's been some time now, I just remember noticing weird stuff happening while using DLSS which disappeared when I switched to FSR. Here's a video showing ghosting using DLSS in Silent hill 2 which looking at the comments section seems like it got fixed by a patch later. I think it's the implementation of specific DLSS versions (same for FSR) combined with games which can/will have issues like the one I described and is not true for all games or DLSS/FSR in general. It's good that we have multiple options now including XeSS which to be honest is looking alright these days, like the differences aren't noticeable while playing, only when you take screenshots and compare pixels you realise.
ah, the one you linked was comparing was doing 3.5 vs 3.7, 3.7 did solve some ghosting issues we were seeing, you dont need to wait for a game patch to upgrade DLSS, simply swap the DLL and you are fine, I have been running zero dawn on 4.0 for months now, dlss perf easily gives me 150+ fps, and that game is a treat playing at that fps
Also today my stance on frame generation is slightly positive than it used to be earlier. I tried frame gen on Horizon: Forbidden West and I was amazed by how smooth it was without glitches or anything while giving more FPS (although some fake ones), the only thing that's noticeable is the input delay which is expected and sometimes inconsistency in the frames as they dip down from time to time depending on areas. The dipping can be resolved or at least mitigated using a separate dedicated GPU for frame gen but the input delay is here to stay.
If you are satisfied with having slightly more input delay or are one of those who didn't/couldn't notice then frame gen is amazing.
as long you are not sensitive to input lag, fg is a fame changer for sure, I dont have a 40xx series card, so dont have first hand experience with it, but tbh my 3060ti on dlss perf is still chugging along pretty nicely, I'll stick with it unless AMD gets upto par or nvidia finally gets good price, and me personally prefer lesser visual fidelity over input lag, I tried fg on a 40xx card once, and while input lag was barely noticeable, for me it was still there.
 
ah, the one you linked was comparing was doing 3.5 vs 3.7, 3.7 did solve some ghosting issues we were seeing, you dont need to wait for a game patch to upgrade DLSS, simply swap the DLL and you are fine, I have been running zero dawn on 4.0 for months now, dlss perf easily gives me 150+ fps, and that game is a treat playing at that fps
Yeah don't remember the exact DLSS version. It's nice that you can just swap DLL file and don't have to wait for devs to implement it but isn't that true for DLSS versions on the same version tier ? For example you can update dlss 3.5 to 3.7 but not 3.5 to 4.1. The numbers are of course just for example. Which is why I didn't bother trying to update it.
Horizon forbidden west is much more demanding than zero dawn, the first area (tutorial area) is good performance wise while also looking pretty but as soon as you reach the next one FPS started to dip for me. From 90FPS to 60-65FPS. Classic case of only optimising the major areas lol.
 
Yeah don't remember the exact DLSS version. It's nice that you can just swap DLL file and don't have to wait for devs to implement it but isn't that true for DLSS versions on the same version tier ? For example you can update dlss 3.5 to 3.7 but not 3.5 to 4.1. The numbers are of course just for example. Which is why I didn't bother trying to update it.
nah, you can, with 4.X release, nvidia has released an update in their Geforce Experience app, where you can force DLSS 4.0, as for swapping DLLs, you can swap for whatever version you want, be it 3.XX or 4.XX, only difference is, you need DLSSTweaks if you want to force presets

Horizon forbidden west is much more demanding than zero dawn, the first area (tutorial area) is good performance wise while also looking pretty but as soon as you reach the next one FPS started to dip for me. From 90FPS to 60-65FPS. Classic case of only optimising the major areas lol.
it really shouldnt, what are your specs? even on my 3060ti build with maxed settings, I easily get 60-70s with dlss quality, as for fps drops, Forbidden West gets way more demanding as soon as the area opens up, its expected
 
  • Like
Reactions: YeAhx
nah, you can, with 4.X release, nvidia has released an update in their Geforce Experience app, where you can force DLSS 4.0, as for swapping DLLs, you can swap for whatever version you want, be it 3.XX or 4.XX, only difference is, you need DLSSTweaks if you want to force presets


it really shouldnt, what are your specs? even on my 3060ti build with maxed settings, I easily get 60-70s with dlss quality, as for fps drops, Forbidden West gets way more demanding as soon as the area opens up, its expected
I should install the app, It's just called NVIDIA App now right? As far as I know Geforce Experience is a deprecated app now. I previously had it installed but uninstalled it because I didn't like the fact that I basically had two softwares doing the same thing (NVIDIA control panel and the app). Time to give another try next time I update the driver (Kinda cozy with the current older one as I heard newer drivers are unstable). ver.566.36

My pc has a 5700X3D with RTX 3070 and 16GB RAM. Planning to increase the RAM to 32GB as that's the sweet spot these days but I'm not really playing games that much anymore. Played just 2 hours of Forbidden west till now.
 
I should install the app, It's just called NVIDIA App now right? As far as I know Geforce Experience is a deprecated app now. I previously had it installed but uninstalled it because I didn't like the fact that I basically had two softwares doing the same thing (NVIDIA control panel and the app). Time to give another try next time I update the driver (Kinda cozy with the current older one as I heard newer drivers are unstable). ver.566.36

My pc has a 5700X3D with RTX 3070 and 16GB RAM. Planning to increase the RAM to 32GB as that's the sweet spot these days but I'm not really playing games that much anymore. Played just 2 hours of Forbidden west till now.
Download the Nvidia app, and just keep that one.

Yes, you can force DLSS4 in most games via the app