Nah, unless RT really matters to you (not that the 3060Ti will be good at RT), just get the 6700XT.Are Nvidia advantages actually worth sacrificing 4gm vram?
Which Cpu~ Gpu combo will be the best between these? Getting used Gpu both at 18k.
I haven't been following the pc market for years now so I would be glad if you could answer a few questions for me. I thought that DLSS was worth picking these over amd cards? I thought that would have become a standard option in majority of AAA games no? Ik that FSR 3 is coming out (or has it already come out?) but it still isn't matched to DLSS right?Yes, 6700XT or 6750XT are an easy pick over 3060Ti, 3070, 3070Ti or 4060Ti because of 12GB VRAM & the value it offers at 35k. Pick Asus model, else Sapphire. My GPU is a 3070 FYI.
I picked the i5 12400 because of the iGPU for my rig, I don't want any downtime if the dGPU has any issues.
Mention your overall budget & fill the questionnaire.
Yes, DLSS is indeed superior to FSR 3 (which is already out). However, none of the 30-series cards below the 3080 are worth it, mostly because of VRAM limitations. The 6700XT/6750XT are better buys, and better priced than the NVIDIA equivalents (unless you're going used). Plus AMD has also implemented its own version of Frame Generation (Fluid Motion) across the 6000 series as well, alongside the 7000 seriesI haven't been following the pc market for years now so I would be glad if you could answer a few questions for me. I thought that DLSS was worth picking these over amd cards? I thought that would have become a standard option in majority of AAA games no? Ik that FSR 3 is coming out (or has it already come out?) but it still isn't matched to DLSS right?
Having used both DLSS 2.x & FSR 2.x in games, I can't see a difference usually. Game to game, there can be artefacts because of upscaling, I have seen it in both. From online comparisons, surely DLSS has a slightly better image quality. IMO both are not worth it for 1080p target resolution because sub 1080p rendering doesn't have enough pixels for a great upscaling. So 1440p DLSS/FSR quality is the lowest you can use (unless you are using it on a 15" laptop where because of smaller screen size sub-1080p rendering is not as big of an issue as on a 24"+ monitor).I haven't been following the pc market for years now so I would be glad if you could answer a few questions for me. I thought that DLSS was worth picking these over amd cards? I thought that would have become a standard option in majority of AAA games no? Ik that FSR 3 is coming out (or has it already come out?) but it still isn't matched to DLSS right?
FSR is terrible at lower quality whereas DLSS holds much better.Having used both DLSS 2.x & FSR 2.x in games, I can't see a difference usually
FSR frame gen can work very well. I have only tested in one game ( witcher 3) with 3080 through a mod + DLSS upscaling, but it makes things smoother + i dont feel any lag + very few artifacts only in extreme cases. Well worth the trade off for me. Also goes against nvidia bs about need extra hardware for frame gen. I was skeptical about this earlier, and yes frame numbers may mean little here but it looks to be worth it and hopefully will get better with time.Regarding frame generation, those are not actual frames, so if you are already getting 60+fps natively, it will be great to make it even smoother with FG & make it seem like 100fps. It is getting better with time but RTX 40xx GPUs are a terrible value at this price & DLSS 3.x is alone not enough to be worth buying a terrible value card like 4060 or 4060Ti.
Depends on game, DLSS quality at 1440p is fine vs native, but i instead use DLSS balanced/performance + DLDSR from 4k which looks better to me.idk about you guys, but fsr and dlss are both terrible at 1080p.
and both 3060ti/6700xt are 1080p cards
but i m interested to know if they look alright in 1440p?
I normally use the same combination with my 3080, as well.Depends on game, DLSS quality at 1440p is fine vs native, but i instead use DLSS balanced/performance + DLDSR from 4k which looks better to me.
There is too much blur these days. Was terrible in Metro Exodus until i did this + also used AMD CAS. Amd cards should be able to use CAS without mods, so that's a plus too for AMD.
At 1080p, yeah i tried on Control with DLSS quality and it was very soft. I did not realize for some tiime that it was DLSS that was causing graphics to look crappy. But another game looked fine ..
This is game dependent, i get its good enough in cyberpunk even at 1080p. But it was horrible in control - quite blurry and disabling it worked much better for me at 1080p. General opinion of many reviewers ( who must have played many more games) is that up scaling in general at 1080p does not work well.I used it at 1080p exclusively, usually on quality mode. So an internal render resolution of 720p, and it was like magic. With the occasional artefact and ghosting to remind me that it wasn't, but still a very easy choice. FSR, at least from what I've seen, is not remotely in the same league. It just looks like a spatial upscaler. DLSS can actually add detail that is not there. And in certain games like Cyberpunk where the TAA implementation, at least at launch, was really bad. DLSS 1080p was/is superior to native 1080p.
I'm trying to recall all the games I played with DLSS at 1080p. I never tried Control. I believe that was a launch title for DLSS and used an earlier version, which could explain why it was bad. But I know for a fact it works really well in Death Stranding, Guardians Of The Galaxy (extremely underrated game and great DLSS implementation), Baldur's Gate 3, Hogwarts Legacy (Took it from barely playable with frequent dips at high 1080p on the 3050 to a smooth 40-60 fps) and Alan Wake II. In fact, I've never had to turn it off since getting a 3000 series card. And with the RTX 3050 being an underpowered card, DLSS is crucial to why it still can be a good choice for 1080p gaming at the right price. Something many reviewers missed when it launched.This is game dependent, i get its good enough in cyberpunk even at 1080p. But it was horrible in control - quite blurry and disabling it worked much better for me at 1080p. General opinion of many reviewers ( who must have played many more games) is that up scaling in general at 1080p does not work well.
ofc, having DLSS as an option if we move up to 1440p, is also well worth considering.
Anyway both are decent cards.
Right now, vram usage has gone back down in new games. So we could say 8gb is still borderline enough for now, but its much more on shaky grounds vs 12gb ( or 11gb if you want to call it).
I am hitting VRAM limits in witcher 3 next gen frequently because of how i am playing with 3080.
I tried to use a mod to increase draw distance of vegetation/detail and crashed soon enough, had to scale down a bit.
Reduced texture quality will look worse too.
RT performance imo is not very relevant because they are both slow. I think 3080 is probably bare minimum. And RT itself can take up more vram.
Its unfortunate that nvidia has been so skimpy with VRAM. And hopefully AMD catches up with RT/ai based FSR in next gen to get more competetive.
RT is the bottleneck now, that is where focus should be.
ok, i have played only 2 games at 1080p before upgrading monitor and in first game it worked well and not in 2nd game.I'm trying to recall all the games I played with DLSS at 1080p. I never tried Control. I believe that was a launch title for DLSS and used an earlier version, which could explain why it was bad. But I know for a fact it works really well in Death Stranding, Guardians Of The Galaxy (extremely underrated game and great DLSS implementation), Baldur's Gate 3, Hogwarts Legacy (Took it from barely playable with frequent dips at high 1080p on the 3050 to a smooth 40-60 fps) and Alan Wake II. In fact, I've never had to turn it off since getting a 3000 series card. And with the RTX 3050 being an underpowered card, DLSS is crucial to why it still can be a good choice for 1080p gaming at the right price. Something many reviewers missed when it launched.
At 1440p, DLSS seems good vs native too. But usually its because native isn't very good because TAA and similar stuff uses lower resolution apparently and then adds details from nearby frames.DLSS at 1080p absolutely works. But it depends on what your reference is. Is it better than playing at 720p? Absolutely. Is
Have you tried it ? It works very well. I am playing witcher 3 with it + DLSS via a mod on 3080.. The new AMD Fluid Motion thing is a great example of the AMD mindset. More frames! But only when you're basically standing still?? And not recommended in fps games according to reviewers! It's a gimmick.
Yeah, DLSS absolutely kicks ass at 1440p. The difference between it and native is even more slight. I usually have to pixel peep and compare screenshots to even see a difference. Can you imagine playing without it at any high resolution? That's why I can't bring myself to buy an AMD card or even recommend it to people. AI upscaling is absolutely the future. And FSR is just not there yet.ok, i have played only 2 games at 1080p before upgrading monitor and in first game it worked well and not in 2nd game.
At 1440p, DLSS seems good vs native too. But usually its because native isn't very good because TAA and similar stuff uses lower resolution apparently and then adds details from nearby frames.
Even with DLSS i don't really like how blurry games can be these days in motion almost always and sometimes even when still ( Metro Exodus enhanced ).
I played watch dogs 1 with some mods and MSAA, and that was nice and sharp.
So i use DLDSR to use a higher initial resolution and then use DLSS and somehow it works. So yeah, DLSS/DLDSR etc are all almost essential for me for many games.
On Amd side there is CAS which can also be nice, AMD is nice they open sourced it and its available to NVIDIA people through reshade.
You're talking about FSR3. I was talking about AFMF, the generic solution that works on the driver level. It does not have access to the motion vectors from the game engine, so it's much worse.Have you tried it ? It works very well. I am playing witcher 3 with it + DLSS via a mod on 3080.
I had my doubts, but frame gen is very useful too and i dont have any particular issue in motion in this game ( have heard that mod is working well on cyberpunk as well).
Its certainly not a gimmick. UI is clean ( had to use DLDSR 2.25x instead of lower resolution DLDSR as it effected UI), Artifacts are rare, dont feel any latency issues - well worth the trade off by getting smoother frames.
Im gonna be gaming at 1080p, don't think DLSS or Fsr will matter since that will not look good. And Nvidia's 40 series cards are capable of DLSS frame gen. While Fsr 3 has come out which works on almost every gpu even i gpus can produce similar performance.I'm trying to recall all the games I played with DLSS at 1080p. I never tried Control. I believe that was a launch title for DLSS and used an earlier version, which could explain why it was bad. But I know for a fact it works really well in Death Stranding, Guardians Of The Galaxy (extremely underrated game and great DLSS implementation), Baldur's Gate 3, Hogwarts Legacy (Took it from barely playable with frequent dips at high 1080p on the 3050 to a smooth 40-60 fps) and Alan Wake II. In fact, I've never had to turn it off since getting a 3000 series card. And with the RTX 3050 being an underpowered card, DLSS is crucial to why it still can be a good choice for 1080p gaming at the right price. Something many reviewers missed when it launched.
DLSS at 1080p absolutely works. But it depends on what your reference is. Is it better than playing at 720p? Absolutely. Is it better than FSR? Nearly always. FSR never manages to convince you that you're looking at anything more than a clever sharpening filter. DLSS will do things like reconstruct a wire fence texture that wasn't even visible at 720p, and even compensate for low texture resolution in many places. Imagine doing this kind of upscaling with FSR lol!
Nvidia definitely gimped all the 3000 series cards, and the 4000 series even more so, with the low vram. They're pretty greedy and need to be disrupted. I desperately want Intel (XeSS looks better than FSR imo and I'm excited for Battlemage) or AMD to give them some real competition. But AMD has consistently dropped the ball on the software side of things. FSR is not a real competitor to DLSS. It's great if you have an old card, but no one should seriously be comparing FSR to DLSS as if they're equivalent technologies. The new AMD Fluid Motion thing is a great example of the AMD mindset. More frames! But only when you're basically standing still?? And not recommended in fps games according to reviewers! It's a gimmick.
Will need to give it time, hopefully there will be use cases where its useful. These things seem to get released early and fixed with time.In the video, he got much worse latency of an extra ~20ms and his fps dips down drastically each time he moves his cursor at a speed appropriate for an fps game. So probably a nifty trick for an MMO or any isometric perspective game, if you don't mind a big latency penalty but I probably wouldn't even bother because those games usually don't need that fps boost in the first place. The use case for AFMF is tiny.
Not tried on Cyberpunk, but its close to perfect in witcher 3 - no ghosting as such, atleast i dont notice it. Can get some artifacts and flickering over objects, but rare.Coming to FSR3, I tried the mod for Cyberpunk that uses FSR frame combined with DLSS. It made full pathtracing at 1440p doable on the 3080, as long as you were okay with lots of ghosting. Not sure how that compares to DLSS 3. Frame generation definitely has potential, but here again you have the classic case of Nvidia locking down DLSS 3 to the 40 series cards because they claim the optical flow analysis can't be done on the 30 series etc. They can get away with that sort of thing because DLSS has no real competitors at the moment. Hopefully FSR3 gets a lot better with time