Search
Search titles only
By:
Search titles only
By:
Forums
New posts
Search forums
What's new
New posts
Latest activity
Feedback
View Statistics
Members
Current visitors
Buy Sell Trade
WTB
Log in
Register
Search
Search titles only
By:
Search titles only
By:
New posts
Search forums
Menu
Install the app
Install
Reply to thread
Forums
Technology
Computer Hardware
Confused b/w (3060 ti - 6700xt) and (i5 12400f - Ryzen 5 5600)
JavaScript is disabled. For a better experience, please enable JavaScript in your browser before proceeding.
You are using an out of date browser. It may not display this or other websites correctly.
You should upgrade or use an
alternative browser
.
Message
<blockquote data-quote="variablevector" data-source="post: 2507135" data-attributes="member: 121190"><p>I owned a RTX 3050 for more than a year before I upgraded to a 3080. I bought it at the height of the GPU shortage because my old card failed. So I feel qualified to say that if you can get it, DLSS is absolutely worth it and it can take a 30-40 fps gaming experience to 60-70 with little to no visual fidelity loss. </p><p></p><p>I would go so far as to say that I'd pick a Nvidia card over a corresponding AMD card that has ~20% better raster performance just for DLSS. </p><p></p><p>I used it at 1080p exclusively, usually on quality mode. So an internal render resolution of 720p, and it was like magic. With the occasional artefact and ghosting to remind me that it wasn't, but still a very easy choice. FSR, at least from what I've seen, is not remotely in the same league. It just looks like a spatial upscaler. DLSS can actually add detail that is not there. And in certain games like Cyberpunk where the TAA implementation, at least at launch, was really bad. DLSS 1080p was/is superior to native 1080p. </p><p></p><p>Furthermore, Nvidia has always used a more efficient texture compression algorithm than AMD has. So while Nvidia's card are absolutely still behind AMD when it comes to VRAM. In some games, I've noticed that Nvidia cards use up to 1 gig less of VRAM at the same settings.</p></blockquote><p></p>
[QUOTE="variablevector, post: 2507135, member: 121190"] I owned a RTX 3050 for more than a year before I upgraded to a 3080. I bought it at the height of the GPU shortage because my old card failed. So I feel qualified to say that if you can get it, DLSS is absolutely worth it and it can take a 30-40 fps gaming experience to 60-70 with little to no visual fidelity loss. I would go so far as to say that I'd pick a Nvidia card over a corresponding AMD card that has ~20% better raster performance just for DLSS. I used it at 1080p exclusively, usually on quality mode. So an internal render resolution of 720p, and it was like magic. With the occasional artefact and ghosting to remind me that it wasn't, but still a very easy choice. FSR, at least from what I've seen, is not remotely in the same league. It just looks like a spatial upscaler. DLSS can actually add detail that is not there. And in certain games like Cyberpunk where the TAA implementation, at least at launch, was really bad. DLSS 1080p was/is superior to native 1080p. Furthermore, Nvidia has always used a more efficient texture compression algorithm than AMD has. So while Nvidia's card are absolutely still behind AMD when it comes to VRAM. In some games, I've noticed that Nvidia cards use up to 1 gig less of VRAM at the same settings. [/QUOTE]
Insert quotes…
Verification
Post reply
Forums
Technology
Computer Hardware
Confused b/w (3060 ti - 6700xt) and (i5 12400f - Ryzen 5 5600)
Top
Bottom