DLSS 3 Frame Generation on Turing and Ampere Cards..!

your hyperlink is bad

1665585276561.png


But interesting news, sadly Nvidia might still lock us out in a more tightly coupled way.
 
your hyperlink is bad



But interesting news, sadly Nvidia might still lock us out in a more tightly coupled way.
Fixed.. Thanks.;) and Yeah.. NVIDIA has to sell those bulky Lovelace Cards... If they care about profits.. they sure will. Still.. If they repeat what they did with allowing Ray Tracing on GTX Cards.. Maybe Frame Generation on Ampere cards can work as a Selling Proposition for Lovelace Cards.
 
Although There are severe FPS Drops and Instability as mentioned.... BUT FPS JUMPED FROM AROUND 40 to 80..!!!
Still.... It's early to say what will become of those FPS Spikes.
FPS drops and FPS jump are so contradictory in the same sentence. One doesn't doubt that the algorithm will run on older cards as at the end of the day it is image interpolation but the bigger question is can the hardware in older cards manage to do so consistently in real time, which seems to be not the case.
 
One doesn't doubt that the algorithm will run on older cards as at the end of the day it is image interpolation but the bigger question is can the hardware in older cards manage to do so consistently in real time, which seems to be not the case.
Why lock out the feature in that case through software control? Maybe a better alternative would have been to keep the feature off globally for older generation cards which can technically support it & give an option in Nvidia settings to toggle it on/off for users who know what they are doing. If the performance is really not impressive people will not use it & Nvidia will get to be on the good side of gamers for once while also advertising how good RTX 4xxx cards are for this.

Artificially locking things out makes you question companies like Nvidia who have a very bad track record of being pro consumer. I mean take example of AMD, FSR 1.0 was really bad for scaling sub 1080p resolutions. Did they lock that feature out? Nope! Do most AMD users use FSR for sub 1080p resolutions? I don't think many do! But AMD is still praised for that feature support on older GPUs & deservedly so. Ultimately FSR 1 has become a really great option for Steam deck users so it has found its users, which is great!

BTW, this is the original comment where DLSS3 running was claimed by a reddit user.
 
Last edited:
Why lock out the feature in that case through software control? Maybe a better alternative would have been to keep the feature off globally for older generation cards which can technically support it & give an option in Nvidia settings to toggle it on/off for users who know what they are doing. If the performance is really not impressive people will not use it & Nvidia will get to be on the good side of gamers for once while also advertising how good RTX 4xxx cards are for this.

Artificially locking things out makes you question companies like Nvidia who have a very bad track record of being pro consumer. I mean take example of AMD, was FSR1.0 was really bad for low end graphics card when scaling from 720p -> 1080p? Did they lock that feature out? Nope! And ultimately that feature has become a really great option for Steam deck users.

BTW, this is the original comment where DLSS3 running was claimed by a redditor.

Not defending Nvidia here as it is a shi**y company and it has always tried to keep DLSS exclusive because it knows it is better than the other upscaling techniques. However, it is also evident that DLSS 3 requires a lot more processing from the tensor cores and the older cards will definitely struggle. Backward compatibility usually limits development, not to say the extent of support that will have to be provided for it not working properly in older hardware by game developers and Nvidia. If it actually works with tinkering, then that is a win-win as then there will be no official support provided and gamers can still try it out.

AMD has always used open standards to compete whenever it was the underdog which does benefit the industry. At the same time, there are using generic GPU image processing and likely wouldn't have an open standard, if they had dedicated hardware for it. Also, they are using a generic algorithm in FSR 1.0 that just needs to be switched on, unlike the more complicated upscaling techniques requiring specific integration into the game engine.

Frankly, don't think it is a big loss and probably most users of older hardware will give it a try and realise it is not meant to be. I have a RTX 2060 laptop and know better than to use RT as all it does is ruin the experience, in terms of frame rate and stability. Sometimes, you have to make peace with what you have.
 
Frankly, don't think it is a big loss and probably most users of older hardware will give it a try and realise it is not meant to be.
That's exactly my point. Why lock this feature out if your justification is performance is too low for older generations? People don't trust Nvidia's word when it comes to such claims as they have lied about some features in the past, RTX voice as an example. They should let folks experience it on their GPUs & appreciate how awesome the optical flow sensors are in RTX 4000 series, they don't have anything to lose here if the claims they are making are true.

The expectation here isn't that Nvidia should actively work on supporting it on older generation or optimize around it, but that they should let it work in any capacity with a disclaimer. I think it should ideally work as per Nvidia's own comment & what we are seeing here in the OP.
 
Back
Top