Is the 3080 Strix 380W? So only 5% more for the 3090? 10% more performance for 5% more power seems like a good tradeoff, particularly for a card that isn't for gamers anyway. I can see memory bandwidth becoming ever more of a bottleneck till DDR5 comes to desktop. Maybe then PCIe 4.0 will open up a bit of a gap, but we'll see.
Proper RT is anyway about 5 years out. People buying these cards for RT features are deluding themselves. DLSS 2.0 is handy, but there's no scope of backward integration and the game list supporting it is too short for it to be a meaningful decider. I worry about DLSS dying out along with the entire RT architecture when games start adding native RT support in a few years time - maybe two further console generations out. If there's an open standard for it and it gets properly massaged into core DirectX (or whatever rendering API comes next), nVidia will need a way to direct those instructions to its RT cores. Whether they will be able to have a fix for current gen cards remains to be seen.