Looking at the thermals, the RTX 4090 is running at 65°C, and the RTX 3090 is running at 63°C. Overall, that’s a difference of about 3.1%, which is basically negligible.
After going through the specs of the two GPUs, we already knew that the RTX 4090 has a 100W higher TDP. So, at 98% usage, the graphics card is consuming 420.8W, which is about 14.8% more than the RTX 3090 as it is consuming 366.3W at 98% usage.
All in all, a ~90.6% performance upgrade at the cost of ~14.8% higher power consumption seems like a fair deal.