This comes from having to help someone choose a monitor but perhaps it has more to do with the associated GPU and CPU.
The friend is getting an RTX 4070, Ryzen 9 7900, 32GB RAM and a 27-inch monitor, mainly for video editing and playing MS Flight Simulator. In general, a higher resolution and higher refresh rate are desirable but budget is limited after the gfx card, CPU and the rest of the system.
A 1440p 100Hz monitor fits into the budget but what I'd like to know is how much it affects games, particularly fps, compared to a 1080p 75Hz display.
Based on simple arithmetic, a QHD display has 1.78 times the number of pixels on a FHD monitor, and 100Hz is 33% more demanding than 75Hz. Together that's 237% more work for the GPU and CPU. How does that affect game play? IF the relation were linear, the frame rate would drop by 2.37 times - e.g. from 150fps to 63fps. But I don't think it works out quite that way.
I don't play games myself so I need your help in estimating the penalty from playing with 1440p 100Hz compared to 1080p 75Hz, everything being equal.
The friend is getting an RTX 4070, Ryzen 9 7900, 32GB RAM and a 27-inch monitor, mainly for video editing and playing MS Flight Simulator. In general, a higher resolution and higher refresh rate are desirable but budget is limited after the gfx card, CPU and the rest of the system.
A 1440p 100Hz monitor fits into the budget but what I'd like to know is how much it affects games, particularly fps, compared to a 1080p 75Hz display.
Based on simple arithmetic, a QHD display has 1.78 times the number of pixels on a FHD monitor, and 100Hz is 33% more demanding than 75Hz. Together that's 237% more work for the GPU and CPU. How does that affect game play? IF the relation were linear, the frame rate would drop by 2.37 times - e.g. from 150fps to 63fps. But I don't think it works out quite that way.
I don't play games myself so I need your help in estimating the penalty from playing with 1440p 100Hz compared to 1080p 75Hz, everything being equal.