This is what you are looking for.
The key difference with AMD was that it rendered the desktop at a higher resolution (e.g., 1080p), sent that full-resolution signal to your monitor, and simultaneously downscaled the same image to a lower resolution (like 720p) for your second output, all handled at the GPU level. As a result, the TV received a clean 720p signal, and didn't need to do any scaling itself, it simply displayed what it was given.
When AMD's GPU downscaled the 1080p image to 720p, it resized (shrink) the entire image to fit the 720p resolution. This is not the same as setting Windows to 720p, which would render everything natively at that resolution.
NVIDIA, on the other hand, does not support per-display scaling in clone mode, neither does Windows. So the only workaround is to output a 1080p signal to both your monitor and your 720p TV, and rely on the TV’s internal scaler to downscale the image to fit its screen.
However, the TV’s downscaling might not automatically behave as expected, you may need to manually adjust settings like aspect ratio, overscan, or underscan in the TV's menu to get a proper fit.
Lastly, keep in mind that the quality of the downscaling done by the TV likely won’t match the GPU-level scaling AMD provided, but if you’re using an NVIDIA card, this is currently your only option.