you might want to look into used Quadro GPUs if ML is your only use case, that might be cheaper for you, and if gaming is not a consideration for you at all, then it might be better to look into Cloud solutions for your ML needs.
P.S. if you are just starting out in ML, you don't need 24 gigs of VRAM, even a normal 3060 12gigs will be more than for your needs. You would either be implementing little models from scratch in TensorFlow/PyTorch or at most do some transfer learning on the the pre-trained models (where even 24gigs of VRAM might not suffice).
P.S. if you are just starting out in ML, you don't need 24 gigs of VRAM, even a normal 3060 12gigs will be more than for your needs. You would either be implementing little models from scratch in TensorFlow/PyTorch or at most do some transfer learning on the the pre-trained models (where even 24gigs of VRAM might not suffice).