May I ask what case you are using to fit the four GPUs ?train my models on all 4 gpus
May I ask what case you are using to fit the four GPUs ?train my models on all 4 gpus
HiMay I ask what case you are using to fit the four GPUs ?
I'm not sure if it would be good to sell the 3080 ti they're almost as powerful as 3090 just want to know if 3060 would cause a bottleneckI don't know how much time it would take but if you are going this much gpu i would say you should move to ryzen trx platform or intel xeon cpus with good mobos that would give more pcie slots with much more lanes.
Also wouldn't it be good if you could sell of one 3080ti and instead of buying another 3060 use that money and get another 3090.
What case are you looking at? i would suggest phanteks enthoo pro 2/719 because it can house 2 psu and so you can run 3 gpu's but i doubt there are any consumer mobos that can actually provide you the required pcie lanes and clearance for those 3 massive gpu's and still run at full bandwith.. i think only Msi godlike x670e can do x8x8x8 and its expensive, so like the guy above said, you should look into hedpc boards and processors. im also searching for the best solution in budget for mlHi
I have 3 gpus as of right now, and i don't have a case. They're open
I'm not sure if it would be good to sell the 3080 ti they're almost as powerful as 3090 just want to know if 3060 would cause a bottleneck
I have bought a 3090 and now I'm in a fix. If i buy a 3060 12 gb and train my models on all 4 gpus (3090 + 2x3080ti + 3060 ) would the 3060 cause a bottleneck?
Glad to see this updated for new platforms.I feel this is a must read for anyone looking to buy hardware for this field -
The Best GPUs for Deep Learning in 2023 — An In-depth Analysis
Here, I provide an in-depth analysis of GPUs for deep learning/machine learning and explain what is the best GPU for your use-case and budget.timdettmers.com
Great thanks for the detailed answer.Glad to see this updated for new platforms.