Search results

  1. D

    2x3080 ti or 3090 for ML?

    Hi I have 2x3080ti + 3090. The 3080ti's are connected to the motherboard through risers. So they run at x1. 3090 is in x16 slot. My question is, if i get an additional 3090, would it be better than 2x3080ti + 3090? That would take less power, and the additional card would fit into my cabinet...
  2. D

    Ddr4 or ddr5 build for ML

    Yeah I've heard bad things about 11th gen as well. Will try for a 12th gen. Thanks
  3. D

    Ddr4 or ddr5 build for ML

    Cool. Going with ddr4 setup. Thanks
  4. D

    Ddr4 or ddr5 build for ML

    Hi I'm looking to buy new Motherboard and processor for ML. I have two options, 1) go for ddr4 motherboard (lga 1200) and i5 11400f or 2) go for ddr5 motherboard (lga 1700) and go for i5 12400f. The thing is ddr5 setup costs me about 20k more than the ddr4 setup as i already have ddr4 ram...
  5. D

    Would Corsair Rm1000x be enough for 2x3080ti and 3090?

    Great thanks for the detailed answer.
  6. D

    Would Corsair Rm1000x be enough for 2x3080ti and 3090?

    Hi I have 3 gpus as of right now, and i don't have a case. They're open I'm not sure if it would be good to sell the 3080 ti they're almost as powerful as 3090 just want to know if 3060 would cause a bottleneck
  7. D

    Would Corsair Rm1000x be enough for 2x3080ti and 3090?

    Hi I have bought a 3090 and now I'm in a fix. If i buy a 3060 12 gb and train my models on all 4 gpus (3090 + 2x3080ti + 3060 ) would the 3060 cause a bottleneck? Would it be better if i didn't use 3060 at all? I googled around and saw that 2x3060 take twice as long as 3090 while training a...
  8. D

    FS: Cabinets Lian Li 205 Mesh

    Hi Will it support this motherboard? https://mdcomputers.in/msi-meg-z590-ace-gold-edition.html
  9. D

    WTB Complete tower - GPU, PSU, CPU, RAM, Case

    Hi Can you post a link to the motherboard? And do you have a tower? Thanks
  10. D

    WTB Wtb 3090 Nvidia

    Looking for a 3090, preferably under warranty
  11. D

    Would Corsair Rm1000x be enough for 2x3080ti and 3090?

    Oh. Sorry no it's too costly. I can get my hands on a much cheaper 3090 Ok will try it out. Thanks
  12. D

    Would Corsair Rm1000x be enough for 2x3080ti and 3090?

    Can I not use a riser? Would there be a bottleneck? I was under the impression that there's not a lot of IO. Thanks
  13. D

    Would Corsair Rm1000x be enough for 2x3080ti and 3090?

    Cool thanks for the heads up Yes sure DM me and we can probably do a project together
  14. D

    Would Corsair Rm1000x be enough for 2x3080ti and 3090?

    Hi 1) Motherboard is ab350 gaming gigabyte 2) I have a relatively large room and I'll be using risers placing the gpus apart 3) I have yet to see how tensorflow handles multiple gpus with different vram for distributed training. I already have 2x3080ti and both of them have the same vram. But...
  15. D

    Would Corsair Rm1000x be enough for 2x3080ti and 3090?

    As the title says, would a Corsair rm1000x be enough for 2x3080ti and 3090? Planning to do some AI/ML thanks
  16. D

    Need help with pytorch....cannot resume training

    @vishalrao i changed the dtype from bfloat16 to float32 and the traning time dropped from 60 ms to 20 ms. But i think changing the dtype takes a hit on the quality of the model. But i should let you know i received an Out of memory error while changing the dtype, so i reduced the batch and block...
  17. D

    Need help with pytorch....cannot resume training

    Sure I'll let you know I'm running it with compile true
  18. D

    Need help with pytorch....cannot resume training

    Tokenizing the data is a ram intensive process. Training is gpu intensive. When i create the train.bin and val.bin files data is read through ram. And i don't have enough of it I figured out what the problem was. Max_iters was set to 5000, and when training resumed, 5000 steps were already...