Laptop : under 1.5L : Student Btech : AI/ML

Please suggest laptop for a BTech AI/ML Student under 1.5L. Expect to train / fine tune SLM/LLM of decent size. For heavier loads will use online services.

Yeah, studied AI/ML. Even took the advanced courses.
Never did we train a model that takes substantialprocessing power, even then a free google colab notebook was enough.

Dont expect to train LLMs on a laptop, theres barely enough VRAM.

Dont buy a laptop for AI/ML. Buy a good laptop that works for your usecase and use cloud for AIML because youll be needing much more than 8gigs of vram

If you game, buy a decent gaming laptop. Or else a Mac works beautifully in colleges. Portability and the battery life is out of this world.

PS.
AIML is more maths and less training LLMs than you think xD They expect to teach you the probabilistic functions of the LLMs, not make LLMs itself. Even my puny i3 8th gen was enough to get me through.
People buying fancy laptops for colleges is purely out of papa ka paisa and flex culture

8 Likes

Thanks @aseem00r

They do have needs for training llm. Specially when they participate in hackathons and need to train / fine tune models for relevant use cases.

Buy a good laptop that works for your usecase and use cloud for AIML
I totally agree with this and had been my thought all the time. But the challenge is to upload huge dataset , where internet speed plays a spoil sport. Using mobile provider data is not a solution

I can’t think of any laptop that matches AI/ML workload price/performance of a mac-mini in that range.

Here’s a crazy idea. I think you can fit a decent laptop + a mac mini for the occasional hackathon use in that price range; a macbook by itself may not offer the same benefit for the price though.

I even thought of egpu. However thunderbolt cannot match pcie speeds

At present they have likes of LOQ and Victus in the group which they use to train basic models.

How’s this Asus Strix - G614FP-DS96 ? Ryzen 9955HX, 5070 , 32gb Ram
Can procure from USA and guess Asus do gives international warranty

AI/ML on laptop is a losing proposition. You are paying for and lugging around hardware that you might potentially use to the full extent only a handful of times.

I cant think of any circumstances where someone hands you gigs of data locally and you need to upload that to a colab or other cloud VM via mobile data. Most datasets are hosted in the cloud anyway and is only a wget away.

Spend 80k on a macbook air and put the rest of your budget towards cloud VMs.

3 Likes

I bought a gaming laptop for the exact reason, but I don’t regret it but if I had to choose again, I would go with slim laptops with better battery backup, it’s just my gaming laptop’s battery back up is bad and charger and laptop weighs a lot, and I don’t think I am using this laptop to a great extent for the reason I bought. It’s just once in a while that can be managed with cloud

1 Like

Got it. Thanks

In a world where internet costs a dime a dozen? 5G is free. Broadband used to cost 999 for a 20mbps plan when I first purchased one for my home.

You can always slowly upload your datasets to gdrive and whoosh in an instant imported to Google Colab.
Also I’d really like to see what datasets youre planning to work on hackathons with. I’ve won a SIH and my dataset was 10k images costing around 50mb. This took around 9g VRAM on a CNN based model we made.

Unless you can afford a laptop with 16G VRAM, youre much better off using cloud services

As for your eGPU idea, you were on the right path. Both training and inference happen on the hardware. Once your dataset has been loaded from your RAM to your VRAM, it doesnt go anywhere until its been processed through a couple neural layers. You DONT need a fast PCIE connection. I run a LLM workbench with pcie-gen4x4 connections to 4Quadros. I have infact benchmarked inference and training on x4,x8,x16 modes and x4 only falls short of x8 by 10%. x8 was same as x16 on inference but another 7-10% short on training compared to x16(iterations/s with 128 NN layers on a CNN based model with a 15gb dataset on my 4x16gb Quadros)

2 Likes

Thank you. I’ll go through this and discuss with the lad