You are using an out of date browser. It may not display this or other websites correctly.
You should upgrade or use an
alternative browser.
-
That's helpful. So just good cpu and ram can make it happen? No need for discrete GPU if I can be little patient?
-
Yes but wouldn't that require local computation? I am looking for a budget setup to make it happen.
-
What is the cheapest setup I can have for running LLMs locally? I do not want to train, just want to do inference. Also won't be running...
-
dank.it replied to the thread
WTB 3060 12 gb.
Depends on condition mostly but 15k is good price.
-
Sold outside the forum. @puns please close the thread.
-
Conisder only if you have balance ₹100 after spending 10k.
:)
-
In good condition without drift issue, fairly priced