What is the cheapest setup I can have for running LLMs locally? I do not want to train, just want to do inference. Also won't be running heaviest models, I would be experimenting with smaller versions of models. Fine with used components/laptop/mini pc, whatever sails the boat.