Graphic Cards Setup for running LLMs locally

dank.it

Enclave Plus
Beginner
What is the cheapest setup I can have for running LLMs locally? I do not want to train, just want to do inference. Also won't be running heaviest models, I would be experimenting with smaller versions of models. Fine with used components/laptop/mini pc, whatever sails the boat.
 
A Mac mini is also a great option. The unified memory architecture allows the gpu to share system RAM so you can have 32GB of memory with the GPU that will give much better perf.
Another thing people do is get one of those mini PCs with expandable RAM and put 2x48GB sticks.
But a GPU with enough vram will beat any of these options