i know a guy who managed to get a used 3090 for 40k and have seen older price listing on facebook market place/tech enclave/olx around 50k
but currently i am not able to find any listing for 3090 or 4090, at least at a price below 70k
I dont think the current ram shortage should affect used gpu prices but considering the current situation what do u think is a fair price to buy a used 3090 or 4090 rn? and is it even possible to get it at that price rn?
4090 can definitely not be below 70k. I had someone willing to sell 3090 to me for 43k, but I had no use of the extra vram. I want it for gaming and it performs similar (~5% better) than a 3080ti which can easily be found for 30-35k.
yeah i get it, the only reason i am looking for 3090 is because of its 24gb vram for ai and homelab, tho i dont really know muhc about it and just want to test things out
i feel like used 3060 will really be limiting (and might be a waste of money if i want anything usable), i have tried small llms , image gen and basic stuff on my 4060 mobile, i know i dont need a rtx 3090 as a beginner especially since i am a complete noob but its one of those “wants” , i am not going to buy it anytime soon at current prices but might buy it if i find a steal deal
i am saying this as complete beginner so forgive me if i say something really dumb
n8n is a server/app where u can create workflows , a workflow can be about anything, literally anything u want, maybe some notification u want to be send to ur telegram or discord based on some event etc
tts is text to speech
i dont have any specific implementation in mind,but want to create simple cool workflows for say home automation, discord bots , telegram bots
most of this stuff can probably be implemented without an ai but i just want to try doing it and maybe learn something
I would love to buy a 3090 for 40K or even a 4090 for 70K.. but I think realistically a 3090 would go for 55-60K and a 4090 would be 1.2L .. as far as I have seen listings in the past .. but yes these are hard to find ..
I personally use a 3060 12 GB (bought for ₹15K used) for my passive AI workloads like immich, paperless, n8n and even some quantized models like Deepseek coder etc.. I have this in my NAS-homelab PC.
In my main PC, I have a 4070 Ti Super, but I am looking to add another card probably a 4060 Ti 16GB, or a 3090 24 GB and try bigger models leveraging model sharding if possible. So, if you find a good deal on either card, let me know too. My budget is around 55Kish for 3090 and around 30Kish for a 4060Ti 16GB.