5090 or 4090 pricing + ram surge = i give up. cloud it is? (my math)

hi guys,

like many of you, i’ve been holding off my build waiting for jan 2026, hoping the rtx 50-series launch would fix the market. instead, it looks like we just walked into a nightmare.

i’ve been crunching the numbers for a wan 2.2 (14b) rig, and honestly, the “buy local” route is looking financially stupid right now. wanted to check if my math makes sense or if i’m missing something.

1. the 5090 scalping is a joke checked elitehubs and others—the 5090 is listed at ₹3.3L - ₹4L+ for the zotac/asus models. that’s not a consumer card anymore. even the used 4090s aren’t dropping because nobody can afford the upgrade.

2. the 4090 trap i expected 4090 prices to drop but they are holding steady at ₹1.7L - ₹2L+. sellers know the 5090 is unaffordable, so they aren’t budging.

3. the “memory crisis” is real i thought the news about ram shortages was marketing hype, but looking at prices on vedant/mdcomputers, it’s brutal.

  • reports say ddr5 contract prices are up 50-60% in 2026 because of server ai demand.

  • a decent 96gb or 128gb ddr5 kit (which i need for model offloading) is costing a fortune now.

4. my math: build vs. cloud

option a: build local (dual 3090s or single 4090)

  • cost: ~₹2.9 Lakhs minimum (considering the inflated ram/ssd prices).

  • problem: wan 2.2 14b (moe) is heavy. even a 4090 struggles with the full model without quantization or system ram offloading.

option b: cloud (vast.ai / runpod) i checked current spot rates for jan:

  • rtx 3090: ~$0.13/hr (approx ₹11/hr).

  • rtx 4090: ~$0.29/hr (approx ₹25/hr).

conclusion: i’d have to render for 10,000+ hours (that’s >1 year of 24/7 usage) on a rented 4090 to match the cost of building a rig right now.

unless you have a jio fiber connection with terrible upload speeds (which is my only worry for uploading 5gb+ checkpoints), does it even make sense to build in 2026?

feels like renting a 48gb a6000 or just spamming 3090 or 4090 instances for ₹11/hr or ₹25/hr is the only way until this ram shortage blows over.

anyone else abandoning their build plans?

2 Likes

I see Ada 4000 at 1.44L and 6000 at 4.5L, won’t they be better options for your usecase

i actually looked into those specific workstation cards yesterday thinking the same thing (newer architecture = better?), but the spec sheet tells a scary story when you compare them to the consumer flagships.

1. RTX 4000 Ada While ₹1.44L sounds ‘okay’ for 20GB, the memory bandwidth is a dealbreaker for Local AI compared to a 4090. * RTX 4000 Ada: 160-bit bus with ~360 GB/s bandwidth. * RTX 4090: 384-bit bus with ~1,008 GB/s (1 TB/s) bandwidth.

Wan 2.2 is an MoE (Mixture of Experts) model. These models are super sensitive to memory bandwidth because they have to constantly swap active experts. The 4000 Ada would likely be 3x slower than a 4090, simply because it can’t feed the data fast enough. You are paying near-4090 prices for 3060-tier bandwidth.

2. RTX 6000 Ada vs. RTX 5090 Where are you seeing the 6000 Ada for ₹4.5L? The lowest I found was ₹6.7L + GST at TPS Tech/Computech. Even if you find one used, look at the bandwidth gap vs. the new 5090: * RTX 6000 Ada: ~960 GB/s bandwidth. * RTX 5090: ~1,792 GB/s bandwidth (GDDR7 is a monster).

The 5090 is almost 2x faster in raw throughput. Unless you absolutely need 48GB in a single slot for a specific commercial workflow, the 6000 Ada is terrible value per rupee compared to even a scalped 5090.

i think for a personal rig, these workstation cards are bad value.

My bad it was the pro 5000, not 6000. 6000 is 6.5L in Amazon

Some 5090s were available for 2.7-3L in Amazon last month. They were gone the next day when i wanted to post them

all fake listings most likely. All new sellers and all were non-prime listings from what I remember.

2 Likes

From Flipkart they were all fake listings for 2.2l to 2.5l. But in Amazon i thought it was from clicktech but not sure though.

Yesterday night in Flipkart saw 5070 for 60k, now it’s gone to 71k.

On amazon you will likely get fake/swapped product for anything costly + in demand like currently ddr4/5 ram/ssd/graphics card even from their own seller like clicktech retail because of fraud buyers & cheat warehouse staff swapping stuff during/after delivery.

All major ISPs in India can have poor upload speeds depending on your area/time of the day. Best option is to have a local server within India to which you upload first & then upload from there to whatever the target remote server abroad is.

@moh1t

Found out the screenshot, one fake listing and the other real listing from clicktech, this was from 16th Jan.

Fake listing-

Real listing-

1 Like

50-50 chances of getting fake/swapped card in that real listing from clicktech too.

I thought that was only for processors, SSD?? Now even gpu?!! At least we can make unboxing video to be safe, I hear Amazon provides refund even without videos.

If Amazon provided refund without an uncut video then either the user got very lucky or the user scammed amazon.

https://techenclave.com/t/amazon-republic-day-2026-deals/411920/52

1 Like

Damn, I was referring to Amazon refund policy there’s no requirement of unboxing video. There were also some people who got refund without any videos, that was for SSD and processors i think.

This is scary, i thought GPUs were safe. I can’t beleive Amazon still isn’t taking steps to solve this, clearly it’s only getting worse.

Now you know why many ppl were/are getting fake processors & ssd on amazon. It clearly shows amazon India management being far behind Flipkart’s when it comes to Indian psyche understanding.

I remember someone getting FE 5090 from RPTech for around 2.2L, but not sure if that’s still the case.

that listing was open for a couple of hours last month and then went OOS. I couldn’t even register in the STPL website during that time… it was buggy.

Yes, I scaled down my plans heavily last year.

Original intention was to get a Threadripper but then I looked at RDIMM pricing and gave up.

Considered AM5/Z890 but running high amount of memory was not guaranteed. Limited amount of PCIe lanes for multi GPU setups.

Couldn’t find a 3090 locally for a good price although I didn’t look too hard either.

5070ti didn’t have enough VRAM for my needs, Gemma3 27B QAT doesn’t fit in 16GB, to say nothing of the space for context. ROCm will get there someday, sticking to Vulkan for now.

Got a 7900XT for 63k (Asus Tuf), decently priced 5900X (used) + b550. Poured rest of the money into RAM, 128GB DDR4.

It’s alright.

very true- workstation cards are meant for a different purpose, typically for local experiments raw power offered by consumer cards make more sense, however for full blown high-end trainings, features like ECC memory, thermals and optimised frequency provides stability which are best suited for servers or professional workstations

1 Like

The economic apparantly favours ‘cloud’, however I see a lot of value having a local workstation with decent GPU vram of 10/16gb (if not 4090) even if you rely on cloud for bigger trainings and experiments. I would not suggest 5090 at this point as it doesn’t works with CUDA lower than 12.8, which is a showstopper especially if you are a researcher

I don’t have much to say about the benifits of local dev machine, but here are some imp considerations IMO

  • coding on a vm is painful- ltwo reasons - latency and a physiological stress when u r paying for unused GPU especially when u spend endless night debugging and analysing
  • even if u go hybrid - say using dev containers on vscode- for every small experiment, loss function test, ablation or debugging etc - it’s painful to push git and spin up cloud GPU for every small need

Ultimately your decision should strongly depend upon the specific work you do, so the best way would be to give a try - rent a GPU for a couple of days and see how well it suits your needs, try different ways, best practices . give yourself sometime.

3 Likes

Reading all this, I feel so good that I bought the new 4090 two years ago for around 163 k.
It served me well, but when I decided to add a second card I find it difficult: the 5090 is too expensive because I already have the 4090 for 163 k, paying 200 k plus for another one I don’t feel very good.

I went ahead and got a 3060 with 12 GB for extra VRAM.
I got the VRAM, but it drags down the 4090 a little bit.

I am stuck—not wanting to buy a new card vs missing the opportunity, and prices are going up.
I also tried looking for a 3090 but cant find one.

but seeing posts like these i am tempted to buy that 4090 new

With everyone moving to cloud to tackle this, cloud costs will also go up in coming months and also if you factor in the increased costs of DRAM and SSDs.

1 Like