News NVIDIA GeForce RTX 5080 Is 2x Faster Than 4080 At $999, RTX 5070 Ti 2x Faster Than 4070 Ti At $769, RTX 5070 Faster Than 4090 For $549

This current 5xxx series is shittier than the 4xxx series reminds me of 2003/2004. Remember the infamous GeForce FX 5XXX series? It barely outperformed the GeForce 4 series. That whole generation was a hot pile of steaming poo.
 
  • Like
Reactions: Kaching999
Ah! Tech jesus steve i missed that video. We shouldnt blame the scalpers we should blame those idiots buying from them to just flaunt with their friends to just play team fortress 2 or CS2.
Scapler gonna scalp scalp scalp scalp.....
This current 5xxx series is shittier than the 4xxx series reminds me of 2003/2004. Remember the infamous GeForce FX 5XXX series? It barely outperformed the GeForce 4 series. That whole generation was a hot pile of steaming poo.
My guy
 
  • Like
Reactions: PunkX 75
This current 5xxx series is shittier than the 4xxx series reminds me of 2003/2004. Remember the infamous GeForce FX 5XXX series? It barely outperformed the GeForce 4 series. That whole generation was a hot pile of steaming poo.
Ahh, nostalgia hitting me rn lol! I was like 4-5 years old. First pc. Had an AMD athlon XP, Geforce Fx 5500, 512mb of RAM and good ol Windows XP
 
  • Like
Reactions: PunkX 75
For what resolution?
I play at 4k, but have seen issues even at 1440p. Depends on game.

If one is to justify this then one needs to keep on buying the latest models regardless of the VRAM because the games will ever be more demanding. And I believe that the manufacturers are basing the VRAM for a certain resolution in mind. For eg. The 5070 with 12GB is being marketed for 1440p not 4k and the 5070 Ti with 16GB also is marketed for 1440p seemingly with all the bells and whistles. If you plan to play them all in 4k max then Nvidia would tell you to go for 5080 bare minimum.
So for us the end consumers all we can do is to just compare this generation with older ones then decide.
This is apple style segmentation at its worst. VRAM isn't expensive from what i have read, not that it becomes a barrier.
Nvidia has a long history of skimping on VRAM in 'value' products ( These days even in not value products).
They seem to be just enough that people dont notice ( or not enough even in new cards these days for 8gb cards). And then in a couple of years your experience starts degrading.

Yes we can only compare including cards from competition. And reduce buying interest in crappy value cards.
VRAM is something that we shouldnt ignore. With enough VRAM for future games, we could still run with textures at maximum or with high res texture mods while keeping other settings lower. Sharp Textures make a big difference in image quality, no amount of RT is gonna make up for blurry textures. There is no performance impact if enough VRAM is available.

Also, there is no need to play only the latest most demanding games, We have undersupply of gfx cards and oversupply of games. So we can choose among many games that will run fine, ignore FOMO.
This is from an avg gamer pov. Some want only the best ofc, and that's fine. Titan type cards are for them. Those cards are fine too, its below that everything has gotten shitty.
 
Last edited:
Also, there is no need to play only the latest most demanding games, We have undersupply of gfx cards and oversupply of games. So we can choose among many games that will run fine, ignore FOMO.
This is from an avg gamer pov. Some want only the best ofc, and that's fine. Titan type cards are for them. Those cards are fine too, its below that everything has gotten shitty.
Lol i just started BF4 again today, at a smooth 200 fps (capped), with 20% usage. Most fun I've had in a while and still looks baller, even if my skills have degraded to that of a new-born.
 
Loving this nvidia shenanigan.

Do you guys think getting a barely used 3090 fe for 55k is still worth it for ai inferencing only ?
 
Loving this nvidia shenanigan.

Do you guys think getting a barely used 3090 fe for 55k is still worth it for ai inferencing only ?
Yes the 3090 is powerful enough to run something like a 32b model pretty well and I don't think anything around that price can match that. You can go quad 3080 to run a bigger 70b model but that will cost at least 90k + the extra for the board. But in both cases you need to have a good amount of ram as well at least 32-64gb
 
  • Like
Reactions: Uvula
Yes the 3090 is powerful enough to run something like a 32b model pretty well and I don't think anything around that price can match that. You can go quad 3080 to run a bigger 70b model but that will cost at least 90k + the extra for the board. But in both cases you need to have a good amount of ram as well at least 32-64gb
you tried ai stuff?
stable diffusion? comfyui? img generation?
(i am looking for people who got experience in them,
why? cuz i have questions/queries regarding them so that i wont be wasting money 'wrong' gpu)
 
you tried ai stuff?
stable diffusion? comfyui? img generation?
(i am looking for people who got experience in them,
why? cuz i have questions/queries regarding them so that i wont be wasting money 'wrong' gpu)
I don't have much experience with it especially anything other than text generation models, but the 3090 about which you asked should be pretty alright for most of those use cases. You can also look up pretty much everything you want on YouTube, different models running of different hardware and other queries you might have or just create a thread asking for help.