Nvidia RTX 3XXX Series announced

even x8 and x16 difference is not that great :p

For me initially when i assembled my pc I was using dual m2 ssd and my gpu was running x8 there was noticeable difference and i was confused why it wasn't performing as expected. Only when i removed the 2nd m2 and gpu ran full x16 i got the expected performance. So there def has to be noticeable difference for 3080 too.
 
3090 ROG Strix power limits are at 400W

Is the 3080 Strix 380W? So only 5% more for the 3090? 10% more performance for 5% more power seems like a good tradeoff, particularly for a card that isn't for gamers anyway. I can see memory bandwidth becoming ever more of a bottleneck till DDR5 comes to desktop. Maybe then PCIe 4.0 will open up a bit of a gap, but we'll see.

It's a pity that post-process third party software is heavily optimised for nVidia. Seems a waste given the raster performance of the AMD cards might be better (per watt).

Still, I might have to replace my aging 1080 soon. I prefer AMD, but might have to go green again unless enbdev/Reshade magically decides to switch optimisation to AMD. If I ever get around to learning Blender, it might be a decent investment anyway.

Proper RT is anyway about 5 years out. People buying these cards for RT features are deluding themselves. DLSS 2.0 is handy, but there's no scope of backward integration and the game list supporting it is too short for it to be a meaningful decider. I worry about DLSS dying out along with the entire RT architecture when games start adding native RT support in a few years time - maybe two further console generations out. If there's an open standard for it and it gets properly massaged into core DirectX (or whatever rendering API comes next), nVidia will need a way to direct those instructions to its RT cores. Whether they will be able to have a fix for current gen cards remains to be seen.
 
Is the 3080 Strix 380W? So only 5% more for the 3090? 10% more performance for 5% more power seems like a good tradeoff, particularly for a card that isn't for gamers anyway. I can see memory bandwidth becoming ever more of a bottleneck till DDR5 comes to desktop. Maybe then PCIe 4.0 will open up a bit of a gap, but we'll see.

Proper RT is anyway about 5 years out. People buying these cards for RT features are deluding themselves. DLSS 2.0 is handy, but there's no scope of backward integration and the game list supporting it is too short for it to be a meaningful decider. I worry about DLSS dying out along with the entire RT architecture when games start adding native RT support in a few years time - maybe two further console generations out. If there's an open standard for it and it gets properly massaged into core DirectX (or whatever rendering API comes next), nVidia will need a way to direct those instructions to its RT cores. Whether they will be able to have a fix for current gen cards remains to be seen.

This is proper RT - just that it is not being applied to all surfaces. It is using simplified imposter geometry instead since the number of rays it can trace are limited. What you refer to as RT is probably full scene path tracing + full monte-carlo integration for GI like movies. This is very hard and movies only started implementing the same a few years ago (around 10). Before that films used REYES aka rasterization for most scene objects and only ray traced a few that absolutely required it.

Path tracing is possible but only in cases of very simple geometry like the 20 year old Quake II which has an RTX spin off and minecraft which anyways has very simple geometry due to its inherent nature. Actually NVIDIA is ahead here for the time being as it provides BVH traversal acceleration which doesn't exist on RDNA2 on the consoles and most likely the GPUs coming next month end. It only does ray triangle intersections in hardware. Once the intersections are computed, the shading is done by the traditional shading hardware - there's no difference there. Also right now there are two variants in terms of implementation. DX12 has its ray tracing API which is standardized while Vulkan has a bunch of proprietary NVIDIA extensions that are used by other games.

Future hardware just has to deal with the scalability problem - other issues are mostly sorted.
 
even x8 and x16 difference is not that great :p
Yes but many people like to make a fuss. We've been spoiled I guess.
I have an old GTX 650 and can't remember the last time I played on my own PC. I've been waiting for the new stuff to be revealed but so far it's good and not so great (as in nothing in the price segment which I'm looking at). I will use my old unused cabinet. Already bought the monitor and SSD. Just waiting for these models to be released to finalize on the rest of the components.
 
More of a cost no object card meant for either super rich folks or those who need it for training models in tensorflow.

Exactly that 24 Gigs of VRAM is just for the training models. Never meant for gaming just like the Titan cards never performed way too better than xx80TI cards but just had way too much of VRAM. But seeing the 4k VRAM requirements maybe the gaming industry too might require higher VRAM, hence Nvidia is going to release a 3080 20GB variant which might make 3090(TITAN formerly) look idiotic. May be there will be a 48GB variant I am not sure.
BVH traversal acceleration
So If the BVH node construction is not present in RDNA2, then there no point in expecting RT performance on RDNA2. dot.
maybe it more reliant on Rasterization.
I have seen programs having a great toll just by using MT ray-triangle intersections (when done on high end CPUs) without the acceleration of BVH nodes and Bounding boxes.
 
Me: Discussing high-end graphics cards in 2020 .. haven't played much games since 2013 ... Very picky with games .. Infact I only like fps war games Cod moha Farcry [2004] only lol Crysis series .. using my office Lcd 19" 5:4 aspect ratio and discussing a 70k gpu.
 
Last edited:
I bought my 2060 Super FE in February this year. using it with a ryzen 7 and a 60hz monitor, which is the most widely used display setup, any game that i play now doesnt utilize 5-6GB of the card's VRAM. Im content with it and will not upgrade it for atleast 4 years. Any performance gains / RT candy offered by the 30 series are superficial and just a temptation for the high end impulse buyers. Genuine use case for professionals aside, this is the age of consumerism and companies milk consumers like cows.
 
I bought my 2060 Super FE in February this year. using it with a ryzen 7 and a 60hz monitor, which is the most widely used display setup, any game that i play now doesnt utilize 5-6GB of the card's VRAM. Im content with it and will not upgrade it for atleast 4 years. Any performance gains / RT candy offered by the 30 series are superficial and just a temptation for the high end impulse buyers. Genuine use case for professionals aside, this is the age of consumerism and companies milk consumers like cows.
I'd like to agree hehe
 
6m1sxwdkeqo51.png
 
can anyone tell me if i will be able to buy PS5 using bots ? nvidia cards sold out in 1 sec. idk if my bsnl 3g speed is fast enugh for that.

have a Jailbreak Ps4Pro.
heavily confused between building a ryzen pc or PS5 for my online fps cravings.
pc parts have become so costly, i am thinking about buying ps5 & playing their free to play games. have never paid for games, unless it's gold like TheWitcher3.

my salary is merely 25k. i dont want to spend more than 80k on a pc. wish i had relatives in foreign countries.

sorry for this rant, i was planning to buy pc since Sept 19, but only got job in Jan20. Corona pissed all over my pc dreams of mid range pc gaming.
 
can anyone tell me if i will be able to buy PS5 using bots ? nvidia cards sold out in 1 sec. idk if my bsnl 3g speed is fast enugh for that.

have a Jailbreak Ps4Pro.
heavily confused between building a ryzen pc or PS5 for my online fps cravings.
pc parts have become so costly, i am thinking about buying ps5 & playing their free to play games. have never paid for games, unless it's gold like TheWitcher3.

my salary is merely 25k. i dont want to spend more than 80k on a pc. wish i had relatives in foreign countries.

sorry for this rant, i was planning to buy pc since Sept 19, but only got job in Jan20. Corona pissed all over my pc dreams of mid range pc gaming.
It's crazy you're willing to spend 80k on a pc with 25k salary. Pure fps love eh ❤️. All the best
 
Back
Top