Nvidia RTX 3XXX Series announced

If these are the actual numbers then 3090 is only 10-12% faster than 3080 for more than double the price

Some in-game benchmarks do show 24gb ram so if true its awful. How could they claim 8k.
I don't expect it to be any faster than 15% in any case since cores don't scale linearly.

The difference will come when both the GPUs are bandwidth limited or when the 10gig frame buffer becomes a bottleneck.

This card is primarily for developers and deep learning guys. No point for pure gamers buying this.
 
The 3090 at 152K.....for 80K over the 3080 one could buy a new cpu+ mobo and upgrade the entire system.

I have a feeling these numbers are not correct right now. Lets wait for dependable reviews to be out.
This 10% increase leaves no room for 3080 20Gb, 3080Ti or Super....now does it?
but nonetheless, I have had Titan cards before and they were also almost double the cost but for like a 25% increase or so over the Ti...
One does pay a heavy premium for those last few extra frames...
Cheeky nvidia removed the nvLink from the 3080's :/
 
...the boost clock of the 3080 strix ROG is a crazy high 1935! Stock nvidia is only 1725. That should be close to these bench numbers of the 3090 posted earlier...
 
Last edited:
The 3090 at 152K.....for 80K over the 3080 one could buy a new cpu+ mobo and upgrade the entire system.

I have a feeling these numbers are not correct right now. Lets wait for dependable reviews to be out.
This 10% increase leaves no room for 3080 20Gb, 3080Ti or Super....now does it?
but nonetheless, I have had Titan cards before and they were also almost double the cost but for like a 25% increase or so over the Ti...
One does pay a heavy premium for those last few extra frames...
Cheeky nvidia removed the nvLink from the 3080's :/
Those Titan cards are not for gaming, there are models with high memory requirements ex BERT and GPT, The 3090/Titans are just for them comparing gaming performance of them is kind of idiotic. I guess Nvidia named them 3090 cause the naming stack of Titan was messed up X, XP, RTX blah blah... For gaming don't have high performance gains betting on those cards, totally meant for Researchers and AI engineers
 
...the boost clock of the 3080 strix ROG is a crazy high 1935! Stock nvidia is only 1725. That should be close to these bench numbers of the 3090 posted earlier...

What it actually boosts to will completely depend on the power limit. These cards seem to be massively power limited. As of now EVGA FTW3 seems to be the most generous at 440W. All the others max out around 380W or so.
 
What it actually boosts to will completely depend on the power limit. These cards seem to be massively power limited. As of now EVGA FTW3 seems to be the most generous at 440W. All the others max out around 380W or so.
I hope that AMD has done some enhancements, pushed the power limits and this time they have a GPU for running 4K games with 60 FPS and above.
 
I hope that AMD has done some enhancements, pushed the power limits and this time they have a GPU for running 4K games with 60 FPS and above.

AMD won't have as much of issues as NVIDIA since their chips will be manufactured with tsmc 7nm which is a more mature and better process with lower leakage compared to Samsung 8nm. RDNA is pretty efficient and I'd expect RDNA2 to be significantly more efficient. NVIDIA on the other hand has pushed beyond the sweet spot in the power/efficiency curve this generation in the search for higher clocks.
 
AMD won't have as much of issues as NVIDIA since their chips will be manufactured with tsmc 7nm which is a more mature and better process with lower leakage compared to Samsung 8nm. RDNA is pretty efficient and I'd expect RDNA2 to be significantly more efficient. NVIDIA on the other hand has pushed beyond the sweet spot in the power/efficiency curve this generation in the search for higher clocks.
Yepp, better node this time. They launched 7nm 5700XT and now for sure they would have made it better if not mastered it. I am voting for AMD so that it starts a price war.
 
I'm from COD/MOHA era rofl.

You can still play COD, MW 2019 was the best COD in ages.



index.jpg
 
Real world performance difference between PCIe 3.0 and PCIe 4.0. This ought to shut up all the Nvidia fanboys who claimed that only with PCIe 4.0 can you play the rtx3x in its max potential. Much bias and marketing gimmick going on.

View attachment 90795
Post automatically merged:


As I've been saying for years, AMD/Nvidia don't give a shit about the Indian market.
At first they will be like "YEAH WE'LL DO THIS AND THAT FOR THE INDIAN MARKET AND CONSUMERS AND HOPE TO GROW THE BUSINESS % MORE BY <INSERT RANDOM DATE AND YEAR HERE>"
Then later they will go "Oh meh. We've got bigger markets and regions to be concerned with."
No corporate company is your friend guys. They just want your money at their own desired time and convenience. So don't set your expectations too high.

even x8 and x16 difference is not that great :p
Post automatically merged:

If these are the actual numbers then 3090 is only 10-12% faster than 3080 for more than double the price
:king:
Could just be an oc'd 3080 although i dont think any reviewer was able to get 10-12% increase over stock.


Some in-game benchmarks do show 24gb ram so if true its awful. How could they claim 8k.

I care more of the price they are charging. like around or more than 60 percent for that 10% and 14gb ram :/
 
Back
Top