AMD 6000 series graphics cards

1605935308225.png
 
Rx 6800 4k ray tracing performance on cod cold war is very low
It's only 25 fps in 4k ultra rt on
amd has clearly told that 4k is not their audience and we all know that amd rt is first gen so that is expected. mostly we will get some bump with software support in next some month. also we don't have dlss alternative as of yet ! but it will be soon. lets see...
 
amd has clearly told that 4k is not their audience and we all know that amd rt is first gen so that is expected. mostly we will get some bump with software support in next some month. also we don't have dlss alternative as of yet ! but it will be soon. lets see...
But they told that 6000 series big navi is for 4k gaming experience
 
I still dont think both GPU are fully capable of true 4K gaming.. except for 3090 and 6900XT , when benches come out.. Still 1440p/2k is max gaming experience one can get at this point of time imo. And turning on Ray tracing will bring it down to crawling FPS. Not to forget these are tested against current gen game.. give it few more months or years , newer games will demand more and we should expect more performance hit. 1440p/2k might be the sweet spot for next couple of years ... CMIIW..
 
Last edited:
Ray Tracing performance in the 6000 series is sub optimal. There is absolutely no denying that.

Even in native resolution, the 6800 XT lags the 3070 in ray tracing performance. On top of that, NVIDIA has DLSS for better performance.
 
I am thinking that this will be fixed with part to the competition in drivers. and RDNA3 will be rock solid as RDNA2 is just below AMPERE, 75% i suppose.
 
I am thinking that this will be fixed with part to the competition in drivers. and RDNA3 will be rock solid as RDNA2 is just below AMPERE, 75% i suppose.
Drivers can't do anything when the hardware itself is not capable enough. To AMD, ray tracing was a checkbox feature this gen.

Performance is well below the top Turing part, let alone ampere.
 
Drivers can't do anything when the hardware itself is not capable enough. To AMD, ray tracing was a checkbox feature this gen.

Performance is well below the top Turing part, let alone ampere.
??? It's above 2080Ti as much as I know. may be depends on card test or something. anyways I don't know how many will use rt even on 3080 :p
 
I'm curious, how have the extra VRAM (which was widely said to be a big bonus which might destroy the 3k at higher resolution) helped? If even the 3080 does better than the 6800 XT in 4k?
 
I'm curious, how have the extra VRAM (which was widely said to be a big bonus which might destroy the 3k at higher resolution) helped? If even the 3080 does better than the 6800 XT in 4k?
That'll only be beneficial once games actually start using more VRAM. Right now most of the games top out at around 8-10GB at 4k. The extra framebuffer will only deliver gains once the extra buffer starts getting used, if at all.

IMHO I wouldn't count on it being a game changer anytime soon, maybe 1-2 years down the line? RTX3XXX series is better at 4k gaming right now due to its higher CUDA cores which are beneficial at higher resolutions, I feel it will keep its performance lead in the future too. Even if the extra VRAM becomes usable in the future, you would generally get better performance at 4k on RTX3XXX series if you keep the VRAM usage within the framebuffers due to the higher raw compute. I feel by the time the extra VRAM becomes somewhat relevant you would end up getting bottlenecked at some other area of the graphics pipeline & would have better alternatives available.

There is also the question of whether the VRAM requirements will be increasing that aggressively in the future or not as some of the next gen DX12 API should ideally enable better streaming performance with the help of SSDs & better allocation of resources on the VRAM.

As is always said, your buying decisions should be based upon what is available today as opposed to what might happen tomorrow.
 
Last edited:
I'm curious, how have the extra VRAM (which was widely said to be a big bonus which might destroy the 3k at higher resolution) helped? If even the 3080 does better than the 6800 XT in 4k?
will be useful for new games as even the max tested ram was around 9gb on GODFALL
 
will be useful for new games as even the max tested ram was around 9gb on GODFALL
The moment I mentioned extra VRAM I knew that someone would bring up the case of Godfall lmao.
I'm just thinking that it's a shame that the huge VRAM would've only added to the unnecessary high costs for little to no reason. I mean imagine how much people would like more if the cost was lower while getting rid of the extra flab. Even AMD is still capable of making such "mistakes" (I know they will use it as marketing bs) huh. Disappointed.
 
The moment I mentioned extra VRAM I knew that someone would bring up the case of Godfall lmao.
I'm just thinking that it's a shame that the huge VRAM would've only added to the unnecessary high costs for little to no reason. I mean imagine how much people would like more if the cost was lower while getting rid of the extra flab. Even AMD is still capable of making such "mistakes" (I know they will use it as marketing bs) huh. Disappointed.
think the other way round , since they knew their performance is not upto the mark , they want to compensate on VRAM atleast and giving hope that down the line it MIGHT be used ...
 
RX 6000 is very good on rasterization part ,people babbling around regarding DXR shut up RTX is proprietary so not all games would be able to support yes few games are actually giving good fps with DXR on nvidia , AMD & MSFT are comming up with Direct ML and it will be open source .This will again happen like gsync and freesync int he end due to cost people will move towards open source. Regarding VRAM future games would have more textures hence 16 gb VRAM future proofs the card ,as of now RX 580 still rocks due to its higher VRAM
 
RX 6000 is very good on rasterization part ,people babbling around regarding DXR shut up RTX is proprietary so not all games would be able to support yes few games are actually giving good fps with DXR on nvidia , AMD & MSFT are comming up with Direct ML and it will be open source .This will again happen like gsync and freesync int he end due to cost people will move towards open source. Regarding VRAM future games would have more textures hence 16 gb VRAM future proofs the card ,as of now RX 580 still rocks due to its higher VRAM
I need your kind of optimism in my life lol.
 
Back
Top