Graphic Cards RDNA 4 Speculations

Arigaytor

Beginner
Isn't it weird that there has been zero news aside from the 9070 and 9070XT from AMD's side? While the 9070XT looks like it's shaping up to be a stellar release, I'm more concerned about the lower end of the market where budget gamers shop for GPUs in the sub 20-25k bracket. Aside from the RX 6600 and the RTX 3050, there's just been nothing new announced in that 15-25k price range. While I hope that the new generation gets us 7700XT performance at close to 30k, I can't help wondering whether the budget GPU market has been all but relegated to last gen gpus at this point.
 
Isn't it weird that there has been zero news aside from the 9070 and 9070XT from AMD's side? While the 9070XT looks like it's shaping up to be a stellar release, I'm more concerned about the lower end
Update: The press conference might be happening in late Feb now instead of March

They have also said that this time the focus will be on FSR4 and optimization.
GPUs in the sub 20-25k bracket. Aside from the RX 6600 and the RTX 3050, there's just been nothing new announced in that 15-25k price range.
Well there was the launch of the B570 and the B580 but intel weren't really prepared with enough stock and that is why they haven't been able to reach here either. Also when entering that price bracket it is usually better to go used in terms of value which a lot of people are doing.
hope that the new generation gets us 7700XT performance at close to 30k
That is probably going to be the case with the 9060 and it might even cost less around 25-28k to compete with the B580 pricing.
I can't help wondering whether the budget GPU market has been all but relegated to last gen gpus at this point.
See the problem with the market is that Nvidia basically owns the market, they also work with the game developers directly and also sponsor them and in doing that they also force them to implement their technologies such as DLSS or Frame Gen and making them rely on these things to make their game look somewhat decent and play relatively okay, they have also been caught doing such things in the past as well. Due to this other companies like AMD and Intel have to follow suit such as with ML base upscaling or Frame Gen in order to somewhat compete and in the middle of it all this causes the pre existing hardware which cannot run these hardware assisted software technologies to become obsolete and hence more money for Jensen's shiny jacket.

Until and unless the competition innovates nothing is going to change AMD was on an alright path with their software based upscaling solution which only failed due to borked implementation by the Nvidia sponsored titles like BM:Wukong, Cyberpunk or Stalker 2, whereas in games with proper implementation it didn't look bad at all like in Ghost of Tsushima. But they had to throw it all away in order to sort of compete or at least make it look like they are competing with Nvidia. AMD each generation released 2nd class hardware which is unoptimized and buggy for the people who can't afford the more expensive Nvidia counterparts and tout it as just as good but it just isn't.

AMD's graphic manufacturing division doesn't rely on selling consumer GPUs but on their console business with Sony and Microsoft, this has actually even been admitted by Lisa Su herself and can be seen in their sales and profit figures, and this is the only reason why they have been able to keep their GPU division alive till now and that is why we don't ever see them leading in with new technologies cause most of the software side on which games are made is locked down by Nvidia's proprietary technologies already and they don't want to dump money into research which will ultimately be a failure.
 
Last edited:
  • Like
Reactions: meizul
Yes , New gen GPUs AMD or Nvidia both get scalped and bought at at least 150% MSRP by big tech , I don't expect new GPUs to be VFM until the AI Bubble bursts
AI itself is not a bubble though. It is a viable and incredibly transformative tech. Will GPUs still be what is used in a couple of years instead of custom silicon? That's up for debate and may indeed 'burst' the heavily skewed supply/demand curve for NVidia products.. but being fairly deep into it, due to so much research work and ongoing projects being so reliant on tools that use CUDA, this isn't happening anytime soon. So though stock valuations may fluctuate, it's not going to be easy to get an NVIDIA gpu for cheap anytime soon. I know people who thought they could wait out the 'temporary' pricing for 3090 and 4090s... they're still waiting.
 
Absolutely the markup is due to the current supply/demand setup ...basically water in the desert. And yes there is overhype by some, but seeing what's already happening, how quickly huge improvements are coming, and the potential for achievable tech in a very short period (in terms of history -- 2 to 5 years instead of decades), the hype may actually not be covering the full reality of what's to come.
 
Absolutely the markup is due to the current supply/demand setup ...basically water in the desert. And yes there is overhype by some, but seeing what's already happening, how quickly huge improvements are coming, and the potential for achievable tech in a very short period (in terms of history -- 2 to 5 years instead of decades), the hype may actually not be covering the full reality of what's to come.
I hope new techniques come out that make AI training faster without needing better hardware , because lets be real . Even rtx 5090 is hitting literal physical limits for transistor size , only major improvements would be tensor core optimization or soon dual die GPUs
 
Isn't it weird that there has been zero news aside from the 9070 and 9070XT from AMD's side? While the 9070XT looks like it's shaping up to be a stellar release, I'm more concerned about the lower end of the market where budget gamers shop for GPUs in the sub 20-25k bracket. Aside from the RX 6600 and the RTX 3050, there's just been nothing new announced in that 15-25k price range. While I hope that the new generation gets us 7700XT performance at close to 30k, I can't help wondering whether the budget GPU market has been all but relegated to last gen gpus at this point.
The only reason they would be doing this is to improve the drivers (and FSR4) because I don't think the supply was that bad considering retailers had the cards since December. I think they learnt even from their CPU launch that initial reviews based on price and performance can be pretty damning over the long term. At the same time, they probably have the best opportunity in a long time to price it right for the performance and challenge Nvidia's 70 and 60 series for market share.
 
I hope new techniques come out that make AI training faster without needing better hardware , because lets be real . Even rtx 5090 is hitting literal physical limits for transistor size , only major improvements would be tensor core optimization or soon dual die GPUs
There is still a lot of low hanging fruit for optimization and improvements plus the possibility of dedicated hardware that can be orders of magnitude faster without needing further transistor shrinkage. Basically along the lines of the Ant miners etc for Bitcoin that made GPU mining obsolete by orders of magnitude VFM. The same will likely happen here (Cerebras, Sambanova, Groq are all sort of headed in that direction), but the initial steam will have to settle so that there are known standard methods/approaches. For now we don't even know if transformers is the architecture that will survive into a couple of years from now.