AMD Ryzen 5 9600X and Ryzen 7 9700X review roundup : Confusing and all over the place

It just looks like they will launch Arrow Lake and Lunar Lake without any pompom. It would be a soft launch.
I think they will postpone. With the current clout around 13th and 14th gen, its not the right set of conditions for arrow lake or lunar lake launch. Its better to wait till this settles
 
I think they will postpone. With the current clout around 13th and 14th gen, its not the right set of conditions for arrow lake or lunar lake launch. Its better to wait till this settles
if AMD set price decent for NEW and even older processors in India, than, till Intel settled 13/14th Gen issue, AMD will be at a Position where once Intel were in India.

Computer means Intel AMD
 
1723221427656.png

This is priced very high.
 
Do you know that Intel's more and more cores and pushing for more voltage to power those cores is what is causing 13th gen and 14th gen failures? Both are on two sides of the spectrum. I don't want 23984623984732 cores in my CPU. I do not like paying so much for an 8 core CPU as well. What we need is something in between. Still wondering why AMD did not ditch 6 core chiplet and go with 8 core and 12 core chipsets. Probably they wanted to keep the power consumption in check? A 24 core Ryzen would be epic though.
the highest core count for a 14th gen is 24 with 32 threads, dunno why you are exaggerating it so much, but if you have played any Sony console port or even any modern management/building genre game like Anno which is designed for higher core counts, you'll see why we want more cores, any "AI" you see in games, whether it be simple pedestrians strolling on the street or even enemies, all of them use CPUs for their behavior/pathing and this is where games that take advantage of CPU multi-processing shine, Dunno what's AMD is smoking to release 8 core CPUs as mainstream but I am not well versed in CPUs and AMD must be doing something right with their "X3D" CPUs that no reviewer is trolling them for low core counts, And forget gaming, since you run so many VMs, you really are fine with a 8C/16T CPU?

Also Inten FAFO'd for the first time in a decade and are suffering not because they increased core count but because they got greedy again for easy money just pandered to the gaming crowd by increasing voltage for minor gains and slapping a new gen sticker on it
When I turn my desktop on (on most of the evenings and weekends), it's either gaming or working on VMs/pods. This includes two hours of daily game time for kids (each one gets 1 hour). So, yes, it is at heavy utilization whenever it is on. Will see how desktop CPU/GPU is heading towards. My RTX3090 is working perfectly. If I face any issue with it, I may ditch desktop and go for a console as I have a MacBook).
yeah, you are not even gonna breach the 50rs/month mark for energy savings assuming the max your PC runs every day is for 5 hours (2 hours kiddo time and a couple hours at max for you)
This is priced very high.
way too high lol, AMD going the Nvidia way in CPU market

I feel like the HUB guys are a bit dumbstruck by the lack of basic numeracy among other tech reviewers.
HUB is always like this, look at any of their reviews, even for GPUs, they refuse to consider vendor specific enhancements be it something like DLSS or any updates/enhancement that may come after, they are really into "Apples to Apples" comparisons but most of us need a lot more nuance which they dont entertain in "fairness", thats why its a love-hate relationship for me with them, because they do a lot thorough testing and doesnt make me sleep like Tech Jesus


that's why I say it's a love hate relationship with HUB lol
 
Last edited:
HUB is always like this, look at any of their reviews, even for GPUs, they refuse to consider vendor specific enhancements be it something like DLSS or any updates/enhancement that may come after, they are really into "Apples to Apples" comparisons but most of us need a lot more nuance which they dont entertain in "fairness", thats why its a love-hate relationship for me with them, because they do a lot thorough testing and doesnt make me sleep like Tech Jesus


that's why I say it's a love hate relationship with HUB lol
Right, but HUB is not being too cynical or extreme in this case. They're just reporting the facts and protecting consumers. Most tech Youtuber's viewerbase are gamers and content creators, not linux sysadmins or some extreme minority of AVX-512 SIMD aficionados. In truth, the enthusiast CPU market is essentially propped up by PC gamers. They buy the vast majority of these higher end CPUs. Datacenters and enterprises are definitely not buying desktop chips, let alone a 9600X or a 9700X. Saying Zen 5 is a complete disappointment for its target market is just objective fact and honest reporting.

Here's someone else saying the very same things,

It's crazy how much fanboying and accusations of 'clickbait' that they got for being among the few to notice that the existence of 7000 series non-X chips basically destroys any argument about Zen 5's efficency gains under normal workloads. If we set expectations this low for newer generations, we're going right back to when Intel was doing their tick-tock era of 5% gains every generation.
 
Right, but HUB is not being too cynical or extreme in this case. They're just reporting the facts and protecting consumers. Most tech Youtuber's viewerbase are gamers and content creators, not linux sysadmins or some extreme minority of AVX-512 SIMD aficionados. In truth, the enthusiast CPU market is essentially propped up by PC gamers. They buy the vast majority of these higher end CPUs. Datacenters and enterprises are definitely not buying desktop chips, let alone a 9600X or a 9700X. Saying Zen 5 is a complete disappointment for its target market is just objective fact and honest reporting
these CPUs are not made for data centers, they mostly go for threadrippers (or it's modern equivalent) for that high cpu count to run multiple VMs, but I do agree with HUB in this case but say that to AMD fan boys lol. I swear they are more rabid than others. I got downvoted to hell because I said DLSS alone makes NVIDIA better than AMD by far and that I won't consider AMD untill they release a DLSS equivalent


EDIT: Intel was way worse than this lol, they used to remove features and then re-introduced them a couple generations later as "new". Intel was shameless af back in the days and that's why they haven't got any sympathy from me
 
Last edited:
these CPUs are not made for data centers, they mostly go for threadrippers (or it's modern equivalent) for that high cpu count to run multiple VMs, but I do agree with HUB in this case but say that to AMD fan boys lol. I swear they are more rabid than others. I got downvoted to hell because I said DLSS alone makes NVIDIA better than AMD by far and that I won't consider AMD untill they release a DLSS equivalent


EDIT: Intel was way worse than this lol, they used to remove features and then re-introduced them a couple generations later as "new". Intel was shameless af back in the days and that's why they haven't got any sympathy from me
Ah yeah, there's still a group of AMD lifers who still haven't tried DLSS yet at 1440p/4k and think it is just a slightly better FSR. But DLSS is/used to be leagues above FSR2, I'm not sure how that has changed now with FSR3 but I suspect not very much. DLSS looks better than native TAA in many games to this day. But AMD's rabid fanbase kept insisting on just raster performance over anything else.
To be fair to AMD, this isn't a vulnerability in their branch prediction unlike with many of the Intel ones like Spectre. That article says it has to do with a compatibility feature of the System Management Mode. That sounds like it would be much easier to patch without any performance hit. You also need a kernel level exploit in-place on the target computer to be able to infect it with this exploit. If someone has kernel level access on your system, you're already ****ed.
 
There is a vulnerability for every hardware and software. Saying this as we work closely with SecOps. They say there should be zero vulnerabilities but that can never be the case. We upgrade vendor software to fix one and in a week. There are few more that are identified.

Coming to this one. Amd announced it and plugged it already. And to get to that, one need to get access to kernel itself. So, chill. You should what kind of vulnerabilities that Chrome browser brings. It’s so much that our company banned it and we are asked to use Edge. Fun fact is that though Edge is based on Chromium, it is far more secure/
 
You also need a kernel level exploit in-place on the target computer to be able to infect it with this exploit. If someone has kernel level access on your system, you're already ****ed.

Coming to this one. Amd announced it and plugged it already. And to get to that, one need to get access to kernel itself. So, chill.
No, even if we need kernel level exploit to infect the target system, this is neither chill nor already ****ed situation. Because this infection survives reboots.

So gone are the days of disconnecting your SSD/hard drives, and booting with potentially compromised images to analyse them. Even light weight virtualization like docker will be vulnerable until fixed, and maybe VMware/VirtualBox as well.

An infection surviving reboots fundamentally changes the usage scenario. Now once infected, always infected even if you nuke your storage.
 
No, even if we need kernel level exploit to infect the target system, this is neither chill nor already ****ed situation. Because this infection survives reboots.

So gone are the days of disconnecting your SSD/hard drives, and booting with potentially compromised images to analyse them. Even light weight virtualization like docker will be vulnerable until fixed, and maybe VMware/VirtualBox as well.

An infection surviving reboots fundamentally changes the usage scenario. Now once infected, always infected even if you nuke your storage.
This kind of firmware infection could be new to you but this is becoming more and more common. Router firmware infections are the most common attacks these days.

Prevention is rather easy. Do not use admin or high privilege account and protect router and other vulnerable devices.

You need not discard your storage or hardware. Detection is hard but fixing is not as scary as the news shows. I will try to catch hold of CISO expert in my office next week and get more details on how they identify and fix firmware infections.
 
Last edited:
You need not discard your storage or hardware. Detection is hard but fixing is not as scary as the news shows. I will try to catch hold of CISO expert in my office next week and get more details on how they identify and fix firmware infections.
not this one, you need a SPI flasher and an engineer who would have go through SMT's code and figure out the malicious code, the cheapest way literally is to just trash the entire CPU
 
An infection surviving reboots fundamentally changes the usage scenario. Now once infected, always infected even if you nuke your storage.
You need not discard your storage or hardware. Detection is hard but fixing is not as scary as the news shows. I will try to catch hold of CISO expert in my office next week and get more details on how they identify and fix firmware infections.

Just need to reflash the SPI BIOS chip with a patched/clean version using a 800 bucks flasher, which will remove the infected code. The hard part was attaching the clip to the IC legs properly.

But you're missing the point: if your kernel is compromised, then you're in already deep trouble. Like complaining of the smoke hindering your vision when your whole house is on fire.

Plot twist: Asus mobos have a root kit called Armoury Crate that lives in the BIOS and can download + install software without user knowledge/input.
 
Ah yeah, there's still a group of AMD lifers who still haven't tried DLSS yet at 1440p/4k and think it is just a slightly better FSR. But DLSS is/used to be leagues above FSR2, I'm not sure how that has changed now with FSR3 but I suspect not very much. DLSS looks better than native TAA in many games to this day. But AMD's rabid fanbase kept insisting on just raster performance over anything else.
My reasoning is straightforward, can we all agree that unless we have a 4090 at the bare minimum, we need upscaling? Yes? is DLSS miles above FSR? Yes? Do you give a shit about image quality and graphics or just want FPS? Yes? Nvidia thats it. its beyond pathetic that Intel poached a few Nvidia engineers and got a better upscaling solution than FSR.

btw coming to fans, dont join a certain Indian hardware server because talking about rabid fans, that server's admins come pretty close XD
But you're missing the point: if your kernel is compromised, then you're in already deep trouble. Like complaining of the smoke hindering your vision when your whole house is on fire.

Plot twist: Asus mobos have a root kit called Armoury Crate that lives in the BIOS and can download + install software without user knowledge/input.
so basically you play valorant, use OEM softwares you are at their mercy? *cough* crowdstrike *cough*, I knew there was a reason I despise both ASUS and valorant
 
Prevention is rather easy. Do not use admin or high privilege account and protect router and other vulnerable devices.
So you agree with what you replied to, "gone are the days", and "fundamentally changes the usage scenario".
But you're missing the point: if your kernel is compromised, then you're in already deep trouble.
No, (potential) kernel compromising was / is necessary for analysing infections in a separate boot or virtual hardware. Which is no more possible.
 
Back
Top