It just looks like they will launch Arrow Lake and Lunar Lake without any pompom. It would be a soft launch.
I think they will postpone. With the current clout around 13th and 14th gen, its not the right set of conditions for arrow lake or lunar lake launch. Its better to wait till this settlesIt just looks like they will launch Arrow Lake and Lunar Lake without any pompom. It would be a soft launch.
if AMD set price decent for NEW and even older processors in India, than, till Intel settled 13/14th Gen issue, AMD will be at a Position where once Intel were in India.I think they will postpone. With the current clout around 13th and 14th gen, its not the right set of conditions for arrow lake or lunar lake launch. Its better to wait till this settles
the highest core count for a 14th gen is 24 with 32 threads, dunno why you are exaggerating it so much, but if you have played any Sony console port or even any modern management/building genre game like Anno which is designed for higher core counts, you'll see why we want more cores, any "AI" you see in games, whether it be simple pedestrians strolling on the street or even enemies, all of them use CPUs for their behavior/pathing and this is where games that take advantage of CPU multi-processing shine, Dunno what's AMD is smoking to release 8 core CPUs as mainstream but I am not well versed in CPUs and AMD must be doing something right with their "X3D" CPUs that no reviewer is trolling them for low core counts, And forget gaming, since you run so many VMs, you really are fine with a 8C/16T CPU?Do you know that Intel's more and more cores and pushing for more voltage to power those cores is what is causing 13th gen and 14th gen failures? Both are on two sides of the spectrum. I don't want 23984623984732 cores in my CPU. I do not like paying so much for an 8 core CPU as well. What we need is something in between. Still wondering why AMD did not ditch 6 core chiplet and go with 8 core and 12 core chipsets. Probably they wanted to keep the power consumption in check? A 24 core Ryzen would be epic though.
yeah, you are not even gonna breach the 50rs/month mark for energy savings assuming the max your PC runs every day is for 5 hours (2 hours kiddo time and a couple hours at max for you)When I turn my desktop on (on most of the evenings and weekends), it's either gaming or working on VMs/pods. This includes two hours of daily game time for kids (each one gets 1 hour). So, yes, it is at heavy utilization whenever it is on. Will see how desktop CPU/GPU is heading towards. My RTX3090 is working perfectly. If I face any issue with it, I may ditch desktop and go for a console as I have a MacBook).
way too high lol, AMD going the Nvidia way in CPU marketThis is priced very high.
HUB is always like this, look at any of their reviews, even for GPUs, they refuse to consider vendor specific enhancements be it something like DLSS or any updates/enhancement that may come after, they are really into "Apples to Apples" comparisons but most of us need a lot more nuance which they dont entertain in "fairness", thats why its a love-hate relationship for me with them, because they do a lot thorough testing and doesnt make me sleep like Tech Jesus
I feel like the HUB guys are a bit dumbstruck by the lack of basic numeracy among other tech reviewers.
Right, but HUB is not being too cynical or extreme in this case. They're just reporting the facts and protecting consumers. Most tech Youtuber's viewerbase are gamers and content creators, not linux sysadmins or some extreme minority of AVX-512 SIMD aficionados. In truth, the enthusiast CPU market is essentially propped up by PC gamers. They buy the vast majority of these higher end CPUs. Datacenters and enterprises are definitely not buying desktop chips, let alone a 9600X or a 9700X. Saying Zen 5 is a complete disappointment for its target market is just objective fact and honest reporting.HUB is always like this, look at any of their reviews, even for GPUs, they refuse to consider vendor specific enhancements be it something like DLSS or any updates/enhancement that may come after, they are really into "Apples to Apples" comparisons but most of us need a lot more nuance which they dont entertain in "fairness", thats why its a love-hate relationship for me with them, because they do a lot thorough testing and doesnt make me sleep like Tech Jesus
that's why I say it's a love hate relationship with HUB lol
x.com
x.com
these CPUs are not made for data centers, they mostly go for threadrippers (or it's modern equivalent) for that high cpu count to run multiple VMs, but I do agree with HUB in this case but say that to AMD fan boys lol. I swear they are more rabid than others. I got downvoted to hell because I said DLSS alone makes NVIDIA better than AMD by far and that I won't consider AMD untill they release a DLSS equivalentRight, but HUB is not being too cynical or extreme in this case. They're just reporting the facts and protecting consumers. Most tech Youtuber's viewerbase are gamers and content creators, not linux sysadmins or some extreme minority of AVX-512 SIMD aficionados. In truth, the enthusiast CPU market is essentially propped up by PC gamers. They buy the vast majority of these higher end CPUs. Datacenters and enterprises are definitely not buying desktop chips, let alone a 9600X or a 9700X. Saying Zen 5 is a complete disappointment for its target market is just objective fact and honest reporting
Ah yeah, there's still a group of AMD lifers who still haven't tried DLSS yet at 1440p/4k and think it is just a slightly better FSR. But DLSS is/used to be leagues above FSR2, I'm not sure how that has changed now with FSR3 but I suspect not very much. DLSS looks better than native TAA in many games to this day. But AMD's rabid fanbase kept insisting on just raster performance over anything else.these CPUs are not made for data centers, they mostly go for threadrippers (or it's modern equivalent) for that high cpu count to run multiple VMs, but I do agree with HUB in this case but say that to AMD fan boys lol. I swear they are more rabid than others. I got downvoted to hell because I said DLSS alone makes NVIDIA better than AMD by far and that I won't consider AMD untill they release a DLSS equivalent
EDIT: Intel was way worse than this lol, they used to remove features and then re-introduced them a couple generations later as "new". Intel was shameless af back in the days and that's why they haven't got any sympathy from me
To be fair to AMD, this isn't a vulnerability in their branch prediction unlike with many of the Intel ones like Spectre. That article says it has to do with a compatibility feature of the System Management Mode. That sounds like it would be much easier to patch without any performance hit. You also need a kernel level exploit in-place on the target computer to be able to infect it with this exploit. If someone has kernel level access on your system, you're already ****ed.AMD can't score even with an open goal it seems.
AMD's 'Sinkclose' vulnerability affects hundreds of millions of processors, enables data theft — AMD begins patching issue in critical chip lines, more to follow
Almost cannot be fixed, say researchers.www.tomshardware.com
There is a vulnerability for every hardware and software. Saying this as we work closely with SecOps. They say there should be zero vulnerabilities but that can never be the case. We upgrade vendor software to fix one and in a week. There are few more that are identified.AMD can't score even with an open goal it seems.
AMD's 'Sinkclose' vulnerability affects hundreds of millions of processors, enables data theft — AMD begins patching issue in critical chip lines, more to follow
Almost cannot be fixed, say researchers.www.tomshardware.com
You also need a kernel level exploit in-place on the target computer to be able to infect it with this exploit. If someone has kernel level access on your system, you're already ****ed.
No, even if we need kernel level exploit to infect the target system, this is neither chill nor already ****ed situation. Because this infection survives reboots.Coming to this one. Amd announced it and plugged it already. And to get to that, one need to get access to kernel itself. So, chill.
This kind of firmware infection could be new to you but this is becoming more and more common. Router firmware infections are the most common attacks these days.No, even if we need kernel level exploit to infect the target system, this is neither chill nor already ****ed situation. Because this infection survives reboots.
So gone are the days of disconnecting your SSD/hard drives, and booting with potentially compromised images to analyse them. Even light weight virtualization like docker will be vulnerable until fixed, and maybe VMware/VirtualBox as well.
An infection surviving reboots fundamentally changes the usage scenario. Now once infected, always infected even if you nuke your storage.
not this one, you need a SPI flasher and an engineer who would have go through SMT's code and figure out the malicious code, the cheapest way literally is to just trash the entire CPUYou need not discard your storage or hardware. Detection is hard but fixing is not as scary as the news shows. I will try to catch hold of CISO expert in my office next week and get more details on how they identify and fix firmware infections.
An infection surviving reboots fundamentally changes the usage scenario. Now once infected, always infected even if you nuke your storage.
You need not discard your storage or hardware. Detection is hard but fixing is not as scary as the news shows. I will try to catch hold of CISO expert in my office next week and get more details on how they identify and fix firmware infections.
My reasoning is straightforward, can we all agree that unless we have a 4090 at the bare minimum, we need upscaling? Yes? is DLSS miles above FSR? Yes? Do you give a shit about image quality and graphics or just want FPS? Yes? Nvidia thats it. its beyond pathetic that Intel poached a few Nvidia engineers and got a better upscaling solution than FSR.Ah yeah, there's still a group of AMD lifers who still haven't tried DLSS yet at 1440p/4k and think it is just a slightly better FSR. But DLSS is/used to be leagues above FSR2, I'm not sure how that has changed now with FSR3 but I suspect not very much. DLSS looks better than native TAA in many games to this day. But AMD's rabid fanbase kept insisting on just raster performance over anything else.
so basically you play valorant, use OEM softwares you are at their mercy? *cough* crowdstrike *cough*, I knew there was a reason I despise both ASUS and valorantBut you're missing the point: if your kernel is compromised, then you're in already deep trouble. Like complaining of the smoke hindering your vision when your whole house is on fire.
Plot twist: Asus mobos have a root kit called Armoury Crate that lives in the BIOS and can download + install software without user knowledge/input.
So you agree with what you replied to, "gone are the days", and "fundamentally changes the usage scenario".Prevention is rather easy. Do not use admin or high privilege account and protect router and other vulnerable devices.
No, (potential) kernel compromising was / is necessary for analysing infections in a separate boot or virtual hardware. Which is no more possible.But you're missing the point: if your kernel is compromised, then you're in already deep trouble.
Why? You need to erase entire firmware and patch new clean firmware.not this one, you need a SPI flasher and an engineer who would have go through SMT's code and figure out the malicious code, the cheapest way literally is to just trash the entire CPU