Intel's upcoming Core Ultra 9 285K is reportedly (slightly) slower than the i9 14900K in gaming

The chip is not very exciting. It's a new socket so people can't just buy the chip, there is the board cost involved. Things don't look good for intel. However I believe that if and it's a big IF, intel can make this new socket last a bit longer (say 3-4 generations) and sell the new chips a bit cheaper, I think they can make this situation work in their favor. They are not the fastest or most efficient but they can be the most value. I would happily buy an Intel chip now, intel knows they suck, we know they suck and this can bring the prices down.

Reminds me of first gen ryzen.

This isn't even remotely comparable to Ryzen 1st Gen.

When Ryzen 1xxx launched, AMD offered consumers with the benefit of 'high core count at value pricing'. It was a dream getting a 12-core R5 1600 at $219 launch price, when Intel was selling the 6-core i5 8400 at $170-180.

Couple that with cheap B350 motherboards, a platform that will last 3-4 generations, beefy stock coolers (much better than the shit Intel was providing in the name of 'stock coolers'), and the potential to OC, you had a winner. Yes, the Ryzen 1st Gen was lacking in terms of gaming performance and the memory support was weird, but the pros more than made up for the cons.

Fast-forward to the tail-end of 2024, and the CPU market isn't the same anymore. There are much better offerings at much lesser prices than this sheep's turd Intel came up with. Hell, even Intel's own previous generations (13th and 14th) offer better value, not that I would suggest anyone to buy those.

Then, why should anyone even be the guinea pig and purchase these current shit offerings from Intel, even if the prices are slashed (hypothetically) and Intel doesn't change the chipset (there's higher chance of seeing Halley's Comet than seeing this)?

They deserve to get annihilated right now, so that they go back to the drawing board again, so that they can rise back from the ashes with a better plan. Similar to Apple and AMD - both of which are currently leaders in their respective industries and were few steps away from bankruptcy back in the day.
 
Lol I've had 3 Nvidias and 1 AMD. Sticking to a particular brand inspite of having lower VFM is the definition of fanboyism. If you think 40fps is good clearly your brain is slow enough to not process any more fps.
bro here if he had even a smidge more reading comprehension and more importantly enough processing power in his brain, would have seen thats my personal preference and not the card's capabilities, or maybe if someone had gone through their own links, they would have seen that a 4070ti is more than capable of giving 60+fps with just DLSS and RT on 1440p+ as long as Path tracing is not used, or that with frame gen, even Path Tracing is more than doable with 60s.
Titles like Cyberpunk are the epitome of what RT/PT can do: it does look really pretty with PT
bad example, I can share way better than this, and if you want proper comps, Digital Foundry is the best there is
Edit: this is how I see you:
maybe if you worked on English speaking skills and reading comprehension, you would have noticed this is what I've been saying ever since this idiotic debate started? either you go all in with visuals or compromise and get the more vfm AMD.
Nope, CAS is very good and at default settings via reshade i don't see much artifacts at all. Only if i crank it higher.
I had tried NIS earlier and got artifacts, but its possible that i did not use it properly. Googled a bit later and many seem to like CAS too.
Anyway..
both are crap, but that honestly depends on peep's preferences, if CAS works for you then use it
Boss, i am playing at 4k. DLDSR from 4k is going to be a multiple of 4k and thats not feasible for my 3080. That is what i meant by performance cost. 4k vs 4k * x or 4k + DLSS vs 4k*x + DLSS.
you dont need DLDSR then lol, DLAA is the way for you, DLDSR is for peeps like me who game at 1440p and want to increase visual fidelity with a dirty hack.
This has become a disease ever since temporal stuff started being used, nothing to do with RT.
I did not need to sharpen Watch dogs 1 with SMAA which was nice and crisp. Same game has a TAA type AA and its horrendous.
Not all games have bad TAA, id generally seem to do it well for ex.
And 4k should be better, but RDR2 is in a league of its own in terms of blurriness.
any game that uses TAA and has a lot of foliage will suffer from this issue (check out HZD to see what I mean), in city-esque landscapes, TAA's shortcoming is hidden quite easily.
Best RT i have seen so far has been in Metro Exodus. And it performed well too even with multiple bounces that added colour nicely. Fire looked nice.
Doom Eternal is nice too ( really love this game ) .
hardware Unboxed are really good but what they are not good at are comps, they'll show you the worst case comparisons to prove their point, if you really want comparisons/perf cost, digital foundry is the best there is, I would rather sit through the games that HUB showed, and then look at DF's video on it to see whether RT is worth it or not.

besides, if you want a proper RT experience, download Cyberpunk and enable overdrive/ray reconstruction, the visual quality is just chef's kiss, specially Ray reconstruction in base game, where when driving past, let's say a billboard, you can see even the reflection colors changing on the car in real time (before, it didnt), or the clarity in reflections. path tracing/overdrive is mostly noticeable in Global Illumination unless you are looking for it, and its specially visible in Dog Town, where they are many localised light sources at night, and you can see how natural the bounce lighting is from that singular source of light.

I managed to get path working on a 3060ti, with DLSS tweaks, I think I used preset C? with scaling factor set to 0.51ish, the game had a lot of softer edges and barely gave me 30s but damn it looked good.
 
Last edited:
It was a dream getting a 12-core R5 1600 at $219 launch price, when Intel was selling the 6-core i5 8400 at $170-180.

Couple that with cheap B350 motherboards, a platform that will last 3-4 generations, beefy stock coolers

R5 1600 was 12 core?, it wasn't. At that point of time AMD didn't promise any platform longevity either, AMD even refused support after they released new generation, only after pressure they extended support to older motherboards.
 
This isn't even remotely comparable to Ryzen 1st Gen.

When Ryzen 1xxx launched, AMD offered consumers with the benefit of 'high core count at value pricing'. It was a dream getting a 12-core R5 1600 at $219 launch price, when Intel was selling the 6-core i5 8400 at $170-180.

Couple that with cheap B350 motherboards, a platform that will last 3-4 generations, beefy stock coolers (much better than the shit Intel was providing in the name of 'stock coolers'), and the potential to OC, you had a winner. Yes, the Ryzen 1st Gen was lacking in terms of gaming performance and the memory support was weird, but the pros more than made up for the cons.

Fast-forward to the tail-end of 2024, and the CPU market isn't the same anymore. There are much better offerings at much lesser prices than this sheep's turd Intel came up with. Hell, even Intel's own previous generations (13th and 14th) offer better value, not that I would suggest anyone to buy those.

Then, why should anyone even be the guinea pig and purchase these current shit offerings from Intel, even if the prices are slashed (hypothetically) and Intel doesn't change the chipset (there's higher chance of seeing Halley's Comet than seeing this)?

They deserve to get annihilated right now, so that they go back to the drawing board again, so that they can rise back from the ashes with a better plan. Similar to Apple and AMD - both of which are currently leaders in their respective industries and were few steps away from bankruptcy back in the day.

12 Cores.... you got your threads and cores confused.

The point is not that it's like first gen ryzen. The point is the "situation" is like first gen ryzen. Intel can make this work in their favor. Remember there is no bad product just a bad price.

I would happily buy Intel if they promise platform longevity and slash the prices below (well below) their AMD conterparts. I bought the ryzen when everyone was Intel-Intel-Intel.
 
Last edited:
Why would anyone pay 589 USD for a 250W TDP 8 p-core 16 e-thread CPU in 2024 when you can buy AMD's 7700X 8C/16T for a lot cheaper. :p

1729835598953.png
 
  • Like
Reactions: dvader
Why would anyone pay 589 USD for a 250W TDP 8 p-core 16 e-thread CPU in 2024 when you can buy AMD's 7700X 8C/16T for a lot cheaper. :p
doesnt make sense for most gamers currently, but when more mainstream games start taking advantage of more cores/threads, then it will, like in the case of RDR2 when it was first released, the performance almost scaled linearly as the amount of cores/threads increased, with lower processors bottlenecking the GPUs

And if you are into RTS/Strat/Management games, more cpu cores is a necessity, all that NPC AI/ moving elements need CPU overhead to render
 
  • Like
Reactions: dvader
doesnt make sense for most gamers currently, but when more mainstream games start taking advantage of more cores/threads, then it will, like in the case of RDR2 when it was first released, the performance almost scaled linearly as the amount of cores/threads increased, with lower processors bottlenecking the GPUs

And if you are into RTS/Strat/Management games, more cpu cores is a necessity, all that NPC AI/ moving elements need CPU overhead to render
Agreed,but we already have AMD CPU for that right ? More cores/Threads that are faster in real-world applications & games and more power efficient than Intel's offerings.
 
Why would anyone pay 589 USD for a 250W TDP 8 p-core 16 e-thread CPU in 2024 when you can buy AMD's 7700X 8C/16T for a lot cheaper. :p

View attachment 212033
Yeah, exactly no one would pay 589 for that CPU. Intel should be smart enough at this point to understand this basic fact. However, if the prices get low enough they can make something of this situation. They moved to chiplet design so they are saving money (albeit not as much as AMD). Also, AMD prices will increase so there is that. Either way this "could be" the beginning of something good for intel.

Also, it doesn't even beat 5800x3d in all the scenarios.
 
Agreed,but we already have AMD CPU for that right ? More cores/Threads that are faster in real-world applications & games and more power efficient than Intel's offerings.
I havent used AMD CPUs so dont know much about it and was just going through the # of cores/threads on paper, does the 8c/16t not bottleneck in strat/RTS games?
 
I havent used AMD CPUs so dont know much about it and was just going through the # of cores/threads on paper, does the 8c/16t not bottleneck in strat/RTS games?
even if 8c/16t do bottleneck, AMD will have a CPU which would eventually cost less for more cores and threads. Don't forget the platform savings. AMD is still releasing AM4 CPUs (with 5600t and xt on the roadmap) and given how stable AM4 is I don't think it's gonna die soon. I haven't bought an Intel CPU in 7 years. The only compelling CPU (in my opinion) they have produced in the last 5-6 years is 13600k and given that the platform is just 2 gen long, I didn't buy that either.
 
I havent used AMD CPUs so dont know much about it and was just going through the # of cores/threads on paper, does the 8c/16t not bottleneck in strat/RTS games?
I was being rhetorical. My point is, both offers higher core/threads, but AMD's X3D CPUs (8C & 16C) offer better performance per watt in almost all games, so it can be hard to recommend Intel's current offerings for the price. AMD won the 2024 CPU battle despite releasing an unimpressive non-x3d 9000 series.
 
  • Like
Reactions: dvader
Just curious, how do you use both AMD & Nvidia graphics techs together in a pc?
FSR/CAS/any other amd tech is hardware agnostic and based on software which they mostly release for competitors too.
I was being rhetorical. My point is, both offers higher core/threads, but AMD's X3D CPUs (8C & 16C) offer better performance per watt in almost all games, so it can be hard to recommend Intel's current offerings for the price. AMD won this battle despite showing unimpressive 9000 series.
ah yeah, makes sense, i was looking at it from a pef only lens
even if 8c/16t do bottleneck, AMD will have a CPU which would eventually cost less for more cores and threads. Don't forget the platform savings. AMD is still releasing AM4 CPUs (with 5600t and xt on the roadmap) and given how stable AM4 is I don't think it's gonna die soon. I haven't bought an Intel CPU in 7 years. The only compelling CPU (in my opinion) they have produced in the last 5-6 years is 13600k and given that the platform is just 2 gen long, I didn't buy that either.
yeah, that is a good point, you save on mobo costs on AMD which can be significant enough and you can then invest it in a better processor
 
  • Like
Reactions: dvader
I was being rhetorical. My point is, both offers higher core/threads, but AMD's X3D CPUs (8C & 16C) offer better performance per watt in almost all games, so it can be hard to recommend Intel's current offerings for the price. AMD won the 2024 CPU battle despite releasing an unimpressive 9000 series.
And for the sake of all that's holy, CPU manufacturers should switch to BIENNIALLY refresh. They can release a mid-gen refresh for enthusiasts ofc...the point is, it would extend the platform life and give these guys time to REALLY improve tech instead of fudging numbers to look good (goes for both AMD and Intel). Remember intel not being good is gonna hurt AMD (and us) eventually cuz they will have no competition.

One good (nay great) thing however is AMD and Intel (and other tech groups) collaborating towards x86 Advisory group (with Linux Torvalds as a luminary). We Do not need proprietary sh**t in the PC industry. RISC CPUs were once the bread and butter of workstations, there is a reason PC went with x86.

Edit: With proprietary sh*t I mean ARM, I don't need soldered RAM, custom boot, and locked devices. They have started creeping up in laptops let's hope that remains a small uneducated part of the customer base. Keep that sh*t away from PC and laptops.
 
Last edited:
Just curious, how do you use both AMD & Nvidia graphics techs together in a pc?
Nvidia because i have Nvidia gpu. Amd because amd is nice and open sources stuff which modders can then use.
In this case i use reshade to do this. Perf impact is minimal, maybe less than 5%.
It makes things clearer without any downside if you use default settings - sort of like putting on glasses vs taking them off. I use it very frequently.

------------------------
What a waste of a cpu. Intel seems to be in trouble, nothing is working well for them these days.
Seems like this is what happens when you put a finance guy as ceo of a tech company. He takes money out of company, does not invest enough, and does buybacks ..
 
I havent used AMD CPUs so dont know much about it and was just going through the # of cores/threads on paper, does the 8c/16t not bottleneck in strat/RTS games?
No it does not. I have 5800X (non-3D) that I have paired with 3090 (used 1650 earlier). I had played AOE and Civilization 6 when I had 1650 GPU. In current setup, every game is lag-free. There are few games like Cyberpunk that show sign of slowness if I put RT at max without frame generation.
Yeah, exactly no one would pay 589 for that CPU. Intel should be smart enough at this point to understand this basic fact. However, if the prices get low enough they can make something of this situation. They moved to chiplet design so they are saving money (albeit not as much as AMD). Also, AMD prices will increase so there is that. Either way this "could be" the beginning of something good for intel.
The problem for Intel is their quantity strategy. Their margins are wafer thin as they are cramming more and more cores / transistors on their CPUs. I dont know if they can compete with AMD on pricing this time around as they had to use TSMC fabs and that cost them more money as their own fabs are not yet operational. AMD was rubbing salt on their wounds by doing price cuts for their Zen 5 CPUs.

I am expecting these CPUs to get better with time as Intel is also seeing issues with Windows just like AMD did when they launched Zen 5. Its a real shame that Intel Arrow Lake cannot beat 14th gen in gaming.
 
Last edited:
No it does not. I have 5800X (non-3D) that I have paired with 3090 (used 1650 earlier). I had played AOE and Civilization 6 when I had 1650 GPU. In current setup, every game is lag-free. There are few games like Cyberpunk that show sign of slowness if I put RT at max without frame generation.
did you reach the end game? I haven't player civ much and am mainly an Anno player, but in 1800 at the start. I easily clear 80s but by then time I am 400-500 hours in, I barely get 30s, no ram bottleneck from what I can tell. usages us around 25-26 gigs in anno.

I fill up my 32 when I am 800-900 hours in with all islands almost maxing out