Intel's upcoming Core Ultra 9 285K is reportedly (slightly) slower than the i9 14900K in gaming

Hey, as long as you're happy with 40fps after paying 60k+ for GPU, all's well and good. Most games don't even have RT or DLSS implemented by the game dev, so for the majority a smooth, stable gameplay is preferable to eyecandy RT.

But my point was Nvidia knows you want RT, for which you have to use DLSS to get playable fps, so why not cut the VRAM and make more profit, while making the customer think they got a better deal.
 
but were massively improved by DLSS (since it uses its own AA) *cough* RDR2 *cough*,
DLSS in RDR2 is quite blurry in motion. I had to jump from DLSS perf to DLSS quality at 4k 32. Even then it has artifcasts, Even after all that that game is so blurry, i had to use AMD CAS on top of that to make it reasonable.
But yeah DLSS is good, esp at worse internal resolutions but there are plenty of times when it doesnt do that well. DLSS quality at 1080p in control was very blurry too. Made it look so ugly, i only realized later.
Lower VRAM is gonna hurt in the long run once we have to tone down texture quality and other things ( even RT / FG ) to reduce VRAM consumption.

Nvidia isnt that interested in selling to retail, they make money from AI these days. So we get crappy compromised specs at stupid prices.
 
But my point was Nvidia knows you want RT, for which you have to use DLSS to get playable fps, so why not cut the VRAM and make more profit, while making the customer think they got a better deal.
let's remove RT from the equation, lets do basic Raster comps, if you are gonna buy a 7900XT or even 7800XT, I think its safe to assume you'll want 4k res? and maybe just a little RT? since almost all AAA games (aside from older PS games ports) uses RT?

a 7800XT in Forbidden West 4k cannot maintain even straight 60s and even a 7900XT can only maintain mid 60s on native, the point of this is, you need upscaling even on an arguably AMD flagship to get good fps, its not like you can skip upscaling even on Raster like in forbidden west, and before you say, its because the game is not optimized, forbidden west is a nixxes port and a damn good one too.

And if you are using upscaling even for raster, why not pay more or go down a level and get an Nvidia with a miles better upscaling solution with DLSS. FSR is a blurry, shimmery mess with atrocious ghosting even at quality, Digital Foundry can give you as many examples of that as you need. and its not like DLSS is alone, DLAA when you have the GPU power really improves the game presentation, specially on a decent monitor, and then you can get flexibility with DLSSTweaks, experimenting with differnt presets, changing scaling ratios and what not and even forcing DLAA in games where its not supported and really upping picture quality. And I wont even get into DLSS FG as thats a whole another level of upscaling compared to FSR3. folks with a 40xx GPU can chime in here.
Hey, as long as you're happy with 40fps after paying 60k+ for GPU, all's well and good. Most games don't even have RT or DLSS implemented by the game dev, so for the majority a smooth, stable gameplay is preferable to eyecandy RT.
also, unless you are maxing out path tracing, there's no games where 4070ti cant manage 60s at DLSS balanced.
DLSS in RDR2 is quite blurry in motion. I had to jump from DLSS perf to DLSS quality at 4k 32. Even then it has artifcasts, Even after all that that game is so blurry,
your scaling ratio was set at 0.55 for perf, what'd you expect? XD as for RDR2, compare the image quality to native and tell me, DLSS doesnt almost eliminate all of it? as for CAS, why do you wanna smear shit on your screen? just add DLSSTweaks to RDR2 and force DLAA for a crystal clear picture, or just use DLDSR manually
Nvidia isnt that interested in selling to retail, they make money from AI these days. So we get crappy compromised specs at stupid prices.
yep, sadly
 
your scaling ratio was set at 0.55 for perf, what'd you expect?
Isn't that the point of using upscaling? Else the benefits shrink.
Digital foundry guys recommend DLSS performance at 4k. Sometimes that can work, but not always - atleast for me on a monitor at arms distance.

compare the image quality to native and tell me, DLSS doesnt almost eliminate all of it?
Did not try much, played for a short time with TAA i think and it wasnt bad from what i remember - not at 4k. TAA tends to be shit at lower resolutions.
DLSS also introduces artifacts, noise in moon, weird stuff around horse hair.
FSR Quality is also usable at 4k, Did not try to compare much, but i assume DLSS is better and it had better fps anyway.
FSR tends to be shit at lower base resolutions and when taken to extreme, DLSS is much better vs FSR.

None of this is native though, TAA also uses lower resolution sampling or something and that's why we can have image quality hit in modern games sometimes vs clean sharp image in something like WD 1 for example. See f**kTAA reddit.

I get it that DLSS is nice, i use it. But i will 100% not buy a VRAM compromised GPU ever again. ( Have 3080 10gb). 12 gb for 60k+ gpus is a joke.
I have seen what happens when VRAM is limited with 1050 2gb. Somehow did not understand that same was happening with 3080 10gb.

Hopefully once we have 3gb vram chips, nvidia will stop being an a**hole. Hopefully they dont compromise on bandwidth.

as for CAS, why do you wanna smear shit on your screen? just add DLSSTweaks to RDR2 and force DLAA for a crystal clear picture, or just use DLDSR manually
Clearly you havent used it. CAS is a great tool. DLDSR at 4k is not gonna work great with 3080. And DLAA does not solve the same issues as CAS. And both take a large perf hit vs CAS.
I like CAS just as much as DLSS and have used it a lot in last few years. Thankfully, AMD made it open source vs prop shitness from nvidia. Hopefully next ai FSR makes it competetive and they fix RT perf.

CAS makes still image sharper beyond what DLSS/TAA can manage. Default settings are conservative ( reshade) and generally i dont see noise. Only if i crank it higher. DLSS sharpness might do the same thing but i liked CAS better when i first tested and so i stick with it. Only with Doom eternal i did not bother as default DLSS sharpness looks great.
 
Last edited:
Isn't that the point of using upscaling? Else the benefits shrink.
Digital foundry guys recommend DLSS performance at 4k. Sometimes that can work, but not always - atleast for me on a monitor at arms distance.
not on every game, DLSS just like any other upscaling solution, works better in some games and less so in others.
None of this is native though, TAA also uses lower resolution sampling or something and that's why we can have image quality hit in modern games sometimes vs clean sharp image in something like WD 1 for example. See f**kTAA reddit.
nope. TAA just uses previous frames as reference, it doesnt down sample by itself, but the game itself specially on consoles might be running on lower internal res, thus exacerbating problems caused by TAA, TAA issues reduce at higher res simply because it has more information per frame, this is why you didnt find many issues when playing at 4k.
I get it that DLSS is nice, i use it. But i will 100% not buy a VRAM compromised GPU ever again. ( Have 3080 10gb). 12 gb for 60k+ gpus is a joke.
I have seen what happens when VRAM is limited with 1050 2gb. Somehow did not understand that same was happening with 3080 10gb.
yeah, its not for everyone definitely, but I play at 1440p, and atleast untill, next gen consoles hit mainstream, you'll be fine, since a PS5 on average has 12-13gigs usable for vram.
Hopefully once we have 3gb vram chips, nvidia will stop being an a**hole. Hopefully they dont compromise on bandwidth.
yeah, not gonna count on that XD
Clearly you havent used it. CAS is a great tool. DLDSR at 4k is not gonna work great with 3080. And DLAA does not solve the same issues as CAS. And both take a large perf hit vs CAS.
I like CAS just as much as DLSS and have used it a lot in last few years. Thankfully, AMD made it open source vs prop shitness from nvidia. Hopefully next ai FSR makes it competetive and they fix RT perf.
unless you mean something else by CAS other than AMD FX CAS aka Contrast Adaptive Sharpening, then yes I have used and you clearly havent used either DLDSR or DLAA if you think CAS is better or they both have a larger perf penalty compared to CAS.

DLDSR is nothing but setting monitor res to greater than native, like 4k (preferrably integer scaling), and then using DLSS perf to get the higher res image which you set in the control panel, which then is downsampled to native res, it literally improves performance somewhat, doing it manually without DLDSR has some issues, since there's some overhead in converting the higher res frame to native res, DLDSR eliminates that bottleneck.

And assuming its a 1080p monitor, you are essentially upscaling from native res and then downsampling it again, since DLSS perf has a scaling factor of 2x, so at 4k, internal res would be 1080p. there's virtually no performance penalty aside from a few frames and frametimes increasing a little when using DLDSR, and in fact at 1440p, you'll get a minor boost in perf, since you'll be upscaling from 1080p, instead of 1440p

And DLAA further improves on this process, by abstracting all of this away internally, no need to set res, or whatever, its all done internally and the reason DLAA is not recommended is, it runs at native res, without any upscaling by design, so you dont get any perf boost, just straight up upgrade to image quality.

EDIT: just to clarify, next gen for me would be PS6, since PS5,XSX can now arguably be called current gen consoles.
 
We disagree on few things, but i don't want to make this a long debate. So probably last reply.

yeah, its not for everyone definitely, but I play at 1440p, and atleast untill, next gen consoles hit mainstream, you'll be fine, since a PS5 on average has 12-13gigs usable for vram.
yeah, but there will be games that that will go beyond consoles. And then there are mods. I had trouble in increasing LOD in W3 at 1440p because of VRAM limits. Things like that will keep getting worse with time.
12gb is at the absolute limit and so we basically get a timer limit on the gpu irrespective of its performance.
And then if you move to 4k in future ( 4k 32 is nice / large tv etc ) or ultrawide - because why not at this price range - then more issues.

Its certainly a tradeoff these days - RT/DLSS or VRAM or spend crazy or buy used old gen.

unless you mean something else by CAS other than AMD FX CAS aka Contrast Adaptive Sharpening, then yes I have used and you clearly havent used either DLDSR or DLAA if you think CAS is better or they both have a larger perf penalty compared to CAS.

DLDSR is nothing but setting monitor res to greater than native, like 4k (preferrably integer scaling), and then using DLSS perf to get the higher res image which you set in the control panel, which then is downsampled to native res, it literally improves performance somewhat, doing it manually without DLDSR has some issues, since there's some overhead in converting the higher res frame to native res, DLDSR eliminates that bottleneck.

And assuming its a 1080p monitor, you are essentially upscaling from native res and then downsampling it again, since DLSS perf has a scaling factor of 2x, so at 4k, internal res would be 1080p. there's virtually no performance penalty aside from a few frames and frametimes increasing a little when using DLDSR, and in fact at 1440p, you'll get a minor boost in perf, since you'll be upscaling from 1080p, instead of 1440p

And DLAA further improves on this process, by abstracting all of this away internally, no need to set res, or whatever, its all done internally and the reason DLAA is not recommended is, it runs at native res, without any upscaling by design, so you dont get any perf boost, just straight up upgrade to image quality.
yeah disagree quite a bit. you had said that CAS will smear shit which makes no sense.
DLSS / CAS are different things. CAS is more like sharpening filters from Nvidia. I use both CAS and DLSS and like both.

Some background - i bought 3080 2 years back. First played it on 1080p, then upgraded to Acer 1440p and then to 4k 32.

1) I have used DLDSR a lot at 1080p and 1440p. Its great for those resolutions because games are shit these days at lower res. Also used it to play gta 5 which has shit AA.
Digital Hardware had a video about how things are made for 4k and lower res is extra blurry. So this helped and yeah i used it with DLSS to get some perf back.
At 4k, its generally not needed and anyway performance cost will be too high.
DLDSR is not free, it costs a lot and we use DLSS to get some back. But its still costly vs DLSS alone.
Also you dont need integer scaling for it, which is how it differs from older DSR ( + machine learning stuff)

2) DLAA does not improve upon DLDSR, they are different things. Ex see here.

3) DLDSR/DLSS does not remove benefits that we can get from sharpening through CAS/NIS.
Some people like soft, so maybe they dont care. To me image quality improves substantially if things get less blurry.
tradeoff is that you can get artifacts ( esp far away things) and some flickering in areas.
CAS at default settigs via reshade does not seem to have much downside and with very little perf impact, i love it.

In Witcher 3 next gen, i played at 1440p with DLDSR + DLSS ( balanced, perf was blurry in motion) and on top of this i used CAS and it looked better vs DLDSR + DLSS alone.
Metro Exodus at 1440p looked very bad in motion which DLDSR fixed ( + used DLSS and it still fixed it). And i added CAS on top of that which made things sharper when still ( ex - blurry weapons).

As an aside -
So much nonsense we have to do these days because game developers have gone lazy.
WD 1 was nice and sharp. Its crazy to see older games being cleaner vs new ones sometimes.
Doom eternal at 4k is beautiful and sharp with default DLSS sharpening. No need for extra stuff + it has extremely good HDR.
 
Last edited:
let's remove RT from the equation, lets do basic Raster comps, if you are gonna buy a 7900XT or even 7800XT, I think its safe to assume you'll want 4k res? and maybe just a little RT? since almost all AAA games (aside from older PS games ports) uses RT?

a 7800XT in Forbidden West 4k cannot maintain even straight 60s and even a 7900XT can only maintain mid 60s on native
Your GPU can't even do 1080p native @ 60 fps with RT enabled and you're trying to compare it with 4k native + RT performance on AMD. RIP logic. Whatever helps you cope and sleep at night.
 
We disagree on few things, but i don't want to make this a long debate. So probably last reply.


yeah, but there will be games that that will go beyond consoles. And then there are mods. I had trouble in increasing LOD in W3 at 1440p because of VRAM limits. Things like that will keep getting worse with time.
12gb is at the absolute limit and so we basically get a timer limit on the gpu irrespective of its performance.
And then if you move to 4k in future ( 4k 32 is nice / large tv etc ) or ultrawide - because why not at this price range - then more issues.

Its certainly a tradeoff these days - RT/DLSS or VRAM or spend crazy or buy used old gen.
yeah, that I can agree with, unless you throw money, you'll have to compromise quite a lot
yeah disagree quite a bit. you had said that CAS will smear shit which makes no sense.
I mean if you dont notice the over sharpened image that gets shimmery as heck, then all the more power to you, I meant shit in literal shit, not blurriness
DLSS / CAS are different things. CAS is more like sharpening filters from Nvidia. I use both CAS and DLSS and like both.

Some background - i bought 3080 2 years back. First played it on 1080p, then upgraded to Acer 1440p and then to 4k 32.

1) I have used DLDSR a lot at 1080p and 1440p. Its great for those resolutions because games are shit these days at lower res. Also used it to play gta 5 which has shit AA.
Digital Hardware had a video about how things are made for 4k and lower res is extra blurry. So this helped and yeah i used it with DLSS to get some perf back.
At 4k, its generally not needed and anyway performance cost will be too high.
DLDSR is not free, it costs a lot and we use DLSS to get some back. But its still costly vs DLSS alone.
Also you dont need integer scaling for it, which is how it differs from older DSR ( + machine learning stuff)

2) DLAA does not improve upon DLDSR, they are different things. Ex see here.

3) DLDSR/DLSS does not remove benefits that we can get from sharpening through CAS/NIS.
Some people like soft, so maybe they dont care. To me image quality improves substantially if things get less blurry.
tradeoff is that you can get artifacts ( esp far away things) and some flickering in areas.
CAS at default settigs via reshade does not seem to have much downside and with very little perf impact, i love it.
no offense man, but read up on what you are saying, of course DLDSR (I'll just use DSR, its kind of a mouthful). is costly, you are setting your res at 4k, of course its gonna reduce your performance, thats why you use DLSS to essentially play at native res while getting a superior image, saying that DLDSR is costly, its just not true, you are free to look up 4k native benchmarks for your GPU and then compare it to a 1440p/1080p upscaled to 4k with DLDSR benchmark and you'll see just how little difference there is, Integer scaling was never needed even for vanillar DSR but was recommended to use,you can find countless threads on the reasons so I wont bother detailing them here

2. Never said DLAA and DLDSR was similar? I said you can use DLDSR with DLSS to simulate DLAA in games where its not supported and you cant use DLSSTweaks


In Witcher 3 next gen, i played at 1440p with DLDSR + DLSS ( balanced, perf was blurry in motion) and on top of this i used CAS and it looked better vs DLDSR + DLSS alone.
Metro Exodus at 1440p looked very bad in motion which DLDSR fixed ( + used DLSS and it still fixed it). And i added CAS on top of that which made things sharper when still ( ex - blurry weapons).
then you prefer oversharpened images, which is absolutely fine,
As an aside -
So much nonsense we have to do these days because game developers have gone lazy.
WD 1 was nice and sharp. Its crazy to see older games being cleaner vs new ones sometimes.
Doom eternal at 4k is beautiful and sharp with default DLSS sharpening. No need for extra stuff + it has extremely good HDR.
well, I think its more like with RT, most devs dont know just how to optimise for it + publishers just rush the game and force their release despite them not being nearly ready. There was a very good video on this topic, not sure if it was DF where they discussed this, RT is just so new that most devs dont have an established convention of how to optimize for it, and with bigger leaps in AI, most just seem to skip optimization since its cheaper and newer GPUs can just power through it. And studios just go so all in with RT even with games that do not strictly need so much, I mean unless you are Cyberpunk/Alan Wake 2, you dont need the full RT treatment, even a single ray bounce can do a lot of good for visuals while preserving performance. I'll try and see if i can find the video, they discussed a lot of nuances around this,
Your GPU can't even do 1080p native @ 60 fps with RT enabled and you're trying to compare it with 4k native + RT performance on AMD. RIP logic. Whatever helps you cope and sleep at night.
Have you looked at the benchmarks for a 7800XT with RT ON? even without RT, a 7800XT cant manage 60s in HZD, anyone with even half a mind, would agree 7800XT is a 4k card but no.

Also,
this is LTT's review for 4070 ti , maybe try watching it and see how its managing to more than give 60+fps native with RT in most games?
 
Last edited by a moderator:
I mean if you dont notice the over sharpened image that gets shimmery as heck, then all the more power to you, I meant shit in literal shit, not blurriness
Nope, CAS is very good and at default settings via reshade i don't see much artifacts at all. Only if i crank it higher.
I had tried NIS earlier and got artifacts, but its possible that i did not use it properly. Googled a bit later and many seem to like CAS too.
Anyway..
no offense man, but read up on what you are saying, of course DLDSR (I'll just use DSR, its kind of a mouthful). is costly, you are setting your res at 4k, of course its gonna reduce your performance, thats why you use DLSS to essentially play at native res while getting a superior image, saying that DLDSR is costly, its just not true, you are free to look up 4k native benchmarks for your GPU and then compare it to a 1440p/1080p upscaled to 4k with DLDSR benchmark and you'll see just how little difference there is, Integer scaling was never needed even for vanillar DSR but was recommended to use,you can find countless threads on the reasons so I wont bother detailing them here
Boss, i am playing at 4k. DLDSR from 4k is going to be a multiple of 4k and thats not feasible for my 3080. That is what i meant by performance cost. 4k vs 4k * x or 4k + DLSS vs 4k*x + DLSS.

then you prefer oversharpened images, which is absolutely fine,
This is subjective ofc, but i dont think i am over-sharpening it although i do have an option to do so. Rather i don't like soft/blurry images.
And yeah i might prefer to oversharpen parts of the image if its very blurry - This happened in Dying light 2 at 1440p.
Had to accept far away things being slightly over sharpened to get nearby stuff clearer.

This has become a disease ever since temporal stuff started being used, nothing to do with RT.
I did not need to sharpen Watch dogs 1 with SMAA which was nice and crisp. Same game has a TAA type AA and its horrendous.
Not all games have bad TAA, id generally seem to do it well for ex.
And 4k should be better, but RDR2 is in a league of its own in terms of blurriness.


well, I think its more like with RT, most devs dont know just how to optimise for it + publishers just rush the game and force their release despite them not being nearly ready. There was a very good video on this topic, not sure if it was DF where they discussed this, RT is just so new that most devs dont have an established convention of how to optimize for it
Best RT i have seen so far has been in Metro Exodus. And it performed well too even with multiple bounces that added colour nicely. Fire looked nice.
Doom Eternal is nice too ( really love this game ) .

 
Last edited:
ah, another brainless AMD fanboy spotted, I guess this is what helps you sleep at night. Has your literal shit for brains mind ever looked at the benchmarks for a 7800XT with RT ON? ****, even without RT, a ****ing 7800XT cant manage 60s in HZD, anyone with even half a mind, would agree 7800XT is a 4k card but no, you idiots wont ever look that way.
Lol I've had 3 Nvidias and 1 AMD. Sticking to a particular brand inspite of having lower VFM is the definition of fanboyism. If you think 40fps is good clearly your brain is slow enough to not process any more fps.

Also,
this is LTT's review for 4070 ti and if your shit for brains cant read, maybe try watching it and see how its managing to more than give 60+fps native with RT in most games?
I don't need to watch that. You yourself said you get only 40 fps. So should we trust your words, or ignore everything you say since you're a Nvidia fanboy?

Titles like Cyberpunk are the epitome of what RT/PT can do: it does look really pretty with PT
CP RT.jpg


Here's a video from TODAY:

TLDW: RT is pretty, but takes a pretty huge performance hit. Not really worth it on current GPUs. Future GPUs might give smoother gameplay with better RT performance.

Edit: this is how I see you:
1729759793622.png


------------------------------------------------------------------------
Reviews are out!



 
Last edited:
Thermals have been great (from the reviews that I have read), but the performance just hasn’t been what I expected. This was the case with the RYZEN 9xxx series as well—decent temps but underwhelming performance. The last one I’m waiting on now is the RYZEN 9xxx X3D. If that also disappoints, I won’t be upgrading from my current Intel 14700KF.

Power consumption - https://www.techpowerup.com/review/intel-core-ultra-9-285k/24.html
Thermals - https://www.techpowerup.com/review/intel-core-ultra-9-285k/28.html

1729783466338.png

This in at the bottom, is at load -

1729783528943.png
 
https://cdn.mos.cms.futurecdn.net/Fdo5prUyCsvKbjrjNpDVCo-1200-80.png.webp


https://cdn.mos.cms.futurecdn.net/pi7qoJspny7iUxEjFncCLo-1200-80.png.webp


x3D still gives more fps and is more power efficient. But the new Ultras seem more efficient than previous gen Intels, even fps is slightly lower.
 
Lol I've had 3 Nvidias and 1 AMD. Sticking to a particular brand inspite of having lower VFM is the definition of fanboyism. If you think 40fps is good clearly your brain is slow enough to not process any more fps.


I don't need to watch that. You yourself said you get only 40 fps. So should we trust your words, or ignore everything you say since you're a Nvidia fanboy?

Titles like Cyberpunk are the epitome of what RT/PT can do: it does look really pretty with PT
View attachment 211981

Here's a video from TODAY:

TLDW: RT is pretty, but takes a pretty huge performance hit. Not really worth it on current GPUs. Future GPUs might give smoother gameplay with better RT performance.

Edit: this is how I see you:
View attachment 211983

------------------------------------------------------------------------
Reviews are out!




Garbage chips. The 285 is just...sad. For gaming, the 7800X3D takes a giant shit on it from the top of the ISS.

Even for productivity, it's meh. GN's conclusion is bang on, which even throws any potential cost/efficiency savings out the window.
 
Gamers Nexus:


Now before everyone starts dunking on Intel, hear me out. The chip is not very exciting. It's a new socket so people can't just buy the chip, there is the board cost involved. Things don't look good for intel. However I believe that if and it's a big IF, intel can make this new socket last a bit longer (say 3-4 generations) and sell the new chips a bit cheaper, I think they can make this situation work in their favor. They are not the fastest or most efficient but they can be the most value. I would happily buy an Intel chip now, intel knows they suck, we know they suck and this can bring the prices down.

Reminds me of first gen ryzen.
 
Last edited:
Back
Top