DigitalDude
Herald
The chip is not very exciting. It's a new socket so people can't just buy the chip, there is the board cost involved. Things don't look good for intel. However I believe that if and it's a big IF, intel can make this new socket last a bit longer (say 3-4 generations) and sell the new chips a bit cheaper, I think they can make this situation work in their favor. They are not the fastest or most efficient but they can be the most value. I would happily buy an Intel chip now, intel knows they suck, we know they suck and this can bring the prices down.
Reminds me of first gen ryzen.
Lol I've had 3 Nvidias and 1 AMD. Sticking to a particular brand inspite of having lower VFM is the definition of fanboyism. If you think 40fps is good clearly your brain is slow enough to not process any more fps.
bad example, I can share way better than this, and if you want proper comps, Digital Foundry is the best there isTitles like Cyberpunk are the epitome of what RT/PT can do: it does look really pretty with PT
maybe if you worked on English speaking skills and reading comprehension, you would have noticed this is what I've been saying ever since this idiotic debate started? either you go all in with visuals or compromise and get the more vfm AMD.Edit: this is how I see you:
both are crap, but that honestly depends on peep's preferences, if CAS works for you then use itNope, CAS is very good and at default settings via reshade i don't see much artifacts at all. Only if i crank it higher.
I had tried NIS earlier and got artifacts, but its possible that i did not use it properly. Googled a bit later and many seem to like CAS too.
Anyway..
you dont need DLDSR then lol, DLAA is the way for you, DLDSR is for peeps like me who game at 1440p and want to increase visual fidelity with a dirty hack.Boss, i am playing at 4k. DLDSR from 4k is going to be a multiple of 4k and thats not feasible for my 3080. That is what i meant by performance cost. 4k vs 4k * x or 4k + DLSS vs 4k*x + DLSS.
any game that uses TAA and has a lot of foliage will suffer from this issue (check out HZD to see what I mean), in city-esque landscapes, TAA's shortcoming is hidden quite easily.This has become a disease ever since temporal stuff started being used, nothing to do with RT.
I did not need to sharpen Watch dogs 1 with SMAA which was nice and crisp. Same game has a TAA type AA and its horrendous.
Not all games have bad TAA, id generally seem to do it well for ex.
And 4k should be better, but RDR2 is in a league of its own in terms of blurriness.
hardware Unboxed are really good but what they are not good at are comps, they'll show you the worst case comparisons to prove their point, if you really want comparisons/perf cost, digital foundry is the best there is, I would rather sit through the games that HUB showed, and then look at DF's video on it to see whether RT is worth it or not.Best RT i have seen so far has been in Metro Exodus. And it performed well too even with multiple bounces that added colour nicely. Fire looked nice.
Doom Eternal is nice too ( really love this game ) .
It was a dream getting a 12-core R5 1600 at $219 launch price, when Intel was selling the 6-core i5 8400 at $170-180.
Couple that with cheap B350 motherboards, a platform that will last 3-4 generations, beefy stock coolers
This isn't even remotely comparable to Ryzen 1st Gen.
When Ryzen 1xxx launched, AMD offered consumers with the benefit of 'high core count at value pricing'. It was a dream getting a 12-core R5 1600 at $219 launch price, when Intel was selling the 6-core i5 8400 at $170-180.
Couple that with cheap B350 motherboards, a platform that will last 3-4 generations, beefy stock coolers (much better than the shit Intel was providing in the name of 'stock coolers'), and the potential to OC, you had a winner. Yes, the Ryzen 1st Gen was lacking in terms of gaming performance and the memory support was weird, but the pros more than made up for the cons.
Fast-forward to the tail-end of 2024, and the CPU market isn't the same anymore. There are much better offerings at much lesser prices than this sheep's turd Intel came up with. Hell, even Intel's own previous generations (13th and 14th) offer better value, not that I would suggest anyone to buy those.
Then, why should anyone even be the guinea pig and purchase these current shit offerings from Intel, even if the prices are slashed (hypothetically) and Intel doesn't change the chipset (there's higher chance of seeing Halley's Comet than seeing this)?
They deserve to get annihilated right now, so that they go back to the drawing board again, so that they can rise back from the ashes with a better plan. Similar to Apple and AMD - both of which are currently leaders in their respective industries and were few steps away from bankruptcy back in the day.
doesnt make sense for most gamers currently, but when more mainstream games start taking advantage of more cores/threads, then it will, like in the case of RDR2 when it was first released, the performance almost scaled linearly as the amount of cores/threads increased, with lower processors bottlenecking the GPUsWhy would anyone pay 589 USD for a 250W TDP 8 p-core 16 e-thread CPU in 2024 when you can buy AMD's 7700X 8C/16T for a lot cheaper.
Agreed,but we already have AMD CPU for that right ? More cores/Threads that are faster in real-world applications & games and more power efficient than Intel's offerings.doesnt make sense for most gamers currently, but when more mainstream games start taking advantage of more cores/threads, then it will, like in the case of RDR2 when it was first released, the performance almost scaled linearly as the amount of cores/threads increased, with lower processors bottlenecking the GPUs
And if you are into RTS/Strat/Management games, more cpu cores is a necessity, all that NPC AI/ moving elements need CPU overhead to render
Yeah, exactly no one would pay 589 for that CPU. Intel should be smart enough at this point to understand this basic fact. However, if the prices get low enough they can make something of this situation. They moved to chiplet design so they are saving money (albeit not as much as AMD). Also, AMD prices will increase so there is that. Either way this "could be" the beginning of something good for intel.Why would anyone pay 589 USD for a 250W TDP 8 p-core 16 e-thread CPU in 2024 when you can buy AMD's 7700X 8C/16T for a lot cheaper.
View attachment 212033
I havent used AMD CPUs so dont know much about it and was just going through the # of cores/threads on paper, does the 8c/16t not bottleneck in strat/RTS games?Agreed,but we already have AMD CPU for that right ? More cores/Threads that are faster in real-world applications & games and more power efficient than Intel's offerings.
Just curious, how do you use both AMD & Nvidia graphics techs together in a pc?I had to jump from DLSS perf to DLSS quality at 4k 32. Even then it has artifcasts, Even after all that that game is so blurry, i had to use AMD CAS on top of that to make it reasonable.
even if 8c/16t do bottleneck, AMD will have a CPU which would eventually cost less for more cores and threads. Don't forget the platform savings. AMD is still releasing AM4 CPUs (with 5600t and xt on the roadmap) and given how stable AM4 is I don't think it's gonna die soon. I haven't bought an Intel CPU in 7 years. The only compelling CPU (in my opinion) they have produced in the last 5-6 years is 13600k and given that the platform is just 2 gen long, I didn't buy that either.I havent used AMD CPUs so dont know much about it and was just going through the # of cores/threads on paper, does the 8c/16t not bottleneck in strat/RTS games?
I was being rhetorical. My point is, both offers higher core/threads, but AMD's X3D CPUs (8C & 16C) offer better performance per watt in almost all games, so it can be hard to recommend Intel's current offerings for the price. AMD won the 2024 CPU battle despite releasing an unimpressive non-x3d 9000 series.I havent used AMD CPUs so dont know much about it and was just going through the # of cores/threads on paper, does the 8c/16t not bottleneck in strat/RTS games?
FSR/CAS/any other amd tech is hardware agnostic and based on software which they mostly release for competitors too.Just curious, how do you use both AMD & Nvidia graphics techs together in a pc?
ah yeah, makes sense, i was looking at it from a pef only lensI was being rhetorical. My point is, both offers higher core/threads, but AMD's X3D CPUs (8C & 16C) offer better performance per watt in almost all games, so it can be hard to recommend Intel's current offerings for the price. AMD won this battle despite showing unimpressive 9000 series.
yeah, that is a good point, you save on mobo costs on AMD which can be significant enough and you can then invest it in a better processoreven if 8c/16t do bottleneck, AMD will have a CPU which would eventually cost less for more cores and threads. Don't forget the platform savings. AMD is still releasing AM4 CPUs (with 5600t and xt on the roadmap) and given how stable AM4 is I don't think it's gonna die soon. I haven't bought an Intel CPU in 7 years. The only compelling CPU (in my opinion) they have produced in the last 5-6 years is 13600k and given that the platform is just 2 gen long, I didn't buy that either.
And for the sake of all that's holy, CPU manufacturers should switch to BIENNIALLY refresh. They can release a mid-gen refresh for enthusiasts ofc...the point is, it would extend the platform life and give these guys time to REALLY improve tech instead of fudging numbers to look good (goes for both AMD and Intel). Remember intel not being good is gonna hurt AMD (and us) eventually cuz they will have no competition.I was being rhetorical. My point is, both offers higher core/threads, but AMD's X3D CPUs (8C & 16C) offer better performance per watt in almost all games, so it can be hard to recommend Intel's current offerings for the price. AMD won the 2024 CPU battle despite releasing an unimpressive 9000 series.
Nvidia because i have Nvidia gpu. Amd because amd is nice and open sources stuff which modders can then use.Just curious, how do you use both AMD & Nvidia graphics techs together in a pc?
+1 for ReShade. Can completely change the look of a gameIn this case i use reshade to do this
No it does not. I have 5800X (non-3D) that I have paired with 3090 (used 1650 earlier). I had played AOE and Civilization 6 when I had 1650 GPU. In current setup, every game is lag-free. There are few games like Cyberpunk that show sign of slowness if I put RT at max without frame generation.I havent used AMD CPUs so dont know much about it and was just going through the # of cores/threads on paper, does the 8c/16t not bottleneck in strat/RTS games?
The problem for Intel is their quantity strategy. Their margins are wafer thin as they are cramming more and more cores / transistors on their CPUs. I dont know if they can compete with AMD on pricing this time around as they had to use TSMC fabs and that cost them more money as their own fabs are not yet operational. AMD was rubbing salt on their wounds by doing price cuts for their Zen 5 CPUs.Yeah, exactly no one would pay 589 for that CPU. Intel should be smart enough at this point to understand this basic fact. However, if the prices get low enough they can make something of this situation. They moved to chiplet design so they are saving money (albeit not as much as AMD). Also, AMD prices will increase so there is that. Either way this "could be" the beginning of something good for intel.
did you reach the end game? I haven't player civ much and am mainly an Anno player, but in 1800 at the start. I easily clear 80s but by then time I am 400-500 hours in, I barely get 30s, no ram bottleneck from what I can tell. usages us around 25-26 gigs in anno.No it does not. I have 5800X (non-3D) that I have paired with 3090 (used 1650 earlier). I had played AOE and Civilization 6 when I had 1650 GPU. In current setup, every game is lag-free. There are few games like Cyberpunk that show sign of slowness if I put RT at max without frame generation.