I don't recall those old slides? Do you have a link? I only remember the ones where it was compared against a 5900X and AMD claimed it would be 15% faster in gaming. I'd say they more or less matched this 15% claim based on the reviews and data available.
It is us, the enthusiasts and forum...
If you plan on keeping a CPU for 5 years, then definitely that's the right move. I only recently upgraded from a 8700K and it too went through 3 GPU upgrades so I know exactly what you mean. In your case buying the best you can afford makes the most sense with 5 years in mind.
For a lot of...
I meant the 5600, but the 5600X is only $30 more so the same argument applies.
The 5600 is widely available for $285 AUD where I live, and after conversion its ~$211 USD, so close enough to the MSRP. We do generally have to pay more for hardware here in Australia thanks to shipping costs and...
Price? Even for Zen 1/2 owners, you have to weigh up whether its worth the asking price, or if a regular Zen 3 chip that is still a fantastic upgrade from Zen 1/2 will be enough.
Case in point - upgrading from a 1600 to 5600 will net you 1.5x - 2.0x the gaming performance (when CPU bound)...
With mid range GPUs, you don't need a 5800X3D.
A $200 12400 / 5600 is more than enough to max out a 3060 and at most times a 3070: https://www.techspot.com/review/2448-amd-ryzen-5600-vs-intel-core-i5-12400f/
3080 and up will see benefits from a 5800X3D, though that will also depend on your...
What kind of GPU will you be running with a 450W PSU? Even with a 5800X3D only taking ~70W that really doesn't leave a lot of room for anything much higher than say a 3060 without pushing the limits of the PSU. Don't forget you have to power your mobo/RAM/storage/peripherals as well.
Would it...
Yeah, I don't even want to think about the latencies of DDR5-4400. Geez, I honestly think my B-Die 3600s might be able to clock close to 4400 with loose timings and extra voltage, they easily do 4000 @ CL18. Honestly defeats the purpose of DDR5 if you're gonna go that low in clockspeed, you lose...
These value comparisons are a bit disingenous since we are ignoring the fact that there are cheaper CPUs like the i7 12700F that when paired with DDR5 could well be a match for a 5800X3D in gaming. Of course, if you insist on *only* comparing the 5800X3D to the 12900K, then sure, you can get...
I'd agree. We need one of those 30 game comparisons that HWUB is known for to truly grasp how different games react to the cache. Right now, the results (and averages) can swing so wildly depending on the games tested.
Why would you even bother with high end DDR4-3800 when its literally 1-2% faster than 3200 but costs way more?
I would say a better argument in favour of the 5800X3D is that it can achieve near parity with a 12900K/DDR5 combo while *only* using DDR4-3200. So the overall platform cost is much...
Yeah, I'd agree. IMO the 5800X3D is a good 'proof of concept' CPU. I'm far more excited about what AMD can do with this tech going forward with Zen 4 and DDR5. Imagine Intel with the DDR5 advantage nullified!
There is no 12900KS in those comparisons though lol. I'm guessing you mean the 12900K. Close enough.
Again, I have to question those charts when the 12600K is beating the 12900K in gaming. I just can't take those results seriously.
Impressive! But why is the 12600K faster than the 12900K?! Something is off there, surely.
But OK, since you said that, I'll bite. I can't, in all honesty, call the 5800X3D the undisputed 'gaming king' - and no, thats not because I own a 12900K and don't want to relinquish the crown! ;) I was...
Are we just going to pretend no other Intel CPU exists that uses less than 186W? ;) My 12900K feels neglected...
Seriously though, those are some great efficiency figures. The problem is that high end GPUs are soaring past 400W these days... saving a few watts off the CPU is hardly going to...
HWUB review is live!
OK, not gonna lie... this is unexpected. 12900K with DDR5 outperforms the 5800X3D in games. The gains between DDR4-3200 to 3800 is also negligible, though I guess the larger cache somewhat negates the benefits of faster RAM.
Things just got slightly more awkward with AMD's...
I'm curious if anyone here is actually planning to get a 5800X3D? I'm unsure who would be the most suitable target for this kind of chip. Yes, I realise its targeted at high end gamers, but high end gamers generally already own a Zen 3 or ADL CPU and I don't think the 5800X3D is *that* much...
Yeah, I'll give AMD the benefit of the doubt here. I'm sure they have a 12900K/KS system in house they can compare against. Would be silly to make such a claim and have reviewers come to a different conclusion
Expected better results for the 5800X3D there since this was a DDR4 only testbench, if anything the 12900K seems a couple of % faster in all the games?
Tbh, I wouldn't put too much faith in those results. Brand new channel, literally less than 24 hours old, no subs lol.
Yeah, I'd agree with that. Zen 4 is close enough that it makes sense to wait, plus DDR5 should (hopefully) be ready for prime time then as well.
5800X3D is a nice last hurrah for AM4, but in grand scheme of things it will be a mere footnote for AMD historians in the future.
I wasn't really comparing it to the 12900K, more to its Zen 3 counter-parts. AMD claimed on average a 15% increase ast 1080P over a 5900X, which I thought was a reasonable and realistic claim. The TPU results show a ~7.5% increase, which is actually a fair bit lower than what I expected, and as...
To be fair, the crowd that AMD is trying to target this at out aren't in it for the 'bang for buck'. That's not what the 'halo gaming' market is about. Think 3090 Ti owners here. They just spent $1500 on their GPU to gain 5% over a vanilla 3090, so spending a bit more (or less) on a CPU doesn't...
Is the 5800X3D underperforming in the TPU review?
https://www.techpowerup.com/review/amd-ryzen-7-5800x3d/15.html
At 720P its ~10% faster than Zen 3, at 1080P that shrinks to ~7.5%, literally half of what AMD claimed the v-cache gains to be for 1080P. Granted, TPU is 'only' using an RTX 3080...
For current gen Intel CPUs, I'd say the 12400F strikes the ideal balance between price/performance.
Anything else above that and you quickly run into the law of diminishing returns, especially if you game at higher than 1080P or use mid range GPUs:
Now, if you are talking previous gen, then...
So, in theory, the games where the 5800X3D has the big wins would also be games where ADL would make significant gains with DDR5? That makes sense to me. I don't think whatever price differences between the 2 combos you mentioned would be a huge factor for anyone looking at these for the...
426W is system power consumption though, not the CPU alone. I am still surprised at how much more power the 12900KS is pulling relative to the 12900K though - don't they both have a 241W PL2 limit?
I'm pretty sure its sufficient as long as you aren't overclocking both the CPU and GPU.
Realistically speaking, only seasoned overclockers would have the means to push a 12900KS upwards of 400W (and have it run stable without throttling!), since youll be running some sort of high end waterblock...
It's pretty unrealistic to have both pegged at 100% though, unless you are imposing some sort of artificial stress test on it. My 12900K doesn't even use 100W in gaming, granted its not overclocked to 5.5GHz all core or whatever. If thats your thing, then yeah, maybe those 1500W PSUs might be a...
How so? 240W + 450W = 690W. Let's be generous and say 100W for memory and storage, thats still around 800W peak usage assuming *both* CPU and GPU can be pegged at 100% utilisation, which wouldn't ever be possible in an actual gaming scenario as no game can come close to maxing out a 12900K
Now...
I just don't recall such big differences between DDR4 and DDR5 at launch, I always thought DDR5 was just a few percent ahead here and there, not worth paying twice the price for. I guess once you really give ADL the bandwidth to stretch its legs at 6000+ speeds its in a different league to DDR4...
The DDR4 3200 is CL14 G.Skill so not exactly budget stuff. Sure, 3600 would be faster but there is no way that performance gulf is gonna be bridged even with DDR4 4000 memory.
I agree with you that once DDR5 is a mature standard, the fun really begins. Should coincide with Zen 4/RPL/RTX 4000/RX...
The video review is here (same content, video format)
They use Trident Z 6400 CL32. Basically the fastest DDR5 memory you can buy right now, but still. Some of those gains are insane over DDR4 3200.
Has it crept up on us how much faster ADL is with DDR5 in certain games? The jump over DDR4 is nuts on a few titles. Making me seriously consider switching to DDR5 for my 12900K once prices settle.
https://www.techspot.com/review/2443-intel-core-i9-12900ks/
PS. Oh yeah... the 12900KS...
Would looking at how ADL responds to increasing L3 sizes be a more accurate way to estimate any potential uplift?
20MB L3 -> 30MB L3 nets an ~8% gain, though that is only on 4 cores, so I'm not sure how that would translate to 6/8P core configs...
So a 60% increase in L2 cache from 10MB to 16MB is supposed to have little impact on gaming performance? What do you base this on?
Comet Lake showed significant gains in many games from an additional 8MB / 67% increase in L3 cache going from 12MB on a 10600K to 20MB on the 10900K, with...
This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
By continuing to use this site, you are consenting to our use of cookies.