***OFFICIAL*** Ryzen 5000 / Zen 3 Launch Thread REVIEWS BEGIN PAGE 39

Page 55 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Zucker2k

Golden Member
Feb 15, 2006
1,810
1,159
136
Th
I don't think any one said that cache does not affect gaming performance, because it clearly does. A CPU with more cache can avoid less lengthy trips to system memory, which can accelerate processing and feed the GPU faster. The contention is whether cache affects performance at various resolutions, and the answer to that is no, because the CPU does not directly involve itself in graphical processing.

A Core i7 5775C and a 9700K will have similar or equal performance at 4K assuming the game is GPU bound, despite the 5775C having much more cache.
That can't be true because @720p the cache will be hit significantly faster and that must surely have an impact on performance. So, my statement that a 720p test is more of "a cache and memory benchmark" is not false, because of the speed factor. At 4k, because of the delay from gpu, the cache is not hit as hard so even cpus with less than stellar cache and memory subsystems do as well as those with superior subsystems.
 
Reactions: spursindonesia

dacostafilipe

Senior member
Oct 10, 2013
772
244
116
That can't be true because @720p the cache will be hit significantly faster and that must surely have an impact on performance. So, my statement that a 720p test is more of "a cache and memory benchmark" is not false, because of the speed factor.

Don't you think that the execution units will be hit as hard, if not harder? What to think access the cache?
 
Reactions: Tlh97 and Elfear

coercitiv

Diamond Member
Jan 24, 2014
6,261
12,221
136
So, my statement that a 720p test is more of "a cache and memory benchmark" is not false, because of the speed factor.
Your statement is not false but deceiving: low resolution testing is not just a cache and memory benchmark, it's a wider bottleneck check. Whether it's memory subsystem performance or simply compute performance, eliminating the GPU as initial bottleneck will expose the next weakness in the chain. That may be ST performance, MT performance, cache size and/or memory latency/bw.

That is why many argue Anandtech's CPU benchmarking in games is to be used with caution, since using JEDEC compliant memory in gaming rigs is not a benchmark for CPU performance but rather a benchmark for average user know-how. Anandtech tries to cater to both average users and enthusiasts, and the result is sometimes an abomination like the Broadwell-C analysis in which they write content exclusively aimed at enthusiasts while using system specs specifically aimed at lay audience.

Let's take a look at one of their 1080p charts for RDR2.

Now let's check out another RDR test, this time on Techspot, afaik both tests were done on 2080TI:

Now granted the tests are not true apples to apples since detail settings are different, but they can be used to observe one immediate difference: there's a 25% performance delta between 3700x and 6600k in the Anandtech chart, with half the CPUs operating within 5% of each other in the higher performing segment, while the Techspot chart shows over 50% difference between 3700X and i3 9100F, with an obviously more linear performance drop-off from top to bottom. In the Techspot chart all CPUs appear to be differentiated by clockspeed and/or thread count. In Anandtech's chart they don't always follow this rule.

Why is that happening? Moving from slower memory to fast memory couldn't have possibly exacerbated the importance of the L3 cache, it stands to reason better latency and bandwidth only lowers cache importance. It could be we're GPU limited in the Anandtech test, but that would be mighty odd since some CPUs perform better there than they do in the Techspot test. The last item on the checklist would be raw compute performance, whether ST, MT or a combination of both. As soon as one bottleneck was removed, the next ones were revealed.
 
Last edited:

Zucker2k

Golden Member
Feb 15, 2006
1,810
1,159
136
Your statement is not false but deceiving: low resolution testing is not just a cache and memory benchmark, it's a wider bottleneck check. Whether it's memory subsystem performance or simply compute performance, eliminating the GPU as initial bottleneck will expose the next weakness in the chain. That may be ST performance, MT performance, cache size and/or memory latency/bw.
You're not saying anything I haven't said or implied. Here's why: In the context of the Anandtech 720p tests, simply upping the memory frequency from 2933 to 3200 for the Intel system would've more than likely eliminated the gap. This is a memory bottleneck. Period. No need to wander off into any other irrelevant areas. The bottleneck is artificial. That is why I said, for this particular test (as tested by Anandtech), Zen 3s (game) cache (inherent strength), and RAM speed (267MHz) advantage gave it the edge over Comet Lake-S.
 
Reactions: spursindonesia

lobz

Platinum Member
Feb 10, 2017
2,057
2,856
136
I seriously doubt. You base your opinion on TPU review and their 'explanation'. It's really not that hard to find review where results are totally different from TPU and using slower memory.
Here is a test of Witcher 3 with memory speed of 3200 on both platforms with not that great timings of CL 16-18-18-36.

As a bonus you know they tested in Novigrad so CPU bound scenario. In TPU review you actually don't know what they tested. They could have tested in some anonymous forest where GPU is bottleneck.
You see a difference? Ryzen 5000 wins without question. The same processors that lose in TPU review.
And seeing that in TPU review results between 1800X and 5800X are basically actually in the margin error (not a big difference) I suspect they are just incompetent and tested GPU bound scenario not CPU.
So their review is practically worthless.

Another one. This time with RAM 3200 MHz for Ryzen and 2933 MHz for intel (officially supported).
Ryzen 5000 still wins.

I don't trust TPU reviews anymore. Why? Look at comparison between GPU from this page. 5700 XT vs 3080

It should be 78% difference. Right? So I go to the 3080 review

65% vs 100% in 1080p so only about 54%.
Ok so maybe 4 k? 50% vs 100% so 100% difference.

So let's take 2080 Super vs 3080 comparison. According to main page for 2080 Super it's a 46% difference.

We go to the same 3080 review and here we are:
In 1080p 77% vs 100% so about 30%
In 4k 64% vs 100% so about 56%

Nothing is correct. So I thought maybe these are data from 1440. OK Once again.
5700 XT
57 % vs 100% so 75-76% difference
2080 Super
70% vs 100% so 43%
Still no luck.

So you look at data and wonder, where the hell do they come?

Oh, and you have remark at the bottom: "Based on TPU review data: "Performance Summary" at 1920x1080, 4K for 2080 Ti and faster."
So for both 5700XT and 2080 Super it should be 1080p data. It's clearly not.
That's why I don't trust them.

And personally I don't think you should test anything on games older than 1 year. With exception of still super popular multiplayer games.
You buy processor or GPU for future games, not past games.
And people still test processors on GTA V, a game that engine breaks at high fps as Wendell from Level1tech already proved.
Thanks for taking the time for this post, I haven't had the patience for it. TPU's been like this for a decade now.

They're not biased or shilling, they just happen to be wrong almost all the time. I always cringe when someone builds up a 3-page long dissertation about multi-generational correlations of GPUs and then they insert 2-3 charts from TPU that always have like some totally unrealistic percentage data in them, making the whole fuss pointless.
 

lobz

Platinum Member
Feb 10, 2017
2,057
2,856
136
So is everyone going to get mad that Hardware Unboxed called the 5600X a 50% price bump over the 3600 in their review?

I won't get mad, I just don't agree. So what now, next they will say in the 5600 non-X review that it's a 50% price bump over the 3300X?
Or will they go the even dumber direction and say 'WOW, this is so much stronger than the 3600X for the same price'?

Same goes for Gamersnexus and everyone else who pretends that AMD is either A: 100% surely won't launch a non-X version soon, or B: obliged to launch precisely corresponding SKUs every generation on day one by some unwritten, unspoken law of the Great Reviewer Council.

I say this with all the respect in the world, because they're both super hard working channels with reliable testing methodology and an accountable attitude.
 

coercitiv

Diamond Member
Jan 24, 2014
6,261
12,221
136
So is everyone going to get mad that Hardware Unboxed called the 5600X a 50% price bump over the 3600 in their review?
Did you also hear Tim and Steve talk about how they expect a cheaper 6-core part, a cheaper 8-core part, and maybe even lower prices as supply improves?

Last but not least, a quote from Steve:
So, if you have the option between a 5600X and a 3700X at around $300, in my opinion you should absolutely get the 5600X. It's a lot better in games right now, and I expect that to still be the case in a few years time and really I think it's likely gonna be the case indefinitely, or you could save you money and get the 3600 [...], or you could you also wait and hold out for what we think will be a Ryzen 5 5600
 

SteveGrabowski

Diamond Member
Oct 20, 2014
6,980
5,895
136
I won't get mad, I just don't agree. So what now, next they will say in the 5600 non-X review that it's a 50% price bump over the 3300X?
Or will they go the even dumber direction and say 'WOW, this is so much stronger than the 3600X for the same price'?

Same goes for Gamersnexus and everyone else who pretends that AMD is either A: 100% surely won't launch a non-X version soon, or B: obliged to launch precisely corresponding SKUs every generation on day one by some unwritten, unspoken law of the Great Reviewer Council.

I say this with all the respect in the world, because they're both super hard working channels with reliable testing methodology and an accountable attitude.

If 5600 is a quadcore like 3300X then you could probably consider it a 3300X replacement. Man I hope not. Don't understand the excitement when 5600X just barely moves the price to performance needle from the 10600k + cheap cooler. If I was buying on the high end of the market 5900X would excite the hell out of me since it's basically the same IPC plus two extra cores for the price of the 10900k, that's a huge gain in price to performance. Anyone in that price bracket should be absolutely thrilled. But I'm just a midrange gamer so that's the bracket I care most about.
 

SteveGrabowski

Diamond Member
Oct 20, 2014
6,980
5,895
136

Hope so. If I upgrade soon it'll probably be either to a 3600 or a hypothetical 5600 next month or maybe early next year, depending on how cpu demanding Cyberpunk is and after I take care of Christmas shopping. So it would be really nice to have the $220 5600 Hardware Unboxed kept talking about if it's not hugely cutdown from 5600X.
 

lobz

Platinum Member
Feb 10, 2017
2,057
2,856
136
If 5600 is a quadcore like 3300X then you could probably consider it a 3300X replacement. Man I hope not. Don't understand the excitement when 5600X just barely moves the price to performance needle from the 10600k + cheap cooler. If I was buying on the high end of the market 5900X would excite the hell out of me since it's basically the same IPC plus two extra cores for the price of the 10900k, that's a huge gain in price to performance. Anyone in that price bracket should be absolutely thrilled. But I'm just a midrange gamer so that's the bracket I care most about.
That's really not what I meant. What I meant is that yes, unfortunately AMD has introduced a flat price bump which, being flat, obviously affect lower tiers more. However it is not 50%, it's not a replacement for the 3600 non-X, that is still yet to come.

Also, not everything gets everyone excited. Just ask Buzz Killington.
 

amrnuke

Golden Member
Apr 24, 2019
1,181
1,772
136
Don't understand the excitement when 5600X just barely moves the price to performance needle from the 10600k + cheap cooler.
Wait... what?

Taking lowest resolution in AnandTech's games tested (that makes it more CPU-bound, right?), the 5600X beats the 10600K by at least 6.5%, and on average 42%, for 13% more money. In CPU testing, the 5600X beats the 10600K by 21%, again, for 13% more money. (In CPU tests I removed the Open SSL sha256 benchmark because Intel's chips are absurdly bad at that... subbed in the 10700K benchmark for GB5, and removed Crysis low because 10600K didn't appear on the chart.)

So how does it "barely" move the price to performance needle? Can you explain that?
 

ondma

Platinum Member
Mar 18, 2018
2,728
1,297
136
Wait... what?

Taking lowest resolution in AnandTech's games tested (that makes it more CPU-bound, right?), the 5600X beats the 10600K by at least 6.5%, and on average 42%, for 13% more money. In CPU testing, the 5600X beats the 10600K by 21%, again, for 13% more money. (In CPU tests I removed the Open SSL sha256 benchmark because Intel's chips are absurdly bad at that... subbed in the 10700K benchmark for GB5, and removed Crysis low because 10600K didn't appear on the chart.)

So how does it "barely" move the price to performance needle? Can you explain that?
You mean those 360p benchmarks that correspond to what everyone games at?
 

Dave3000

Golden Member
Jan 10, 2011
1,359
91
91
It looks like the 5900x is slightly faster in overall gaming than the 5800x based on the reviews I read. Once games start using up to 7-8 cores heavily, will the 5800x perform better than a 5900x in those games since cores 7-12 are on another CCX on the 5900x? I'm leaning towards the 5900x instead of the 5800x since it's only $100 more for 4 more cores and runs a bit cooler than the 5800x based on a few reviews a read, plus it does not seem like dual 6-core CCXs are slowing down games compared to a single 8-core CCX, but that's probably because of the larger L3 cache of the 5900x or that most games just don't use up to 7-8 cores right now, or the higher memory-write bandwidth of a dual CCD chip that compensates for the latency of communicating through the infinity fabric between the two CCDs?
 
Last edited:

CHADBOGA

Platinum Member
Mar 31, 2009
2,135
832
136
It looks like the 5900x is slightly faster in overall gaming than the 5800x based on the reviews I read. Once games start using up to 7-8 cores heavily, will the 5800x perform better than a 5900x in those games since cores 7-12 are on another CCX on the 5900x? I'm leaning towards the 5900x instead of the 5800x since it's only $100 more for 4 more cores and runs a bit cooler than the 5800x based on a few reviews a read, plus it does not seem like dual 6-core CCXs are slowing down games compared to a single 8-core CCX, but that's probably because of the larger L3 cache of the 5900x or that most games just don't use up to 7-8 cores right now, or the higher memory-write bandwidth of a dual CCD chip that compensates for the latency of communicating through the infinity fabric between the two CCDs?
Right now the 5900x seems an obvious and easy choice over the 5800x, especially if you want to hang on to your system for a while.

If AMD releases a 5700x that is $100 cheaper than the 5800x, it becomes a lot more difficult of a choice, but then I'd probably go for the 5700x(assuming a similar performance gap to the 5800x, as we saw with the 3800x vs 3700x).
 

lobz

Platinum Member
Feb 10, 2017
2,057
2,856
136
It looks like the 5900x is slightly faster in overall gaming than the 5800x based on the reviews I read. Once games start using up to 7-8 cores heavily, will the 5800x perform better than a 5900x in those games since cores 7-12 are on another CCX on the 5900x? I'm leaning towards the 5900x instead of the 5800x since it's only $100 more for 4 more cores and runs a bit cooler than the 5800x based on a few reviews a read, plus it does not seem like dual 6-core CCXs are slowing down games compared to a single 8-core CCX, but that's probably because of the larger L3 cache of the 5900x or that most games just don't use up to 7-8 cores right now, or the higher memory-write bandwidth of a dual CCD chip that compensates for the latency of communicating through the infinity fabric between the two CCDs?
It's not that black and white, it's down to the game engine if it prefers less off-die comms or the 64 MB L3 instead of 32 MB.
 

SteveGrabowski

Diamond Member
Oct 20, 2014
6,980
5,895
136
Wait... what?

Taking lowest resolution in AnandTech's games tested (that makes it more CPU-bound, right?), the 5600X beats the 10600K by at least 6.5%, and on average 42%, for 13% more money. In CPU testing, the 5600X beats the 10600K by 21%, again, for 13% more money. (In CPU tests I removed the Open SSL sha256 benchmark because Intel's chips are absurdly bad at that... subbed in the 10700K benchmark for GB5, and removed Crysis low because 10600K didn't appear on the chart.)

So how does it "barely" move the price to performance needle? Can you explain that?

Haven't seen that review. In techpowerup's testsuite it's about 4% faster at 720p ultra using a 2080 Ti. I have not seen any other reviewer claiming 40% gains vs 10600k.

 
Reactions: spursindonesia

amrnuke

Golden Member
Apr 24, 2019
1,181
1,772
136
Haven't seen that review. In techpowerup's testsuite it's about 4% faster at 720p ultra using a 2080 Ti. I have not seen any other reviewer claiming 40% gains vs 10600k.

I'm just running through stuff blindly because you haven't specified a use case for your claim that the 5600X doesn't move the price to performance needle.

So you're looking for the best chip to game at 720p with a 2080 Ti?
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |