***OFFICIAL*** Ryzen 5000 / Zen 3 Launch Thread REVIEWS BEGIN PAGE 39

Page 54 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

yeshua

Member
Aug 7, 2019
166
134
86
If the only people who read reviews and post on forums are primarily in the group that buys the high end products for the type of situations described in the reviews, it's hard to argue that the reviewers s are doing a bad job. They know who their readers are and what they want. If anyone wanted to read reviews of low end part combinations for the masses the internet would be full of sites doing it. I'm sure there are a few out there, but I have a feeling the readership is as big as people doing reviews of 20 year old technology for the sake of nostalgia.

I'm not so sure about that. A year ago or something I saw a poll on a fellow tech website in regard to the hardware being used by its readers and it turned out than less than 30% of people had top end GPUs and CPUs. So, it looks like there are lots of people with modest/average hardware who indeed read reviews, participate in forums and leave comments, only they are not always as vocal as those who spends thousands of dollars to get the absolute best.

Someone could run/start a similar poll(s) here on AT.
 
Reactions: spursindonesia

inf64

Diamond Member
Mar 11, 2011
3,753
4,191
136
Want me to add to the crazyness that is Zen 3?

Apparently the pipeline is shortened, despite the clocks going way up.
Was this found in some of the low level tests in official reviews?
 

uzzi38

Platinum Member
Oct 16, 2019
2,690
6,345
146
Was this found in some of the low level tests in official reviews?

Software Optimisation Guide is out and it states in here:

The branch misprediction penalty is in the range from 11 to 18 cycles, depending on the type of mispredicted branch and whether or not the instructions are being fed from the op cache. The common case penalty is 13 cycles.

EDIT: Just to add some context, Zen 2 is 12-18 cycles with 16 typical.
 

RickITA

Junior Member
Nov 8, 2020
3
0
11
Thanks for the replies. Yes, I will wait. It could be I will need the PC ASAP, in this case I will go for the 5600X.

Just for curiosity, but can someone explains this:

For me it's amazing that @4450MHz a single 5600X core requires 10.2W, while the 5800X core 14.6W on average. Should not be the same core? Why such a large difference?
 

shady28

Platinum Member
Apr 11, 2004
2,520
397
126
I'm not so sure about that. A year ago or something I saw a poll on a fellow tech website in regard to the hardware being used by its readers and it turned out than less than 30% of people had top end GPUs and CPUs. So, it looks like there are lots of people with modest/average hardware who indeed read reviews, participate in forums and leave comments, only they are not always as vocal as those who spends thousands of dollars to get the absolute best.

Someone could run/start a similar poll(s) here on AT.


Would probably have to put it on the main page, people who visit the site rarely post. In fact the avg time someone spends on a web page at Anandtech according to Similar web is 1 minute 45s and on average they visit 1.75 pages. 72% visit only one page.

So what you guys are asking for would probably be described by CNET. To put it in perspective, per similarweb cnet has very nearly 20X the traffic of AnandTech, more pages per visit (>2) and a lower bounce rate (62%).
 

KompuKare

Golden Member
Jul 28, 2009
1,048
1,053
136
And personally I don't think you should test anything on games older than 1 year. With exception of still super popular multiplayer games.
You buy processor or GPU for future games, not past games.
How about if an older than 1 year game is still severly CPU limited like Fallout 4, or Skyrim SE and still has lots of players due thousands of mods?
 

DrMrLordX

Lifer
Apr 27, 2000
21,766
11,087
136
Well, i dont game at 720p, so.......

Edit: Strange (not really) that low res gaming was so strongly denigrated (niche within a niche comes to mind from one poster) when Intel led more at low res, but now is somehow becoming the holy grail since that appears to be where AMD has the lead.

It's not like Zen3 is going to lose gaming benchmarks at 1440p or 4k. Low-res gaming was Intel's last bastion of true performance, and they've been thrown out of that too. That's why people are talking about it.

I don't even trust for a moment all these claims that Comet Lake can somehow still beat Vermeer in gaming at any resolution I'd give a darn about (1440p or higher), especially considering how close Matisse was to Comet Lake.

TSMC doesn't even have enough capacity for AMD alone.

Oh they have it. They just have other customers, that's all.

TimeSpy results

Yay TimeSpy physics results. Got some actual game benchmarks to back that up?

3. Look at the retail prices now. If they'd lower the MSRP, the retailers would just pocket the difference.

Um, that's what the scalpers are doing. nV's latest are in short supply, so you are going to pay out the arse to get one. I admire NV's attempt at lowering prices, but they didn't supply the market with enough product. Also you're a bit late to the party when it comes to complaining about AMD's price hikes. I smelled that coming months ago with the XT launch. It was all about positioning product so AMD could charge more for Vermeer. Intel sold us quad cores with anemic IPC improvements and (in some cases) regressed clockspeeds for years, so don't act like they were doing us any favors either.
 

maddie

Diamond Member
Jul 18, 2010
4,786
4,771
136
Thanks for the replies. Yes, I will wait. It could be I will need the PC ASAP, in this case I will go for the 5600X.

Just for curiosity, but can someone explains this:

For me it's amazing that @4450MHz a single 5600X core requires 10.2W, while the 5800X core 14.6W on average. Should not be the same core? Why such a large difference?
One possibility is that all the full 8C high quality dies are going for Milan. A 64C Zen3 CPU should be in high demand.
 

Zucker2k

Golden Member
Feb 15, 2006
1,810
1,159
136
3. Like I said before, with tuned, equalised memory across both Intel and AMD platforms you have both LTT and GN showing that even at a higher resolution Zen 3 takes a very noticeable lead. (EDIT: and at 1080p) So your point about the cache is even more useless.
WTF? The i7 5775c with a 128MB L4 cache is a gaming monster. Here we have Zen 3 with a 64MB L3 cache that AMD have aptly named "Game Cache" and you're just going to sit there and say it has no influence in gaming? Really?
 
Reactions: spursindonesia

uzzi38

Platinum Member
Oct 16, 2019
2,690
6,345
146
WTF? The i7 5775c with a 128MB L4 cache is a gaming monster. Here we have Zen 3 with a 64MB L3 cache that AMD have aptly named "Game Cache" and you're just going to sit there and say it has no influence in gaming? Really?
...no, I'm saying the cache does not have a differing effect at different resolutions. The effect will be the same regardless of resolution.

Did you not read point 2 where I spelled that out?
 

Kryohi

Member
Nov 12, 2019
27
48
91
Thanks for the replies. Yes, I will wait. It could be I will need the PC ASAP, in this case I will go for the 5600X.

Just for curiosity, but can someone explains this:

For me it's amazing that @4450MHz a single 5600X core requires 10.2W, while the 5800X core 14.6W on average. Should not be the same core? Why such a large difference?
I might be wrong, but I think the relationship between voltage and temperature goes in both directions, although with very different laws. So the higher the temperature, the higher the voltage must be in order for the cpu to be stable at high clocks.
At "normal" temperatures it shouldn't be a big difference. but this effect might play a part in that power difference, since the 5800X has a higher heat density, having all the cores active.
Did they control for the temperature of hotspots and the cpu package where you saw those numbers?
 

amrnuke

Golden Member
Apr 24, 2019
1,181
1,772
136
So is everyone going to get mad that Hardware Unboxed called the 5600X a 50% price bump over the 3600 in their review?

I think he's wrong to do so, especially since he highlighted in his 10600K review that it was basically an 8700K with a 22% discount (rather than a 9400F with SMT enabled and unlocked/high clocks and an $80 price hike), though he did remark that a 22% discount over a 2.5 year period isn't really all that stellar.

In AMD's case, it's really difficult to compare the 5600X to the 3600 in a similar way because it's so vastly different from the prior gen, let alone the original Zen chips. It's not just a rebranding or mature silicon, it's a top-to-bottom redesign.

The 10600K beats the 8700K by 5-10% at a 22% discount, yes, but if that's how he's going to frame it, he should have been sure to frame the 5600X in a similar fashion: the 5600X beats the 3800XT by 1% in CPU tests and 13% in 720p gaming for a $100 discount.

Similarly, the 5800X beats the 3900XT by 4% in CPU tests and in 720p gaming by 12% for a $50 discount.

It's all about how you frame it. No matter how you slice it, AMD are offering more performance for less money than the prior gen when you compare the 5600X and 5800X to more expensive prior generation parts released 16 months ago, or even 4 months ago.
 
Last edited:

Zucker2k

Golden Member
Feb 15, 2006
1,810
1,159
136
...no, I'm saying the cache does not have a differing effect at different resolutions. The effect will be the same regardless of resolution.

Did you not read point 2 where I spelled that out?
If what you're saying is correct, then it means that the data size in cache is the same regardless if the game is 720p or 4k?
 

ondma

Platinum Member
Mar 18, 2018
2,745
1,319
136
I think he's wrong to do so, especially since he highlighted in his 10600K review that it was basically an 8700K with a 22% discount (rather than a 9400F with SMT enabled and unlocked/high clocks and an $80 price hike), though he did remark that a 22% discount over a 2.5 year period isn't really all that stellar.

In AMD's case, it's really difficult to compare the 5600X to the 3600 in a similar way because it's so vastly different from the prior gen, let alone the original Zen chips. It's not just a rebranding or mature silicon, it's a top-to-bottom redesign.

The 10600K beats the 8700K by 5-10% at a 22% discount, yes, but if that's how he's going to frame it, he should have been sure to frame the 5600X in a similar fashion: the 5600X beats the 3800XT by 1% in CPU tests and 13% in 720p gaming for a $100 discount.

Similarly, the 5800X beats the 3900XT by 4% in CPU tests and in 720p gaming by 12% for a $50 discount.

It's all about how you frame it. No matter how you slice it, AMD are offering more performance for less money than the prior gen when you compare the 5600X and 5800X to more expensive prior generation parts released 16 months ago, or even 4 months ago.
I think the problem is that the higher clocked premium model (5600x) was released first and no 5600 is available. So really makes the comparison to the previous gen difficult, since one had the choice of the "base" or higher clocked model at launch. Personally, I think the price is reasonable, but not the outstanding bargain of past releases (probably justified by the improved gaming performance.)
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
If what you're saying is correct, then it means that the data size in cache is the same regardless if the game is 720p or 4k?

What he's saying is correct, because it's the GPU that is responsible for processing graphics related data like pixels, shaders, textures etcetera so increasing the resolution does not burden the CPU. Reducing the resolution however does increase the burden on the CPU because the CPU has to send data faster to the GPU since the framerate typically increases at low resolutions due to less work for the GPU.

There are other things the CPU is usually responsible for processing however, like draw calls, animations, game logic, A.I and physics if I'm not mistaken.
 
Last edited:

wahdangun

Golden Member
Feb 3, 2011
1,007
148
106
axaxaxaxaxa anandtech tested at 600-720p for the gaming benchmarks ???? wtf axaxaxaxaxaxxaxaxaxa

edit: just found they also tested at 384p

aaxaxxaxaxaxxaxaxaxaxaxxaxaxaxaxaxaxaxa


wtf, zen 3 have so much potential, if we have bigger GPU the gap will be even wider, zen 3 even gpu limited in that 384p games
 

Zucker2k

Golden Member
Feb 15, 2006
1,810
1,159
136
There are other things the CPU is usually responsible for processing however, like draw calls, animations, game logic, A.I and physics if I'm not mistaken.
I wish a game dev/programmer would share more light on this because if cache size is not significant to gaming, then how does one explain this performance from the Quad Core i7 5775C with a 3.3GHz base, 3.7GHz boost and 128MB L4 Cache:
Civilization VI, 1080p Max
Strange Brigade DX12 - 1080p Ultra
F1 2019 - 1080p Ultra

With its lowly clocks, the only thing the 5775c and 5765c have that Intel's other CPUs don't have is that L4 cache. Certainly, a bigger L3 cache would make even far more impact due to access latency (the L4 cache on these chips is on a separate die). One could make the case that some games benefit less from huge caches but to dismiss the thought outright as "useless" is bothering on ignorance and arrogance, especially when the chip architects at AMD have clearly explained to marketing that the huge L3 cache is going to be great for gaming, hence "game cache." Even if one's in doubt about this, the Intel examples above should leave no doubt.

Core i7 5775c vs Core i7 4790k Anand Bench Comparison
 
Reactions: Space Tyrant

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
I wish a game dev/programmer would share more light on this because if cache size is not significant to gaming, then how does one explain this performance from the Quad Core i7 5775C with a 3.3GHz base, 3.7GHz boost and 128MB L4 Cache:

I don't think any one said that cache does not affect gaming performance, because it clearly does. A CPU with more cache can avoid less lengthy trips to system memory, which can accelerate processing and feed the GPU faster. The contention is whether cache affects performance at various resolutions, and the answer to that is no, because the CPU does not directly involve itself in graphical processing.

A Core i7 5775C and a 9700K will have similar or equal performance at 4K assuming the game is GPU bound, despite the 5775C having much more cache.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |