Vishera Review Up - Anandtech

Page 6 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Mallibu

Senior member
Jun 20, 2011
243
0
0
I'm actually glad they didn't puss out and try to reduce power consumption. I'd rather have the performance than save what amounts to pennies a year on the electricity bill.

If they did that, then there goes the +7-8% advantage they gained. They would release a processor that performed exactly like the previous one (aka bad) with a power consumption of just bad (because BD was ultra bad).
 

Idontcare

Elite Member
Oct 10, 1999
21,110
59
91
I am calm . Its not me .People are so threatened AMD is going out of business that they look at these underperforming power eaters and say not bad . Its amusing . This is amds best 8 core cpu . There in deep do do. Comparing to intells 4 cores is fun and all based on price . But what is exactly the reason for a price compare to beginn with . Because WHY exactly? It has nothing to do with price. A 6 core intel wipes the floor with these . You guys might think based on PRICE Amd has a nice cpu. Thats laughable. I understand the price compare is good .. Same as cpu efficiency is a big deal . Bottom line Intel is cheaper to own threw the lifetime of the product by a large factor . But than The same people who sing the praises of cheaper cpu . Avoid the fact intel 8threaded 4 core is cheaper to own and a better bargin. AMD competes with the 3750 so why even show the 8 thread model in the reviews . In so doing the reality is hidden even deeper in this tangled story of Intel Vs AMD and confussing the cost of ownership . If fans think this will save AMD they are wrong . Next QT. tells the story . I just hate the fact I will have to listen to 3 more months of How great this CPU and the earnings report will show something all together differant . Stange world

Choking on your whamburger? Shall I call a whambulance for you?

whambulance

The imaginary rescue vehicle that will rescue someone when they are incessantly whining over a trivial matter. Used mockingly, but in good humor.
 

angevil

Member
Sep 15, 2012
29
0
0
I'm actually glad they didn't puss out and try to reduce power consumption. I'd rather have the performance than save what amounts to pennies a year on the electricity bill.

Some people pay more for electricity than in united states. Where i live the price after taxes is around 0,12$ per kwh, without any offpeak reductions, while the wages are over 10x smaller.

A 20$ saving in electricity here, adjusted to buying power, would be over 200$ there. In other words, 20$ is worth to me what 200$ is worth to you. Actually i have the computer open 12 hrs every day with both heavy and light use. If you do the math it would be more than 20$ in my case per year, but i gave that as a typical scenario.

It may sound like peanuts to you, but i am upgrading to an i3 3220 and i love the 55w power consumption.
 
Last edited:

Ketchup

Elite Member
Sep 1, 2002
14,559
248
106
I think it was a mistake offering chips that overclock so well. I mean, why didn't they just come clocked higher, considering the price point they are going for? It would sure make them look better compared to Intel. We, as enthusiasts, make too small of a target audience for such decisions to be based solely upon us. And as said before, their target audience doesn't care much about a few extra degrees to get there.

In my eyes, AMD was that company that wasn't trusted as much as Intel, but we all wanted one because they offered better performance at a lower price.

They took away both, so what is left? Do they think they have elevated themselves that much in the trust part?

Certainly not.

Every server I have put together for our customers has had an Intel processor. Every controller I have worked with has an Intel processor. I am not trying to speak for everyone, but our customers, I believe, use similar technology as other companies.
 

Yuriman

Diamond Member
Jun 25, 2004
5,530
141
106
The gaming argument fails since nobody games in low resolution(unless he has to due to poor GPU). Once you get to 1080p, almost all titles will show no difference since gaming at 100fps and 130fps is basically the same. There are those few titles like Civ and Skyrim that for some reason take a performance hit with FX but they are still more than playable in 1080p. Those who want the absolute highest FPS in their games will buy intel for sure.

Those who want relatively cheap system to do everything one can think of (game also) may pick up 8350 or even 8320(and OC it).

Sorry for quoting a post so early in the thread, but this bothers me. Yes, I don't much care either when I'm playing Sims 3 and I get 400fps with my i5 instead of 300fps. However, I do care that when playing Guild Wars 2, I get 60+fps instead of 40 because it's such a CPU heavy game, and doesn't take advantage of 8 cores. I do care when I'm late game in a Starcraft 2 mod and I get 30fps instead of 20. I also care when I'm playing Mario Galaxy 2 in Dolphin that it stays pegged at 60fps instead of dropping into the 40's when there's a lot going on.

These are all places where Bulldozer's (and to a lesser extent, Piledriver's) gaming performance fails me. It's not always a case of 300 vs 400fps, a fair number of games (that I play) are actually CPU bound below 60fps.
 

Yuriman

Diamond Member
Jun 25, 2004
5,530
141
106
I think it was a mistake offering chips that overclock so well. I mean, why didn't they just come clocked higher, considering the price point they are going for? It would sure make them look better compared to Intel. We, as enthusiasts, make too small of a target audience for such decisions to be based solely upon us. And as said before, their target audience doesn't care much about a few extra degrees to get there.

In my eyes, AMD was that company that wasn't trusted as much as Intel, but we all wanted one because they offered better performance at a lower price.

They took away both, so what is left? Do they think they have elevated themselves that much in the trust part?

Certainly not.

Every server I have put together for our customers has had an Intel processor. Every controller I have worked with has an Intel processor. I am not trying to speak for everyone, but our customers, I believe, use similar technology as other companies.

The stock voltage is already pretty high.
 

BallaTheFeared

Diamond Member
Nov 15, 2010
8,115
0
71
Sorry for quoting a post so early in the thread, but this bothers me. Yes, I don't much care either when I'm playing Sims 3 and I get 400fps with my i5 instead of 300fps. However, I do care that when playing Guild Wars 2, I get 60+fps instead of 40 because it's such a CPU heavy game, and doesn't take advantage of 8 cores. I do care when I'm late game in a Starcraft 2 mod and I get 30fps instead of 20. I also care when I'm playing Mario Galaxy 2 in Dolphin that it stays pegged at 60fps instead of dropping into the 40's when there's a lot going on.

These are all places where Bulldozer's (and to a lesser extent, Piledriver's) gaming performance fails me. It's not always a case of 300 vs 400fps, a fair number of games (that I play) are actually CPU bound below 60fps.




The misconception that happens when listing only AVG is that top end fps were just lower and thus doesn't matter. When in actuality it could be the exact opposite where mins tank games stutter and you would have no idea looking at a review that showed only AVG.
 

Eeqmcsq

Senior member
Jan 6, 2009
407
1
0
I think it was a mistake offering chips that overclock so well. I mean, why didn't they just come clocked higher, considering the price point they are going for? It would sure make them look better compared to Intel. We, as enthusiasts, make too small of a target audience for such decisions to be based solely upon us. And as said before, their target audience doesn't care much about a few extra degrees to get there.

Given that the FX-8350 turbos from 4.0 GHz to only 4.2 GHz, I think they pushed Vishera about as high as they can get and still guarantee a 125w TDP. If they clocked Vishera any higher and added a 140w CPU, some people might get the impression that Vishera is suddenly an "alarmingly hot" CPU.
 

MentalIlness

Platinum Member
Nov 22, 2009
2,383
11
76
I am calm . Its not me .People are so threatened AMD is going out of business that they look at these underperforming power eaters and say not bad . Its amusing . This is amds best 8 core cpu . There in deep do do. Comparing to intells 4 cores is fun and all based on price . But what is exactly the reason for a price compare to beginn with . Because WHY exactly? It has nothing to do with price. A 6 core intel wipes the floor with these . You guys might think based on PRICE Amd has a nice cpu. Thats laughable. I understand the price compare is good .. Same as cpu efficiency is a big deal . Bottom line Intel is cheaper to own threw the lifetime of the product by a large factor . But than The same people who sing the praises of cheaper cpu . Avoid the fact intel 8threaded 4 core is cheaper to own and a better bargin. AMD competes with the 3750 so why even show the 8 thread model in the reviews . In so doing the reality is hidden even deeper in this tangled story of Intel Vs AMD and confussing the cost of ownership . If fans think this will save AMD they are wrong . Next QT. tells the story . I just hate the fact I will have to listen to 3 more months of How great this CPU and the earnings report will show something all together differant . Stange world

:biggrin:
 

pelov

Diamond Member
Dec 6, 2011
3,510
6
0
A 20$ saving in electricity here, adjusted to buying power, would be over 200$ there. In other words, 20$ is worth to me what 200$ is worth to you. Actually i have the computer open 12 hrs every day with both heavy and light use. If you do the math it would be more than 20$ in my case per year, but i gave that as a typical scenario.

It may sound like peanuts to you, but i am upgrading to an i3 3220 and i love the 55w power consumption.

Two things:

1 - The TDP isn't the power consumption. Sort of related, but still different. The TDP relates to the watts of heat a cooler is asked to dissipate. Obviously, the hotter the chip the more power it's consuming (and the heat actually bumps that up as well. Thanks IDC )

2 - In general usage, and I'd imagine your usage as well, the idle power consumption matters far more than max load during Prime95 or LinPack. Those are pretty much the equivalent of Furmark for the GPU, such that they aren't realistic power consumption figures even during load, but rather just represent a worst case scenario which you're unlikely to ever encounter. Processors spend an overwhelming amount of time at idle, very rarely kicking up to higher P-states and different C-states. Processors also very very rarely kick up to full load.


Imo, AMD's biggest loss here as far as power consumption goes is the chipset. Without integrating the NB/SB, you're looking at ~13W-to-20W TDP for the NB alone. That doesn't seem like much, but it's unnecessary and shows that they're further behind than Intel as far as reducing overall system power (and heat in a mobile form factor). Though their reduction idle consumption is more aggressive, the chipset is a big hurdle. I really wouldn't worry about the load power consumption that much, other than perhaps buying a PSU that can supply an extra 20% of the max system load.
 
Last edited:

SlowSpyder

Lifer
Jan 12, 2005
17,305
1,002
126
I think it was a mistake offering chips that overclock so well. I mean, why didn't they just come clocked higher, considering the price point they are going for? It would sure make them look better compared to Intel. We, as enthusiasts, make too small of a target audience for such decisions to be based solely upon us. And as said before, their target audience doesn't care much about a few extra degrees to get there.

In my eyes, AMD was that company that wasn't trusted as much as Intel, but we all wanted one because they offered better performance at a lower price.

They took away both, so what is left? Do they think they have elevated themselves that much in the trust part?

Certainly not.

Every server I have put together for our customers has had an Intel processor. Every controller I have worked with has an Intel processor. I am not trying to speak for everyone, but our customers, I believe, use similar technology as other companies.


TDP and yields, maybe. As others have said, this is probably as fast as they can go with a 125 watt TDP. And who knows, maybe the number of chips they get from a wafer that can operate at 4.2GHz with say a 4.5GHz turbo falls off pretty steeply.
 

pelov

Diamond Member
Dec 6, 2011
3,510
6
0
TDP and yields, maybe. As others have said, this is probably as fast as they can go with a 125 watt TDP. And who knows, maybe the number of chips they get from a wafer that can operate at 4.2GHz with say a 4.5GHz turbo falls off pretty steeply.

I remember reading an article stating that the resonant clock mesh 3rd-party IP AMD was licensing for Piledriver cores was pretty ineffective at higher-than-4ghz.

Bear in mind they missed their clock speed goals too, which were initially designed to hit ~4.5ghz (missed IPC by >10% too, but that's another story).
 

CTho9305

Elite Member
Jul 26, 2000
9,214
1
81
Wouldn't that introduce problems too? Say, bumping up another thread's stores to LLC which isn't going to get any better, judging from AMD's last statements regarding Steamroller? For power consumption that would do wonders, though. With that much cache, you need a way of toggling that on and off.

What are you basing that on? I've always thought of caches as being a much smaller portion of the power consumption than cores. Is any data available on cache vs. core power consumption?

From Tech Power Up it seems the clock for clock improvement ranges from 3% 3dMark 11, to 15% Cinebench, to 25% WinRaR.

Did any of the reviewers determine how much of the performance uplift comes from differences in the amount of time spent in boost states? I didn't see that when I skimmed a couple, but that would be useful for determining how much performance uplift comes from IPC improvements vs. power savings (or improvements in the algorithm that determines when to turbo).
 

SlowSpyder

Lifer
Jan 12, 2005
17,305
1,002
126
I remember reading an article stating that the resonant clock mesh 3rd-party IP AMD was licensing for Piledriver cores was pretty ineffective at higher-than-4ghz.

Bear in mind they missed their clock speed goals too, which were initially designed to hit ~4.5ghz (missed IPC by >10% too, but that's another story).


Well, a few of the reviews in the OP show 5+GHz. Until these are out in the wild, who knows if that'll be the norm or not. But to sell these at $200 I would think they have to be able to get a good yield on these. They're moderately large chips, I imagine they pushed the clocks about as high as they can to still be able to sell them at a price where they can compete.
 

crazymonkeyzero

Senior member
Feb 25, 2012
363
0
0
I have a general question. Does the FX 8350 use or have hyper threading at all, or is it actually 8 physical cores? If it is actually 8 physical cores, the chip can have potential uses in workstations, as there are many kinds of software running old code that support only physical cores, and thus HT would be pointless for these users, and an i7 would essentially just be running as an i5, making AMDs offering more palatable.
 
Last edited:

pelov

Diamond Member
Dec 6, 2011
3,510
6
0
What are you basing that on? I've always thought of caches as being a much smaller portion of the power consumption than cores. Is any data available on cache vs. core power consumption?

The caches tend to be the most transistor dense portions of a chip outside of on-die GPUs.

I have a general question. Does the FX 8350 use or have hyper threading at all, or is it actually 8 physical cores? If it is actually 8 physical cores, the chip can have potential uses in workstations, as there are many kinds of software running old code that support only physical cores, and thus HT would be pointless for these users, and an i7 would essentially just be running as an i5, making AMDs offering more palatable.

It'll depend on the workload. Technically, the FX parts are 4-module parts and not 8 cores. Each module has 2 separate ALUs but shares a single FPU, so if you're workload is FP-heavy you're using a 4-core processor. If you're workload is integer heavy then it'll likely outpace an Intel quad+HT.

The Win7/Win8 scheduler sees them as 8 individual cores rather than Intel-like SMT.
 

Yuriman

Diamond Member
Jun 25, 2004
5,530
141
106
I have a general question. Does the FX 8350 use or have hyper threading at all, or is it actually 8 physical cores? If it is actually 8 physical cores, the chip can have potential uses in workstations, as there are many kinds of software running old code that support only physical cores, and thus HT would be pointless for these users, and an i7 would essentially just be running as an i5, making AMDs offering more palatable.

At a software level, a program can't tell the difference between a physical core and a hyperthreaded core (at least that's what I've read).

Oversimplified version:

AMD = 8 cores that suffer a potential 20% single-threaded performance loss when all cores are loaded due to shared resources
= 6.4 effective cores

i5's = 4 cores that perform 50% faster
= doing the work of 6 AMD cores
i7's have 1.3x throughput from hypertheading
= 7.8 effective cores

This explains why in multithreaded workloads the FX chips tend to beat the i5's (which are only roughly equal to 6 AMD cores) but tend to lose to i7's except in specific workloads (FP-light). Unfortunately, AMD's 8 cores also draw the power of 8 cores, and both i5's and i7's are 50% faster (on average) in single-threaded workloads.
 

SlowSpyder

Lifer
Jan 12, 2005
17,305
1,002
126
I have a general question. Does the FX 8350 use or have hyper threading at all, or is it actually 8 physical cores? If it is actually 8 physical cores, the chip can have potential uses in workstations, as there are many kinds of software running old code that support only physical cores, and thus HT would be pointless for these users, and an i7 would essentially just be running as an i5, making AMDs offering more palatable.


Each module has two cores. But, some resources are shared by those cores.
 

CTho9305

Elite Member
Jul 26, 2000
9,214
1
81
The caches tend to be the most transistor dense portions of a chip outside of on-die GPUs.

That's true, but it doesn't necessarily result in higher power consumption in the caches. There are two major reasons why caches tend to be lower power than their transistor density might lead you expect: 1) activity factor, and 2) transistor selection. Lower-level caches are generally not accessed very often, and even when they are accessed, only a very small portion of the cache actually does anything. Since the very vast majority of transistors aren't switching, there's very little dynamic power consumption. Designers know this, and that's why the second part comes into play: they specifically design caches to leak less. Bit cells (the 6-8 transistor circuits that store data in caches) can use longer channel lengths and/or higher threshold voltages for their transistors, reducing the static (leakage) power consumption relative to the transistors used for more aggressively-designed logic inside a core.

TL;DR: Yes, there are more transistors, but that doesn't directly imply they burn a lot of power.
 

Rifter

Lifer
Oct 9, 1999
11,522
751
126
While I agree that ideally you want the highest numbers possible, how about you shows us a setup that can sustain those 120fps on BF3, metro 2033 and similars running 1920 x 1080 highest settings 8xMSAA 16xAF? In fact, many newer games, even on lowered detail, won't reach 120fps at 1080p. And if you tell me that you can lower the resolution, well, how could you complain of "massive difference" when nothings looks uglier than LCDs not running at native res?

Edit: My comment was to highlight that no matter the setup, we are not there yet for 120fps at 1080 yet.

AA is not required, at least for any multiplayer games. For any multiplayer games its a good idea to sacrifice AA for FPS. smoother/higher FPS will make you a better player and give you an edge, AA will not.

And most decent high end setups will push 100 fps+ with no AA. If you are spending 500+ on a panel you are probably spending 500+ on the GPU(s) to run it.
 

Idontcare

Elite Member
Oct 10, 1999
21,110
59
91
That's true, but it doesn't necessarily result in higher power consumption in the caches. There are two major reasons why caches tend to be lower power than their transistor density might lead you expect: 1) activity factor, and 2) transistor selection. Lower-level caches are generally not accessed very often, and even when they are accessed, only a very small portion of the cache actually does anything. Since the very vast majority of transistors aren't switching, there's very little dynamic power consumption. Designers know this, and that's why the second part comes into play: they specifically design caches to leak less. Bit cells (the 6-8 transistor circuits that store data in caches) can use longer channel lengths and/or higher threshold voltages for their transistors, reducing the static (leakage) power consumption relative to the transistors used for more aggressively-designed logic inside a core.

TL;DR: Yes, there are more transistors, but that doesn't directly imply they burn a lot of power.

I've never really pulled out a pen and paper to crunch the numbers but I've always assumed that because the xtor width was physically limited in sram in order for the cell size itself to be so small that this resulted in the net drive current from the cache xtors on a per-bit basis as being rather small.

Since W/L was a smaller number than say the W/L for logic circuits, the current itself is W-limited, making the power be W-limited. Or to say it differently, you just can't push enough current through the teeny xtors used in an sram cell as would be needed to make the power-usage become large. (or so I have always presumed)
 

Chiropteran

Diamond Member
Nov 14, 2003
9,811
110
106
Some people pay more for electricity than in united states. Where i live the price after taxes is around 0,12$ per kwh, without any offpeak reductions, while the wages are over 10x smaller.

A 20$ saving in electricity here, adjusted to buying power, would be over 200$ there. In other words, 20$ is worth to me what 200$ is worth to you. Actually i have the computer open 12 hrs every day with both heavy and light use. If you do the math it would be more than 20$ in my case per year, but i gave that as a typical scenario.

It may sound like peanuts to you, but i am upgrading to an i3 3220 and i love the 55w power consumption.

You don't understand how little the difference is.

A good rule of thumb, for US energy costs:

100W over an entire year, costs about $100 more in your utility bills.

However, when you are looking at things like CPU power usage, you need to realize you don't run it at peak for a year. You probably don't even run it at peak for one minute per day, unless you regularly use multi-threaded encoding apps or run a distributed computing project that loads all cores. Of course, if you are paying 10X the US rate and make very little money, either of those activities is probably a dumb thing to do.

In reality, you probably run your computer at idle, powered off, or near-idle 23.5 hours a day. For the remaining 30 minutes you might run it at full load for 1-2 cores but almost never all 8 cores.

For a usage pattern like this you will find the extra power usage of a bulldozer CPU averages out to about 2 watts higher, or $2 extra per year, or $.16 extra per monthly utility bill.

For those of you in foreign countries paying outrage electricity prices, this is an insane extra $1.60 per month for using a power hungry FX-8150 CPU. Of course you probably saved $200 or more by buying an AMD CPU and board in the first place, so in a mere 10 years the extra electricity cost will make the 8150 a worse deal.
 

iCyborg

Golden Member
Aug 8, 2008
1,344
61
91
The Win7/Win8 scheduler sees them as 8 individual cores rather than Intel-like SMT.
Wasn't this addressed by those Bulldozer patches?

At a software level, a program can't tell the difference between a physical core and a hyperthreaded core (at least that's what I've read).
It doesn't, because it normally doesn't care, but in principle, it could know if it wanted.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |