Phenom II > i7

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Viditor

Diamond Member
Oct 25, 1999
3,290
0
0
Originally posted by: ShawnD1
Many of us don't care about practical tests because many of the games we'll be playing don't exist yet. A $100 processor and $300 processor will have very similar frame rates in today's games (Perseus Mandate), so the question is really how well they'll play tomorrow's games. Since we can't test games that don't exist, the best we can do is make impractical 800x600 tests to see how the fast the processors are in a CPU bound scenario. 2-3 years from now, buying a new video card for your old i7 or Phenom II system will create the same CPU limitations as an 800x600 test done today.

Huh?
New video cards are becoming LESS dependent on CPUs, not MORE so...
This trend will become much greater as the continued development of OpenCL begins to bear fruit. That's why both Intel and AMD are so focused on their respective "Fusion" scenario releases...
800 x 600 can have absolutely no bearing on anything except some hypothetical discussion that will never effect the user (e-penis stuff...).
 

Idontcare

Elite Member
Oct 10, 1999
21,110
59
91
Originally posted by: Viditor
AMD are so focused on their respective "Fusion" scenario releases...

I don't know why but for some peculiar reason I can't help but laugh every time I read an AMD logo with the "The Future is Fusion" slogan in it...it just strikes me as marketing saying "we think the future is going to be stuff we can't sell you today, but please, oh dear for the love of god please, buy our un-fit for the future hardware today so we can pay out bills, but remember whatever you buy today is complete crap because the future is fusion".

Isn't it some marketing golden rule that you never actively discredit or undermine the viability of your own current product-line by touting and hyping up the stuff you won't/can't sell for months or years to come?
 

MODEL3

Senior member
Jul 22, 2009
528
0
0
Originally posted by: Idontcare
Originally posted by: Viditor
AMD are so focused on their respective "Fusion" scenario releases...

I don't know why but for some peculiar reason I can't help but laugh every time I read an AMD logo with the "The Future is Fusion" slogan in it...it just strikes me as marketing saying "we think the future is going to be stuff we can't sell you today, but please, oh dear for the love of god please, buy our un-fit for the future hardware today so we can pay out bills, but remember whatever you buy today is complete crap because the future is fusion".

Isn't it some marketing golden rule that you never actively discredit or undermine the viability of your own current product-line by touting and hyping up the stuff you won't/can't sell for months or years to come?

No actually it makes sense, the marketing company AMD used,
is secretly bribed by Intel to make an ad for clarkdales! (not exactly fusion but anyway)

 

alkemyst

No Lifer
Feb 13, 2001
83,769
19
81
it's got a fucking dragon as a mascot, intel has some bunny man.

that's all there is to that.
 

Viditor

Diamond Member
Oct 25, 1999
3,290
0
0
Originally posted by: Idontcare


Isn't it some marketing golden rule that you never actively discredit or undermine the viability of your own current product-line by touting and hyping up the stuff you won't/can't sell for months or years to come?

You speak of the famed "Osborne effect"...though according to Osborne it was actually a myth, it's a myth that persists to this day.
Quoting the wiki:

"In 1983, inventor Adam Osborne pre-announced several next-generation computer models (the "Executive" and "Vixen" models), which had not yet been built, highlighting the fact that they would outperform the existing model. According to the myth, sales of the Osborne 1 immediately plummeted as customers opted to wait for the more advanced systems, leading to a sales decline from which Osborne Computer was unable to recover."
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: Idontcare
Originally posted by: Viditor
AMD are so focused on their respective "Fusion" scenario releases...

I don't know why but for some peculiar reason I can't help but laugh every time I read an AMD logo with the "The Future is Fusion" slogan in it...it just strikes me as marketing saying "we think the future is going to be stuff we can't sell you today, but please, oh dear for the love of god please, buy our un-fit for the future hardware today so we can pay out bills, but remember whatever you buy today is complete crap because the future is fusion".

Isn't it some marketing golden rule that you never actively discredit or undermine the viability of your own current product-line by touting and hyping up the stuff you won't/can't sell for months or years to come?

Well, how the hell does Apple survive then?

- they are built on hype
 

Idontcare

Elite Member
Oct 10, 1999
21,110
59
91
Originally posted by: apoppin
Originally posted by: Idontcare
Originally posted by: Viditor
AMD are so focused on their respective "Fusion" scenario releases...

I don't know why but for some peculiar reason I can't help but laugh every time I read an AMD logo with the "The Future is Fusion" slogan in it...it just strikes me as marketing saying "we think the future is going to be stuff we can't sell you today, but please, oh dear for the love of god please, buy our un-fit for the future hardware today so we can pay out bills, but remember whatever you buy today is complete crap because the future is fusion".

Isn't it some marketing golden rule that you never actively discredit or undermine the viability of your own current product-line by touting and hyping up the stuff you won't/can't sell for months or years to come?

Well, how the hell does Apple survive then?

- they are built on hype

Hype of current selling product, but very very secretive of unreleased in-development product. Perhaps the most so. In fact I don't think you could have picked a better business example to precisely highlight my point.
 

ShawnD1

Lifer
May 24, 2003
15,987
2
81
Originally posted by: Viditor
Originally posted by: ShawnD1
Many of us don't care about practical tests because many of the games we'll be playing don't exist yet. A $100 processor and $300 processor will have very similar frame rates in today's games (Perseus Mandate), so the question is really how well they'll play tomorrow's games. Since we can't test games that don't exist, the best we can do is make impractical 800x600 tests to see how the fast the processors are in a CPU bound scenario. 2-3 years from now, buying a new video card for your old i7 or Phenom II system will create the same CPU limitations as an 800x600 test done today.

Huh?
New video cards are becoming LESS dependent on CPUs, not MORE

If this is true, we should be able to look at random games and see next to no correlation between how much you pay for a modern processor and how well it runs modern games.
frame rates for some ubisoft RTS
$76 Athlon 250 - 18.7 fps
$119 Phenom X3 720 - 23.3
$189 Phenom X4 940 - 28.6
$245 Phenom X4 965 - 32.1

Now what we do is take those prices and frame rates and we put them into Microsoft Excel. Excel will show us if there's a correlation and more importantly it will show if there's a point of "diminishing returns" where it's no longer worth getting a faster CPU.
shawn's CPU cost and frame rate correlation graph of science

It's uncanny how linear it is, and that shows how CPU really is a bottleneck in modern games. That bottleneck gets more and more apparent when you expect the processor to last a few years. For example, looking at anandtech bench for FC2, E6550 and E2140 came out the same year but one has a 65% higher frame rate than the other. That right there is the definition of CPU bottleneck in a modern game. If you're one of the suckers who bought the E2140 because games were not CPU bottlenecked at that time, you'd end up buying a new system after maybe 1-2 years. Not because you want to, but because you need to.

You really should care about the possibility of CPU bottlenecking when you buy a new system. People talk about graphics bottlenecks but I can always get around that by turning the graphics down. Fallout 3 will kill a GTX 280 if all the settings are maxed out, but it runs great on a 7950GT if I turn the settings down. CPU is not like that; there's no way out. If the game wants to run this much stuff, that's what it's going to do. The only CPU setting I've seen in any recent game is draw distance in WoW and GTA. If the game still runs like garbage after I've turned that down, I'm totally screwed, I need a whole new system.

 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: ShawnD1
Originally posted by: Viditor
Originally posted by: ShawnD1
Many of us don't care about practical tests because many of the games we'll be playing don't exist yet. A $100 processor and $300 processor will have very similar frame rates in today's games (Perseus Mandate), so the question is really how well they'll play tomorrow's games. Since we can't test games that don't exist, the best we can do is make impractical 800x600 tests to see how the fast the processors are in a CPU bound scenario. 2-3 years from now, buying a new video card for your old i7 or Phenom II system will create the same CPU limitations as an 800x600 test done today.

Huh?
New video cards are becoming LESS dependent on CPUs, not MORE

If this is true, we should be able to look at random games and see next to no correlation between how much you pay for a modern processor and how well it runs modern games.
frame rates for some ubisoft RTS
$76 Athlon 250 - 18.7 fps
$119 Phenom X3 720 - 23.3
$189 Phenom X4 940 - 28.6
$245 Phenom X4 965 - 32.1

Now what we do is take those prices and frame rates and we put them into Microsoft Excel. Excel will show us if there's a correlation and more importantly it will show if there's a point of "diminishing returns" where it's no longer worth getting a faster CPU.
shawn's CPU cost and frame rate correlation graph of science

It's uncanny how linear it is, and that shows how CPU really is a bottleneck in modern games. That bottleneck gets more and more apparent when you expect the processor to last a few years. For example, looking at anandtech bench for FC2, E6550 and E2140 came out the same year but one has a 65% higher frame rate than the other. That right there is the definition of CPU bottleneck in a modern game. If you're one of the suckers who bought the E2140 because games were not CPU bottlenecked at that time, you'd end up buying a new system after maybe 1-2 years. Not because you want to, but because you need to.

You really should care about the possibility of CPU bottlenecking when you buy a new system. People talk about graphics bottlenecks but I can always get around that by turning the graphics down. Fallout 3 will kill a GTX 280 if all the settings are maxed out, but it runs great on a 7950GT if I turn the settings down. CPU is not like that; there's no way out. If the game wants to run this much stuff, that's what it's going to do. The only CPU setting I've seen in any recent game is draw distance in WoW and GTA. If the game still runs like garbage after I've turned that down, I'm totally screwed, I need a whole new system.

letsee

they are testing ONE game with a GTX 285 and they are only using 16x10 no AA/ no AF ??

- it looks like a stupid test to me with a cherry-picked game
- --- what am i missing out on?


look at the review in my sig; it is a HELL of a lot closer when you shift the balance to the GPU with a higher res and 4xAA/16xAF - hell, i tested 14 games with Athlon II 250 x2 vs Phenom II 550 X2 vs Ph II 720 X3 vs Q9550s .. and i overclocked all of them and also tested at stock


when modern CPUs are overclocked - with a GTX 280 and/or a 4870-X2 - there is less practical difference at 16x10 and 19x12 in Modern PC games that that article generalizes from one game


EDIT: i found ONE *other* RTS - World in Conflict - that also NEEDS a superfast quad when paired with 4870 Crossfire or Tri-Fire
- there is very little practical difference playing most modern PC games - between Phenom II at 3.5 GHz and with Q9550s at 4.0 GHz with a single GPU at 19x12 with maxed out details and with filtering applied.
 

ShawnD1

Lifer
May 24, 2003
15,987
2
81
Originally posted by: apoppin
look at the review in my sig; it is a HELL of a lot closer when you shift the balance to the GPU with a higher res and 4xAA/16xAF

If I'm trying to play Far Cry 2 on a horrible E2140 that I bought because I was too cheap to buy a real processor, will the game run faster or slower when I increase the game's details?
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: ShawnD1
Originally posted by: apoppin
look at the review in my sig; it is a HELL of a lot closer when you shift the balance to the GPU with a higher res and 4xAA/16xAF

If I'm trying to play Far Cry 2 on a horrible E2140 that I bought because I was too cheap to buy a real processor, will the game run faster or slower when I increase the game's details?

it depends

if you leave that poor little e2140 stock, you will have a very unsatisfactory experience

if you overclock it to 3.4 GHz, you can pair it with a must faster GPU and get a good gaming experience


i am saying you are GENERALIZING from ONE *RTS*

Please look the benches from the 14 games i presented - you tell me what conclusion that you derive from the benches

- there are all modern games - mostly with min/av/max framerates charted - with 4 CPUs (stock/overclocked), and two GPUs (4870-X2/GTX280) at 2 resolutions (16x10/19x12) with 4xAA/16xAF - the way we like to play (DX10/Vista/4GB RAM).


 

Idontcare

Elite Member
Oct 10, 1999
21,110
59
91
Originally posted by: ShawnD1
Originally posted by: Viditor
Originally posted by: ShawnD1
Many of us don't care about practical tests because many of the games we'll be playing don't exist yet. A $100 processor and $300 processor will have very similar frame rates in today's games (Perseus Mandate), so the question is really how well they'll play tomorrow's games. Since we can't test games that don't exist, the best we can do is make impractical 800x600 tests to see how the fast the processors are in a CPU bound scenario. 2-3 years from now, buying a new video card for your old i7 or Phenom II system will create the same CPU limitations as an 800x600 test done today.

Huh?
New video cards are becoming LESS dependent on CPUs, not MORE

If this is true, we should be able to look at random games and see next to no correlation between how much you pay for a modern processor and how well it runs modern games.
frame rates for some ubisoft RTS
$76 Athlon 250 - 18.7 fps
$119 Phenom X3 720 - 23.3
$189 Phenom X4 940 - 28.6
$245 Phenom X4 965 - 32.1

Now what we do is take those prices and frame rates and we put them into Microsoft Excel. Excel will show us if there's a correlation and more importantly it will show if there's a point of "diminishing returns" where it's no longer worth getting a faster CPU.
shawn's CPU cost and frame rate correlation graph of science

It's uncanny how linear it is, and that shows how CPU really is a bottleneck in modern games. That bottleneck gets more and more apparent when you expect the processor to last a few years. For example, looking at anandtech bench for FC2, E6550 and E2140 came out the same year but one has a 65% higher frame rate than the other. That right there is the definition of CPU bottleneck in a modern game. If you're one of the suckers who bought the E2140 because games were not CPU bottlenecked at that time, you'd end up buying a new system after maybe 1-2 years. Not because you want to, but because you need to.

You really should care about the possibility of CPU bottlenecking when you buy a new system. People talk about graphics bottlenecks but I can always get around that by turning the graphics down. Fallout 3 will kill a GTX 280 if all the settings are maxed out, but it runs great on a 7950GT if I turn the settings down. CPU is not like that; there's no way out. If the game wants to run this much stuff, that's what it's going to do. The only CPU setting I've seen in any recent game is draw distance in WoW and GTA. If the game still runs like garbage after I've turned that down, I'm totally screwed, I need a whole new system.

I like what you did there with the data, if I could make one unsolicited suggestion with the hope of offending no one it would be that you really can't/shouldn't compare the a performance of the system to the varying cost of just a single underlying component of that system.

Your cost vs. fps graph should be for the cost of the system used to generate the fps.

As an example, a hypothetical $200 cpu does not need to double performance over that of a $100 cpu in order to make the performance/cost of the system compelling in favor of the $200 cpu.

In your case the linear correlation will be maintained in adjusting for total system cost (which is a fixed adder to the cpu prices) and the y-intercept will remain unchanged but the effect will be evident on the value of the slope.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: Idontcare
Originally posted by: ShawnD1
Originally posted by: Viditor
Originally posted by: ShawnD1
Many of us don't care about practical tests because many of the games we'll be playing don't exist yet. A $100 processor and $300 processor will have very similar frame rates in today's games (Perseus Mandate), so the question is really how well they'll play tomorrow's games. Since we can't test games that don't exist, the best we can do is make impractical 800x600 tests to see how the fast the processors are in a CPU bound scenario. 2-3 years from now, buying a new video card for your old i7 or Phenom II system will create the same CPU limitations as an 800x600 test done today.

Huh?
New video cards are becoming LESS dependent on CPUs, not MORE

If this is true, we should be able to look at random games and see next to no correlation between how much you pay for a modern processor and how well it runs modern games.
frame rates for some ubisoft RTS
$76 Athlon 250 - 18.7 fps
$119 Phenom X3 720 - 23.3
$189 Phenom X4 940 - 28.6
$245 Phenom X4 965 - 32.1

Now what we do is take those prices and frame rates and we put them into Microsoft Excel. Excel will show us if there's a correlation and more importantly it will show if there's a point of "diminishing returns" where it's no longer worth getting a faster CPU.
shawn's CPU cost and frame rate correlation graph of science

It's uncanny how linear it is, and that shows how CPU really is a bottleneck in modern games. That bottleneck gets more and more apparent when you expect the processor to last a few years. For example, looking at anandtech bench for FC2, E6550 and E2140 came out the same year but one has a 65% higher frame rate than the other. That right there is the definition of CPU bottleneck in a modern game. If you're one of the suckers who bought the E2140 because games were not CPU bottlenecked at that time, you'd end up buying a new system after maybe 1-2 years. Not because you want to, but because you need to.

You really should care about the possibility of CPU bottlenecking when you buy a new system. People talk about graphics bottlenecks but I can always get around that by turning the graphics down. Fallout 3 will kill a GTX 280 if all the settings are maxed out, but it runs great on a 7950GT if I turn the settings down. CPU is not like that; there's no way out. If the game wants to run this much stuff, that's what it's going to do. The only CPU setting I've seen in any recent game is draw distance in WoW and GTA. If the game still runs like garbage after I've turned that down, I'm totally screwed, I need a whole new system.

I like what you did there with the data, if I could make one unsolicited suggestion with the hope of offending no one it would be that you really can't/shouldn't compare the a performance of the system to the varying cost of just a single underlying component of that system.

Your cost vs. fps graph should be for the cost of the system used to generate the fps.

As an example, a hypothetical $200 cpu does not need to double performance over that of a $100 cpu in order to make the performance/cost of the system compelling in favor of the $200 cpu.

In your case the linear correlation will be maintained in adjusting for total system cost (which is a fixed adder to the cpu prices) and the y-intercept will remain unchanged but the effect will be evident on the value of the slope.

well, in *this case* the data is skewed because of being from one very non-standard bench that was evidently cherry picked to show *eXtreme differences* between CPUs in a single RTS at a middle resolution without AA/AF.

You also have to factor in the hours spent playing PC games over the course of the HW
- 1 hour a week and it is not a gaming PC at all

What makes an upgrade compelling or worthwhile is your dissatisfaction with your current HW
- then, if it is satisfactory, it is priceless .. there is no way to factor in frustration with inferior HW as a cost

 

ShawnD1

Lifer
May 24, 2003
15,987
2
81
Originally posted by: Idontcare
I like what you did there with the data, if I could make one unsolicited suggestion with the hope of offending no one it would be that you really can't/shouldn't compare the a performance of the system to the varying cost of just a single underlying component of that system.

Your cost vs. fps graph should be for the cost of the system used to generate the fps.
That's a good idea. PCGH used a GTX 285 video card and 4gb DDR3 ram. Phenom 940 is the wrong socket and they're not even sold in Canada, so I'll use the 945 for the sake of comparison.

dvd burner
ATX tower
1TB hard drive
AM3 motherboard
GTX 285 (includes a free copy of batman!)
500W PSU
4GB DDR3
SSD

total cost after shipping - CPU name - frame rate in RTS game
$997 - Athlon II X2 250 - 18.7
$1060 - Phenom II X3 720 - 23.3
$1102 - Phenom II X4 945 - 28.6
$1202 - Phenom II X4 965 - 32.1

The i7 used was with 6gb DDR3 so it's a big more expensive; I picked the cheapest motherboard for this one too:
$1364 - i7 920 - 46.9

graph with a bad title

I don't want to draw a trend line on this one, but what you can do is put a piece of paper on the graph to see which processor is the best value. Put the left side of the paper on top of the 0,0 point and put the right side of the paper on one of the data points. The point with the greatest slope is the best value. As expected, the i7 is the best value, then Phenom 965, 945, 720, and Athlon 250 in that order. Better processor = better value? Of course. The more expensive processors (generally) get the best frame rate relative to the cost of the entire platform, and you won't need to upgrade as often; that i7 should last at least 3-4 years but an Athlon II will be lucky to get even 2 years before becoming totally useless.

Of course you can only see this trend if you look at a game under severely CPU-bottlenecked conditions. You can do that with one of the following:
1 - benchmarking an RTS game which always uses more CPU than GPU
2- using 800x600 resolution to force the game into using more CPU than GPU
 

ShawnD1

Lifer
May 24, 2003
15,987
2
81
Originally posted by: apoppin
well, in *this case* the data is skewed because of being from one very non-standard bench that was evidently cherry picked to show *eXtreme differences* between CPUs in a single RTS at a middle resolution without AA/AF.
That link has 4 separate games tested and they all show the exact same thing. Anno is real time strategy, Far Cry 2 is a first person shooter, GTA 4 is a sandbox, and Grid is a racing game. These are 4 completely different types of PC game.

You also have to factor in the hours spent playing PC games over the course of the HW
- 1 hour a week and it is not a gaming PC at all
I don't understand what you mean by this.
 

Idontcare

Elite Member
Oct 10, 1999
21,110
59
91
Originally posted by: apoppin
Originally posted by: Idontcare
Originally posted by: ShawnD1
Originally posted by: Viditor
Originally posted by: ShawnD1
Many of us don't care about practical tests because many of the games we'll be playing don't exist yet. A $100 processor and $300 processor will have very similar frame rates in today's games (Perseus Mandate), so the question is really how well they'll play tomorrow's games. Since we can't test games that don't exist, the best we can do is make impractical 800x600 tests to see how the fast the processors are in a CPU bound scenario. 2-3 years from now, buying a new video card for your old i7 or Phenom II system will create the same CPU limitations as an 800x600 test done today.

Huh?
New video cards are becoming LESS dependent on CPUs, not MORE

If this is true, we should be able to look at random games and see next to no correlation between how much you pay for a modern processor and how well it runs modern games.
frame rates for some ubisoft RTS
$76 Athlon 250 - 18.7 fps
$119 Phenom X3 720 - 23.3
$189 Phenom X4 940 - 28.6
$245 Phenom X4 965 - 32.1

Now what we do is take those prices and frame rates and we put them into Microsoft Excel. Excel will show us if there's a correlation and more importantly it will show if there's a point of "diminishing returns" where it's no longer worth getting a faster CPU.
shawn's CPU cost and frame rate correlation graph of science

It's uncanny how linear it is, and that shows how CPU really is a bottleneck in modern games. That bottleneck gets more and more apparent when you expect the processor to last a few years. For example, looking at anandtech bench for FC2, E6550 and E2140 came out the same year but one has a 65% higher frame rate than the other. That right there is the definition of CPU bottleneck in a modern game. If you're one of the suckers who bought the E2140 because games were not CPU bottlenecked at that time, you'd end up buying a new system after maybe 1-2 years. Not because you want to, but because you need to.

You really should care about the possibility of CPU bottlenecking when you buy a new system. People talk about graphics bottlenecks but I can always get around that by turning the graphics down. Fallout 3 will kill a GTX 280 if all the settings are maxed out, but it runs great on a 7950GT if I turn the settings down. CPU is not like that; there's no way out. If the game wants to run this much stuff, that's what it's going to do. The only CPU setting I've seen in any recent game is draw distance in WoW and GTA. If the game still runs like garbage after I've turned that down, I'm totally screwed, I need a whole new system.

I like what you did there with the data, if I could make one unsolicited suggestion with the hope of offending no one it would be that you really can't/shouldn't compare the a performance of the system to the varying cost of just a single underlying component of that system.

Your cost vs. fps graph should be for the cost of the system used to generate the fps.

As an example, a hypothetical $200 cpu does not need to double performance over that of a $100 cpu in order to make the performance/cost of the system compelling in favor of the $200 cpu.

In your case the linear correlation will be maintained in adjusting for total system cost (which is a fixed adder to the cpu prices) and the y-intercept will remain unchanged but the effect will be evident on the value of the slope.

well, in *this case* the data is skewed because of being from one very non-standard bench that was evidently cherry picked to show *eXtreme differences* between CPUs in a single RTS at a middle resolution without AA/AF.

You also have to factor in the hours spent playing PC games over the course of the HW
- 1 hour a week and it is not a gaming PC at all

What makes an upgrade compelling or worthwhile is your dissatisfaction with your current HW
- then, if it is satisfactory, it is priceless .. there is no way to factor in frustration with inferior HW as a cost

Don't get off-message on me there apoppin, I'm not making any judgement calls of the conclusions drawn from the data nor am I making any judgment calls on the validity of the method by which the data are derived...I am just saying I personally like to see "cost of the entire system" versus "performance derived from the entire system" when performance/cost numbers are being discussed.

I would like to see GPU's evaluated this way too, in addition to the usual screen size vs. fps scaling graphs it would be nice to see system cost with varying GPU skus versus the fps you get from that gpu-enabled rig (pick a beefy cpu like PhII 965 or i7 975 so you know you are always gpu limited if you like) while holding all other components the same.

Then draw a horizontal line at somewhere above 60fps and plot min-fps and tell people "below this line you will notice your system being slow, anywhere above this line and the benchmark differences are imperceptible to most users". That way even though an IGP-based system will turn out impressive fps/$ stats the graphs will highlight the very obvious fact that they fail to meet minimum fps requirements at any reasonable screen rez or AA/AF/etc.
 

Idontcare

Elite Member
Oct 10, 1999
21,110
59
91
Originally posted by: ShawnD1
The more expensive processors (generally) get the best frame rate relative to the cost of the entire platform, and you won't need to upgrade as often; that i7 should last at least 3-4 years but an Athlon II will be lucky to get even 2 years before becoming totally useless.

This message I think gets lost in the pages and pages of gpu and cpu review data that gets churned out when hot hardware is released. I like how you state it, spot on and clear to understand.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: Idontcare
Originally posted by: apoppin
Originally posted by: Idontcare
Originally posted by: ShawnD1
Originally posted by: Viditor
Originally posted by: ShawnD1
Many of us don't care about practical tests because many of the games we'll be playing don't exist yet. A $100 processor and $300 processor will have very similar frame rates in today's games (Perseus Mandate), so the question is really how well they'll play tomorrow's games. Since we can't test games that don't exist, the best we can do is make impractical 800x600 tests to see how the fast the processors are in a CPU bound scenario. 2-3 years from now, buying a new video card for your old i7 or Phenom II system will create the same CPU limitations as an 800x600 test done today.

Huh?
New video cards are becoming LESS dependent on CPUs, not MORE

If this is true, we should be able to look at random games and see next to no correlation between how much you pay for a modern processor and how well it runs modern games.
frame rates for some ubisoft RTS
$76 Athlon 250 - 18.7 fps
$119 Phenom X3 720 - 23.3
$189 Phenom X4 940 - 28.6
$245 Phenom X4 965 - 32.1

Now what we do is take those prices and frame rates and we put them into Microsoft Excel. Excel will show us if there's a correlation and more importantly it will show if there's a point of "diminishing returns" where it's no longer worth getting a faster CPU.
shawn's CPU cost and frame rate correlation graph of science

It's uncanny how linear it is, and that shows how CPU really is a bottleneck in modern games. That bottleneck gets more and more apparent when you expect the processor to last a few years. For example, looking at anandtech bench for FC2, E6550 and E2140 came out the same year but one has a 65% higher frame rate than the other. That right there is the definition of CPU bottleneck in a modern game. If you're one of the suckers who bought the E2140 because games were not CPU bottlenecked at that time, you'd end up buying a new system after maybe 1-2 years. Not because you want to, but because you need to.

You really should care about the possibility of CPU bottlenecking when you buy a new system. People talk about graphics bottlenecks but I can always get around that by turning the graphics down. Fallout 3 will kill a GTX 280 if all the settings are maxed out, but it runs great on a 7950GT if I turn the settings down. CPU is not like that; there's no way out. If the game wants to run this much stuff, that's what it's going to do. The only CPU setting I've seen in any recent game is draw distance in WoW and GTA. If the game still runs like garbage after I've turned that down, I'm totally screwed, I need a whole new system.

I like what you did there with the data, if I could make one unsolicited suggestion with the hope of offending no one it would be that you really can't/shouldn't compare the a performance of the system to the varying cost of just a single underlying component of that system.

Your cost vs. fps graph should be for the cost of the system used to generate the fps.

As an example, a hypothetical $200 cpu does not need to double performance over that of a $100 cpu in order to make the performance/cost of the system compelling in favor of the $200 cpu.

In your case the linear correlation will be maintained in adjusting for total system cost (which is a fixed adder to the cpu prices) and the y-intercept will remain unchanged but the effect will be evident on the value of the slope.

well, in *this case* the data is skewed because of being from one very non-standard bench that was evidently cherry picked to show *eXtreme differences* between CPUs in a single RTS at a middle resolution without AA/AF.

You also have to factor in the hours spent playing PC games over the course of the HW
- 1 hour a week and it is not a gaming PC at all

What makes an upgrade compelling or worthwhile is your dissatisfaction with your current HW
- then, if it is satisfactory, it is priceless .. there is no way to factor in frustration with inferior HW as a cost

Don't get off-message on me there apoppin, I'm not making any judgement calls of the conclusions drawn from the data nor am I making any judgment calls on the validity of the method by which the data are derived...I am just saying I personally like to see "cost of the entire system" versus "performance derived from the entire system" when performance/cost numbers are being discussed.

I would like to see GPU's evaluated this way too, in addition to the usual screen size vs. fps scaling graphs it would be nice to see system cost with varying GPU skus versus the fps you get from that gpu-enabled rig (pick a beefy cpu like PhII 965 or i7 975 so you know you are always gpu limited if you like) while holding all other components the same.

Then draw a horizontal line at somewhere above 60fps and plot min-fps and tell people "below this line you will notice your system being slow, anywhere above this line and the benchmark differences are imperceptible to most users". That way even though an IGP-based system will turn out impressive fps/$ stats the graphs will highlight the very obvious fact that they fail to meet minimum fps requirements at any reasonable screen rez or AA/AF/etc.

You are not talking about reviews, you are talking about creating an interactive database that is regularly updated to factor in changing price
- i'd love one too

i am just saying what you are asking for is unrealistic; no "off message"
 

Arkaign

Lifer
Oct 27, 2006
20,736
1,379
126
Originally posted by: Idontcare
Originally posted by: ShawnD1
The more expensive processors (generally) get the best frame rate relative to the cost of the entire platform, and you won't need to upgrade as often; that i7 should last at least 3-4 years but an Athlon II will be lucky to get even 2 years before becoming totally useless.

This message I think gets lost in the pages and pages of gpu and cpu review data that gets churned out when hot hardware is released. I like how you state it, spot on and clear to understand.

It's sensible, but really only applicable to gamers / bleeding-edge types. A 4 year old AXP 3200+, Opteron 144, or ~3ghz P4 w/1 to 2gb of ram or so is still more than enough for a lot of people. Good for reports, email, MS Office, youtube, mp3s, most video (depend on card obviously), and light gaming.

I would certainly hope and expect an i7 system will still be viable in 3-4 years, but it probably won't be pushing top games at that time. It will certainly outlast current PhII's like mine though.
 

Pelu

Golden Member
Mar 3, 2008
1,208
0
0
Dude,,, i7, Lower speed, lower transistor count, if its higher in a game benchmark is because the benchmark is arrange lol! by the way architecture is not a magic want to create something faster right from your axx lol... maybe it help lol... but is not magical...
 

v8envy

Platinum Member
Sep 7, 2002
2,720
0
0
Originally posted by: Viditor
Originally posted by: Idontcare


Isn't it some marketing golden rule that you never actively discredit or undermine the viability of your own current product-line by touting and hyping up the stuff you won't/can't sell for months or years to come?

You speak of the famed "Osborne effect"...though according to Osborne it was actually a myth, it's a myth that persists to this day.
Quoting the wiki:

"In 1983, inventor Adam Osborne pre-announced several next-generation computer models (the "Executive" and "Vixen" models), which had not yet been built, highlighting the fact that they would outperform the existing model. According to the myth, sales of the Osborne 1 immediately plummeted as customers opted to wait for the more advanced systems, leading to a sales decline from which Osborne Computer was unable to recover."


And then there's the counter-example of Microsoft in the era of OS/2. Back when the as-yet to be released Microsoft Windows 3 was a paper competitor of an existing OS/2 Microsoft did all they could to convince prospective purchasers to delay by promising pure awesome. When the product was finally ready to sell, the customers were more than ready to buy.

Oracle was another company taking pages out of the same playbook in the early days. Sales guys were instructed to sell the sky (coming in the next version) if that's what it took to keep the customer from buying Ingres today.

I'm not sure how this relates to AMD in 2009, but there you go. Cannibalizing your current sales makes sense if you view your product as unsaleable when compared to the competition.
 

batmang

Diamond Member
Jul 16, 2003
3,020
1
81
I had a Phenom II, I went i7. I don't regret it one bit.
Better in gaming or not, the i7 is an all around better CPU/platform. It cost more, but thats why its better, duh.
 

Idontcare

Elite Member
Oct 10, 1999
21,110
59
91
Originally posted by: v8envy
Originally posted by: Viditor
Originally posted by: Idontcare


Isn't it some marketing golden rule that you never actively discredit or undermine the viability of your own current product-line by touting and hyping up the stuff you won't/can't sell for months or years to come?

You speak of the famed "Osborne effect"...though according to Osborne it was actually a myth, it's a myth that persists to this day.
Quoting the wiki:

"In 1983, inventor Adam Osborne pre-announced several next-generation computer models (the "Executive" and "Vixen" models), which had not yet been built, highlighting the fact that they would outperform the existing model. According to the myth, sales of the Osborne 1 immediately plummeted as customers opted to wait for the more advanced systems, leading to a sales decline from which Osborne Computer was unable to recover."


And then there's the counter-example of Microsoft in the era of OS/2. Back when the as-yet to be released Microsoft Windows 3 was a paper competitor of an existing OS/2 Microsoft did all they could to convince prospective purchasers to delay by promising pure awesome. When the product was finally ready to sell, the customers were more than ready to buy.

Oracle was another company taking pages out of the same playbook in the early days. Sales guys were instructed to sell the sky (coming in the next version) if that's what it took to keep the customer from buying Ingres today.

I'm not sure how this relates to AMD in 2009, but there you go. Cannibalizing your current sales makes sense if you view your product as unsaleable when compared to the competition.

The premise of your argument relies on the company that is hyping their next gen product as having little sales opportunity from their current product, hence they are trying to steal sales from the competition by way of delaying the customer from making a purchase.

(and that delay at best will be a few months, not years as AMD fusion is headed)

I agree this is a common sales tactic, but to reiterate this tactic implicitly does not cannibalize sales at the company doing the hyping because the very reason it is invoked is because the sales of their current product is already in the dumper because of some intrinsic inferiority it already has relative to the competitor's product which is selling well.

There is no "cannibalizing your current sales" when your choices are (1) have no sales because the competition is stealing them already with superior product already in the market, or (2) have no sales because you hype next gen product in an attempt to reduce your competition's sales.

But it seems weird to have 20% marketshare, sales are non-zero, and yet your first foot forward in the marketing dept is that all this crap you buy from them today is nothing because the future is product you can't buy for 2yrs yet, unless you want to buy it from their competitor, then you can get it in about 3 months.

I'm not a marketing type person though, so I am sure it makes perfect logic at some level once the rationale is thoroughly explained. At the moment its jsut beyond me, but humorous and I continue to laugh every time I see that logo with the "The Future is Fusion" proudly displayed at the bottom of a marketing slide hyping the current non-fusion product line-up.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |