I think the true conclusion from this article is that games are STILL not multi-threaded enough.
Being a PC gamer that I am I could care less about not having the fastest CPU around (as long as it's fast enough) but not having a fast GPU setup is another story. Besides it's much faster and easier (not to mention rewarding for a gamer) to switch a video card than it is to switch a CPU...
Being able to play two or more games at once is also nice.
What games are those? Chess? Checkers? lol
That is one of the downsides to the "synthetic" core-count tests that THG does.
I love the trick they do, its crafty and downright easy, but yes unfortunately it does give you an oranges-to-apples comparison because of all the uncore-stuff that doesn't scale correctly with the core-count manipulation they employ.
dont forget multiboxing. quad core with 8gb of ram is a must have for anyone running multiple current MMORPG instances on 1 compLol I play Flash based games on my other monitor while I'm stuck in a raid group. You know how raids are...
Therefore, in that selection of games at those settings, a dual-core + GTX480 would provide far better performance overall than a quad-core + GTX460, indicating yet again that the graphics card is the most important part of the gaming equation.
This is what I’ve been saying for quite some time, and mirrors my own findings.
Read the whole article before posting, Russian.
Furthermore, most people keep 1 CPU for every 2-3 GPU generation swaps. As a result, this makes it even more important that at the very least your CPU will suffice for 2-3 of these swaps. That's why it makes sense to spend another $50 to get a Quad-core today than to state that a dual-core is sufficient. Remember what happened to A64 4800+ users vs. A64 X2 3800+ users? The former became all but worthless.
What is being claimed is that a mid-range CPU + high-end GPU is better for gaming overall than a high-end CPU + mid-range GPU.
I used a real dual-core (E6850) and I saw similar results against my i5 750.
This is what I've been saying for a while as well. People that insist it would be more beneficial going from a setup such as E6600 / 8800 GTS 320MB to i5-750 / 8800 GTS 320MB would be more beneficial than E6600 / GTX 460 are mostly mistaken. CPUs tend to last a lot longer than graphics cards, especially considering their overclockability as of late.
Patrick, that's the most worthless article I've ever read. They are using 1920x1080 8AA on GTX460 768mb card with a Core i5 @ 4.0ghz to show that no CPU limitation exists. Why didn't they just use a GTS450 or an ATI 5750? :hmm:
CPU limitation articles should be done with a wide range of videocards including the fastest ones. TH also failed to recognize that AMD and NV videocards don't react the same to CPU limitations.
For example, http://www.xbitlabs.com/articles/cpu/display/cpus-and-games-2010_4.html
1920x1200 4AA
Civilization 5
Starcraft 2
Splinter Cell Conviction
I'll explain the flaw in argument presented.
Let's take GTAiv at 1280x1024 0AA with GTX280
C2Q @ 3.6ghz = 38 avg / 31 min
E6850 @ 3.0ghz = 25 avg / 21 min
Now BFG would always argue that no one will ever use 1280x1024 0AA. Ok fair enough, but this is what happens in the real world:
Ok now, let's say I add a GTX480 into the 2 systems. I can increase AA, resolution, and still get the same 38 avg if I wanted to on the C2Q @ 3.6ghz, since I'll be transferring the load to the GPU. My CPU can still support 38 fps avg. Therefore, I'll be able to increase image quality and still maintain decent playability, or I will get faster framerates than 38 fps on a faster CPU at the same image quality (if I don't have a CPU limitation). Now if I add a GTX480 into the E6850 rig, it's still choppy and unplayable. I am already at < 30 fps without AA, with minimums at 21!
The 2nd systems is so slow, it will only gain "free AA" but not any more playability. I am not going to get faster frames at the same image quality settings either because I am CPU limited. Does it matter that you could crank 4AA/8AA on E6850 with GTX480? Not really since the frames are too low.
You can always reduce a GPU limitation by reducing some AA or in-game quality settings. The minute you become CPU limited, you are done. There is nothing you can do to improve playability.
Furthermore, most people keep 1 CPU for every 2-3 GPU generation swaps. As a result, this makes it even more important that at the very least your CPU will suffice for 2-3 of these swaps. That's why it makes sense to spend another $50 to get a Quad-core today than to state that a dual-core is sufficient. Remember what happened to A64 4800+ users vs. A64 X2 3800+ users? The former became all but worthless.
This isn't a CPU limitation article per say. It's a core count limitation.
The thing is they probably test in a "best-case scenario", "controlled" environment with all applications disabled. For example, let's say you have anti-virus always on, have a couple excel/word documents open that you have been working on, a couple browsers open with multiple tabs (like 20-30). Then you decide to take a break and play a game.
What i got out of the article is that more cores is not necessary...if you are on a limited budget and making the choice of "do I upgrade GPU or CPU" then these kinds of critical reviews are invaluable to helping make decisions.
I found that interesting to digest.
So, for the CPU "size matters" crowd, a scenario. At what point will the CPU make a GTX-460 run as fast in games as a GTX-480?
1.If I, for example, have an E8200 @ 3.6GHz, running 1280*1024 would the 460 be just as fast as the 480 for me? I realize that the 460 is all I'd need, but that's not the question. Would it be just as fast? Or, would the 480 still run faster?
2. What about @ 1920*1200? Would an i7930 @ 3.6GHz + gtx-460 be faster than the E8200 @ 3.6Ghz + gtx-480?
3.At what point does the CPU just become too crappy to justify a top card?
The article doesn't really help anyone who plans on upgrading imo. Most people who are looking to upgrade the CPU are already using dual cores (C2D 1.86 - 3.0ghz variety - E6300 - E8400) or low end quads at stock speeds (such as Q6600/6700). These people are wondering if upgrading the GPU is worth it from their 4850/4870/GTX260, or if they are going to be bottlenecked by their slower CPUs. In other words, it would be completely wasteful to get a GTX480 for a stock E6600/Q6600 as such systems would produce almost identical framerates with a slower GTX460/5850 videocard, making a GTX480 wasteful.
From that perspective, the article did little to help these users decide what to upgrade. It would have been far better to see various systems such as C2D 1.86, C2D 3.0ghz, Core i3/i5 @ 4.0ghz, Athlon X4s + 4850 compared to the same CPUs with GTX460/480/5870 and then tested SLI/CF setups too. Then we would have seen which GPUs are wasteful for which CPUs and what's the minimum modern CPU clock speed/core count for modern games (not just FPS variety either).
Plus, you can't compare an i5 dual core processor to a dual core Phenom or C2D processor due to the differences in performance per clock and the effects of shared 8mb cache. And like I said, they didn't include minimums in most of their graphs - CPU plays a large role for minimum framerates.
This article would have been great if Xbitlabs, LegionHardware, PCgameshardware and Techspot already didn't produce far superior CPU/GPU articles. However, since the results of these websites constantly contradict the predominant view on our forum that CPU speed is not important, I only see Toyota, myself and a handful of others linking to them (with BFG on many occassions ignoring results from all 4 of those websites because they show both CPU frequency and core count dependence in a large variety of games; and they focus on minimum framerates - a metric BFG largely dismisses as 'inaccurate').
So we have 4 independent sources which continue to show that CPU speed is important and 1 source that shows that it isn't (on top of that using a $130 videocard paired with a $200 CPU to prove their point). It's almost the same as Wreckage trying to find 1-2 outlier benchmarks where a stock GTX460 beat an HD5870 and then claiming that GTX460 is as fast as an HD5870. Bottom line is, every game is different. The games one plays should be tested separately in order for us to answer if CPU or GPU is more important for a particular game.
3DVagabond said:So, for the CPU "size matters" crowd, a scenario. At what point will the CPU make a GTX-460 run as fast in games as a GTX-480?
1.If I, for example, have an E8200 @ 3.6GHz, running 1280*1024 would the 460 be just as fast as the 480 for me? I realize that the 460 is all I'd need, but that's not the question. Would it be just as fast? Or, would the 480 still run faster?
2. What about @ 1920*1200? Would an i7930 @ 3.6GHz + gtx-460 be faster than the E8200 @ 3.6Ghz + gtx-480?
3.At what point does the CPU just become too crappy to justify a top card?
The CPU utilization numbers shown in the games are also quite telling and informative.