Tom's Hardware: CPU/GPU Bottlenecks

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Borealis7

Platinum Member
Oct 19, 2006
2,901
205
106
I think the true conclusion from this article is that games are STILL not multi-threaded enough.
 

Idontcare

Elite Member
Oct 10, 1999
21,110
59
91
I think the true conclusion from this article is that games are STILL not multi-threaded enough.

Or that the cpu-portion of gaming itself is adequately handled by the processing power of a 3+GHz dual-core cpu and increasing the multi-threaded aspects of the game would still not net a tangible improvement in gameplay because the GPU is the bottleneck.

That was my take-away when looking at the CPU vs GPU utilization rates.

If the GPU utilization rate is 95-100% then it doesn't really matter whether the game is single-threaded, dual-threaded, or can take advantage of a magny-cours cpu...it needs more GPU before the CPU side of the equation becomes the bottleneck.
 

solofly

Banned
May 25, 2003
1,421
0
0
Being a PC gamer that I am I could care less about not having the fastest CPU around (as long as it's fast enough) but not having a fast GPU setup is another story. Besides it's much faster and easier (not to mention rewarding for a gamer) to switch a video card than it is to switch a CPU...
 
Last edited:

pcslookout

Lifer
Mar 18, 2007
11,959
157
106
Being a PC gamer that I am I could care less about not having the fastest CPU around (as long as it's fast enough) but not having a fast GPU setup is another story. Besides it's much faster and easier (not to mention rewarding for a gamer) to switch a video card than it is to switch a CPU...

Are you sure about that?

I love being able to run many vms and game at the same time. Being able to play two or more games at once is also nice.
 

Patrick Wolf

Platinum Member
Jan 5, 2005
2,443
0
0
That is one of the downsides to the "synthetic" core-count tests that THG does.

I love the trick they do, its crafty and downright easy, but yes unfortunately it does give you an oranges-to-apples comparison because of all the uncore-stuff that doesn't scale correctly with the core-count manipulation they employ.

I agree using actual CPU's would be more realistic and helpful to potential buyers. But the point of the article is to keep all variables the same and look soley at core count and how it affects performance.

Though some of their results were stupid. Who's going to play JC2 with an avg of 34 fps? Should've thrown in 5870 + CF results. And why not do minimums for all games?
 
Last edited:

faxon

Platinum Member
May 23, 2008
2,109
1
81
Lol I play Flash based games on my other monitor while I'm stuck in a raid group. You know how raids are...
dont forget multiboxing. quad core with 8gb of ram is a must have for anyone running multiple current MMORPG instances on 1 comp
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Therefore, in that selection of games at those settings, a dual-core + GTX480 would provide far better performance overall than a quad-core + GTX460, indicating yet again that the graphics card is the most important part of the gaming equation.

This is what I’ve been saying for quite some time, and mirrors my own findings.

No one argued that the graphics card is not the most important part with a balanced core i5/i7/Phenom 2 system.

But if you play games like Civ5, SC2, Dragon Age: Origins, Far Cry 2, World in Conflict, Arma2, RE5 and get a Q6600 + GTX480 then a Core i7 860/920 + GTX470 system will destroy it. That's the whole discussion about CPU limitation - just because you have the fastest graphics card, doesn't mean it can beat a more balanced system. Sorry man, most people here don't use super-sampling to bring in 100% GPU limitation in games. Just because all you play are FPS games, I can understand your point of view. However, anyone who plays strategy games, flight sims will disagree that CPU doesn't matter. It depends on the game.



Wow!! I want 5 min of my life back.

They tested AvP at 1920x1080 8AF and Anno1404 at 1920x1080 8AA! on a GTX460 768mb and then measured CPU limitation on a Core i5 4.0ghz lol! Are they nuts? a 768mb GTX460 can't play AvP with tessellation smoothly regardless what CPU it's paired with. That would be the same as using 8800GT at 1920x1080 Entusiast setting in Crysis and then arguying that changing CPU speed from 4.0ghz to 2.0ghz made no difference - we already know the conclusion before testing!

Plus no minimum framerates, which makes any CPU limitation article worthless.

Let me see we have Toms' H with the worst CPU limitation article vs.

TechSpot, LegionHardware, Xbitlabs, PCGameshardware - all showing that CPU limitations exist (each single one of them using CPUs with the Fastest Graphics cards and showing minimum framerates).

Once they test a Core i5 @ 2.0 --> 4.0ghz and Core i3 dual core @ 2.0 --> 4.0ghz paired with a 5870/GTX480 and 5870 CF/GTX480 SLI setups, then we can have a discussion. Testing a $130 videocard for a CPU limitation article is just wrong.
 
Last edited:

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Read the whole article before posting, Russian.

Patrick, that's the most worthless article I've ever read. They are using 1920x1080 8AA on GTX460 768mb card with a Core i5 @ 4.0ghz to show that no CPU limitation exists. Why didn't they just use a GTS450 or an ATI 5750? :hmm:

CPU limitation articles should be done with a wide range of videocards including the fastest ones. TH also failed to recognize that AMD and NV videocards don't react the same to CPU limitations.

For example, http://www.xbitlabs.com/articles/cpu/display/cpus-and-games-2010_4.html

1920x1200 4AA
Civilization 5
Starcraft 2
Splinter Cell Conviction

I'll explain the flaw in argument presented.

Let's take GTAiv at 1280x1024 0AA with GTX280
C2Q @ 3.6ghz = 38 avg / 31 min
E6850 @ 3.0ghz = 25 avg / 21 min

Now BFG would always argue that no one will ever use 1280x1024 0AA. Ok fair enough, but this is what happens in the real world:

Ok now, let's say I add a GTX480 into the 2 systems. I can increase AA, resolution, and still get the same 38 avg if I wanted to on the C2Q @ 3.6ghz, since I'll be transferring the load to the GPU. My CPU can still support 38 fps avg. Therefore, I'll be able to increase image quality and still maintain decent playability, or I will get faster framerates than 38 fps on a faster CPU at the same image quality (if I don't have a CPU limitation). Now if I add a GTX480 into the E6850 rig, it's still choppy and unplayable. I am already at < 30 fps without AA, with minimums at 21!

The 2nd systems is so slow, it will only gain "free AA" but not any more playability. I am not going to get faster frames at the same image quality settings either because I am CPU limited. Does it matter that you could crank 4AA/8AA on E6850 with GTX480? Not really since the frames are too low.

You can always reduce a GPU limitation by reducing some AA or in-game quality settings. The minute you become CPU limited, you are done. There is nothing you can do to improve playability.

Furthermore, most people keep 1 CPU for every 2-3 GPU generation swaps. As a result, this makes it even more important that at the very least your CPU will suffice for 2-3 of these swaps. That's why it makes sense to spend another $50 to get a Quad-core today than to state that a dual-core is sufficient. Remember what happened to A64 4800+ users vs. A64 X2 3800+ users? The former became all but worthless.
 
Last edited:

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
This is a bit off topic, but I recall reading that Windows 7 does a better job than Windows XP of taking advantage of multicore CPUs. (I'm not sure where Vista fits in.) Does anyone have a link to an article about how much that affects gaming applications specifically?

Furthermore, most people keep 1 CPU for every 2-3 GPU generation swaps. As a result, this makes it even more important that at the very least your CPU will suffice for 2-3 of these swaps. That's why it makes sense to spend another $50 to get a Quad-core today than to state that a dual-core is sufficient. Remember what happened to A64 4800+ users vs. A64 X2 3800+ users? The former became all but worthless.

I completely agree with Russian as to CPU importance due to upgrade cycles. Like I said earlier in the thread, it's a lot more hassle to upgrade CPU/mobo/RAM than GPU, especially with Intel's rapid switching from socket to socket--and now even AMD is moving on to a different socket for Bulldozer.
 
Last edited:

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
What is being claimed is that a mid-range CPU + high-end GPU is better for gaming overall than a high-end CPU + mid-range GPU.

Who claimed that? I have never seen anyone on this forum who ever recommended a Core i7 960 + GTX460 over a Core i5 750 (mid range $200 CPU) + GTX480 (high-end GPU). Have you?

I used a real dual-core (E6850) and I saw similar results against my i5 750.

That's because most of the games you play are FPS @ 2560x1600 + 8x MSAA or 16xQ CSAA to the moon! Basically the type of settings maybe 2&#37; of gamers with a single GPU use. However, the most important factor in all of your conclusions are the types of games you test your hardware with:

Quake 1-4
Doom 3
Far Cry 1
Serious Sam 2
Fear 1
Quake Wars
HL2
Unreal Tournament 99, 2004
Call of Duty 1, 2
Return to Castle Wolfenstein
Medal of Honor Pacific Assault

Can you tell me a single one of these games which supports more than 2 CPU Cores? Of course the only differences you will see are between per clock CPU performance/CPU frequency, not # of cores. None of these games have modern gaming engines that can take advantage of multiple cores.


Also, who besides you buys a GTX480 for these games? Just like you are a huge fan of ancient FPS games, I am a huge fan of ancient strategy games Let's say, I ran another article where I tested Starcraft 1, Age of Empires 1-3, Age of Mythology, Warcraft 2-3, Brood Wars on my laptop with X4500HD integrated GPU. So at the end of my article I can safely conclude that GPU performance above Intel integrated doesn't matter for gamers. That wouldn't contradict my findings based on the games I tested.

Anyone who has ever played other games such as Arma2, GTAIV, Supreme Commander 2, Starcraft 2, Civ5, DA:O at a more realistic 4AA 1920x1080 will attest to the fact that E6850+GTX480 can't provide the same playability as a Core i5 @ 3.8+ghz + GTX480.

Even Far Cry 2 and Resident Evil 5 and GTAiv show massive differences. My Q6600 @ 3.4ghz would max out at about 55 fps at 1920x1080 8AA while the i7 860 @ 3.9ghz does 80-90 fps with ease in RE5.
 
Last edited:

CurseTheSky

Diamond Member
Oct 21, 2006
5,401
2
0
This is what I've been saying for a while as well. People that insist it would be more beneficial going from a setup such as E6600 / 8800 GTS 320MB to i5-750 / 8800 GTS 320MB would be more beneficial than E6600 / GTX 460 are mostly mistaken. CPUs tend to last a lot longer than graphics cards, especially considering their overclockability as of late.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
This is what I've been saying for a while as well. People that insist it would be more beneficial going from a setup such as E6600 / 8800 GTS 320MB to i5-750 / 8800 GTS 320MB would be more beneficial than E6600 / GTX 460 are mostly mistaken. CPUs tend to last a lot longer than graphics cards, especially considering their overclockability as of late.

As far back as I can rememeber, no one ever disagreed that GPU is the most important component for gaming for the majority of games.

Even in that 2005 thread, most of us came to the conclusion that as long as you have a modern CPU, you are good to go for games; and that videocard is more important.

While today GPU is still more important, with today's more modern game engines, a well balanced CPU+GPU system can make a world of difference in some games.
 
Last edited:

Patrick Wolf

Platinum Member
Jan 5, 2005
2,443
0
0
Patrick, that's the most worthless article I've ever read. They are using 1920x1080 8AA on GTX460 768mb card with a Core i5 @ 4.0ghz to show that no CPU limitation exists. Why didn't they just use a GTS450 or an ATI 5750? :hmm:

CPU limitation articles should be done with a wide range of videocards including the fastest ones. TH also failed to recognize that AMD and NV videocards don't react the same to CPU limitations.

For example, http://www.xbitlabs.com/articles/cpu/display/cpus-and-games-2010_4.html

1920x1200 4AA
Civilization 5
Starcraft 2
Splinter Cell Conviction

I'll explain the flaw in argument presented.

Let's take GTAiv at 1280x1024 0AA with GTX280
C2Q @ 3.6ghz = 38 avg / 31 min
E6850 @ 3.0ghz = 25 avg / 21 min

Now BFG would always argue that no one will ever use 1280x1024 0AA. Ok fair enough, but this is what happens in the real world:

Ok now, let's say I add a GTX480 into the 2 systems. I can increase AA, resolution, and still get the same 38 avg if I wanted to on the C2Q @ 3.6ghz, since I'll be transferring the load to the GPU. My CPU can still support 38 fps avg. Therefore, I'll be able to increase image quality and still maintain decent playability, or I will get faster framerates than 38 fps on a faster CPU at the same image quality (if I don't have a CPU limitation). Now if I add a GTX480 into the E6850 rig, it's still choppy and unplayable. I am already at < 30 fps without AA, with minimums at 21!

The 2nd systems is so slow, it will only gain "free AA" but not any more playability. I am not going to get faster frames at the same image quality settings either because I am CPU limited. Does it matter that you could crank 4AA/8AA on E6850 with GTX480? Not really since the frames are too low.

You can always reduce a GPU limitation by reducing some AA or in-game quality settings. The minute you become CPU limited, you are done. There is nothing you can do to improve playability.

Furthermore, most people keep 1 CPU for every 2-3 GPU generation swaps. As a result, this makes it even more important that at the very least your CPU will suffice for 2-3 of these swaps. That's why it makes sense to spend another $50 to get a Quad-core today than to state that a dual-core is sufficient. Remember what happened to A64 4800+ users vs. A64 X2 3800+ users? The former became all but worthless.

This isn't a CPU limitation article per say. It's a core count limitation.

Also, it's probably not fair to cherry-pick GTAIV from that article since it's old so it was done before all those performance patches were released.

I didn't look through the whole xbit article, but it looks like NV cards tend to scale better with higher frequencies, which doesn't help much here as TH's was talking about core count, not frequency. However one could draw a conclusion that because NV cards scale better with higher frequencies (on Intel CPUs anyway) it would make their cards the optimal choice for core count comparison.
 
Last edited:

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
This isn't a CPU limitation article per say. It's a core count limitation.

The thing is they probably test in a "best-case scenario", "controlled" environment with all applications disabled. For example, let's say you have anti-virus always on, have a couple excel/word documents open that you have been working on, a couple browsers open with multiple tabs (like 20-30). Then you decide to take a break and play a game.

You launch the game while keeping everything you were working on still open. Now that's a more realistic real-world gaming comparison for a dual-core vs. quad core! :awe:

Also, these articles only test at a point in time. IMO, these types of articles may lead less than experienced system builders to save $40-50 on a Quad-Core CPU that will bite them hard down the line. Just some food for thought.

Take a closer look what happens when you run out of cores.

A64 4000+ 2.4ghz = 26 fps
A64 FX60 2.0ghz = 48 fps

I just think that while a dual-core i5 may be satisfactory for the majority of games out there, it may not be so smart to recommend a $100 i3 over a $100 AMD X4 940 or $140 955 on the basis that most games tested ran fine on a dual-core i5 system in that article.
 
Last edited:

Idontcare

Elite Member
Oct 10, 1999
21,110
59
91
The thing is they probably test in a "best-case scenario", "controlled" environment with all applications disabled. For example, let's say you have anti-virus always on, have a couple excel/word documents open that you have been working on, a couple browsers open with multiple tabs (like 20-30). Then you decide to take a break and play a game.

Don't you think you are being just a little over-critical of the article, the data, and the purpose it intended to serve?

Of course if you task your computer with a bajillion background tasks then more cores will never hurt your performance.

What i got out of the article is that more cores is not necessary...if you are on a limited budget and making the choice of "do I upgrade GPU or CPU" then these kinds of critical reviews are invaluable to helping make decisions.

I found that interesting to digest.

Personally I'd like to see a comparable article devoted to CUDA vs. CPU with transcoding. When using TMPGEnc should I buy a GTX480 to go with my Q6600 or should I pair my GTX460 with a i7 950?

Anything that involves identifying a compute bottleneck and mitigating it is helpful. The same goes with "SSD and slower CPU or faster CPU with a spindle-drive"...the answer is always app-dependent.

At least THG took a significant stab at adding data to the CPU/GPU balance debate.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
So, for the CPU "size matters" crowd, a scenario. At what point will the CPU make a GTX-460 run as fast in games as a GTX-480?

If I, for example, have an E8200 @ 3.6GHz, running 1280*1024 would the 460 be just as fast as the 480 for me? I realize that the 460 is all I'd need, but that's not the question. Would it be just as fast? Or, would the 480 still run faster?

What about @ 1920*1200? Would an i7930 @ 3.6GHz + gtx-460 be faster than the E8200 @ 3.6Ghz + gtx-480? At what point does the CPU just become too crappy to justify a top card?
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
What i got out of the article is that more cores is not necessary...if you are on a limited budget and making the choice of "do I upgrade GPU or CPU" then these kinds of critical reviews are invaluable to helping make decisions.

I found that interesting to digest.

The article doesn't really help anyone who plans on upgrading imo. Most people who are looking to upgrade the CPU are already using dual cores (C2D 1.86 - 3.0ghz variety - E6300 - E8400) or low end quads at stock speeds (such as Q6600/6700). These people are wondering if upgrading the GPU is worth it from their 4850/4870/GTX260, or if they are going to be bottlenecked by their slower CPUs. In other words, it would be completely wasteful to get a GTX480 for a stock E6600/Q6600 as such systems would produce almost identical framerates with a slower GTX460/5850 videocard, making a GTX480 wasteful.

From that perspective, the article did little to help these users decide what to upgrade. It would have been far better to see various systems such as C2D 1.86, C2D 3.0ghz, Core i3/i5 @ 4.0ghz, Athlon X4s + 4850 compared to the same CPUs with GTX460/480/5870 and then tested SLI/CF setups too. Then we would have seen which GPUs are wasteful for which CPUs and what's the minimum modern CPU clock speed/core count for modern games (not just FPS variety either).

Plus, you can't compare an i5 dual core processor to a dual core Phenom or C2D processor due to the differences in performance per clock and the effects of shared 8mb cache. And like I said, they didn't include minimums in most of their graphs - CPU plays a large role for minimum framerates.

This article would have been great if Xbitlabs, LegionHardware, PCgameshardware and Techspot already didn't produce far superior CPU/GPU articles. However, since the results of these websites constantly contradict the predominant view on our forum that CPU speed is not important, I only see Toyota, myself and a handful of others linking to them (with BFG on many occassions ignoring results from all 4 of those websites because they show both CPU frequency and core count dependence in a large variety of games; and they focus on minimum framerates - a metric BFG largely dismisses as 'inaccurate').

So we have 4 independent sources which continue to show that CPU speed is important and 1 source that shows that it isn't (on top of that using a $130 videocard paired with a $200 CPU to prove their point). It's almost the same as Wreckage trying to find 1-2 outlier benchmarks where a stock GTX460 beat an HD5870 and then claiming that GTX460 is as fast as an HD5870. Bottom line is, every game is different. The games one plays should be tested separately in order for us to answer if CPU or GPU is more important for a particular game.

Again, what is so surprising about a GTX460 768mb being the bottleneck of a Core i5 3.0ghz system in games tested at 1920x1080 4/8AA? They mysteriously omitted testing GTX480/5970 to show that the importance of the CPU.
 
Last edited:

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
It really depends of the game engine and the resolution. Some of them need CPU power and multicores like Unreal Engine 3, CIV-5 and most RTS and Simulators but FPS most of the time at those high resolution with filters ON they really need the faster GPU.

Even if you don’t see a raise in avg fps you could see faster minimum fps with a faster CPU, not all of the times. Bellow 1920x1080 without filters CPU plays an important role and having a faster CPU will always bring more fps.
 

Patrick Wolf

Platinum Member
Jan 5, 2005
2,443
0
0
So, for the CPU "size matters" crowd, a scenario. At what point will the CPU make a GTX-460 run as fast in games as a GTX-480?

1.If I, for example, have an E8200 @ 3.6GHz, running 1280*1024 would the 460 be just as fast as the 480 for me? I realize that the 460 is all I'd need, but that's not the question. Would it be just as fast? Or, would the 480 still run faster?

2. What about @ 1920*1200? Would an i7930 @ 3.6GHz + gtx-460 be faster than the E8200 @ 3.6Ghz + gtx-480?
3.At what point does the CPU just become too crappy to justify a top card?

1. Wouldn't play any smoother. The 480 would just net you more FPS, unless the game has a cap.
2. E8200 + 480 is faster, unless the game benefits drastically from 3+ cores which is very unlikely.
3. Depends on the game. A better question would be, at what point will developers start optimizing their engines for quad/hex core CPU's? Xbox 3, PS4 era maybe?
 
Last edited:

cusideabelincoln

Diamond Member
Aug 3, 2008
3,275
46
91
The article doesn't really help anyone who plans on upgrading imo. Most people who are looking to upgrade the CPU are already using dual cores (C2D 1.86 - 3.0ghz variety - E6300 - E8400) or low end quads at stock speeds (such as Q6600/6700). These people are wondering if upgrading the GPU is worth it from their 4850/4870/GTX260, or if they are going to be bottlenecked by their slower CPUs. In other words, it would be completely wasteful to get a GTX480 for a stock E6600/Q6600 as such systems would produce almost identical framerates with a slower GTX460/5850 videocard, making a GTX480 wasteful.

From that perspective, the article did little to help these users decide what to upgrade. It would have been far better to see various systems such as C2D 1.86, C2D 3.0ghz, Core i3/i5 @ 4.0ghz, Athlon X4s + 4850 compared to the same CPUs with GTX460/480/5870 and then tested SLI/CF setups too. Then we would have seen which GPUs are wasteful for which CPUs and what's the minimum modern CPU clock speed/core count for modern games (not just FPS variety either).

Plus, you can't compare an i5 dual core processor to a dual core Phenom or C2D processor due to the differences in performance per clock and the effects of shared 8mb cache. And like I said, they didn't include minimums in most of their graphs - CPU plays a large role for minimum framerates.

This article would have been great if Xbitlabs, LegionHardware, PCgameshardware and Techspot already didn't produce far superior CPU/GPU articles. However, since the results of these websites constantly contradict the predominant view on our forum that CPU speed is not important, I only see Toyota, myself and a handful of others linking to them (with BFG on many occassions ignoring results from all 4 of those websites because they show both CPU frequency and core count dependence in a large variety of games; and they focus on minimum framerates - a metric BFG largely dismisses as 'inaccurate').

So we have 4 independent sources which continue to show that CPU speed is important and 1 source that shows that it isn't (on top of that using a $130 videocard paired with a $200 CPU to prove their point). It's almost the same as Wreckage trying to find 1-2 outlier benchmarks where a stock GTX460 beat an HD5870 and then claiming that GTX460 is as fast as an HD5870. Bottom line is, every game is different. The games one plays should be tested separately in order for us to answer if CPU or GPU is more important for a particular game.

Actually you can look at the article and conclude that CPU speed/core count isn't particularly important for a GTX 460 class video card. That seems like valuable information to me. This means there should be room to upgrade the video card for users running such systems.

The CPU utilization numbers shown in the games are also quite telling and informative.
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
3DVagabond said:
So, for the CPU "size matters" crowd, a scenario. At what point will the CPU make a GTX-460 run as fast in games as a GTX-480?

1.If I, for example, have an E8200 @ 3.6GHz, running 1280*1024 would the 460 be just as fast as the 480 for me? I realize that the 460 is all I'd need, but that's not the question. Would it be just as fast? Or, would the 480 still run faster?

2. What about @ 1920*1200? Would an i7930 @ 3.6GHz + gtx-460 be faster than the E8200 @ 3.6Ghz + gtx-480?
3.At what point does the CPU just become too crappy to justify a top card?




1: If we take AVP in DX-11 with tessellation enable, the faster GPU will bring more frames.

2: First of all, both CPUs at 3.6GHz are more than enough for today Games at that high resolution the GTX480 will score more frames.

3: If you play at low resolutions 1280 and below with DX-9/10 games you need a fast CPU, but in DX-11 (with Tessellation enabled) Games you need the faster GPU.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |