The total failure of SLI

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

nRollo

Banned
Jan 11, 2002
10,460
0
0
You can waffle and misdirect all you like Bar81.

The fact of the matter is that the developer of Splinter Cell Chaos Theory is not using PS2 at all, PS1.1 for you. If I would have told you a big game like that would ignore your X800XT PE a month ago, you would have barked,"No way!". Yet here it is.

AFAIK, there's no official support of HDR for Far Cry with ATI, and you saying "Without AA it's unacceptable" is a matter of your opinion only. If you're comfortable spending a fair amount of money on a card you can't see high end effects with, that's fine.

PS2 is NOT the standard, and MS determines that, not you. :roll:

Because Stencil Shadows cause a frame rate hit is no reason to say they're worthless. Another high end feature you'll never see at ANY framerate because you put your money on 9700Pro rev.2.

Like I've said: you can say the 2002 feature set of the X800s is "good enough" but more and more we're hearing of games coming out with nV40 only features because that is the way of the future, and 2002 was a long time ago. You can bet the rent ATI wishes they had a SM3, SLI ready part on the market, because they are scrambling and spending millions as I type this trying to get there. If it weren't important, they wouldn't bother.

 

Bar81

Banned
Mar 25, 2004
1,835
0
0
Keep on hoping and living on that wing and a prayer Rollo. Okay so Splinter Cell Chaos Theory supports SM3.0 but not SM2.0. Way to prove your point with that fine isolated example from the hundreds of games released since PS3.0 hit the market.

Nobody said you could do HDR with an ATI card in FarCry. Do you have a reading comprehension problem? Okay, I guess it's cool to have new features that disable useful features that everyone with a high end card uses.

SM2.0 IS the standard because outside of ONE game, they all include it if they're going to include support, rather than SM3.0. Get a grip on reality.

Yeah, I'm heartbroken I can't play a practical slide show with that *sweet* feature. I got my card to play at framerates that don't simulate a strobe light thank you very much.

Way to prove your point about the "way of the future" The whole THREE upcoming games to support SM3.0

Rollo, you really make one convincing argument, wait, no you don't.
 

VirtualLarry

No Lifer
Aug 25, 2001
56,570
10,203
126
Originally posted by: Bar81
That's a BIG assumption. If that's the route SCCT's developer's go I highly doubt the entire industry would follow one asinine programming team's decision. In fact, besides SCCT there is NO other game on the horizon that is excluding SM2.0 and including SM3.0
I agree. I don't know if they are: 1) trying to make a political statement, 2) trying to conserve developer resources/costs/time-to-market, 3) lazy, or 4) paid off.

Originally posted by: Rollo
The fact is developers are coding specifically for nV40s, and they are not for R420s. If you throw out HL2, ATIs GITG logo is as rare as Bigfoot, as are their vendor relations.
Sorry, that's the way it is now.
While I'll readily admit that NV's dev-relations are probably better than ATI's - aren't you forgetting something? 3Dc? Normal-map compression would seem to me to be a very useful feature, especially considering what you can do with it. And ignoring ATI's support for Geometry Instancing, only because their cards aren't "full SM3.0 spec", in those cases in which it would be usful, would be a totally asinine dev decision, IMHO.

But there is something to be said for the convenience and economy-of-scale that standards provide, and if the industry moves to adopt SM3.0 as a baseline standard, then ATI will simply have to adapt to that and deal. But that would also mean cutting-off nearly the entire installed-based of existing consumer gaming PC display hardware, save for those NV early-adoptors/enthusiasts.

Personally, I think that DX10/WGF 1.0/whateverMScallsIT, may make both[/] of them obsolete in one fell swoop. Wouldn't that be a laugh riot?
 

VirtualLarry

No Lifer
Aug 25, 2001
56,570
10,203
126
Originally posted by: Bar81
(2) IN FARCRY you must have SM3.0 to implement HDR.
... which was completely a developer decision, there is nothing inherent in SM3.0 that I am currently aware of that is absolutely required to impliment HDR, as there are some HDR demos that work on ATI SM2.0(b) cards. (IOW, I'm agreeing with you.)
Originally posted by: Bar81
The blatantly ignorant comment is claiming that no game takes a nose dive when implementing SM3.0 features. Purchase a clue and take a look at Riddick 2.0++ numbers.
I know that you were addressing Gamingfreek here, but ... I'm curious myself, why the performance nose-dives like that. Could it be, that they started playing with branching in shaders, and used it too much, to the detriment of performance, because it causes pipeline bubbles/stalls/etc.? (Kind of like branches in the P4's NetBurst pipeline?)

I wonder if NV's next part, the one with the rumored 32/24 (virtual/real) pipelines or whatever, really is going to implement a form of SMT in the GPU pipelines, in order to overcome that deficit that heavy use of SM3.0/branching causes?

You know what would really make me LOL? Much like with NV's FX 5900-series parts and their relatively-poor DX9 shader performance - what if the current crop of 6600/6800 card's SM3.0 performance is actually similar, that when SM3.0 features (like branching shaders) start to get used heavily, the performance just simply tanks, because the cards can't handle the performance necessary to truely use that feature? IOW, the current crop of NV high-end cards, may actually be just as useless running "real" SM3.0 code, as the NV3x series was when running DX9 shaders. Wouldn't that be a hoot... Oh Rollo...
 

VirtualLarry

No Lifer
Aug 25, 2001
56,570
10,203
126
Originally posted by: Rollo
You can waffle and misdirect all you like Bar81.
The fact of the matter is that the developer of Splinter Cell Chaos Theory is not using PS2 at all, PS1.1 for you. If I would have told you a big game like that would ignore your X800XT PE a month ago, you would have barked,"No way!". Yet here it is.
And Ben thought I was whacko for suggesting that these sorts of things might happen, due to overly-intimate "developer relations" with NV. Hmm. L@@k Ben! "Conspiracy"!
Originally posted by: Rollo
AFAIK, there's no official support of HDR for Far Cry with ATI
But there could be, if CryTek coded it that way. It's not as though the card (ATI) lacks the raw features to be able to do so. They would likely simply have to implement a second code-path to deal with precision and overflow issues.
Originally posted by: Rollo
Because Stencil Shadows cause a frame rate hit is no reason to say they're worthless. Another high end feature you'll never see at ANY framerate because you put your money on 9700Pro rev.2.
You do realise that there is a workaround for lacking stencil shadows in hardware, right? It just cuts your frame-rates way down, AFAIK, because you have to render the same scene in multiple passes.
Originally posted by: Rollo
Like I've said: you can say the 2002 feature set of the X800s is "good enough" but more and more we're hearing of games coming out with nV40 only features because that is the way of the future, and 2002 was a long time ago. You can bet the rent ATI wishes they had a SM3, SLI ready part on the market, because they are scrambling and spending millions as I type this trying to get there. If it weren't important, they wouldn't bother.
If my theory about pipeline bubbles/stalls with NV4x parts with use of SM3.0 branching is true, then ATI doesn't really have much to worry about, so long as they've got that issue covered in their next-gen part, that will (hopefully) be released (non-paper) by the time that games that are actually actively using those features are more commonplace. It will be a total replay of DX9 shaders and ATI's 24-bit vs. NV's 16/32-bit precision issue. (If you own an NV3x-based card, do you choose banding and graphical anomalies, or deficient performance? Hey, that's the way that it's meant to be played - with NVidia...) ATI actually got something right, hitting the market on-target at that point, and I have no doubt that they will be able to do it again, even if they are taking a bit of a PR hit over lack of SM3.0 in the meantime. (But no real technological hit - other than in the few cases that are caused by an artifact of NV's marketing plans, and no real other reason.)
 

Grit

Member
Nov 9, 2002
130
0
76
I read every post, and I still have tearing in HL2 (and WoW and SW:KotOR, but NOT in CoD:UO) on my SLI system, and NO clue how to fix it or if other people see it. Only one other person (the thread starter) reported the problem.

- Can anyone else see it?

- Can anyone tell me how to fix it without buying a new monitor?

Please, no criticism of my hardware or questions as to why I selected it. If it won't work, please point out the defective part and what I should purchase to fix the problem.

Thanks all.
 

Bar81

Banned
Mar 25, 2004
1,835
0
0
Originally posted by: VirtualLarry


You know what would really make me LOL? Much like with NV's FX 5900-series parts and their relatively-poor DX9 shader performance - what if the current crop of 6600/6800 card's SM3.0 performance is actually similar, that when SM3.0 features (like branching shaders) start to get used heavily, the performance just simply tanks, because the cards can't handle the performance necessary to truely use that feature? IOW, the current crop of NV high-end cards, may actually be just as useless running "real" SM3.0 code, as the NV3x series was when running DX9 shaders. Wouldn't that be a hoot... Oh Rollo...

That's it exactly what I am starting to think may happen.
 

Bar81

Banned
Mar 25, 2004
1,835
0
0
Originally posted by: Grit
I read every post, and I still have tearing in HL2 (and WoW and SW:KotOR, but NOT in CoD:UO) on my SLI system, and NO clue how to fix it or if other people see it. Only one other person (the thread starter) reported the problem.

- Can anyone else see it?

- Can anyone tell me how to fix it without buying a new monitor?

Please, no criticism of my hardware or questions as to why I selected it. If it won't work, please point out the defective part and what I should purchase to fix the problem.

Thanks all.


Did you enable v-sync?
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: Bar81
Originally posted by: VirtualLarry


You know what would really make me LOL? Much like with NV's FX 5900-series parts and their relatively-poor DX9 shader performance - what if the current crop of 6600/6800 card's SM3.0 performance is actually similar, that when SM3.0 features (like branching shaders) start to get used heavily, the performance just simply tanks, because the cards can't handle the performance necessary to truely use that feature? IOW, the current crop of NV high-end cards, may actually be just as useless running "real" SM3.0 code, as the NV3x series was when running DX9 shaders. Wouldn't that be a hoot... Oh Rollo...

That's it exactly what I am starting to think may happen.


And the fantasy ATI team takes another long pull on the hookah........
:roll:

Fan 1: cough "Dude, I have a theory that because nVidia offered lower performance at something else in the past, they'll fail at something totally unrelated in the future!" cough cough cough

Fan 2: "Riiighhhtt onnn dude! Our X800s will be da bomb on that SM3 stuff, because the cards the games were developed on probably won't run it, because I hope I'm not wrong!" cough cough "I am so high......"
 

Bar81

Banned
Mar 25, 2004
1,835
0
0
Keep on dreaming Rollo. Unlike you, I deal in reality. Like I've said MANY times so far, you're just guessing and unlike you I know that nobody knows and I don't try to pretend like I can tell the future.
 

VirtualLarry

No Lifer
Aug 25, 2001
56,570
10,203
126
Originally posted by: Rollo
And the fantasy ATI team takes another long pull on the hookah........
:roll:
At least I'm not hallucenating about a non-functional chunk of 22 million transistors and telling people that they work fine, absent any real proof. (At least I think that I've been quite clear that I'm speculating about the SM3.0 performance on current parts, not stating it as a fact, pending more "evidence", although what data is out there so far, doesn't exactly look like good news for current high-end NV part owners.)

Btw, this is some good sh*t, you want a drag? *inhales* Ruby told us all that we could stop by her place later, too, if we saved some for her. Mwhawhaha.
 

housecat

Banned
Oct 20, 2004
1,426
0
0
Originally posted by: VirtualLarry
Originally posted by: Rollo
And the fantasy ATI team takes another long pull on the hookah........
:roll:
At least I'm not hallucenating about a non-functional chunk of 22 million transistors and telling people that they work fine, absent any real proof. (At least I think that I've been quite clear that I'm speculating about the SM3.0 performance on current parts, not stating it as a fact, pending more "evidence", although what data is out there so far, doesn't exactly look like good news for current high-end NV part owners.)

Btw, this is some good sh*t, you want a drag? *inhales* Ruby told us all that we could stop by her place later, too, if we saved some for her. Mwhawhaha.

Yeah well whatever we get out of Purevideo is a complete freebie over what you got out of your fast R300.
The X800s dont have any Purevideo DLLs awaiting release by MS.. nothing at all.

And less performance headroom to be pulled out of the Radeons as well.

While the NV40 gen has alot to look forward too (all while at least matching ATI on performance or beating it with SLI), a die hard group such as yourself only have the SM3 core from ATI to look forward too.

We have that already here, and more.

Quit acting like 22million transitors that might be useless is some kind of detrement.. you couldnt even get your hands on the "NEW" ATI for a long time.. and even then it was outdated in comparison.
Its not like ATI cut you a killer deal on the X800s because they "saved" 22million transistors.

Still paid out the arse like a fanATIc with maple leaf underpants for less capable hardware.
 

Acanthus

Lifer
Aug 28, 2001
19,915
2
76
ostif.org
Originally posted by: ribbon13
Not true. I imagine a x16 PCIe RAID card with enough solid state drives could use it all, and would be useful in servers. Too bad none exist in the market. Yet. Heh.

As for graphics, the new Wildcats will push 8x slots to thier limits. But for mainstream, but will be a few years for that bandwidth to be actually useful. I'll give you that. That is the very reason why I want a K8WE so bad. I wonder how many years it will be before I actually NEED to upgrade. heh.

With an x16 PCI-E RAID card, you couldn't use SLI, so that makes no sense.

Graphics cards that are not for professional 3d apps do not use even 8x agp, let alone PCI-E 16x.

Edit: changed could, to couldn't.
 
Feb 3, 2001
5,156
0
0
Originally posted by: Insomniak
This is why I didn't jump into SLI off the bat. I want to see the problems people will find and what fixes/attention they get.

Yeah, I dunno what reviews he's reading that rave about SLI. Pretty much all the reviews I've seen show it to be minimally effective in most cases, and WORSE in others. Sure, *editorials* scream about it being the end-all be all of performance, but the numbers seem to indicate it's more like the end-all of your bank account more than anything else.

I doubt SLI will be worth even *thinking about* for another YEAR.

Jason
 

ScrewFace

Banned
Sep 21, 2002
3,812
0
0
Having vsync on is very important to the enjoyability of a game. I find no slowdown whatsoever with vsync enabled. I've found that if you benchmark a game with vsync off and are averaging 113 fps+ when you enable vsync your game will run just like a movie: nice and smooth.
 

Genx87

Lifer
Apr 8, 2002
41,091
513
126
These arguments remind me of the R300 vs NV30\35 flames.

These rules will apply in these kinds of situations.

Todays top hardware will not play tomorrows games very well. R300 isnt what I would call a SM2.0 performance champion. We havent seen a real shader heavy game yet and it crumbles pretty good under Far Cry. Which is probably one of the most shader heavy game out there. In the end it ran DX8 games great just like the FX series.

Likewise todays 6800s will run SM2 and DX8 series games just great. But we will have to wait and see on games like Stalker and the Unreal 3 engine. unreal 3 engine from what I can tell will be the real benchmark for SM3 shaders. And from what I can gather on the internets. The 6800 Ultra was running it at speeds that are considered not playable. But useable.

So in the end chances are the 6800s will be the R300s of their time. They wont be able to play SM3 shader heavy games very well.

But that doesnt mean you shouldnt consider the feature set when purchasing. There will be games using SM3 and they will be playable on the 6800s. But I dont know how intensive they will be so dont consider it a validation of the hardware.

The next generation cards will probably be decent SM3 performers.

My own opinion is ATI is beating a dead horse. It sounds like their next card will just be another revision of the R300 with a higher clock. I cant reward companies for sitting on their arse and not innovating. Nvidia pushed the envelope with the NV40. I expect their next GPU to push it further. Thus my next purchase in about 12-18 months will probably be nvidia.

 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
Originally posted by: Genx87
These arguments remind me of the R300 vs NV30\35 flames.

These rules will apply in these kinds of situations.

Todays top hardware will not play tomorrows games very well. R300 isnt what I would call a SM2.0 performance champion. We havent seen a real shader heavy game yet and it crumbles pretty good under Far Cry. Which is probably one of the most shader heavy game out there. In the end it ran DX8 games great just like the FX series.

Likewise todays 6800s will run SM2 and DX8 series games just great. But we will have to wait and see on games like Stalker and the Unreal 3 engine. unreal 3 engine from what I can tell will be the real benchmark for SM3 shaders. And from what I can gather on the internets. The 6800 Ultra was running it at speeds that are considered not playable. But useable.

So in the end chances are the 6800s will be the R300s of their time. They wont be able to play SM3 shader heavy games very well.

But that doesnt mean you shouldnt consider the feature set when purchasing. There will be games using SM3 and they will be playable on the 6800s. But I dont know how intensive they will be so dont consider it a validation of the hardware.

The next generation cards will probably be decent SM3 performers.

My own opinion is ATI is beating a dead horse. It sounds like their next card will just be another revision of the R300 with a higher clock. I cant reward companies for sitting on their arse and not innovating. Nvidia pushed the envelope with the NV40. I expect their next GPU to push it further. Thus my next purchase in about 12-18 months will probably be nvidia.

This is exacltly the reason I don't spend $500 for top of the line cards, and I pity the fool who spends twice that much on SLI.
 

Gamingphreek

Lifer
Mar 31, 2003
11,679
0
81
unreal 3 engine from what I can tell will be the real benchmark for SM3 shaders.

While that uses SM3 that is not indicative of all SM3 games. Remember you are looking at a game that is not supposed to be released for another 1-2years so it is going to be slow.

fool who spends twice that much on SLI

While that is your opinion i would have a hard time believing that people who buy/bought SLI are fools.

It sounds like their next card will just be another revision of the R300 with a higher clock.

Where have you been the past ~3 months. Both ATI and Nvidia are released completely new architectures for next gen. Nothing will be the same. What you might be referring to is the 512mb texture memory GPU's which are simply this generation card with anothe 256mb tacked on.

-Kevin

-Kevin
 

Gamingphreek

Lifer
Mar 31, 2003
11,679
0
81
Originally posted by: redmyst
I've been struggling for weeks trying to resolve the problem of enabling Vsync in my SLI configuration (specifically for HL2). This weekend I figured out how, but more importantly, I learned that SLI is a COMPLETE AND TOTAL FAILURE in it's current offering. Read on.

I found a way to enable Vsync in SLI mode. Here's the 'trick' and it doesn't always work. If I setup a "applicaiton profile" for HL2 and make the vsync-on setting (inside the Nvidia 'performance' settings). Plus enable Vsync w/in the HL2 video options. Doing this I can get Vsync to hold for SLI mode.

That's the good news.

So here is the test:

1) I pick a point on the HL2 map which is 'tough' to render
2) Save the game
3) Collect the data using cl_showfps from the 'console' window.

Data (for same map point):
With non-SLI mode
vsync off -> 40fps
vsync on -> 32fps

With SLI enabled
vsync off -> 60fps (cool, SLI rocks just like all the reviews say)
vsync on -> 30fps (WHAT THE ....???)

So basically w/ Vsync turned ON and SLI enabled I get the EXACT SAME PERFORMANCE as I do without SLI. Nice. I'm so happy...

I tried this experiement w/ 4 of the last Nvidia Beta drivers; starting w/ 67.03 and finishing with 71.24. All the same results (+/- 3fps)

This is a HUGE flaw w/ SLI. I HOPE it is a drive issue, and not a hardware one. All I can do is pray for a fix or wait for some poor sap on Ebay to buy my two 'working' 6600 GTs.

But this entire thing begs the question -> WHY HASN'T ANYONE ELSE REPORTED ON THIS ISSUE? I see very little chatter online about this Vsync issue. The image tearing exists on both my HDTV, LCD and CRT. And it looks terrible. So why am I the only guy out here screaming about this?

Granted. The image tearing does look a lot worse on my 50" TV running at 16x9. That is a WIDE range of tearing to look at... but still?!?

Seeing as this thread has sort of strayed OT. Just out of curiosity what refresh rate are you using? This is not a flaw, or a bug, simply VSYNC. Ever wonder why virtually no one uses this.

-Kevin
 

Acanthus

Lifer
Aug 28, 2001
19,915
2
76
ostif.org
Originally posted by: Grit
I read every post, and I still have tearing in HL2 (and WoW and SW:KotOR, but NOT in CoD:UO) on my SLI system, and NO clue how to fix it or if other people see it. Only one other person (the thread starter) reported the problem.

- Can anyone else see it?

- Can anyone tell me how to fix it without buying a new monitor?

Please, no criticism of my hardware or questions as to why I selected it. If it won't work, please point out the defective part and what I should purchase to fix the problem.

Thanks all.

The solution? Dont run $1000 worth of graphics cards on a $200 mobo and expect perfect image quality on your $150 display.
 

eastvillager

Senior member
Mar 27, 2003
519
0
0
Looks like he does have an issue. V-synch in the SLI system is actually capping framerate at HALF of the refresh, instead of the same number. Perhaps it is the FPS counter itself being confused by SLI and reporting only for one card?

IMHO, 3dfx proved that SLI was a waste of effort in any situation that a single board solution yielded acceptable performance. Nvidia buys the tech... and here we go again, lol. Those who don't remember history are doomed to repeat it, isn't that how the saying goes?

For the record, I've got a 6800gt in one system and an 800pro in another, and both deliver the performance I require as a gamer.

Oh, and I usually enable v-synch by default. Too many games I've played show visible tearing, which I can't stand.
 

cryptonomicon

Senior member
Oct 20, 2004
467
0
0
i never thought much of SLI from the start because of the cost-value issues that arose.
solo vid cards increase in performance and decrease in price too fast for SLI to stay competitive. why buy two cards and consume more power when you can buy one that is fast enough, and still have some cash for the next generation.

of course, you're allowed to disagree


on topic:
I couldn't play without vsync. I would smash the vcard to bits / throw it out the window, so i understand how severe this problem is to some of you. also i should warn you, HL2 isnt really the best game to test the whole FPS thing with. its hard to explain, but i have seen cl_showfps doing some funky things, like displaying my framerate at a set amount and then randomly cutting it in half, on and off. and i'm not using SLI or a 6800.
 

grit621

Member
Jun 14, 2001
46
0
0
Originally posted by: Acanthus
Originally posted by: Grit
I read every post, and I still have tearing in HL2 (and WoW and SW:KotOR, but NOT in CoD:UO) on my SLI system, and NO clue how to fix it or if other people see it. Only one other person (the thread starter) reported the problem.

- Can anyone else see it?

- Can anyone tell me how to fix it without buying a new monitor?

Please, no criticism of my hardware or questions as to why I selected it. If it won't work, please point out the defective part and what I should purchase to fix the problem.

Thanks all.

The solution? Dont run $1000 worth of graphics cards on a $200 mobo and expect perfect image quality on your $150 display.

So much for not criticizing my hardware. The monitor is a Viewsonic VP201b, which commonly retails for about $700.00.
 

Scoobyd00

Golden Member
Sep 11, 2002
1,386
14
81
I play alot of Ravenshield. Where you have the choice (albeit in the .INI file) to ruin vsync on or off.

I play with a 6800GT with Vsync off I can run about 180-230 FPS depend on the map and location. The only problem being when looking scoped thru gas or smoke you get mouse lag and can end up dead

So I keep it the Vsync locked on in the card drivers for much smoother game play (IMO) but it always stays at 120 FPS (refresh rate of my monitor).

So, to me Vsync serves a purpose
 

Acanthus

Lifer
Aug 28, 2001
19,915
2
76
ostif.org
Originally posted by: grit621
Originally posted by: Acanthus
Originally posted by: Grit
I read every post, and I still have tearing in HL2 (and WoW and SW:KotOR, but NOT in CoD:UO) on my SLI system, and NO clue how to fix it or if other people see it. Only one other person (the thread starter) reported the problem.

- Can anyone else see it?

- Can anyone tell me how to fix it without buying a new monitor?

Please, no criticism of my hardware or questions as to why I selected it. If it won't work, please point out the defective part and what I should purchase to fix the problem.

Thanks all.

The solution? Dont run $1000 worth of graphics cards on a $200 mobo and expect perfect image quality on your $150 display.

So much for not criticizing my hardware. The monitor is a Viewsonic VP201b, which commonly retails for about $700.00.

Since my P95f+ which costs less than half of that can do 1600x1200 @ 85hz, you mustn't have your refresh rate properly configured, or its an LCD.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |