X800XTPE OR BGF 6800 Ultra

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: DAPUNISHER
Originally posted by: Rollo
Errr, like I said, Doom3 will probably be the most licensed fps engine this year and since it's not out yet, I don't see how it can be obsolete.

As far ATI being faster in todays "most advanced D3d games", ATI can't even PLAY todays most advanced D3d games. DX9c is due next month, ATI doesn't have it.

Watch and learn, there is a reason ATI is selling X800XTs below MSRP at launch. Times are changing, and people are going to want to have SM3 this year. All ATI users will be able to say is "Well, we have fast brilinear".
(that does produce image degradation, according to owners and reviewers)
That's either prophetic vision, or famous last words brudda


I have been going out on a limb a bit lately, haven't I?
I just can't think of any other reason X800XTs would be less than MSRP at launch, and no one else has suqqested any.
 

SithSolo1

Diamond Member
Mar 19, 2001
7,740
11
81
MSRP Manufacturer's Suggested Retail Price

I hope to god everything sells for under that because they are always too high.
 

VisableAssassin

Senior member
Nov 12, 2001
767
0
0
Originally posted by: ZimZum
If you need a PCIE card I would go with the X800xt. Its a true PCIE card where as nVidias is just an agp card with a PCIE Bridge slapped onto it. Kind of defeats the purpose of PCIE.

The 6800 series is still slower than the x800 series at PS2.0. The question is will nVidia be able to run sm3.0 code faster than ATI can run ps2.0 code. Since there isn't really anything Devs can do with sm3.0 that they cant do with 2.0.

I coulda swore they ditched the bridged design on the 45's if not...oh well... not one person here can fully say which is faster than which as no one has either card.
Would be nice if everyone would stop speculating and wait for both cards to be out in full force with mature drivers to be pitted against each other.
 

futuristicmonkey

Golden Member
Feb 29, 2004
1,031
0
76
Originally posted by: Rollo
Originally posted by: EngenZerO
I was actually in the same situation you were in.

I had an x800xt on order from compusa and then for some odd reson, I decided I would rather have the nvidia card this go around. I cancelled my order from compusa and ended up purchasing the BFG 6800 Ultra. Granted I spend an extra 100 dollars I personally believe nvidia will win this round in the long run. It will take them a few months to iron out the driver issues but I am a hoe for new technology. Dunno, I could have made the bigged mistake but what is done is done.


Err, I don't know how you could have made "the biggest mistake":
1. There are no games a 6800U won't perform well enough on that you will miss the difference in performance of an X800XT
2. If you turn off the brilinear filtering that compromises IQ in some situations, the cards should be about equal
3. The impact of SM3 over the life of the card for you remains to be seen, and you wouldn't have it at all with the X800XT. ATI couldn't figure it out and is stuck with the limited features of DX9b, which they've had for 2 years.


Rollo, you are sounding like more of an nvidiot every time I see one of your posts. I remember that you once wrote something like "I'm bored with the R300 core. Why would i buy a core that's 2 years old?" or something like that. Anyways, how the hell can you get bored with a core? wtf? And both of you, you're going to spend 100 dollars more just to get a few fps here, and a few fps less there? I wish I had that kind of money right now.


Oh ya and I almost forgot :

All ATI users will be able to say is "Well, we have fast brilinear".
(that does produce image degradation, according to owners and reviewers)

So you're saying that you can actually see, while you're playing games, those optimizations at work? The site that found out about ATi's optimizations said that they could barely see it. Unless you play with a giant 6x magnifying glass between you and your monitor, I dont theink you'll see anything wrong. And, if those optimizations are so bad and so noticeable, why didnt you see them before? Also, Nvidia has their "brilinear" which does degrade iq noticeably. Ati's "trylinear" (as ive seen it called" was only visible in comparisons of coloured mipmaps and real in-game "footage" (for lack of a better term).


Lastly
As far ATI being faster in todays "most advanced D3d games", ATI can't even PLAY todays most advanced D3d games. DX9c is due next month, ATI doesn't have it.

Please tell me where all of todays most advanced 3d games are, that ATi can't play. Can you predict the future? Exactly. So don't talk about something that hasn't happened yet.
 

stnicralisk

Golden Member
Jan 18, 2004
1,705
1
0
Originally posted by: VisableAssassin
Originally posted by: ZimZum
If you need a PCIE card I would go with the X800xt. Its a true PCIE card where as nVidias is just an agp card with a PCIE Bridge slapped onto it. Kind of defeats the purpose of PCIE.

The 6800 series is still slower than the x800 series at PS2.0. The question is will nVidia be able to run sm3.0 code faster than ATI can run ps2.0 code. Since there isn't really anything Devs can do with sm3.0 that they cant do with 2.0.

I coulda swore they ditched the bridged design on the 45's if not...oh well... not one person here can fully say which is faster than which as no one has either card.
Would be nice if everyone would stop speculating and wait for both cards to be out in full force with mature drivers to be pitted against each other.

Pretty sure than the per pixel dynamic self lighting shown in the Unreal3 demo is FP32 only.
 

ZimZum

Golden Member
Aug 2, 2001
1,281
0
76
Originally posted by: VisableAssassin
Originally posted by: ZimZum
If you need a PCIE card I would go with the X800xt. Its a true PCIE card where as nVidias is just an agp card with a PCIE Bridge slapped onto it. Kind of defeats the purpose of PCIE.

The 6800 series is still slower than the x800 series at PS2.0. The question is will nVidia be able to run sm3.0 code faster than ATI can run ps2.0 code. Since there isn't really anything Devs can do with sm3.0 that they cant do with 2.0.

I coulda swore they ditched the bridged design on the 45's if not...oh well... .

No, its still there.

http://www.hardwareanalysis.com/content/article/1720/
 

SithSolo1

Diamond Member
Mar 19, 2001
7,740
11
81
Originally posted by: stnicralisk
Originally posted by: VisableAssassin
Originally posted by: ZimZum
If you need a PCIE card I would go with the X800xt. Its a true PCIE card where as nVidias is just an agp card with a PCIE Bridge slapped onto it. Kind of defeats the purpose of PCIE.

The 6800 series is still slower than the x800 series at PS2.0. The question is will nVidia be able to run sm3.0 code faster than ATI can run ps2.0 code. Since there isn't really anything Devs can do with sm3.0 that they cant do with 2.0.

I coulda swore they ditched the bridged design on the 45's if not...oh well... not one person here can fully say which is faster than which as no one has either card.
Would be nice if everyone would stop speculating and wait for both cards to be out in full force with mature drivers to be pitted against each other.

Pretty sure than the per pixel dynamic self lighting shown in the Unreal3 demo is FP32 only.

You're probably right then again I doubt we'll see it before late 2006.
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: futuristicmonkey
Rollo, you are sounding like more of an nvidiot every time I see one of your posts. I remember that you once wrote something like "I'm bored with the R300 core. Why would i buy a core that's 2 years old?" or something like that. Anyways, how the hell can you get bored with a core? wtf? And both of you, you're going to spend 100 dollars more just to get a few fps here, and a few fps less there? I wish I had that kind of money right now.
1. You're entitled to your opinion 2. Let's say 3dfx put out the Voodoo 1 and it was the greatest thing since sliced bread. Then the next year they put out the V1a with a faster chip and RAM, and you bought that too. Then the next year they put out the same chip, no new features, just more pipes so it ran the games faster. How many times are you going to buy that same core for $400 while other companies are putting out more advanced chips? Would you get bored with it? Or would you keep happily paying them $400 year after year? If you can honestly say you would buy the same core over and over forever, you can honestly say it's not possible to get bored with a core. If you can't, and you'd buy something else sooner or later, all we differ on is how soon we get bored. 3. You don't see me paying over MSRP for a 6800. My 5800U runs everything fine, I'll buy a 6800U when it's under MMSRP 4. If you wish you had that kind of money, go earn it. If you're a kid, get a part time job. If you're a man, learn a new skill and increase you salary.


Oh ya and I almost forgot :

All ATI users will be able to say is "Well, we have fast brilinear".
(that does produce image degradation, according to owners and reviewers)

So you're saying that you can actually see, while you're playing games, those optimizations at work? The site that found out about ATi's optimizations said that they could barely see it.
Of course, you realize you just posted that the review sites said they can see it, which sort negates your point?

Unless you play with a giant 6x magnifying glass between you and your monitor, I dont theink you'll see anything wrong.
That's not what Cainam says. BTW- I have a 22" monitor, so I have more real estate to see flaws in a picture.

And, if those optimizations are so bad and so noticeable, why didnt you see them before? Also, Nvidia has their "brilinear" which does degrade iq noticeably. Ati's "trylinear" (as ive seen it called" was only visible in comparisons of coloured mipmaps and real in-game "footage" (for lack of a better term).

That's not true. The X800s IQ compared to the 9800 or 6800 (optomizations off) has been called worse. nVidia's brilinear may not be as good as ATIs brilinear, but at least they give you the ability to turn it off.

As far ATI being faster in todays "most advanced D3d games", ATI can't even PLAY todays most advanced D3d games. DX9c is due next month, ATI doesn't have it.

Please tell me where all of todays most advanced 3d games are, that ATi can't play. Can you predict the future? Exactly. So don't talk about something that hasn't happened yet.

I knew someone would misunderstand this. What I meant by ATI not being able to play the most advanced D3d is that they can't play them in SM3, the most advanced version of DX9. As for not being able to predict the future, you're right about that, but I don't have to to know ATI won't be able to play anything in SM3. Nor do I consider the E3 Far Cry demo a valid indication of all of the capabilities of DX9c SM3. One patched mostly PS1.1 game does not a vision of the future make. It's barely a taste, and you can bet there are games coming out in the next year that will show us more.

Make sense?
 

futuristicmonkey

Golden Member
Feb 29, 2004
1,031
0
76
Yes, it does make sense, but, I did understand what you meant by advanced 3d. I knew you were talking about SM3. I was going to write another thing about how ATi could put out a new gpu that supports SM3 b4 SM3 games come out, but, that's the kind of crap fanboys write. I am a fanboy of ATi, but not to the point where I'm blinded by my brand loyalty.
 
Apr 14, 2004
1,599
0
0
ATI can simply run SM3 games under the SM2 path. SM3 doesn't really add much to the table besides speed, which the XT already has enough of.
 

CaiNaM

Diamond Member
Oct 26, 2000
3,718
0
0
So you're saying that you can actually see, while you're playing games, those optimizations at work?

yes, it's easily visible in many areas. i posted a "still" shot where it's easily seen. it's more pronouneced when moving. depending on the type of game and/or the textures it may not be as obvious, but it's there.

The site that found out about ATi's optimizations said that they could barely see it. Unless you play with a giant 6x magnifying glass between you and your monitor, I dont theink you'll see anything wrong.

you think poorly, then. i have the card and it's easily seen - again, more prominent in some situations than others, but again, it's there.

And, if those optimizations are so bad and so noticeable, why didnt you see them before?

who didn't see them before?

Also, Nvidia has their "brilinear" which does degrade iq noticeably. Ati's "trylinear" (as ive seen it called" was only visible in comparisons of coloured mipmaps and real in-game "footage" (for lack of a better term).

really not much more so than ati's optimization. just depends on the situation. regardless, ati needs to allow the "opts" to be turned of where desired, as nvidia has now done.

as for how ati and nvidia competes, i won't take sides as i can't get an nv40 yet, but it's pretty clear that they're pretty even, and that a final decision really can't be made for some time - both sides need some driver refinements.
 

AnnoyedGrunt

Senior member
Jan 31, 2004
596
25
81
Originally posted by: ZimZum
If you need a PCIE card I would go with the X800xt. Its a true PCIE card where as nVidias is just an agp card with a PCIE Bridge slapped onto it. Kind of defeats the purpose of PCIE.

A video card of any type "kind of defeats the purpose of PCIE."

A video card uses bandwidth in one direction, from the CPU to the card, and the AGP bus is nowhere near its limit. Therefore, there is no benefit in going to PCIE from a video card/gaming standpoint.

However, PCIE will be good for people who have GbE or high speed drives on the bus, since there will be more bandwidth available than on the PCI bus. Also, I've heard that video/audio editing/encoding might be improved if someone makes a special PCIE card that can do the editing and encoding in hardware on the card. It's really for instances that require data transfer in both directions that PCIE will help.

For gaming though, from everything I've read, there would be a negligible difference in the performance between a card on PCI-E and the same card on AGP.

If you are waiting for a PCIE card, either the 6800 series or the X800 series would be fine. You should make your choice based on other factors (price, performance, IQ, etc.).

-D'oh!
 

futuristicmonkey

Golden Member
Feb 29, 2004
1,031
0
76
Cainam, I agree with the idea that ATi should allow their optimizatioons to be diasabled. However, let me ask you this: If the website that found out about these optimizations didn't, would you be so sure of yourself right now, and instead dismiss it as application specific, or maybe a driver problem? Why didn't you say anything about them if you did, in fact, see them?

And what did you mean by "who didn't see them before?" ? Oh wait, nevermind, you might've misunderstood what i meant. I meant "And if these opitimizations are so bad and noticeable, why didn't you say anything about them before?"
 

CaiNaM

Diamond Member
Oct 26, 2000
3,718
0
0
Originally posted by: futuristicmonkey
Cainam, I agree with the idea that ATi should allow their optimizatioons to be diasabled. However, let me ask you this: If the website that found out about these optimizations didn't, would you be so sure of yourself right now, and instead dismiss it as application specific, or maybe a driver problem?

i would be sure there was an issue. i would be sure it wasn't "application specific", as it appears in more than a single application. i would not have conclusively known exaclty what the cause of it was.

Why didn't you say anything about them if you did, in fact, see them?

because this issue was know before the card was available to the general public, and yes, i did speak out earlier after i received the card a few weeks ago. in the beginning, we just didn't know the cause. the effect was clearly there and questioned.

And what did you mean by "who didn't see them before?" ? Oh wait, nevermind, you might've misunderstood what i meant. I meant "And if these opitimizations are so bad and noticeable, why didn't you say anything about them before?"

i did

there a several other threads here, as well as b3d & r3d discussing this issue.
 

futuristicmonkey

Golden Member
Feb 29, 2004
1,031
0
76
i would be sure there was an issue. i would be sure it wasn't "application specific", as it appears in more than a single application. i would not have conclusively known exaclty what the cause of it was..

That last sentence really says what I've tried to say. You wouldn't have known what the cause of it was, but now that you know what the cause of it is, you are able to say that they are there optimizations because there are people who have proven it.

Think of it like this: A teacher at medical school asks her students to identify a virus by the way it looks. So, one student tries to id it, but isn't sure. The other students then walk over and confirm what the first student said. The first student now feels confident that he/she is correct.

My point is this: If you saw those optimizations (but you didn't know they were) and tried to figure out what they were. You would try to examine them but start to think they may be driver-related. You could come over to these forums and say what you think (that they're illegitimate optimizations). The ATi fanboys will disagree with you and the Nvidia fans will jump all over it. So everyone will take a look and find what you did. So, instead of just you contacting ATi and asking about this (and getting put on hold, or are told that it's application specific), it's now almost everyone who visits the AT Video forum, along with people from other forums. Now you can feel confident that you might be right. So you all ask ATi and they come out with the truth.

I'm not even sure if I made my point there but let me ask you this: If you got your X800 before the optimizations were exposed, what would you have done?
 

CaiNaM

Diamond Member
Oct 26, 2000
3,718
0
0
Originally posted by: futuristicmonkey
i would be sure there was an issue. i would be sure it wasn't "application specific", as it appears in more than a single application. i would not have conclusively known exaclty what the cause of it was..

That last sentence really says what I've tried to say. You wouldn't have known what the cause of it was, but now that you know what the cause of it is, you are able to say that they are there optimizations because there are people who have proven it.

Think of it like this: A teacher at medical school asks her students to identify a virus by the way it looks. So, one student tries to id it, but isn't sure. The other students then walk over and confirm what the first student said. The first student now feels confident that he/she is correct.

My point is this: If you saw those optimizations (but you didn't know they were) and tried to figure out what they were. You would try to examine them but start to think they may be driver-related. You could come over to these forums and say what you think (that they're illegitimate optimizations). The ATi fanboys will disagree with you and the Nvidia fans will jump all over it. So everyone will take a look and find what you did. So, instead of just you contacting ATi and asking about this (and getting put on hold, or are told that it's application specific), it's now almost everyone who visits the AT Video forum, along with people from other forums. Now you can feel confident that you might be right. So you all ask ATi and they come out with the truth.

I'm not even sure if I made my point there but let me ask you this: If you got your X800 before the optimizations were exposed, what would you have done?

i'm still not sure what your point is....

the bottom line is this: the card is capable of full trilinear filtering. you are not getting full trilinear filtering. this causes the mip transitions to be clearly visible.

what difference does it make wether we knew the actual cause or not? either way, it's not right, and it needs to be fixed.

that it ended up being an optimization which ati was clearly deceptive about is another issue in itself, and didn't seem to fall under the topic of this discussion.
 

futuristicmonkey

Golden Member
Feb 29, 2004
1,031
0
76
1. You're entitled to your opinion 2. Let's say 3dfx put out the Voodoo 1 and it was the greatest thing since sliced bread. Then the next year they put out the V1a with a faster chip and RAM, and you bought that too. Then the next year they put out the same chip, no new features, just more pipes so it ran the games faster. How many times are you going to buy that same core for $400 while other companies are putting out more advanced chips? Would you get bored with it? Or would you keep happily paying them $400 year after year? If you can honestly say you would buy the same core over and over forever, you can honestly say it's not possible to get bored with a core. If you can't, and you'd buy something else sooner or later, all we differ on is how soon we get bored. 3. You don't see me paying over MSRP for a 6800. My 5800U runs everything fine, I'll buy a 6800U when it's under MMSRP 4. If you wish you had that kind of money, go earn it. If you're a kid, get a part time job. If you're a man, learn a new skill and increase you salary.

I didn't fully read this, I sorta skimmed over it but you did make a good point. But, if they did just make a faster core with no new features, I doubt that anyone would buy it. Companies like ATi and Nvidia know that if they ever did this, they'd be out of business. Now, that is an example but, an unrealistic one. Can you give me an example of anytime a graphics chipset company (like ATi) has ever put out the same chip, three years in a row, without any new features? And how big of features. Big, like SM3, or small-ish, like ATi's F-Buffer?
 

Cerb

Elite Member
Aug 26, 2000
17,484
33
86
IIRC, the BFG cards have higher clock speeds than the other companies.
 

CaiNaM

Diamond Member
Oct 26, 2000
3,718
0
0
Let's say 3dfx put out the Voodoo 1 and it was the greatest thing since sliced bread. Then the next year they put out the V1a with a faster chip and RAM, and you bought that too. Then the next year they put out the same chip, no new features, just more pipes so it ran the games faster. How many times are you going to buy that same core for $400 while other companies are putting out more advanced chips? Would you get bored with it? Or would you keep happily paying them $400 year after year? If you can honestly say you would buy the same core over and over forever, you can honestly say it's not possible to get bored with a core. If you can't, and you'd buy something else sooner or later, all we differ on is how soon we get bored.

for the most part this has happened many times over.

of all people, rollo i didn't expect you saying something like this - you bought 3 different dx9 cards (and at least one even twice, after stomping on it (and did you drive over it also? i don't rember) lol) with no core "features" differences whatsoever in just the last year and a half or so
 

futuristicmonkey

Golden Member
Feb 29, 2004
1,031
0
76
Near the middle of this article, I was annoyed with Rollo for saying that ATi didn't know how to implement the DX9c spec in their hardware. So, as an ATi fan, I was going to defend them, saying the things that Nvidia fans said when the 9700 Pro came out with its SM2, but I didn't and then went of on a tangent. Anywasy, I will say those things right now. It's useless. (we don't have it so it doesn't matter -lol). But, Nvidia does have it, so, they are ahead in that respect. When it comes into use, we will see how important it really is. And, IIRC, I could've sworn that I read somewhere (mighta been here) that ATi thought they didn't need it, or it would slow down performance, or something. Well, now I'm arguing with Cainam that ATi's optimizations aren't that bad.

As of right now, I'm dropping my fanboy attitude.

Rollo, ATi's newest card is equal with the 6800 Ultra in today's games. That may change later on in next-gen games.

Cainam, you are right. ATi's optimizations are bad, and we should be able to disable them. Both companies should cut the crap and not have to resort to cheating.

C'mon, argue with this now, I dare you!
 
Apr 14, 2004
1,599
0
0
Rollo, ATi's newest card is equal with the 6800 Ultra in today's games. That may change later on in next-gen games.
It's actually better than the 6800 U. More or less every review site puts the XT ahead of the Ultra, the only thing that differs is by how much. Take off the 10-15% from the XT for their brilinear and the XT is still faster. It probably overclocks better on 1 molex connector as well.

Anandtech had really conservative benchmarks for the XT. Sites like xbitlabs and toms hardware had the XT trouncing the 6800 left and right.
 

Cerb

Elite Member
Aug 26, 2000
17,484
33
86
Originally posted by: GeneralGrievous
Rollo, ATi's newest card is equal with the 6800 Ultra in today's games. That may change later on in next-gen games.
It's actually better than the 6800 U. More or less every review site puts the XT ahead of the Ultra, the only thing that differs is by how much. Take off the 10-15% from the XT for their brilinear and the XT is still faster. It probably overclocks better on 1 molex connector as well.

Anandtech had really conservative benchmarks for the XT. Sites like xbitlabs and toms hardware had the XT trouncing the 6800 left and right.
Depends on who's benches you look at. They are dead even right now.
With PS2 having the same actual features as PS3 (just not as efficient), it should be easy to drop down to it, and if they can keep the speeds up, they should be fine.
However, their out-engineering NVidia is what got them where they are. They shouldn't be sitting and waiting.
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: CaiNaM
Let's say 3dfx put out the Voodoo 1 and it was the greatest thing since sliced bread. Then the next year they put out the V1a with a faster chip and RAM, and you bought that too. Then the next year they put out the same chip, no new features, just more pipes so it ran the games faster. How many times are you going to buy that same core for $400 while other companies are putting out more advanced chips? Would you get bored with it? Or would you keep happily paying them $400 year after year? If you can honestly say you would buy the same core over and over forever, you can honestly say it's not possible to get bored with a core. If you can't, and you'd buy something else sooner or later, all we differ on is how soon we get bored.

for the most part this has happened many times over.

of all people, rollo i didn't expect you saying something like this - you bought 3 different dx9 cards (and at least one even twice, after stomping on it (and did you drive over it also? i don't rember) lol) with no core "features" differences whatsoever in just the last year and a half or so

You're sort of right Cainam. I've actually purchased 5 DX9 cards since Sept 2002. (9700Pro>5800NU>9800Pro>5800OTES>traded for 5800U.)
I only stomped the OTES, although I've been considering making a keychain out of the chip.

I suppose you're right, if I wouldn't have just re-furnished my living room and went on vacation this month I'd probably have a X800 something just to play with till the prices drop on 6800s.
As it is, I'll probably stick with the 5800U till I can get a 6800U for a reasonable (read not $100 over MSRP) amount.
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: futuristicmonkey
Near the middle of this article, I was annoyed with Rollo for saying that ATi didn't know how to implement the DX9c spec in their hardware. So, as an ATi fan, I was going to defend them, saying the things that Nvidia fans said when the 9700 Pro came out with its SM2, but I didn't and then went of on a tangent. Anywasy, I will say those things right now. It's useless. (we don't have it so it doesn't matter -lol). But, Nvidia does have it, so, they are ahead in that respect. When it comes into use, we will see how important it really is. And, IIRC, I could've sworn that I read somewhere (mighta been here) that ATi thought they didn't need it, or it would slow down performance, or something. Well, now I'm arguing with Cainam that ATi's optimizations aren't that bad.

As of right now, I'm dropping my fanboy attitude.

Rollo, ATi's newest card is equal with the 6800 Ultra in today's games. That may change later on in next-gen games.

Cainam, you are right. ATi's optimizations are bad, and we should be able to disable them. Both companies should cut the crap and not have to resort to cheating.

C'mon, argue with this now, I dare you!

I can't argue with this, it seems pretty reasonable.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |