PhysX worthless with ATI?

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: keysplayr2003
Learn when to say, "When". Agree to disagree. No need for 15 pages of getting nowhere.

so what do you think, Keys

. . . will PhysX make it in face of the opposition it faces?

that is the key question, i believe
[no pun intended, sorry]

and if we did not have 15 pages of getting nowhere, AT video would be like B3D

*cough*


very polite


very educational


and somewhat boring



http://www.amazon.com/BFG-BFGE...=8-4#productPromotions

BFG {lifetime wrranty} GTX280 at Amazon.com for $399.99 shipped [with the $20 cash back promo] .. i need to get me one; that is the cheapest i saw yet; down $250 since release
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
Originally posted by: keysplayr2003
Learn when to say, "When". Agree to disagree. No need for 15 pages of getting nowhere.

I agree. Besides if I talked like Apoppin did Virge would put me on "vacation".

I'm happy my card supports Physx. I look forward to using it. I will stir this pot again when the second NVIDIA Physx pack comes out (or the next major announcement at least).

Otherwise there is not much more to say.
 
Apr 20, 2008
10,067
989
126
Originally posted by: keysplayr2003
Originally posted by: hohyss
So if I buy 4870x2 card...

will there be any chance that I can take advantage of PhysX?

Looks like motherloads of developers are on PhysX bandwagon but
it looks like physx works on the card that actually supports it.

Even if Nvidia and Ati cooperates, do you guys think it will be possible ?
I m not sure whether ATI incorporated CUDA solution to their card.

I think its kind of possible since those developers are developing a game for xbox360 ( ati ) and ps3 ( nvidia ) which doesnt seem to have physx support. But thats my theory/

So what do you guys think?

Doesn't look that way. At least for current gen cards. I suppose it's possible, but like Pic said, AMD seems to be the stubborn one.

Many developers are already incorporating CUDA based PhysX into their game engines. Yes PhysX works on all 8 series and above Nvidia cards.

Anything is possible, and Nvidia is cooperating, AMD isn't. CUDA was designed specifically for Nvidia's unified shader architecture, which is very different from AMD/ATI's shader architecture. Even if PhysX is made to run on AMD/ATI hardware, there isn't any guaranty that it will perform as well. Hey, it's always possible, but CUDA was designed alongside the hardware from NV.

Once again, anything is possible. As of right now, if you want any opportunity to run PhysX on your PC, AMD/ATI currently is not the way to go. That may change in the future, but I don't have any solid info that AMD is hammering away at this ability.

Here is a bit-tech article that may help explain things.

Nvidia helping to bring PhysX to ATi Cards

Another from EnGadget.

PhysX on ATI effort gets helping hand from NVIDIA

So are you saying physx will work my my 8800GTS 320?
 

ViRGE

Elite Member, Moderator Emeritus
Oct 9, 1999
31,516
167
106
I'm tempted to lock this thread, but I won't. Behave yourselves and stop bickering back & forth, or I may change my mind.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
Originally posted by: Wreckage
No game "warrants" AA/AF or a resolution higher than 640x480. Basically that's what you are advocating. I would say that game physics affects gameplay far more than AA/AF or resolution.

This is the key issue, although i partially disagree about the resolution

In order of importance:
1. going from a very low to med resolution
2. Going from med to high resolution
3. Adding AF.
4. Adding physX
5. going from high to super high resolution
6. Adding AA.

The thing is... SOME newer games don't support AA (you can force it in driver... some will work, others will either cause the game to crash/BSOD, or give you single digit FPS).
Some older games don't support AF.
Almost every game out there does not support physX, only a select few do.

So, physX is pretty awesome when supported, but is very rarely support. It is a feature, just not a huge one.

DX10.1 supposedly adds FPS on games running in it. Could be somewhat useful, but the picture quality is the same. So a faster card without DX10.1 beats a slower one with.

Honestly, rather then making absolute statements, how about you just figure out for yourself how much it is worth to you? Both nvidia and AMD offer cards from 20$ (passively cooled, DDR2, tiny die) to multi card solutions OVER 1000$.
Most people are not going to buy that.

Give a $ or % value to features like physX and DX10.1, and then use that to calculate performance/$ for individual cards you are checking. And buy the best deal.

See that GTX260SC in my signature? I got that for 225$ (after 30MIR), because it was cheaper then the 4870. The 4870 being slightly faster on average, HDMI with audio output, superior video decode and having DX10.1 was balanced out by the superior customer service, physX, and lower power consumption. I considered them equal in value to me. So I bought the cheaper one.

If you don't value physX, customer service, and power consumption at ALL. and you LOVE the idea of DX10.1 etc... then by how much more would you value the 4870? 10$? 20$? 30$?
Set such a value, and then see how much you can buy them each for.
 

SSChevy2001

Senior member
Jul 9, 2008
774
0
0
@taltamir - The point of this thread is whether or not Physx is worthless, and what ATI owners would be missing. It's not about why you bought your GTX260, though it's a little strange you did add have more memory as a plus. Currently if I want to buy a Visiontek 4870 I can get one for $218, but the 512MB is a turn off.

What makes having GPU Physx so much better than having DX10.1 currently?
Where's a real list of any AAA recommend GPU Physx titles?
What can we expect from supported GPU Physx titles?

These are things I would expect key or nrollo to be able to answer, as their marketing for Nvidia. Yet there's no answer ever coming from them. Then you got weckage which goes and list a whole bunch of old physx titles, which don't add anything new to gameplay.

I've already pull up the list for most upcoming PhysX titles and guess what there mostly multiplatform titles ( Bionic Commando ). Somehow I doubt there will be much added effects GPU Physx effects with multiplatform titles.

The point is PhysX shouldn't be on anyone list for buying a new card as of now. Later when the games come and the CPU / GPU get more powerful it will be another story.

Back to the HD-DVD - Blu-ray war, who won? DVD
Same can be said about PhysX and DX10.1. DX9 and DX10 *mostly DX9*
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
1. they are not marketing for nvidia.
2. there are many lists of titles for physX, including in this very thread.

Heh... yea, DVD won, so did DX9... but when DX11 starts rolling around and a few titles for it will come that will play, barely, on 2000$+ machines, then DX10 games will be the norm...
You can already buy holographic disks that hold 300GB per disk.. only 18k for the drive (size of a shuttle PC) and 180$ per disk... they are gonna release an 800GB model soon, and a 1.6TB model of the disk is slated for 2010.
Labs demonstrated disks with larger capacity, but they are not being sold to the public yet...

When a 300GB holodisk drive costs a 1000$, and 10$ per disk, blu ray would be the standard (since HD-DVD threw in the towel)...
http://www.inphase-technologie...cts/media.asp?subn=3_2

So what does that have to do with topic? well, you say the question is, if physX has any worth. It does. It improves the graphics and gameplay, much MUCH more then AA or some other techniques do.
 

evolucion8

Platinum Member
Jun 17, 2005
2,867
3
81
DX10.1 is far more promising than PhysX from nVidia, DX10.1 gives Cube Map arrays for improved global illumination effects, something not possible in real time before, Separate Blend modes for MRT which improves the deferred shading performance, more input/output for Vertex Shaders, Gather4 to improve the performance when rendering shadows (nVidia does the same with PCF), LOD to improve the performance and quality of texture filtering, Multi-sample buffer reads and writes for improved performance and quality with Anti Aliasing across all scenarios, Pixel Coverage Mask to program anti aliasing in shaders, Increased precision for floating point operations, there are some other features but they aren't that important since most current hardware can do it like FP32, Int16, Programmable Sample AA patterns etc. I was able to play Assassin Creed with 4x FSAA and the performance impact was non existant since I'm CPU limited in that game, and was very playable, but I didn't like the game much anyways, is so sickily repetitive.

Graphics and details while improving the performance of it is more inmersive than having a ghost feature which will not take off and will not change your gameplay, I do not think that being able to throw everything or seeing dust everywhere will change my gameplay experience, specially those multi platform tittle which rarely takes advantage of the PC hardware capability thanks to incompetent publishers/developers like Ubiport. (Ubisoft)
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
yet there are so many games with physX and not ONE game with DX10.1
Assassin creed had it, but it was bugged and was just causing non rendering as said in every review site, including anand. And the end result was the ubi disabled it, and said its not worthwhile to bother fixing.

You can call foul play until tongues fall out, but the fact of the matter is, there is no such thing as a DX10.1 title.
Sure, you could blame conspiracies and the like, but game developers, the ones who know this stuff, chose not to bother with it. Not a single game. NOT ONE!

Even if you are right and it IS vastly superior on paper and the evil nvidia and stupid incompetent developers are the reason it doesn't exist in any games... well... so what? what does it give YOU, the customer, to pay extra for a part with a great feature that nobody is using or going to use due to "incompetence" and "bribes"?

Look ma, I bought this awesome future car, it runs on trash and produces fuel and diamonds. Too bad the cops wont let me drive it cause the evil corpo-oilio-conspirico is making it illigalz.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: evolucion8
DX10.1 is far more promising than PhysX from nVidia, DX10.1 gives Cube Map arrays for improved global illumination effects, something not possible in real time before, Separate Blend modes for MRT which improves the deferred shading performance, more input/output for Vertex Shaders, Gather4 to improve the performance when rendering shadows (nVidia does the same with PCF), LOD to improve the performance and quality of texture filtering, Multi-sample buffer reads and writes for improved performance and quality with Anti Aliasing across all scenarios, Pixel Coverage Mask to program anti aliasing in shaders, Increased precision for floating point operations, there are some other features but they aren't that important since most current hardware can do it like FP32, Int16, Programmable Sample AA patterns etc. I was able to play Assassin Creed with 4x FSAA and the performance impact was non existant since I'm CPU limited in that game, and was very playable, but I didn't like the game much anyways, is so sickily repetitive.

Graphics and details while improving the performance of it is more inmersive than having a ghost feature which will not take off and will not change your gameplay, I do not think that being able to throw everything or seeing dust everywhere will change my gameplay experience, specially those multi platform tittle which rarely takes advantage of the PC hardware capability thanks to incompetent publishers/developers like Ubiport. (Ubisoft)

But it is a check box feature also

UNLESS it becomes universally adopted - very very quickly - it is pretty meaningless as it will get *encompassed* in DX11 next year

i find it hard to recommend either brand based on having DX10.1 or not or PhysX or not. I think it will become completely clear next year - and neither "camp" should feel left out

i really haven't explored PhysX .. sigh .. i am sure it does not count running very slowly on a 8800GTX .. so i will probably be looking to get a GT260+ and see what i really think. That should be fine for the primary card and the 8800 can still be useful as a physX card.

i cannot say how impressed i will be or not. Not yet. i need experience. And i intend to find out for my self .. right around Big Bang II drivers, i guess is the timing to see if my HD4870x3 is "missing out" - or not



but yeah, to answer the OP, PhysX is worthless with ATi graphics and it will take some major shift in thinking of the Big players to adopt it universally. Just like Dx10.1 is worthless on Nvidia cards.

i just don't see either as a big deal. But then i haven't really seen DX10.1 on an AMD card either



 

evolucion8

Platinum Member
Jun 17, 2005
2,867
3
81
Originally posted by: taltamir
yet there are so many games with physX and not ONE game with DX10.1
Assassin creed had it, but it was bugged and was just causing non rendering as said in every review site, including anand. And the end result was the ubi disabled it, and said its not worthwhile to bother fixing.

You can call foul play until tongues fall out, but the fact of the matter is, there is no such thing as a DX10.1 title.
Sure, you could blame conspiracies and the like, but game developers, the ones who know this stuff, chose not to bother with it. Not a single game. NOT ONE!

Even if you are right and it IS vastly superior on paper and the evil nvidia and stupid incompetent developers are the reason it doesn't exist in any games... well... so what? what does it give YOU, the customer, to pay extra for a part with a great feature that nobody is using or going to use due to "incompetence" and "bribes"?

Look ma, I bought this awesome future car, it runs on trash and produces fuel. Too bad the cops wont let me drive it cause the evil corpo-oilio-conspirico is making it illigalz.

Yeah, there are so many PhysX games, can you name a AAA game using it? Can all those games use GPU accelerated PhysX? The only rendering issue in AC was some of the dust in the floor and it was solved with later revisions of CCC, when I played the game I've never saw a single rendering issue or dust missing, actually it was full ot it and was a bit bothersome, if it was really an issue, Why hardware reviewers keep using the Assassin Creed game UNPATCHED? It was a CONSPIRACY because nVidia doesn't have DX10.1 hardware and is stagnating technology and it's evolution, there are games announced to use DX10.1, is true that it's adoption is moving slowly but now that ATi has the performance crown, it's adoption should move faster. Thanks to nVidia, they tried to sell high end cards of over $600 (8800Ultra, GTX 280), rehashing the same old gpu in newer products (G92 on 8800GT, 9800GT/GTX) while being as fast or even slower than previous generations (8800GTX vs 9800GTX). Is true that the HD 4870 has the same features than it's previous generations, but is at least twice faster than it. ATi is incompetent at promoting new technology, but I don't see myself buying a GTX 280 GPU with the same features as the almost 3 years old 8800GTX.
 

SSChevy2001

Senior member
Jul 9, 2008
774
0
0
Originally posted by: taltamir
1. they are not marketing for nvidia.
2. there are many lists of titles for physX, including in this very thread.

Heh... yea, DVD won, so did DX9... but when DX11 starts rolling around and a few titles for it will come that will play, barely, on 2000$+ machines, then DX10 games will be the norm...
You can already buy holographic disks that hold 300GB per disk.. only 18k for the drive (size of a shuttle PC) and 180$ per disk... they are gonna release an 800GB model soon, and a 1.6TB model of the disk is slated for 2010.
Labs demonstrated disks with larger capacity, but they are not being sold to the public yet...

When a 300GB holodisk drive costs a 1000$, and 10$ per disk, blu ray would be the standard (since HD-DVD threw in the towel)...
http://www.inphase-technologie...cts/media.asp?subn=3_2

So what does that have to do with topic? well, you say the question is, if physX has any worth. It does. It improves the graphics and gameplay, much MUCH more then AA or some other techniques do.
1) Being part of a focus group has a lot to do with marketing.
2) The same list of of old titles that don't benefit from having GPU PhysX. Did your FPS change in Mass Effect from using GPU Physx? Do your regular UT3 map benefit from having GPU Physx? No it doesn't!

When DX11 comes Vista will have support for it, unlike XP not having DX10 support, that's why DX9 won. Same goes for physx, developers want to make money $$$. Console titles are the big ticket for developers, and currently the 360 and PS3 aren't supporting GPU PhysX.

I'm not going to even talk about storage drives with you. The example I gave was on topic. It clearly showed that a physic war right now would not benefit anyone, not sure what you were trying for.

Regarding AC at least it's a AAA title that benefit from having DX10.1. Bionic Commando will be out early 2009 and should hopefully answer some question about GPU Physx titles.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
define AAA title for me (its triple A now? not just A titles?)... you throw that around a lot.
Mass effect physX on GPU supposedly improves the visual effects (like object destruction), i did not test this myself.
UT3 is as AAA as it gets, but it does take a mod to give you physX, rather then making it mandatory to play. But it is available if you chose to use it on some maps.
GRAW is the first really impressive display of physX as far as I can tell...

Assassin creed looked the same, only it was 20% faster due to not rendering some stuff... I was not able to see the non rendered water function that was broken in that nvidia driver in crysis either, but it was fixed because it is supposed to be rendered. The benefit to FPS in assassin creed was not due to DX10.1
If it looks the same, then they should disable whatever function it was that it broke in DX10 as well, get it the FPS bonus without doing some useless, non observable calculation.

And consoles are only "the big thing" according to some analysts that are full of it, gamers are still buying PC games. Consoles are at their worst ever. Back in the day a console could do thing you could not do with a PC... nowadays, the video card is about to be replaced by arrays of in order CPUs (larabee and fusion). and console push "video playback" and "multimedia functionality" as features, both of which are neutered PC functions.

Anyways, physX is scaleable, and runs on both the CPU and GPU. So any title that uses it would get an FPS increase when done on GPU. generally speaking, for most of the titles listed on nvidia, the amount of effects is so small that it is not needed (aka, increasing FPS further above 60fps is meaningless since you are monitor limited), which is why I think most of the titles listed on the nvidia site are BS...
but people listed high budget, non indie titles here, which are modern. Both in development and already on the market.
(some people also listed a bit too many titles here, I am not saying every one of them fits this category)

EDIT: interesting, I am looking back and I can't find that list of titles in this thread, I just see a full copy of nvidia's list, that includes BS titles like "darkphysics demo" and "dragon shard" (CPU physX, no PPU/GPU stuff... aka, something that competes with havok, and has nothing to do with physX on GPU which has no competitor at the moment)
 

Obsoleet

Platinum Member
Oct 2, 2007
2,181
1
0
The golden nugget said within this thread was-

evolucion8:
"I will just shut up, and so you should, and stop spreading misinformation and a smokescreen to hide the fact that ATi has the fastest video card on the market, with the most features (DX10.1, 7.1 audio, etc), higher double precision performance, best Video Processor, and best performance per dollar on the market, otherwise we would still having $800 8800 Ultra or GTX 280."


Most of the fairminded consumers here who lean towards the obvious winner (ATI) of this generation are being far too diplomatic. The truth of the matter is that DirectX will continue to dominate, it's not a gamble to have advanced DX hardware support compliance.

It is a gamble to have "physx" support. It's been stated for all the clear reasons as to why it won't mean anything.
You're gaming on Microsoft's platform, they make the standards, not Nvidia.

DX10.1 support matters, and likely will in games that will be played for a decade or more into the future, such as Starcraft 2 and Diablo 3.

This is the truth of this matter. The ATI leaning crowd here is far to diplomatic here (as-in avoiding the reality of the situation in favor of attempting to appear fair-minded), and the Nvidia crowd and marketers are smokescreening.
Calling it as it is, Nvidia's Physx = Truform. DX10.1, DX11, DX12, are all inevitable and not such an extreme moot-point as proprietary Physx support, even if DX11 eclipses DX10.1.

 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
DX10.1 support matters, and likely will in games that will be played for a decade or more into the future, such as Starcraft 2 and Diablo 3.
Yes... i intend to use my video card for the next DECADE... mmmm hmmm, no upgrading here...
 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
Obsoleet, why do folks say, "The fact of the matter", and then follow it up with BS? Smokescreening? Which do you think is more of a smokescreen? DX10.1 with a whopping 0 games supporting it? Or PhysX with 3 games having PhysX content?
Turn on your ceiling fan. I think there is some of your own smokescreen you have to work through. The only thing you said that was accurate, was that ATI had the fastest card on the market. They did a great job on that 4870X2. I give Kudos where it's due. Can you?

"You're gaming on Microsoft's platform, they make the standards, not Nvidia."

Exactly which standard of Microsoft is Nvidia going against? Are they going against Direct X? No. PhysX is not in direct competition with DX nor was it ever designed to be.
 

ArchAngel777

Diamond Member
Dec 24, 2000
5,223
61
91
In order of importance:
1. going from a very low to med resolution
2. Going from med to high resolution
3. Adding AF.
4. Adding physX
5. going from high to super high resolution
6. Adding AA.

I don't agree... The problem is that you can't just create a list of importance like that. Each person has their own preference, not to mention that many of those things should be considered 'standard'. Personally, AA is more important to me than AF, and AA is more important than going from high to super high resolution. Unless the resolution increase can produce the affect of AA (it can't, at least not substancially) then going to high to super high is a waste, IMO. In other words, I'd glady take 1920X1200 with 4XAA over 2560x1600 with no AA. But to each their own.

PhysX shouldn't even be in that list, because it is game specific... If the game does not call for advanced physics, I would hardly consider that an importance on my list. Especially if the game itself only require basic physics that the CPU can handle just fine.
 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
I'd agree with that, except we are not talking basic physics here. You have an 8800GT so you can test for yourself how complex the physics are, exactly how much is going on at one time, and how trying to run them on the CPU totally crushes framerates to single digits.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
Me saying it makes it, by definition, my opinion.
Also by definition, other people will have an opinion, that will differ from mine.
Neither means "I can't" create a list of importance... And by can't I hope you mean shouldn't, because I obviously was able to do so.
 

Klinky1984

Member
Nov 21, 2007
48
0
66
Both HW Physics & DX 10.1/11 are moot right now. In the future they will matter. For the current generation of games it's not going to make much difference. if I was going to buy a video card today I'd probably go with a 4850. I have an 8800GT right now and really haven't felt compelled to install the PhysX demos as it seems rather gimicky right now. I do believe that HW Physics will be important at some point in the future and more solutions/support for it will help it become a staple of games. Without all videocards supporting it, it will be harder to gain acceptance. So I think it's great nVidia is launching forward with HW Physics on their 8+ series cards.

It sounds like ATi/AMD is cooking up something as well with Havok, but that is still a wait and see approach. At this point in time though, HW Physics is a bullet point to a list of features, it's nice to have but not really usable. When it does become a bigger deal I doubt my 8800GT is going to keep up and I'll need to invest in a new card anyways. So buying a card now based off of PhysX support alone would be pretty silly. :/
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
Yea... by the time those things would be staple, you would need a new video card anyways because games would be too intensive.
I am sure my DX11 video card would need to run bicubic maps (dx10.1) and physX apps often enough. Not so much my current one.
 

Creig

Diamond Member
Oct 9, 1999
5,170
13
81
Originally posted by: keysplayr2003
Obsoleet, why do folks say, "The fact of the matter", and then follow it up with BS? Smokescreening? Which do you think is more of a smokescreen? DX10.1 with a whopping 0 games supporting it? Or PhysX with 3 games having PhysX content?

And do either of those credentials lend themselves to saying "Purchase card A over card B"? Certain Nvidia fans are hyping PhysX as the next best thing to sliced bread. Right now, however, neither PhysX nor DirectX 10.1 is being utilized to any meaningful extent. You can point out the future titles that may or may not use PhysX in some manner as a reason to purchase an Nvidia card. In response, it could also be said that DirectX 10.1 titles are inevitably going to be released as a reason to purchase ATI cards instead. But until either system reaches a decent market penetration, they can be considered only a checkmark feature.


Originally posted by: keysplayr2003
Originally posted by: Obsoleet
You're gaming on Microsoft's platform, they make the standards, not Nvidia.
Exactly which standard of Microsoft is Nvidia going against? Are they going against Direct X? No. PhysX is not in direct competition with DX nor was it ever designed to be.

I believe he was referring to DX11. When DX11 is released, PhysX will be in direct competition with it as DX11 will contain its own shader-based physics calulation feature. And what will happen to PhysX or Havok FX then? I would imagine that DX11 will become the dominant physics system and any proprietary hardware based physics systems such as PhysX or Havok FX will disappear. No developer is going to want to have to code for two separate proprietary physics systems since any card that wishes to be DX11 compliant will have to allow DX11 physics as well.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |