Nvidia are on a roll

Page 6 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Gstanford

Junior Member
Nov 12, 2006
3
0
0
Originally posted by: Creig
Originally posted by: Gstanfor
Do you think microsoft comes up with directx ideas sitting in a vacuum? They don't - they go ask the graphics companies what they can realistically bring to market and base Direct3D around that. DirectX7 & DirectX8.0 were nvidia led all the way.
Ati were furious at the time that nvidia's shader technology was used pretty much verbatim, with no input from them - that is why DirectX8.1 came about with pixel shader 1.4 to appease them.

And guess which graphics company led the development of DX9 and DX10? I'll give you a hint.... It wasn't Nvidia. In fact, Nvidia wasn't even CONSULTED when the DX9 specifications were drawn up. And, obviously, ATI has worked closely with Microsoft on DX10.

Oh really?...

Care to explain why DX9 has partial precision then?

Who developed HLSL for m$? (hint: it wasn't ATi...)

Care to explain why DX10 used G80 was used as the reference GPU for developing and certifying DirectX 10?
 

Gstanford

Junior Member
Nov 12, 2006
3
0
0
Originally posted by: Fox5
Originally posted by: Gstanfor
Of course features get licenced to microsoft fox5. Look no further than S3's S3TC - licenced as DXTC in DirectX (and as such able to be used by anyone developing a DX GPU).

I don't believe that was my comment.

What I'm questioning is whether nvidia licensed their pixel shader tech to Microsoft. Can you pull up any documents showing the licensing agreement; it's pretty easy to find documents on S3TC being licensed.

Then why didn't we see the fancy new AA mode in NV3x or NV4x? nvidia owns all of 3dfx's IP...

Huh? NV3x and NV4x did implement MSAA (and introduced new variations of it from the previous generations), but they did not implement any 3dfx 3d tech, there's no T-buffer or M-buffer in either product line.

You misread what I was saying. You said 3dfx were supposedly working on different AA (frankly I don't think they worked on a thing post VSA100). If that were so nvidia would have at some point implimented said new AA, especially around the time of NV3x/NV4x when they were under AA pressure from ATi. They certainly didn't have time to put it into GF3 (GF3 launched 3 months after nvidia purchased 3dfx's IP).
 

Creig

Diamond Member
Oct 9, 1999
5,170
13
81
Originally posted by: Gstanford
Originally posted by: Creig
Originally posted by: Gstanfor
Do you think microsoft comes up with directx ideas sitting in a vacuum? They don't - they go ask the graphics companies what they can realistically bring to market and base Direct3D around that. DirectX7 & DirectX8.0 were nvidia led all the way.
Ati were furious at the time that nvidia's shader technology was used pretty much verbatim, with no input from them - that is why DirectX8.1 came about with pixel shader 1.4 to appease them.

And guess which graphics company led the development of DX9 and DX10? I'll give you a hint.... It wasn't Nvidia. In fact, Nvidia wasn't even CONSULTED when the DX9 specifications were drawn up. And, obviously, ATI has worked closely with Microsoft on DX10.

Oh really?...

Care to explain why DX9 has partial precision then?

Who developed HLSL for m$? (hint: it wasn't ATi...)

Care to explain why DX10 used G80 was used as the reference GPU for developing and certifying DirectX 10?

Yes, oh really. What, do you think throwing out a few irrelevent questions is going to somehow change the facts?

Nice try.


DX9

Shortcomings of FX series
At this point NVIDIA?s market position looked unassailable, and industry observers began to refer to NVIDIA as the Intel of the graphics industry. However while the next generation FX chips were being developed, many of NVIDIA?s best engineers were working on the Xbox contract, developing the SoundStorm audio chip, and a motherboard solution.

It is also worth noting Microsoft paid NVIDIA for the chips themselves, and the contract did not allow for falling manufacturing costs, as process technology improved. Microsoft eventually realized its mistake, but NVIDIA refused to renegotiate the terms of the contract. As a result, NVIDIA and Microsoft relations, which had previously been very good, deteriorated. NVIDIA was not consulted when the DirectX 9 specification was drawn up. Apparently as a result, ATI designed the Radeon 9700 to fit the DirectX specifications. Rendering color support was limited to 24-bits floating point, and shader performance had been emphasized throughout development, since this was to be the main focus of DirectX 9. The Shader compiler was also built using the Radeon 9700 as the base card.

http://en.wikipedia.org/wiki/Nvidia


DX10
Unified Architecture

Now here's where ATI step in. They have worked closely with Microsoft on this version of their DirectX. The API is very much targeted towards a unified architecture and way of working.

http://www.overclock3d.net/articles.php...page=1&desc=ati_talks_about_directx_10

ATI is working closely with Microsoft to make sure the DirectX 10 API and their GPU programmability is accessible to game developers.

http://enthusiast.hardocp.com/article.html?art=MTA0NSw0LCxoZW50aHVzaWFzdA==
 

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
Originally posted by: Gstanford
Originally posted by: Creig
Originally posted by: Gstanfor
Do you think microsoft comes up with directx ideas sitting in a vacuum? They don't - they go ask the graphics companies what they can realistically bring to market and base Direct3D around that. DirectX7 & DirectX8.0 were nvidia led all the way.
Ati were furious at the time that nvidia's shader technology was used pretty much verbatim, with no input from them - that is why DirectX8.1 came about with pixel shader 1.4 to appease them.

And guess which graphics company led the development of DX9 and DX10? I'll give you a hint.... It wasn't Nvidia. In fact, Nvidia wasn't even CONSULTED when the DX9 specifications were drawn up. And, obviously, ATI has worked closely with Microsoft on DX10.

Oh really?...

Care to explain why DX9 has partial precision then?

Who developed HLSL for m$? (hint: it wasn't ATi...)

Care to explain why DX10 used G80 was used as the reference GPU for developing and certifying DirectX 10?
Dude, did you make another account? You used to be "Gstanfor" with 2516 posts. Now you're "Gstanford" with a Junior Member level of 3 posts. What's the deal?
 

redbox

Golden Member
Nov 12, 2005
1,021
0
0
Originally posted by: coldpower27
Originally posted by: Matt2
I'd be weary of waiting for R600 because ATI's probably gonna release R620 (or whatever it is) 7 weeks after R600

Doubtful, typically graphics card companies don't release a new GPU 2 - 3 months after unless the prior one was late already and the latter one is ontime/early.

R600 is supposed to come in around Mid March ish, if ATI's past high end launches from the year 2000 onwards are to be used as a guideline.

Considering the timeline I keep hearing is Feburary I think they are on time for R600.

As well considering when R600 is arriving it should be mildy faster then Geforce 8800 GTX. I don't expect anything higher then 10-20% at the most, more would be icing on the cake.

10-20% faster than a 8800gtx is still a MAJOR improvement on ATI's chips. The only problem is that it won't look like as big of a jump in GPU tech because Nvidia has already stolen the thunder.

I wonder how hard it would be for Nvidia to put a 512mb bus and 1gb of gddr4 on to the 8800gtx? I wouldn't think that would be too hard for them to manage since G80 is supposed to support them. Perhaps that would be a more likely out come to the new g80 derivitive rumored to come out in Feb?
 

Creig

Diamond Member
Oct 9, 1999
5,170
13
81
Originally posted by: josh6079
Dude, did you make another account? You used to be "Gstanfor" with 2516 posts. Now you're "Gstanford" with a Junior Member level of 3 posts. What's the deal?

It's proof Gstanfor and Rollo are the same person! See!

"Gstanford" & Son
"Rollo"

It's proof I tell you! PROOF!!
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
Originally posted by: josh6079
Originally posted by: Gstanford
Originally posted by: Creig
Originally posted by: Gstanfor
Do you think microsoft comes up with directx ideas sitting in a vacuum? They don't - they go ask the graphics companies what they can realistically bring to market and base Direct3D around that. DirectX7 & DirectX8.0 were nvidia led all the way.
Ati were furious at the time that nvidia's shader technology was used pretty much verbatim, with no input from them - that is why DirectX8.1 came about with pixel shader 1.4 to appease them.

And guess which graphics company led the development of DX9 and DX10? I'll give you a hint.... It wasn't Nvidia. In fact, Nvidia wasn't even CONSULTED when the DX9 specifications were drawn up. And, obviously, ATI has worked closely with Microsoft on DX10.

Oh really?...

Care to explain why DX9 has partial precision then?

Who developed HLSL for m$? (hint: it wasn't ATi...)

Care to explain why DX10 used G80 was used as the reference GPU for developing and certifying DirectX 10?
Dude, did you make another account? You used to be "Gstanfor" with 2516 posts. Now you're "Gstanford" with a Junior Member level of 3 posts. What's the deal?

I'm back. password/email issues (I swear this forum must be water wheel powered) & the gstanford account is no more.
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
Nvidia was consulted extensively during DX9 development, until the DirectX chief left for ATi and the specification suddenly changed (very late in the day from what it had been).

The B3D forums used to be full of ATi engineers crowing about R400/R600. That abruptly stopped a year and a bit ago. I wonder why?
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
Originally posted by: redbox
Originally posted by: Wreckage
Originally posted by: redbox

QFT! I don't know where people are getting the idea that Nvidia is going to be brining out the super version of the G80 around the time of R600. To me it just seams like wishfull thinking.

It could also be said that it's wishful thinking that the R600 will top the current G80. NVIDIA may have blind sided em this time and they will not have what it takes. I have no idea, but several people around here talk like the know the future with nothing to back it up.

As far as I'm concerned the R600 is just a rumor and will remain such until there is product on the shelf.

Wait so you are saying you don't even believe that there is an R600 coming at all? Wow...just wow.

No, he's saying there is no R600 until it actually launches.

And of course nvidia would attempt to launch another card at the same time R600 launches - its called "nailing the coffin shut" (it happened to 3dfx on a larger time scale with nv10 & nv15).
 

redbox

Golden Member
Nov 12, 2005
1,021
0
0
Originally posted by: Gstanfor
Originally posted by: redbox
Originally posted by: Wreckage
Originally posted by: redbox

QFT! I don't know where people are getting the idea that Nvidia is going to be brining out the super version of the G80 around the time of R600. To me it just seams like wishfull thinking.

It could also be said that it's wishful thinking that the R600 will top the current G80. NVIDIA may have blind sided em this time and they will not have what it takes. I have no idea, but several people around here talk like the know the future with nothing to back it up.

As far as I'm concerned the R600 is just a rumor and will remain such until there is product on the shelf.

Wait so you are saying you don't even believe that there is an R600 coming at all? Wow...just wow.

No, he's saying there is no R600 until it actually launches.

And of course nvidia would attempt to launch another card at the same time R600 launches - its called "nailing the coffin shut" (it happened to 3dfx on a larger time scale with nv10 & nv15).

Hmmm that didn't seam to keep some people from talking about the G80 even though it wasn't in retail yet. If your going to take that road then this super "nail driving" G80 doesn't exist either untill it actually launches. You can't have it both ways. I do believe Nvidia will release a new GPU around the time R600 launches -call it nailing the coffin shut, or covering your rear it's all the same in the end. What I don't believe is that it will be a GPU more powerful than the 8800gtx. I could be wrong though, that's what's great about speculating. If Nvidia top the 8800gtx this coming Feb then that's just another feather in their cap.
 

coldpower27

Golden Member
Jul 18, 2004
1,676
0
76
Originally posted by: redbox
Originally posted by: coldpower27
Originally posted by: Matt2
I'd be weary of waiting for R600 because ATI's probably gonna release R620 (or whatever it is) 7 weeks after R600

Doubtful, typically graphics card companies don't release a new GPU 2 - 3 months after unless the prior one was late already and the latter one is ontime/early.

R600 is supposed to come in around Mid March ish, if ATI's past high end launches from the year 2000 onwards are to be used as a guideline.

Considering the timeline I keep hearing is Feburary I think they are on time for R600.

As well considering when R600 is arriving it should be mildy faster then Geforce 8800 GTX. I don't expect anything higher then 10-20% at the most, more would be icing on the cake.

10-20% faster than a 8800gtx is still a MAJOR improvement on ATI's chips. The only problem is that it won't look like as big of a jump in GPU tech because Nvidia has already stolen the thunder.

I wonder how hard it would be for Nvidia to put a 512mb bus and 1gb of gddr4 on to the 8800gtx? I wouldn't think that would be too hard for them to manage since G80 is supposed to support them. Perhaps that would be a more likely out come to the new g80 derivitive rumored to come out in Feb?

That is one of the issues with launching later then your competition, your second and alot of the hoopla has died down already. Since you also have had more time, you have greater expectations.

I am not sure Nvidia should be going to 512Bit this generation, I know the current die size can easily support it, were talking a current die in the 480-500mm2 range. However Nvidia will want to shrink this die in the future to more economical levels. You see there is a reason why we don't see 256Bit cards paired with any die lower then ~200mm2 or so.

I am also uncertain if it's necessary, is G80 really bandwidth limited, I think it would be more prudent to switch over to GDDR4 and maintain the 384Bit Interface then to increase complexity further with a 512Bit Memory Interface, necessitating the use of 16 Memory chips. How much performance would it gain? I tihnk increasing Shader Power is more important then bandwidth after this massive leap we have had with the 256Bit to 384Bit transistion.

I currently seriously doubt Nvidia will release a 512Bit Card in Feburary or any high end release that supercedes the 8800 GTX at that time for the matter. The next time we should see something from them is like April if your going by the average time frame. Maybe the Geforce 8800 GT will launch then and Nvidia will lower their pricing?

I would be really impressed if R600 did indeed have a 512Bit External memory interface, not that I feel that is the limiting factor mind you, I am more interested in what the shader power is then memory bandwidth. A 512Bit Interface won't gurantee a win.
 

Creig

Diamond Member
Oct 9, 1999
5,170
13
81
Originally posted by: Gstanfor
Nvidia was consulted extensively during DX9 development, until the DirectX chief left for ATi and the specification suddenly changed (very late in the day from what it had been).

Wrong. AGAIN.

Microsoft and DX9

NVIDIA was instrumental in the development of the DX7 and DX8 spec, and their initial products always fell in line with these specs. This changed with DX9. When Microsoft first convened many of the graphics players throughout the world to develop the DX9 specification, one player was absent. NVIDIA did not rejoin the talks in developing DX9 until quite some time after much of the groundwork for DX9 was laid.

http://www.penstarsys.com/editor/tech/graphics/nv_ati_dx9/index.html
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
Originally posted by: redbox

Hmmm that didn't seam to keep some people from talking about the G80 even though it wasn't in retail yet. If your going to take that road then this super "nail driving" G80 doesn't exist either untill it actually launches. You can't have it both ways. I do believe Nvidia will release a new GPU around the time R600 launches -call it nailing the coffin shut, or covering your rear it's all the same in the end. What I don't believe is that it will be a GPU more powerful than the 8800gtx. I could be wrong though, that's what's great about speculating. If Nvidia top the 8800gtx this coming Feb then that's just another feather in their cap.

I considered the rumors regarding the G80 to be BS (and they were). I am giving the same consideration to R600 rumors. I never said I knew what was coming out, yet several people seem to have a R600 in their computer and know exactly how well it will perform.
 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
Originally posted by: Creig
Originally posted by: Gstanfor
Nvidia was consulted extensively during DX9 development, until the DirectX chief left for ATi and the specification suddenly changed (very late in the day from what it had been).

Wrong. AGAIN.

Microsoft and DX9

NVIDIA was instrumental in the development of the DX7 and DX8 spec, and their initial products always fell in line with these specs. This changed with DX9. When Microsoft first convened many of the graphics players throughout the world to develop the DX9 specification, one player was absent. NVIDIA did not rejoin the talks in developing DX9 until quite some time after much of the groundwork for DX9 was laid.

http://www.penstarsys.com/editor/tech/graphics/nv_ati_dx9/index.html


Wasn't that around the time Nvidia tried to create their own standard of Cg programming or something like that? And Microsoft was going along with it and I remember reading how they changed their collective minds and pursued the full DX9 spec. Kind of leaving Nvidia in the cold. Probably why NV30 was not as good performing DX9.
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
I certainly would not take everything online and linkable as gospel truth if I were you. It seems to me that HLSL is a pretty fundamental part of DirectX9 (without it shaders would not be especially useful).

You do realize Josh helps out at the Inquirer nowadays, don't you?

and redobx, try reading what I wrote again. I stated "No, he's saying there is no R600 until it actually launches." with "he's" meaning wreckage. I didn't say I agreed with him, although personally an product doesn't have much signifigance to me until it launches. That is a lesson everyone should have learned from nv30.
 

Fox5

Diamond Member
Jan 31, 2005
5,957
7
81
Originally posted by: Gstanford
Originally posted by: Fox5
Originally posted by: Gstanfor
Of course features get licenced to microsoft fox5. Look no further than S3's S3TC - licenced as DXTC in DirectX (and as such able to be used by anyone developing a DX GPU).

I don't believe that was my comment.

What I'm questioning is whether nvidia licensed their pixel shader tech to Microsoft. Can you pull up any documents showing the licensing agreement; it's pretty easy to find documents on S3TC being licensed.

Then why didn't we see the fancy new AA mode in NV3x or NV4x? nvidia owns all of 3dfx's IP...

Huh? NV3x and NV4x did implement MSAA (and introduced new variations of it from the previous generations), but they did not implement any 3dfx 3d tech, there's no T-buffer or M-buffer in either product line.

You misread what I was saying. You said 3dfx were supposedly working on different AA (frankly I don't think they worked on a thing post VSA100). If that were so nvidia would have at some point implimented said new AA, especially around the time of NV3x/NV4x when they were under AA pressure from ATi. They certainly didn't have time to put it into GF3 (GF3 launched 3 months after nvidia purchased 3dfx's IP).

I think you're misreading what I'm saying.

The new form of FSAA 3dfx was working on was MSAA! The super sampling t-buffer of the vsa-100 was to replaced with a multisampling m-buffer in rampage. Rampage was a finished chip, and it did use MSAA.
Nvidia did implement MSAA as well starting with the geforce 3 line, and introduced new forms of it with every card since then, offering substantial increases in quality over what the geforce 3 had. MSAA, like most modern 3d techniques, was not invented by any of the bad boys we know now, not 3dfx, ati, nor nvidia. ATI was actually the last one to the party with an MSAA implementation.
I cannot say if nvidia ever used 3dfx's msaa implementation, but I doubt it, since I believe they use their pixel shaders to accomplish what the m-buffer would have, though it's possible they use the same algorithm.
 

redbox

Golden Member
Nov 12, 2005
1,021
0
0
Originally posted by: Gstanfor
I certainly would not take everything online and linkable as gospel truth if I were you. It seems to me that HLSL is a pretty fundamental part of DirectX9 (without it shaders would not be especially useful).

You do realize Josh helps out at the Inquirer nowadays, don't you?

and redobx, try reading what I wrote again. I stated "No, he's saying there is no R600 until it actually launches." with "he's" meaning wreckage. I didn't say I agreed with him, although personally an product doesn't have much signifigance to me until it launches. That is a lesson everyone should have learned from nv30.

So now you speak for wreckage? Not only that but you should practice what you preach. You say "personally a product doesn't have much significance to you until it launches", but you where still very active in discusions over G80 before it launched.
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
Yes, redbox, I was active in G80 discussions, but I think you'll find I was quite conservative about how it would turn out compared to everyone else who were flying off at crazy tangents.
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
Originally posted by: Fox5
Originally posted by: Gstanford
Originally posted by: Fox5
Originally posted by: Gstanfor
Of course features get licenced to microsoft fox5. Look no further than S3's S3TC - licenced as DXTC in DirectX (and as such able to be used by anyone developing a DX GPU).

I don't believe that was my comment.

What I'm questioning is whether nvidia licensed their pixel shader tech to Microsoft. Can you pull up any documents showing the licensing agreement; it's pretty easy to find documents on S3TC being licensed.

Then why didn't we see the fancy new AA mode in NV3x or NV4x? nvidia owns all of 3dfx's IP...

Huh? NV3x and NV4x did implement MSAA (and introduced new variations of it from the previous generations), but they did not implement any 3dfx 3d tech, there's no T-buffer or M-buffer in either product line.

You misread what I was saying. You said 3dfx were supposedly working on different AA (frankly I don't think they worked on a thing post VSA100). If that were so nvidia would have at some point implimented said new AA, especially around the time of NV3x/NV4x when they were under AA pressure from ATi. They certainly didn't have time to put it into GF3 (GF3 launched 3 months after nvidia purchased 3dfx's IP).

I think you're misreading what I'm saying.

The new form of FSAA 3dfx was working on was MSAA! The super sampling t-buffer of the vsa-100 was to replaced with a multisampling m-buffer in rampage. Rampage was a finished chip, and it did use MSAA.
Nvidia did implement MSAA as well starting with the geforce 3 line, and introduced new forms of it with every card since then, offering substantial increases in quality over what the geforce 3 had. MSAA, like most modern 3d techniques, was not invented by any of the bad boys we know now, not 3dfx, ati, nor nvidia. ATI was actually the last one to the party with an MSAA implementation.
I cannot say if nvidia ever used 3dfx's msaa implementation, but I doubt it, since I believe they use their pixel shaders to accomplish what the m-buffer would have, though it's possible they use the same algorithm.

I'll rush out and buy a rampage tommorow fox5! Oh wait.... it was NEVER RELEASED!!!! and outside of engineering prototypes doesn't exist.

As for "everrybody" using MSAA, S3 doesn't use it despite some deltachrome's theoretically being capable (like all things S3, more bugs than a termite colony), Matrox does not use it -- only nvidia and ATi use it and ATi not until R300 (SiS and XGi *may* use it, but, frankly, who cares???).
 

Creig

Diamond Member
Oct 9, 1999
5,170
13
81
Originally posted by: Gstanfor
I certainly would not take everything online and linkable as gospel truth if I were you.

Oh, I definitely don't. But when it comes right down to it, at least I have SOME form of proof to back up my statements. You can't do this because you distort everything to fit your pro-Nv/Anti-ATI agenda.
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
Originally posted by: Creig
Only in your little world, Greg.

Don't you worry about "my little world" creig (which I happen to share with the "silent majority" of PC gamers) - it's quite plush and comfortable.

ATi fans can only wish for such comforts. I expect their world is akin to being stuck in the Sahara desert at mid-day in the middle of a sand storm at the moment...
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |