Geforce GTX 295 previews

Page 5 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Wreckage

Banned
Jul 1, 2005
5,529
0
0
Originally posted by: evolucion8


You talk like if the RV770 power consumption were miles ahead of the GT200

Where did I say that? Besides the 4870 does not compete with the 280 and the 4850 does not compete with the 260. Not even close. So maybe those extra transistors are helping. Not to mention that if you clock a 2xx so it runs as hot as a 48xx the performance gap gets even wider.

Then when you look at things like video transcoding, physics and folding@home the gap is light years across.
 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
Originally posted by: ronnn
Originally posted by: keysplayr2003
Originally posted by: ronnn
I agree with the cookie. If in a few handpicked games it can't win at high res\aa, claiming the performance crown will be tough. Likely things will change when the vapour turns into actual product.

Quote from final page of Guru3d's preview:

"Also despite the fact we were limited to testing a handful of games, we internally of course did run the majority of benchmarks with other games already. And the performance widespread is consistent and the card worked with any game we threw at it."

I don't think it'll be tough at all.

At what resolution and aa?

This is a joke right?

Why would you think it would be different from their test methods and settings on the games they "did" show? Did they hint that they used different settings? No. I'd say it's a fair bet that they used the highest settings possible in all instances of any games they tested. Why wouldn't they with cards a powerful as these?

 

nosfe

Senior member
Aug 8, 2007
424
0
0
Originally posted by: Wreckage

Then when you look at things like video transcoding, physics and folding@home the gap is light years across.

and that's exactly why nvidia is loosing marketshare right now, because instead of focusing on gaming they went with the jack of all trades approach. If they stayed focused on gaming then maybe the g200 die wouldn't be as big.
 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
Not focused on gaming? Is that why they still have the most powerful single GPU available? And from the previews and internal testing, it looks like the most powerful multi-GPU as well. Yeah, they must not focus much on gaming.
 

nosfe

Senior member
Aug 8, 2007
424
0
0
you mean that you weren't expecting more from the gtx 280 when it came out considering its price and transistor count? and especially after the 4870 came out? i know i was
 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
Originally posted by: nosfe
you mean that you weren't expecting more from the gtx 280 when it came out considering its price and transistor count? and especially after the 4870 came out? i know i was


Should I have? What was the launch price of G80 8800GTX? How did the transistor count differ from the previous gen G71 7900GTX? 278 million to 681 million. 8800GTX launch price was 649.00.

For gaming, it was pretty much what I expected. Just about double the performance of last gens high end, or just about equal to last gens high end SLI. Some people, like yourself, expected 3 times or better the performance of the previous gen high end, which is pretty unrealistic.

The 4870 (R770) by comparison, seems to be a very simple GPU. Geared towards gaming, but not much else. It's programmable to an extent to take advantage of it's very high shader count, but can't seem to keep up with even a 64 shader 9600GT GPU in something like F@H. At stock clocks, a 9600GT pushes 3200 PPD while an overclocked 800 shader 4850 chugs out about 2400 PPD. Don't even ask what a stock clock GTX280 does. (over 8000PPD). G80 thru GT200s are geared for gaming, and a whole lot more. To not be impressed by these GPU's is another conversation all on it's own. And to keep on topic of this thread, I should get back to my testing.
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: nosfe
you mean that you weren't expecting more from the gtx 280 when it came out considering its price and transistor count? and especially after the 4870 came out? i know i was

Last I checked this thread was about the GTX295, not the GTX280. (and the "what did we expect from the GTX280" topic had been beaten to death last summer)

From what I've seen so far in personal use and the reviews, looks to be an excellent addition to the high end.

Everyone wins here- I don't see how anyone can disparage this launch at all.

Has less 8X AA performance at some games? So what? It has more at others, pretty much across the board advantage at 4XAA, better transparency AA, and offers 8X CSAA for higher performance than 8XMSAA and higher IQ than 4XMSAA.

Has no DX10.1? OK, but competition has no stereoscopic (rumored launching 1/9 with support for 350+ games), no PhysX (at least two more full games launching in next two months), no CUDA (better transcoding, folding for cure).

Not on a single PCB? Who cares? No aftermarket air coolers for 4870X2 either, history has shown water block support for both, and both use two slots.

Big fan of ATi that doesn't want NVIDIA to have market leading parts? Oh well, the launch of this part will do something for you the last few months of time hasn't- make X2s cheaper to buy.

Last thing to consider- this card will offer users the ability to create and edit their own profiles (competition doesn't) and more games launch with SLi support out of the box.

Anyway you cut it, the launch of this product is good news for all of us. It also represents NVIDIA doing something a whole lot of you said could not be done- putting out a dual GPU GT200 product. I'd say that is a testament to their engineering skill.
 

nosfe

Senior member
Aug 8, 2007
424
0
0
if the architecture is so good why haven't we seen a 8800gt version of the g200? if you think the g200 was a step in the right direction, then you're expecting the g300 to be more of the same? something with a die area of about 700mm² and 3bil transistors? it's not sustainable, the g200 was already on the edge

@nRollo
i wasn't talking about the gtx 280 as much as the g200 architecture as a whole
why are some people instantly badging others as fanboys whenever they don't like something about one company around here?
 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
Originally posted by: nosfe
if the architecture is so good why haven't we seen a 8800gt version of the g200? if you think the g200 was a step in the right direction, then you're expecting the g300 to be more of the same? something with a die area of about 700mm² and 3bil transistors? it's not sustainable, the g200 was already on the edge

@nRollo
i wasn't talking about the gtx 280 as much as the g200 architecture as a whole
why are some people instantly badging others as fanboys whenever they don't like something about one company around here?

It was so "on edge" that they put two of them in one card. C'mon nosfe, sit there and tell me that the architecture isn't a good one. And you are really trying to take this convo all over the place. Lets just concentrate on the topic at hand shall we? If you wish to create a thread about the inferior G200 architecture, please feel free. I'll participate.
 

nosfe

Senior member
Aug 8, 2007
424
0
0
they didn't put two of them in one card, they put two of them in one slot and also, they didn't do it until they had a 55nm version of it so my point still stands, the original 65nm version was on the edge of what's doable. IF the gtx 295 would have been like you said, two gtx 280 on one card then hats off but it's not, it's not even two 55nm gtx 280 on two separate cards. That's not to say that it's a bad card, it's a great card, depending on the price, not everyone lives in USA-land

 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
Originally posted by: nosfe
they didn't put two of them in one card, they put two of them in one slot and also, they didn't do it until they had a 55nm version of it so my point still stands, the original 65nm version was on the edge of what's doable. IF the gtx 295 would have been like you said, two gtx 280 on one card then hats off but it's not, it's not even two 55nm gtx 280 on two separate cards. That's not to say that it's a bad card, it's a great card, depending on the price, not everyone lives in USA-land

You know as well as I do that the orientation of GPU's really doesn't matter. It uses a single slot connection, and is virtually identical in size to a GTX260/280. AND, they kept the wider 448 bit bus. But, if you had a 4870X2, you could sit there all day and say, "My card might be slower, but it's on one PCB!!!". How exactly does this effect you or make your X2 superior in any way? Either mechanically, physically or emotionally? It doesn't. Even if you say it does, it doesn't, because it can't.

And your "point" does not stand. Current events my friend. Current events. 55nm now available renders old 65nm arguments moot. It's done and over.
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: nosfe
they didn't put two of them in one card, they put two of them in one slot and also, they didn't do it until they had a 55nm version of it so my point still stands, the original 65nm version was on the edge of what's doable. IF the gtx 295 would have been like you said, two gtx 280 on one card then hats off but it's not, it's not even two 55nm gtx 280 on two separate cards. That's not to say that it's a bad card, it's a great card, depending on the price, not everyone lives in USA-land

As long as the cards fill the same number of slots, and have similar after market cooling I don't get the whole single PCB "argument".

Last I checked performance, features, thermals, power consumption, image quality, and drivers are what we buy video cards for.

"Single PCB" is so far outside the realm of what matters it doesn't even warrant the effort to type the words.

Given the domination of the 9800GX2 over the 3870X2, and what looks to be the pretty much across the board sweep of the GTX295 over the 4870X2, maybe it's time for people to start saying "Why doesn't ATi put their multi GPU cards on two PCBs like NVIDIA, as their cards seem to lead in all the categories that matter?".

My $.02
 

nosfe

Senior member
Aug 8, 2007
424
0
0
what do i care about the 4870x2? can't stay on the subject? i'm talking about the g200 architecture. you started the "one card" thing, you said that nvidia can place two g200 on a card but it's not on a card and it's not a g200 either. Why is the g200 architecture suddenly moot? because it's pointless to talk about a "past" architecture? well it's moot to talk about anything in this forum, its not like we're changing the world here. So my discussion about the inefficiency of the g200 still remains.

I don't think that nvidia was price gouging us(not me, i don't have that kind on money) when the gtx280 came to the scene; i think it has more to do with the fact that the die was to big so they had to ask so much for it. i hope that nvidia will learn from this mistake and focus in the future on smaller dies for those of us that can't afford 500$ video cards. Going with smaller dies and multi gpu solutions ensures that the failure rate is kept small and that helps the prices stay low. Sure, multi gpu solutions aren't the best right now, but they'll evolve because two of ati's next gen cards will probably(i don't know, just saying) beat out one big chip made from nvidia which will "force" them to make their own multi gpu "single card" solution so weather nvidia likes it or not, multi gpu is the future(whether its sli, crossfire or lucid hydra)

@nRollo
i wasn't talking about single cards vs sandwich cards and what is better, its just that when someone talks about "single card" in a discussion about architectures you'd think they're talking about just that, a single card, not multiple cards. from an architecture point of view, making a single card is better than a sandwich or two(yes, it's useless for consumers but most of the conversations on this forum are useless for them and without them we'd be bored out of our skulls)

the reason why nvidia's solutions are higher performing is because a) they came second so they knew what performance they had to achieve in order to beat ati's cards and b) because each of the single chips is stronger. If given the choice between a 9800gx2 composed of 2 cards and one of only one card i'd say nvidia would chose the one card approach as it's usually cheaper to produce. Problem is that it takes more to research and it's impossible to do with a chip like the g200 because of the 512bit memory interface, too complex for the pcb
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: nosfe
what do i care about the 4870x2? can't stay on the subject? i'm talking about the g200 architecture. you started the "one card" thing, you said that nvidia can place two g200 on a card but it's not on a card and it's not a g200 either.

It's difficult, and somewhat useless, to talk about a card without the context of how it performs compared to other products that compete in the same price range, or position in competitor's hierarchy. What's it matter at all if the GTX295 can run a game at 60fps if not taken in the context of a competing product doing it at 50fps? Or perhaps 70fps? Knowledge without comparison is of very little use.

Originally posted by: nosfe
Why is the g200 architecture suddenly moot? because it's pointless to talk about a "past" architecture? well it's moot to talk about anything in this forum, its not like we're changing the world here. So my discussion about the inefficiency of the g200 still remains.

However, to all of us, the performance per transistor line of debate means about as much as the day to day of amoeba on Venus. I don't count my transistors, I set the resolution, AA, AF and see how fast it runs and what the IQ is. If I paid for cards, the cost each would factor in as well.

I can honestly say if NVIDIA figured out how to cram four GTX280 cards into one double slot package and it performed 20% higher than a 4870X2 with similar thermals and power, I'd pay 20% more for that card.

I'd never once think about efficiency, what it cost NVIDIA, or how many PCBs are there because I'm a gamer, not a guy making design decisions for NVIDIA.

Originally posted by: nosfe
I don't think that nvidia was price gouging us(not me, i don't have that kind on money) when the gtx280 came to the scene; i think it has more to do with the fact that the die was to big so they had to ask so much for it.
Obviously they didn't, as they lowered the price substantially and kept on churning out sales.
Companies will make what margin they can, and those margins pay for things people value about NVIDIA cards: TWIMTBP compatibility out of box, better warranties, step up programs, PhysX, CUDA, etc.

Originally posted by: nosfe
i hope that nvidia will learn from this mistake and focus in the future on smaller dies for those of us that can't afford 500$ video cards.
That's sort of selfish don't you think? I can afford $1000 video cards, and I want companies to give me the performance I can afford, not cater to the mainstream. Why don't people who can afford high end tech deserve to be developed for?

Originally posted by: nosfe
Going with smaller dies and multi gpu solutions ensures that the failure rate is kept small and that helps the prices stay low.
I'd rather pay for the lower yields and get higher end parts. I don't want my options limited by others budgets. I bought an 18', 150hp boat last year- I didn't post on the boat forums "I wish boat manufacturers would concentrate more on what I can afford, stop wasting their time on those inefficient 300hp monstrosities".

Originally posted by: nosfe
Sure, multi gpu solutions aren't the best right now, but they'll evolve because two of ati's next gen cards will probably(i don't know, just saying) beat out one big chip made from nvidia which will "force" them to make their own multi gpu "single card" solution so weather nvidia likes it or not, multi gpu is the future(whether its sli, crossfire or lucid hydra)

Lots and lots of people will never care about multi GPU till it's as seamless as single. (I'm not one)

Originally posted by: nosfe
@nRollo
i wasn't talking about single cards vs sandwich cards and what is better, its just that when someone talks about "single card" in a discussion about architectures you'd think they're talking about just that, a single card, not multiple cards. from an architecture point of view, making a single card is better than a sandwich or two(yes, it's useless for consumers but most of the conversations on this forum are useless for them and without them we'd be bored out of our skulls)
How is single card "better" if it doesn't offer better thermals, sound, power, performance, or image quality?

Originally posted by: nosfe
the reason why nvidia's solutions are higher performing is because a) they came second so they knew what performance they had to achieve in order to beat ati's cards and b) because each of the single chips is stronger.
OK

Originally posted by: nosfe
If given the choice between a 9800gx2 composed of 2 cards and one of only one card i'd say nvidia would chose the one card approach as it's usually cheaper to produce. Problem is that it takes more to research and it's impossible to do with a chip like the g200 because of the 512bit memory interface, too complex for the pcb

G200 isn't defined by 512 bit interface.

 

nosfe

Senior member
Aug 8, 2007
424
0
0
true, at the end of the day, when you're looking to buy, the performance is the important part. I was talking about the performance/transistor in light of the high price of the gtx 280 when it launched. My point being that history has a habit of repeating itself and i hope that nvidia learned its lesson and will not make huge chips anymore and stops ignoring us little folk. It doesn't help the consumer when a company comes out with a 500+ graphic card and it relies on older tech for the lower end. I think we can agree that it would have been better if there was something based on new technology(g200 in this case) in the sub gtx 260 market from nvidia. Problem is that if it takes 3 years to develop a new architecture, can they still change in time, or will we have another huge chip for the next generation from nvidia and then wait for heck knows how long for cards based on that technology for those of us that don't have the money or desire to spend 300$+ on a video card

yes, they reduced the price, but at what cost? i don't know how much profit they still make from those cards, if at all as the rumor mill says about the gtx 260. What i don't want to see is the company with the most money win just because they can lower the price of their products. Lets not forget that most of nvidia's profit margin comes from the workstation market where they dominate and where gtx 280 hardware sells for $1k+

i'm not saying that nvidia shouldn't make high end parts, i'm saying that high end parts would be cheaper if they were based on a couple of smaller dies instead of one huge chip because it's cheaper to produce

a single card is usually better because it's one less pcb and all those circuits that are doubled so it's more expensive to produce which means it will be more expensive to buy

the g200 isn't defined by the memory interface but i think(don't know, think) that it would be very hard to place 2 of them on one pcb because it would require too many layers because of its memory interface, that's why i said that it would be impossible to make a single card, dual chip out of a g200
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Nosfe
You seem very concerned about NVIDIA's profits and costs.

As an end user, I only care about things that actually make a difference to me.

I hope NVIDIA stays on the current path- offering the single chip performance leader, and offering dual chip performance leaders like the 295 where necessary.

The market will set the price, we all win when we have better parts available. People who can't afford them should buy cheaper parts, not hope companies can somehow magically sell better parts cheaper.

Hey, I wish I could drive a Ferrari to work every day. The answer is to learn how to do something that pays enough to buy a Ferrari, not ask Ferrari to focus on cheaper cars like Dodge.
 

nosfe

Senior member
Aug 8, 2007
424
0
0
actually, i'm more worried about amd's profits and costs, they're the ones getting trashed

I care about cost to manufacture because i think(maybe foolishly) that if a company can make products cheaper then they can sell them cheaper too. I don't have the money for the expensive stuff so yes, i learned to live with what i've got so no AA, PhysX or other eye candy for me; funny thing is that even when i find a game where i can turn AA/AF on i hardly see any difference, i guess i'm just used to crappy graphics cards, comes with the territory i guess(my last video card upgrade was about 4 years ago)

it's weird though that it takes so much effort to make multi gpu work, i mean, gpu's are basically a lot of processors wired in parallel, so why can't they make it work when they wire even more together?
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
Originally posted by: nRollo


I hope NVIDIA stays on the current path- offering the single chip performance leader, and offering dual chip performance leaders like the 295 where necessary.

I agree. While cards like the 295 and 4870x2 are interesting, I still think having a single GPU solution that can handle just about everything is the best way to go. The dual GPUs are for those select few who run 30" monitors or can tell the difference between 8xAA and 16xAA (or at least think they can).

 

evolucion8

Platinum Member
Jun 17, 2005
2,867
3
81
Originally posted by: nosfe
actually, i'm more worried about amd's profits and costs, they're the ones getting trashed

I care about cost to manufacture because i think(maybe foolishly) that if a company can make products cheaper then they can sell them cheaper too. I don't have the money for the expensive stuff so yes, i learned to live with what i've got so no AA, PhysX or other eye candy for me; funny thing is that even when i find a game where i can turn AA/AF on i hardly see any difference, i guess i'm just used to crappy graphics cards, comes with the territory i guess(my last video card upgrade was about 4 years ago)

it's weird though that it takes so much effort to make multi gpu work, i mean, gpu's are basically a lot of processors wired in parallel, so why can't they make it work when they wire even more together?

Actually the RV770 stepped AMD into the right direction. Usually they used to lose over 300 millions because of the transition costs, debts, whatever, etc, and they only lost 125 this last time which is quite an accomplishment, specially for ATi since it's the only profitable thing currently in AMD. I also understand your point of smaller dies, cheaper for us, but people claim here that "we're end users, we don't care about how big is the die, we just care about crancking the FSAA blah blah etc", but the end user is the one who pays the costs of bigger dies and production costs. I don't see how nVidia benefits selling those huge GPU's at it's current price, and If the price isn't that important, why they don't sell the GTX 280/295 at less than $200?? As a matter of fact, why we haven't seen a midrange or lower end GT200 based? Seems that the complex GT200 architecture isn't scalable at all. nVidia's route is fat shaders which needs very little software optimization to work at it's fullest while AMD's "cheaper approach" is lots of simpler shaders which requires more software optimizations which will cost money, which approach will be better in the future? Only time will tell...
 

akugami

Diamond Member
Feb 14, 2005
6,210
2,551
136
Let me just say I can afford the expensive stuff. However, with a new daughter born 12/10/08, I want to save some of my money for more important things like the stuff like my weekly $500 football money (+ 5k for the year). Now, my younger bro bought a 4870 X2 but if the GTX 295 was out at the same time at the same price point, he would have bought it instead. He went from a pair of 8800 GTX SLI. I personally don't game as much anymore so my needs are a bit simpler.

The 295's seem to be a solid update to the nVidia line of video cards. A 20% performance boost over its main competitor is in most cases nothing to sneeze at. For a beta product with likely immature drivers, this is darned good. I have no reason to believe that the roughly 20% boost does not exist (mostly) across the board barring the few games that runs better on one platform or the other.

With all that said, the main issue is running the settings on high and at higher resolutions. Performance seems to even out a lot between the 295 and the 4870 X2. Whether this is CPU related or video RAM bandwidth related is irrelevant. In the lower resolutions such as 1680 x 1050, the 20% boost is very very good. However, when the 4870 X2 is already averaging well over 60FPS and the minimum FPS is in the upper 50's, I have to wonder whether that has any real effect on gaming.

Some people have better sight than others but I have to wonder whether most of the people will notice any difference if the minimum FPS is at or close to 60. At 2560 x 1600, the performance seems to level off against the 4870 X2's. Again, this is probably due to the video memory bandwidth. Bottom line is the performance though and the 4870 X2 is competitive at those levels.

If I were in the market in January for a $500 video card, it'd be the 295. No question about it. nVidia seems to have a winner here. The ball is now in ATI's court and we'll see what they can do with it. ATI needs a solid update to their flagship 4870 X2 that can either level the playing field or better yet, surpass the 295. I think if ATI really wanted to make a serious dent in nVidia they need to come out with better performance than the 295.

The 295's are still not released and there is still plenty of time to tweak performance so we can expect a minimum performance of what was since in the previews and likely slightly better as things get tweaked. There are games where the 295 kills the 4870 X2's though and that gives it a big edge. The edge is blunted due to the 4870 X2's performing very well (for the most part) in lower resolutions and the 295's performing "poorly" in higher resolutions. And that gives ATI a chance, especially if they can release a decent successor to the 4870 X2's.
 

nosfe

Senior member
Aug 8, 2007
424
0
0
the impression that i've got from that AT article about the RV770 design was that ati stopped caring about the "halo effect" so i doubt they'll come up with anything to beat the gtx 295
 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
Originally posted by: evolucion8
Actually the RV770 stepped AMD into the right direction. Usually they used to lose over 300 millions because of the transition costs, debts, whatever, etc, and they only lost 125 this last time which is quite an accomplishment, specially for ATi since it's the only profitable thing currently in AMD. I also understand your point of smaller dies, cheaper for us, but people claim here that "we're end users, we don't care about how big is the die, we just care about crancking the FSAA blah blah etc", but the end user is the one who pays the costs of bigger dies and production costs. I don't see how nVidia benefits selling those huge GPU's at it's current price, and If the price isn't that important, why they don't sell the GTX 280/295 at less than $200?? As a matter of fact, why we haven't seen a midrange or lower end GT200 based? Seems that the complex GT200 architecture isn't scalable at all. nVidia's route is fat shaders which needs very little software optimization to work at it's fullest while AMD's "cheaper approach" is lots of simpler shaders which requires more software optimizations which will cost money, which approach will be better in the future? Only time will tell...

It isnt the price of production people need to worry about (not sure why we would in the first place), but the price of the final retail product. Sure if the competition is stale your point might make sense, but in this case its moot. Unless your some investor worried about the financial side of nVIDIA I just see no reason why one would even care about it.

nVIDIA doesn't need a mid range solution yet. Its perfectly filled by 9800GTX+, 9800GTX and 9800GT and further more the 9600GT. Most of the time the mid range part comes after 9 months from the high end release of that generation. Im guessing its the 40nm GT212 that will start replacing the G92B derivatives in Q1 or Q2 09.

edit - what ATi did was one risky move and probably considered a miracle in the tech world. I mean for example a delay or shortages in GDDR5 modules would have hurt the products pretty bad.
 

nosfe

Senior member
Aug 8, 2007
424
0
0
i dunno, when i've googled "manufacturing cost of the GT200 die" i've found an article mentioning 100-110$ cost/die(and a ton of other articles that linked to that one)
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |