8GB VRAM not enough (and 10 / 12)

Page 75 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

BFG10K

Lifer
Aug 14, 2000
22,709
2,978
126
8GB
Horizon Forbidden West 3060 is faster than the 2080 Super despite the former usually competing with the 2070. Also 3060 has a better 1% low than 4060 and 4060Ti 8GB.
Resident Evil Village 3060TI/3070 tanks at 4K and is slower than the 3060/6700XT when ray tracing:
Company Of Heroes 3060 has a higher minimum than the 3070TI:

10GB / 12GB

Reasons why still shipping 8GB since 2014 isn't NV's fault.
  1. It's the player's fault.
  2. It's the reviewer's fault.
  3. It's the developer's fault.
  4. It's AMD's fault.
  5. It's the game's fault.
  6. It's the driver's fault.
  7. It's a system configuration issue.
  8. Wrong settings were tested.
  9. Wrong area was tested.
  10. Wrong games were tested.
  11. 4K is irrelevant.
  12. Texture quality is irrelevant as long as it matches a console's.
  13. Detail levels are irrelevant as long as they match a console's.
  14. There's no reason a game should use more than 8GB, because a random forum user said so.
  15. It's completely acceptable for the more expensive 3070/3070TI/3080 to turn down settings while the cheaper 3060/6700XT has no issue.
  16. It's an anomaly.
  17. It's a console port.
  18. It's a conspiracy against NV.
  19. 8GB cards aren't meant for 4K / 1440p / 1080p / 720p gaming.
  20. It's completely acceptable to disable ray tracing on NV while AMD has no issue.
  21. Polls, hardware market share, and game title count are evidence 8GB is enough, but are totally ignored when they don't suit the ray tracing agenda.
According to some people here, 8GB is neeeevaaaaah NV's fault and objective evidence "doesn't count" because of reasons(tm). If you have others please let me know and I'll add them to the list. Cheers!
 
Last edited:

Ranulf

Platinum Member
Jul 18, 2001
2,379
1,257
136
I've probably said this before but this all makes the gtx 960 debates over 2GB vs 4GB and 6GB vs 8GB seem quaint. It is one thing to argue over the price/perf of the gtx 960 at $200 msrp and the 960 4Gb variant at $230 vs say the amd r9 290 4GB cards at $250 (or less). Now we're debating ram size again yet its over $500-800 cards. Actually $900 cards since the 4070ti was originally the 4080 12GB for $900. At least with the gtx 1060 6Gb, it was a harsh reality that the 960 was replaced within 1.5 years with 1.5-3x more vram but we're only talking about $200-250. With the super variants since Turing there is no stability or longevity. You pays your moneys and takes your chances.

Saying your $800 4070ti will be a 1080p card before long but that lower end cards with 12-16GB of vram will not survive correct settings or otherwise, is not the win you think it is. At least my 2060Super 4 years ago was about $315 with incentives and my 6700XT last summer was $300 with its free FSR exclusive game (valued at $100 msrp).
 
Last edited:

Aapje

Golden Member
Mar 21, 2022
1,418
1,927
106
Indeed. If UE5 will kill the longevity of even $800 cards, then it makes most sense to wait out this transition with a relatively cheap card. After all, UE5 should logically have an early sprint in how demanding it is, and then settle into more of a jog. Then Nvidia will need to actually start giving us some decent boosts during that sprint, or PC gaming will start to fall behind console gaming.

One of the reasons that I got my 6700 XT is that I can wait out Nvidia and AMD now for a while, certainly with the backlog in games I have.
 

psolord

Golden Member
Sep 16, 2009
1,958
1,201
136
Saying your $800 4070ti will be a 1080p card before long but that lower end cards with 12-16GB of vram will not survive correct settings or otherwise, is not the win you think it is. At least my 2060Super 4 years ago was about $315 with incentives and my 6700XT last summer was $300
with its free FSR exclusive game (valued at $100 msrp).

I am just showing what is the situation my friend. I did not apply a win or loss attribute to it. People that are willing to follow the trend, they just will. People that are not, will tune down their settings, as I have said many times and have been mocked. That's why the freagin settings are there, to fine tune your gaming experience, according to your hardware. Cranking everything over 9000 will set you up for failure and no amount of vram will help.

Since you have a 6700XT, can you please test the Deus Ex Manking United that is given for free on EPIC until tomorrow? I want a benchmark run at Ultra 1080p+4XMSAA.
 
Reactions: Ranulf

psolord

Golden Member
Sep 16, 2009
1,958
1,201
136
Indeed. If UE5 will kill the longevity of even $800 cards, then it makes most sense to wait out this transition with a relatively cheap card. After all, UE5 should logically have an early sprint in how demanding it is, and then settle into more of a jog. Then Nvidia will need to actually start giving us some decent boosts during that sprint, or PC gaming will start to fall behind console gaming.

One of the reasons that I got my 6700 XT is that I can wait out Nvidia and AMD now for a while, certainly with the backlog in games I have.
While your strategy is sound, it's not just UE5 but yes it's especially UE5. All big titles on next gen engines, have shown this behavior, it's just people in this thread choosing to ignore it. They just like 29fps vs 5fps examples and TLOU cranked over 9000 for nothing really.

The request for the Deus Ex benchmark stands for you as well, if you will...

Regarding UE5, I don't think it will settle for a jog. I mean maybe some of the devs will. The engine itself is designed for the sky to be the limit. What I said the other day, for a 4090 with 12GBs being more useful than the 7600XT 16GB, was not a joke. You will be required to have that kind of processing power, even for mere 1080p, for 2025-2028 games, for ultra settings. Mere being an unfair term. 1080p is pretty good and will always be, as long as you can keep a 1:1 pixel mapping or at least have a very good upscaler.

Now regarding the consoles, they are too weak to provide any kind of similar to PC experience. One of the reasons I am debating this thread itself, is that it gives the wrong idea about PC gaming. You are taking a 4060ti for example, cranking everything over 9000 and then complain, while the console settings are way way lower than that. So an individual will increasingly be exposed to a "PC gaming = bad/expensive, better get a console" narrative but he doesn't know that you are NOT comparing same things. It's not Nvidia or AMD that are hurting the PC. It's the users, that aren't getting their facts straight and always bitching about everything, especially Nvidia related stuff in this forum.

To put things into perspective, of what PC gaming is, I will give as an example some testing I am doing these days, on my old GTX 970. You can find these test on my channel (non monetized)


The GTX 970 is a pure Gen 8 product. I mean it came out in the same year as the Gen 8 consoles. It also has less than half the vram (3.5GB), of what the consoles had back then.

Still, you will see games like Tekken 8, Atlas Fallen, Palworld, Armored Core IV, Last Epoch, The Finals and many others, being quite playable, while they are nowhere to be found on Gen 8 consoles, not even the PS Pro or XoneX. Some of them are UE5 too.

You will see Baldur's Gate 3 being run at 60fps, yes with correct settings, while it's capped at 30fps on the vastly newer and with more vram, Series S.


You will see games like the Remnant II, another super heavy UE5 game, running at 60fps, yes with correct settings AND the addition of AMD's frame generation, that was given away very gracefully, never disputed that.


And many many more. All that on a 9.5yo card, with 3.5GBs of vram. And you guys are worried about 8GB cards, which are also 3-4 times more powerful? Pfff....
 

MrPickins

Diamond Member
May 24, 2003
9,017
585
126
Yes this is one of my main arcs, my arguments are based on. You will not be able to play future games as you like, just because of more vram.

Along with those 4vs8GBs came a considerable performance uplift from the gpu.

1060 3Gb and RX580 4GB would like to have a word with you. As I said, they got left in the dust, but the same models with more VRAM were valid for years longer.

Please stop ignoring datapoints that don't fit your predetermined conclusion. At this point, it doesn't feel like you're debating in good faith.
 

Timorous

Golden Member
Oct 27, 2008
1,668
2,922
136
I am just showing what is the situation my friend. I did not apply a win or loss attribute to it. People that are willing to follow the trend, they just will. People that are not, will tune down their settings, as I have said many times and have been mocked. That's why the freagin settings are there, to fine tune your gaming experience, according to your hardware. Cranking everything over 9000 will set you up for failure and no amount of vram will help.

Since you have a 6700XT, can you please test the Deus Ex Manking United that is given for free on EPIC until tomorrow? I want a benchmark run at Ultra 1080p+4XMSAA.

I cannot believe how badly you have missed the point of this thread.

The whole point is that the 4060Ti 16GB for instance is going to have far longer legs than the 4060Ti 8GB, the 3070, 3070Ti etc. It will have that not because it has more compute performance (the 3070Ti has a bit more compute power) but because all the assets will fit in the VRAM buffer. It also shows that the 16GB variant will be capable of running higher texture settings which has far more impact on visual quality than turning some of the compute settings down.

The 6500XT 4GB vs 8GB video / written article show what happens when you run out of VRAM, performance tanks or IQ tanks. 1% lows especially suffer. We have cases now where the 4060Ti 16GB can run the game at 1080p + RT just as smoothly as the 8GB variant can manage at 1080p no RT. Those cases will become more frequent to the point that in 6 years or so time there will be a similar article comparing the 4060Ti 16GB to the 8GB in then modern games where it shows the 16GB can offer a playable experience with decent texture quality and the 8GB card will fall short.

If you want a card to last that long while making fewer IQ sacrifices then getting more VRAM is the way to go, yes you will still need to turn down compute settings as games get more and more demanding, that is trivially obvious to all.

This is where console anchoring is important. The 4060Ti has enough of an advantage in compute, RT and VRAM over the consoles that until devs stop making games with PS5/ Series X in mind the 4060Ti 16GB will always be able to find a combination of settings that works. The 8GB variant and other cards with that amount of VRAM will not have the same guarantee because there will be cases where even on the lowest texture settings the VRAM is just not big enough, like we see with the 4GB 6500XT now.

The 1080Ti is a great example to show what I mean. At launch it was a $700 MSRP and today it hangs around the 3060 12GB in performance which in titles like TLOU and Halo Infinite offers a better experience than cards with 8GB of VRAM, especially at 1440p which someone who went 1080Ti may still be using. Sure you may need to turn down settings in the very latest games to get 60 fps playable at 1440p but for a 6 year old card I don't think that is surprising.

Inflation adjusted that is $900 in todays money or around $800 by end of 2022 which is 4070Ti at launch money. I think that in 6 years the 4070Ti will be having issues in a fair number of games. It won't totally crash like 8GB cards are going to but there will be noticeable difference between it and the 4070Ti Super, the Super version is going to grow that lead. You see this with the 2060 6GB at the moment. Suffers far more than the 8GB 2070 does in a few titles. FH5 for example, the 2070 can get a 1440p 60 fps experience at max settings, pretty good for the age of card. The 2060 can't at all and even dropping to 1080p does not make up much difference because it runs out of VRAM. The 4070Ti will be in that halfway house where it hangs on better than 8GB cards but starts to see significant deficits to comparable 16GB + cards. There are already cases where the 4070Ti is getting caught by the 7800XT / 7900GRE with 16GB of VRAM.

The whole reason you get mocked for 'correct settings' is because you are applying it to cards where it should not apply. The 4070 Ti occupies the same price point as the 1080Ti and while the game has moved on feature wise and there is even higher tier pricing available now at the same tier the card should offer comparable longevity and 12GB cards at that price point won't and 8GB cards at $400 price points are equally rubbish for the same reasons.
 

Aapje

Golden Member
Mar 21, 2022
1,418
1,927
106
@psolord

Now regarding the consoles, they are too weak to provide any kind of similar to PC experience.
Sorry, but this is just nonsense. Obviously someone who buys a x090 or x080 every generation will have a much different experience, but they will also spend way more.

The proper comparison is with a PC buyer who has about the same upgrade cycle and buys mainstream cards. The PS5 is about a 2070, which was $500 in 2020, the year that the PS5 came out. The PS5 is also $500. And the PS4 lasted 3-4 GPU generations, so the PS5 probably lasts about that long.

So the question is whether the mainstream buyer is better off with a $500 card every 3-4 generations or with a $250-$300 card every 2 generations, rather than getting a console.

And the console has the big advantage that developers cannot afford to write off an entire console generation, so they will pretty much always optimize and tune to a point where the console can run the game at a level that at most budget gamers consider acceptable. On a PC you get worse tuning and optimization, so you need more performance in the first place. And if the console has more (V)RAM, the people porting the game over can implement some really bad tuning or hacks to make it work, like we've seen with various games.

And on the PC you tend to have more challenges figuring out the 'correct settings' or other issues. A lot of people are willing to give up some quality/performance for a smoother experience.

If the value of PC gaming deteriorates, then more people will pick console, and it will especially be kids and young people with less money, who then don't become PC gamers. So this can result in them being lost for PC gaming forever, even if they get richer and could afford a big gaming PC.
 

psolord

Golden Member
Sep 16, 2009
1,958
1,201
136
1060 3Gb and RX580 4GB would like to have a word with you. As I said, they got left in the dust, but the same models with more VRAM were valid for years longer.

Please stop ignoring datapoints that don't fit your predetermined conclusion. At this point, it doesn't feel like you're debating in good faith.
It was not the same thing. I explained why. The higher you go in data sizes, the more diminishing returns you get, while resolutions stay roughly the same. 1vs2GBs will not be the same as 32GBvs64GB, just because they both have 100% difference.

Also we have direct storage now and data being streamed from super fast nvmes. The playfield is not exactly the same. Also I do have a feeling nvidia's compression algorithms are more efficient than AMDs.
 

MrPickins

Diamond Member
May 24, 2003
9,017
585
126
It was not the same thing. I explained why. The higher you go in data sizes, the more diminishing returns you get, while resolutions stay roughly the same. 1vs2GBs will not be the same as 32GBvs64GB, just because they both have 100% difference.

Also we have direct storage now and data being streamed from super fast nvmes. The playfield is not exactly the same. Also I do have a feeling nvidia's compression algorithms are more efficient than AMDs.

What? That makes no sense.

That's like saying that the 4 vs 8 GB system RAM argument ~15 years ago is different than the 16 vs 32 GB argument today. You realize that requirements don't tend to grow linearly, right?

And regarding compression and direct storage, those can be mitigating factors, but don't trump adding more VRAM (higher VRAM cards can still use these tricks too, you know).

You keep saying things are different than they used to be, but your arguments for that are weak, at best. You need to prove the assertion that you keep repeating, and so far, you haven't.
 
Last edited:
Reactions: Tlh97 and Mopetar

psolord

Golden Member
Sep 16, 2009
1,958
1,201
136
I cannot believe how badly you have missed the point of this thread.

The whole point is that the 4060Ti 16GB for instance is going to have far longer legs than the 4060Ti 8GB, the 3070, 3070Ti etc. It will have that not because it has more compute performance (the 3070Ti has a bit more compute power) but because all the assets will fit in the VRAM buffer. It also shows that the 16GB variant will be capable of running higher texture settings which has far more impact on visual quality than turning some of the compute settings down.
The 16GB variant of the 4060ti will be just as dead in the water, as the 8GB variant, when games do not even run properly at 1080p. I showed you what happens in TLOU if you go to Ultra textures at 1080p. You get a 1.5GB additional overhead, for nothing.

The 6500XT 4GB vs 8GB video / written article show what happens when you run out of VRAM, performance tanks or IQ tanks. 1% lows especially suffer. We have cases now where the 4060Ti 16GB can run the game at 1080p + RT just as smoothly as the 8GB variant can manage at 1080p no RT. Those cases will become more frequent to the point that in 6 years or so time there will be a similar article comparing the 4060Ti 16GB to the 8GB in then modern games where it shows the 16GB can offer a playable experience with decent texture quality and the 8GB card will fall short.
In 6 years time, the 4060ti will be forced to use such low settings, that vram will not matter at all. The 8GB variant will be able to provide the same result roughly.

I'd like to see this example of 4060ti 16GB is giving same performance with RT as 4060ti 8GB with no RT please. Valid examples please. Not 5fps vs 10fps, thank you. And then show me how both run the no RT path and how much faster.


This is where console anchoring is important. The 4060Ti has enough of an advantage in compute, RT and VRAM over the consoles that until devs stop making games with PS5/ Series X in mind the 4060Ti 16GB will always be able to find a combination of settings that works. The 8GB variant and other cards with that amount of VRAM will not have the same guarantee because there will be cases where even on the lowest texture settings the VRAM is just not big enough, like we see with the 4GB 6500XT now.
How will the lowest texture settings, require more vram than 8GBs? And why? We already have fantastic quixel textures that fit just fine in 8GBs framebuffer. How will they get worse? Data is data. Again with the 4GBs argument. Not the same thing.

The 1080Ti is a great example to show what I mean. At launch it was a $700 MSRP and today it hangs around the 3060 12GB in performance which in titles like TLOU and Halo Infinite offers a better experience than cards with 8GB of VRAM, especially at 1440p which someone who went 1080Ti may still be using. Sure you may need to turn down settings in the very latest games to get 60 fps playable at 1440p but for a 6 year old card I don't think that is surprising.
The 1080ti came with considerable enhancements on the gpu side and bus too though. It's not just the vram size. Maybe it helps is TLOU. You can use high textures and be fine tho. No visual handicap whatsoever. I don't know about Halo Infinite. What's wrong with it?

Inflation adjusted that is $900 in todays money or around $800 by end of 2022 which is 4070Ti at launch money. I think that in 6 years the 4070Ti will be having issues in a fair number of games. It won't totally crash like 8GB cards are going to but there will be noticeable difference between it and the 4070Ti Super, the Super version is going to grow that lead. You see this with the 2060 6GB at the moment. Suffers far more than the 8GB 2070 does in a few titles. FH5 for example, the 2070 can get a 1440p 60 fps experience at max settings, pretty good for the age of card. The 2060 can't at all and even dropping to 1080p does not make up much difference because it runs out of VRAM. The 4070Ti will be in that halfway house where it hangs on better than 8GB cards but starts to see significant deficits to comparable 16GB + cards. There are already cases where the 4070Ti is getting caught by the 7800XT / 7900GRE with 16GB of VRAM.
The 4070ti already has performance issues due to gpu power, in some super heavy games. This worries more, compared to its vram buffer. Worries being academic of course. I am not getting married to it. With proper settings, this too will have a long life.

The comparisons with the GRE will be interesting indeed. We'll see.
The whole reason you get mocked for 'correct settings' is because you are applying it to cards where it should not apply.
Well I am talking about my own 8GB cards mostly and the correct settings do apply in all of them.

The 4070 Ti occupies the same price point as the 1080Ti and while the game has moved on feature wise and there is even higher tier pricing available now at the same tier the card should offer comparable longevity and 12GB cards at that price point won't and 8GB cards at $400 price points are equally rubbish for the same reasons.
If the 4070ti will not have the same longevity as the 1080ti, will certainly not be because of vram. UE5, next gen snowdrop, next gen northlight, next gen creation engine and the likes, will be the death of numerous cards, vram size notwithstanding.
 

Timorous

Golden Member
Oct 27, 2008
1,668
2,922
136
The 16GB variant of the 4060ti will be just as dead in the water, as the 8GB variant, when games do not even run properly at 1080p. I showed you what happens in TLOU if you go to Ultra textures at 1080p. You get a 1.5GB additional overhead, for nothing.


In 6 years time, the 4060ti will be forced to use such low settings, that vram will not matter at all. The 8GB variant will be able to provide the same result roughly.

I'd like to see this example of 4060ti 16GB is giving same performance with RT as 4060ti 8GB with no RT please. Valid examples please. Not 5fps vs 10fps, thank you. And then show me how both run the no RT path and how much faster.



How will the lowest texture settings, require more vram than 8GBs? And why? We already have fantastic quixel textures that fit just fine in 8GBs framebuffer. How will they get worse? Data is data. Again with the 4GBs argument. Not the same thing.


The 1080ti came with considerable enhancements on the gpu side and bus too though. It's not just the vram size. Maybe it helps is TLOU. You can use high textures and be fine tho. No visual handicap whatsoever. I don't know about Halo Infinite. What's wrong with it?


The 4070ti already has performance issues due to gpu power, in some super heavy games. This worries more, compared to its vram buffer. Worries being academic of course. I am not getting married to it. With proper settings, this too will have a long life.

The comparisons with the GRE will be interesting indeed. We'll see.

Well I am talking about my own 8GB cards mostly and the correct settings do apply in all of them.


If the 4070ti will not have the same longevity as the 1080ti, will certainly not be because of vram. UE5, next gen snowdrop, next gen northlight, next gen creation engine and the likes, will be the death of numerous cards, vram size notwithstanding.

The 4GB Vs 8GB cards show you will be wrong.

4060Ti 16GB managing RT at over 60fps.



4060T 8GB less than 10% faster without RT than the 16GB is with RT.



In 6 years time the 16GB will be able to offer a better than PS5 pro experience, the 8GB won't.
 

psolord

Golden Member
Sep 16, 2009
1,958
1,201
136
Horizon Forbidden West benchmarks are in.



Ok it favors AMD cards, that was to be expected. But let's see the results from the vram viewpoint.

I mean the super bad 4060ti with its 128bit 8GBs at 70fps?It is beating the 11GB 2080ti and trouncing the 12GB 3060? Vram ain't helping? And that's at 1080p? lol
 

MrPickins

Diamond Member
May 24, 2003
9,017
585
126
I can pick cherries too:





How many times do you need to be told that it's not just about todays games, it's about the next few years' worth? Or that fps graphs often miss things like textures that are downgraded or don't load properly?

All of these charts you keep posting are missing the point.
 
Jul 27, 2020
16,640
10,635
106

This reviewer has something against showing V-cache advantage in games. He has the newer 8700G in the CPU results but not even a 5800X3D!!!

We can guess pretty easily who's paying him or where his allegiance lies...
 
Jul 27, 2020
16,640
10,635
106
Another thing: look closely in the CPU core utilization graph. Showing only older CPUs and Zen 2 is showing an imbalance of workload between its physical and virtual cores. Anyone seeing that will come to the conclusion that HT in games on AMD CPUs sucks! He should've also compared the newest CPUs from both camps.
 

MrPickins

Diamond Member
May 24, 2003
9,017
585
126

This reviewer has something against showing V-cache advantage in games. He has the newer 8700G in the CPU results but not even a 5800X3D!!!

We can guess pretty easily who's paying him or where his allegiance lies...
That as the first thing I noticed. One of the most popular/powerful (but still affordable) gaming CPUs around, or a fair while now, and it's not included? That's not even saying anything about the lack of a 7800X3D
 

BFG10K

Lifer
Aug 14, 2000
22,709
2,978
126
View attachment 95647

System RAM consumption. At 1440p, 4070 consumes less coz it fills more data in its VRAM whereas the 3070 has to waste time and energy shuffling data between its VRAM and system RAM.
Even more interesting, 3070 will hit the page file on a 16GB system. I guess a certain individual in this thread thinks there's absolutely no problem with that.

8GB VRAM from ten years ago is totally fine yo, but it's perfectly acceptable to demand 32 GB system RAM, LMAO.

It's absolutely hilarious witnessing the pretzel-twisting, corporate boot-licking arguments we're seeing in this thread.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |