Discussion Apple Silicon SoC thread

Page 398 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Eug

Lifer
Mar 11, 2000
24,024
1,644
126
M1
5 nm
Unified memory architecture - LP-DDR4
16 billion transistors

8-core CPU

4 high-performance cores
192 KB instruction cache
128 KB data cache
Shared 12 MB L2 cache

4 high-efficiency cores
128 KB instruction cache
64 KB data cache
Shared 4 MB L2 cache
(Apple claims the 4 high-effiency cores alone perform like a dual-core Intel MacBook Air)

8-core iGPU (but there is a 7-core variant, likely with one inactive core)
128 execution units
Up to 24576 concurrent threads
2.6 Teraflops
82 Gigatexels/s
41 gigapixels/s

16-core neural engine
Secure Enclave
USB 4

Products:
$999 ($899 edu) 13" MacBook Air (fanless) - 18 hour video playback battery life
$699 Mac mini (with fan)
$1299 ($1199 edu) 13" MacBook Pro (with fan) - 20 hour video playback battery life

Memory options 8 GB and 16 GB. No 32 GB option (unless you go Intel).

It should be noted that the M1 chip in these three Macs is the same (aside from GPU core number). Basically, Apple is taking the same approach which these chips as they do the iPhones and iPads. Just one SKU (excluding the X variants), which is the same across all iDevices (aside from maybe slight clock speed differences occasionally).

EDIT:



M1 Pro 8-core CPU (6+2), 14-core GPU
M1 Pro 10-core CPU (8+2), 14-core GPU
M1 Pro 10-core CPU (8+2), 16-core GPU
M1 Max 10-core CPU (8+2), 24-core GPU
M1 Max 10-core CPU (8+2), 32-core GPU

M1 Pro and M1 Max discussion here:


M1 Ultra discussion here:


M2 discussion here:


Second Generation 5 nm
Unified memory architecture - LPDDR5, up to 24 GB and 100 GB/s
20 billion transistors

8-core CPU

4 high-performance cores
192 KB instruction cache
128 KB data cache
Shared 16 MB L2 cache

4 high-efficiency cores
128 KB instruction cache
64 KB data cache
Shared 4 MB L2 cache

10-core iGPU (but there is an 8-core variant)
3.6 Teraflops

16-core neural engine
Secure Enclave
USB 4

Hardware acceleration for 8K h.264, h.264, ProRes

M3 Family discussion here:


M4 Family discussion here:

 
Last edited:
Jul 27, 2020
25,155
17,492
146
A walled garden has to be even more concerned about survival since the walls don’t contain the human consumers, they can all step out as they please.
Without going into an unnecessary debate that I'm not interested in, I'll just say that Apple consumers are not typical consumers. Majority of them are fiercely loyal and they were that way even when Apple had Intel chips in their hardware. No one smashed their Macbook on the pavement because they were frustrated with the laptop's performance or less than ideal battery life. I just fail to see what urgency Apple faced (did they envision some other lifestyle company stepping into the foray with their own ARM chips?). To me, the Apple Silicon Project was simply one of vanity. They did it because they could. And I don't mind it because the CPU is an awesome data cruncher but had they enlisted AMD's help, we could still have gotten better performance AND Bootcamp dual boot at the same time.
 

Eug

Lifer
Mar 11, 2000
24,024
1,644
126
Without going into an unnecessary debate that I'm not interested in, I'll just say that Apple consumers are not typical consumers. Majority of them are fiercely loyal and they were that way even when Apple had Intel chips in their hardware. No one smashed their Macbook on the pavement because they were frustrated with the laptop's performance or less than ideal battery life. I just fail to see what urgency Apple faced (did they envision some other lifestyle company stepping into the foray with their own ARM chips?). To me, the Apple Silicon Project was simply one of vanity. They did it because they could. And I don't mind it because the CPU is an awesome data cruncher but had they enlisted AMD's help, we could still have gotten better performance AND Bootcamp dual boot at the same time.
It seems you've already forgotten the #1 reason, which is the almighty $. This IBM exec estimated that in the first year of M1, Apple would save billions of $.


They no longer have to pay Intel (or AMD) their cut for chip profits. And that's just for the Macs. That doesn't even factor in the iPhone/iPad chips. Sure, there are big startup costs for the chip development, but that can be amortized over subsequent generations. Furthermore, they also are no longer dependent upon general purpose chips that miss out on functionality that Apple wants, and Apple no longer has to pay for features it has no use for but which are there to appease other buyers of Intel chips.
 
Last edited:

Doug S

Diamond Member
Feb 8, 2020
3,203
5,498
136
Without going into an unnecessary debate that I'm not interested in, I'll just say that Apple consumers are not typical consumers. Majority of them are fiercely loyal and they were that way even when Apple had Intel chips in their hardware. No one smashed their Macbook on the pavement because they were frustrated with the laptop's performance or less than ideal battery life. I just fail to see what urgency Apple faced (did they envision some other lifestyle company stepping into the foray with their own ARM chips?). To me, the Apple Silicon Project was simply one of vanity. They did it because they could. And I don't mind it because the CPU is an awesome data cruncher but had they enlisted AMD's help, we could still have gotten better performance AND Bootcamp dual boot at the same time.

What part about "Apple's CPUs beat anything AMD makes in both performance AND power" do you not get?

A "vanity" project would be one where Apple replaced x86 with something that had similar performance, similar power, and similar cost to Apple. Apple had huge wins in all three, PLUS they fully control the future roadmap in a way not possible if someone else is designing the CPUs and all Apple gets to pick are which SKUs they want to buy.

Your position here is absolutely ludicrous.
 

poke01

Diamond Member
Mar 8, 2022
3,525
4,856
106
Why would Apple want to be beholden to AMD? Looking at AMD’s CPUs currently there is nothing that even matches the M4 in terms of single core for laptops especially when it comes to performance per watt.


Very few were buying MacBooks for Windows anyway.
 
Reactions: mvprod123

Doug S

Diamond Member
Feb 8, 2020
3,203
5,498
136
Very few were buying MacBooks for Windows anyway.

And dual boot is only necessary if you are buying a Macbook SOLELY for Windows. That may have happened in the past when no PC laptops could compete on the "thin and light" front but there is LG Gram and probably others that are more than competitive on that front and also on the build quality front (yes also on the price front but you don't get all that stuff for free)

Parallels will take care of pretty much everything except some games. If you want to run Windows games get a Windows laptop.
 

The Hardcard

Senior member
Oct 19, 2021
317
399
106
Without going into an unnecessary debate that I'm not interested in, I'll just say that Apple consumers are not typical consumers. Majority of them are fiercely loyal and they were that way even when Apple had Intel chips in their hardware. No one smashed their Macbook on the pavement because they were frustrated with the laptop's performance or less than ideal battery life. I just fail to see what urgency Apple faced (did they envision some other lifestyle company stepping into the foray with their own ARM chips?). To me, the Apple Silicon Project was simply one of vanity. They did it because they could. And I don't mind it because the CPU is an awesome data cruncher but had they enlisted AMD's help, we could still have gotten better performance AND Bootcamp dual boot at the same time.
Ah, but you can’t avoid getting into the debate, because it guides your view and comments, that can Apple when do or not do whatever they want because their consumers will buy whatever Apple puts in front of them.

That narrative is wrong. Apple consumers are typical consumers. Jobs started Apple making the claim that he was going to always walk onto the stage with a computer that can do things other computers can’t do. The current Apple Silicon moment has been the goal for decades, and Apple developed its fan base from this persistent attempt to be out in front. That’s where the consumer loyalty comes from, Apple’s longstanding attempt to outpace other computer makers.

Apple has been trying to use an aggressive CPU and GPU roadmap since the 1980s, demanding Motorola push harder with the 68000 series. Then they forced the AIM alliance to enlist IBM RISC to pick up the pace, but both Motorola and IBM didn’t see the benefit of the investment. Go back and look at what Apple wanted from the 68040 and 68060, and then the discussion around the PowerPC 604, G3, and G4 with Altivec.

Steve Jobs had wanted to dictate processor roadmaps from the 1980s and had employed CPU architects since then. A significant portion of the PowerPC CPU architects were Apple employees and it was in fact Keith Dieffendorf, an Apple CPU architect (hired from Motorola) who led the development of Altivec. This at launch was the most capable SIMD implementation, leading Apple to call it the first desktop supercomputer and capable of video, audio, and 3D processing that Windows processors couldn’t match.

Apple consumers expect not just the promise but the aggressive attempt to achieve superior capabilities based on 50 years of Apple commitment to try to be out front.

Apple Silicon is just a continuation of that. Apple bought PA Semi and Intrinsity with the intention of pushing Samsung in phone SOCs and Intel in Mac chips, but Jobs talking with Dan Dobberpuhl about making those requests decided to do their own phone chips. Then, after having the same frustrations with Intel that he had with Motorola and IBM, Jobs decided to have Apple Silicon power Macs where they now offer SOC metrics no one else can match.

They have to continue to try this to keep their customer base. That’s why they chose to go it alone with Apple Silicon, there was no hope of AMD allowing them to offer what current Macs and iPads can do.
 
Reactions: mvprod123

johnsonwax

Member
Jun 27, 2024
165
281
96
Not true though. They had AMD if their plan failed and they wanted to move away from Intel. He's just being dramatic.

Monumental software effort? The company with so much cash on hand, they could have kept going as-is for decades. Again. Being overly dramatic like Apple's survival depended on the ARM effort. A walled garden monopoly does not need to be so concerned about its ongoing survival. It's superfluous but kudos to them for being paranoid. It paid off handsomely.
I think you're really downplaying how much investment Apple had made over the preceding years to shifting MacOS not only to ARM, not only to preferentially ARM, but to preferentially Apple Silicon.

You're acting as though the whole endeavor was silicon design, and not lifting the entire OS not only to that new instruction set but also to one that was co-designed with the silicon, plus the lift to make it as easy as possible for developers to do the same. And most of that work had to be done before they had production silicon.
 
Jul 27, 2020
25,155
17,492
146
I'm just saying that it was not a do-or-die thing for them. I still think that it would be a good idea for them to put out a Strix Halo Macbook or Mac Mini. Give people the flexibility they desire. Maintaining a PC and a Mac is a cumbersome exercise for many people.
 
Reactions: Io Magnesso

johnsonwax

Member
Jun 27, 2024
165
281
96
It seems you've already forgotten the #1 reason, which is the almighty $. This IBM exec estimated that in the first year of M1, Apple would save billions of $.
That's not the reason. From the Cook Doctrine:

We believe that we need to own and control the primary technologies behind the products we make, and participate only in markets where we can make a significant contribution.

The trauma that Apple still carries in guys like Cook and a number of the other executives was the near death of the company in the 90s largely because Apple didn't have that control. IBM and Motorola weren't interested in margin the investments that Apple needed. Metroworks as the most used compiler was leaving. Microsoft pulled IE and there were questions around Office.

Apple was being torn apart by core technologies and products that they didn't control, and that was the thing Jobs re-established with Tim alongside. And what they decided is that the best use for Apple's profits were to secure that control, which they've done. And the places that they don't have that control are parts of the supply chain that there are a fair number of options like assembly, or things they probably are unsure they can make work competitively, like fab.

Yeah, from Intel's perspective it was a cost benefit for Apple, but compared to losing a supplier of a critical tech and the potential for lost sales, that was a much bigger risk than what they're saving.
 
Reactions: moinmoin

LightningZ71

Platinum Member
Mar 10, 2017
2,246
2,764
136
I wouldn't be completely shocked if Apple chose to abandon ARM completely one day and design a processor from the ground up with a fully custom instruction set that is optimized for their targeted use cases.
 
Reactions: Io Magnesso

poke01

Diamond Member
Mar 8, 2022
3,525
4,856
106
I wouldn't be completely shocked if Apple chose to abandon ARM completely one day and design a processor from the ground up with a fully custom instruction set that is optimized for their targeted use cases.

Those are called ASICs, don’t need a seperate ISA for that. Apple already has ProRes hardware ASICs.
 

Io Magnesso

Member
Jun 12, 2025
71
25
46
I wouldn't be completely shocked if Apple chose to abandon ARM completely one day and design a processor from the ground up with a fully custom instruction set that is optimized for their targeted use cases.
Personally, Apple might use RISC-V too.
Maybe the chips loaded by RISC-V ISA already have Apple products. It may be like a microcontroller.
With RISC-V, you can expand the command as you like.
However, as expected, there is a limit, and it is required to expand its own commands to the extent that it does not cause trouble for the RISC-V ecosystem itself.
 

Doug S

Diamond Member
Feb 8, 2020
3,203
5,498
136
I think you're really downplaying how much investment Apple had made over the preceding years to shifting MacOS not only to ARM, not only to preferentially ARM, but to preferentially Apple Silicon.

You're acting as though the whole endeavor was silicon design, and not lifting the entire OS not only to that new instruction set but also to one that was co-designed with the silicon, plus the lift to make it as easy as possible for developers to do the same. And most of that work had to be done before they had production silicon.


They had production silicon the entire time they were doing that work. They were putting iPhone SoCs into other form factors long before the world first saw one in the form of that developer Mac Mini with the A12X/A12Z SoC.

I don't think the rumors of ARM Macs starting shortly after the launch of the iPad (and gaining in volume after A9 brought a 70% performance leap) were all mere speculation. I have to imagine a few could have based on real leaks - Apple may have even encouraged the speculation behind the scenes so any actual leaked information was lost in the noise.

They did seed the ground for a lot of this in the intervening years, by obsoleting old APIs where necessary to force developers towards improved APIs that would allow for seamless transition to ARM. Given how well it worked they had clearly been working on Rosetta 2 for a long time - long enough that they were able to make changes to Apple Silicon to support it (i.e. x86's strong memory ordering) which requires a lead time of 3-4 years from "on paper" to shipping silicon. I'm willing to bet that while A12Z was the first Apple design that officially included it, it had been present for at least a couple revs before that but since it wasn't something iOS/iPadOS could use it was not something the world at large ever knew about.

The transition was clearly a decade long effort with major investment as you say.
 

Doug S

Diamond Member
Feb 8, 2020
3,203
5,498
136
MacBook Air is 16/512 GB model, which also comes with the non-binned M4. The base 256 GB model has a binned 8-core GPU instead of the 10-core GPU. We don't actually need the non-binned SoC, and 256 GB would be fine for now, but I suspect we'll keep this Mac for >5 years so 512 GB may come in handy. My purchases in the past year so far:

A18 6-core CPU / 4-core GPU (binned) with 8 GB / 128 GB - iPhone 16e
M4 9-core CPU / 10-core GPU (binned) with 8 GB / 256 GB - 11" iPad Pro
M4 10-core CPU / 10-core GPU (non-binned) with 16 GB / 512 GB - 13" MacBook Air
M4 10-core CPU / 10-core GPU (non-binned) with 24 GB / 512 GB - Mac mini

It's been an expensive year, but luckily I can write some of that off.


Care to run a small experiment? I posted the below back in November and now someone here has both an M4 MacBook Air and Mac Mini. It would have been better if yours was a Macbook Pro (since I think that more expensive model is more likely to get the "best" M4s) but it is safe to say that if Apple is binning on power they'd put the "worst" in Mac Mini. But hey beggars can't be choosers - and maybe if you run this experiment someone else with an M4 Macbook Pro will be inspired to try the same to compare with your results.


I think there's no doubt M4 could easily exceed 5 GHz if Apple binned for frequency. The problem is that Apple hates having a proliferation of SKUs, plus there is the inherent unpredictability of yields in different processes. You'd get yourself sort of stuck if you had say M2 yielding well enough to have a halo model at 18% above nominal but M3 yielding only 6% above nominal and then the halo M2 would beat the halo M3! One might suggest Intel is having this exact issue with ARL right now - they are not only using a problematic process but one targeted towards low power which may not work quite as well as Intel 7 for the 250W that modern desktop CPUs can draw.

Over on RWT we've recently been discussing the possibility that Apple Silicon is binned on power. It would make sense, for example, to select M4/M4P dies able to operate on less power for Macbook Pro and use ones that require more in Mac Mini since it isn't running on battery and has (or at least should have, I guess we haven't seen the inside of the new one yet) better cooling.

I have no idea if they do this, but it would be interesting if someone had access to a Mac Mini and Macbook Pro with the same model of M4 and compare power usage when running the same tasks.
 
Reactions: igor_kavinski

Eug

Lifer
Mar 11, 2000
24,024
1,644
126
Mac
Care to run a small experiment? I posted the below back in November and now someone here has both an M4 MacBook Air and Mac Mini. It would have been better if yours was a Macbook Pro (since I think that more expensive model is more likely to get the "best" M4s) but it is safe to say that if Apple is binning on power they'd put the "worst" in Mac Mini. But hey beggars can't be choosers - and maybe if you run this experiment someone else with an M4 Macbook Pro will be inspired to try the same to compare with your results.
That makes sense and I've wondered that in the past myself too, but how would you actually test this? For example, the power draw from the wall for a battery-endowed MBA may not actually correspond to the power draw increase needed to say run Cinebench. Another issue is the MBA is fanless and will throttle under load, whereas the Mac mini has a robust fan and won't throttle under load.

BTW, my largest USB-C charger is only 35 Watts.
 

Eug

Lifer
Mar 11, 2000
24,024
1,644
126
Interestingly, although the MacBook Air did the encode with lower Wattage and completed the task with lower overall power utilization, it was slower. The iMac did the same encode in less time.



The fanless M4 MacBook Air took 7% longer to complete the task than the M4 iMac which has a fan.
 

Doug S

Diamond Member
Feb 8, 2020
3,203
5,498
136
Mac
That makes sense and I've wondered that in the past myself too, but how would you actually test this? For example, the power draw from the wall for a battery-endowed MBA may not actually correspond to the power draw increase needed to say run Cinebench. Another issue is the MBA is fanless and will throttle under load, whereas the Mac mini has a robust fan and won't throttle under load.

BTW, my largest USB-C charger is only 35 Watts.

You're right measuring it from the wall wouldn't tell you anything useful, you'd have to measure just the SoC itself. Apple has a way to do just that called powermetrics that's a CLI command. It can break it down to CPU power alone, so I would think with the appropriate powermetrics calls and running some representative stuff when logging in remotely to a Mac (so the screen is in power save to minimize GUI related CPU usage, and with as many processes killed as possible to avoid contaminating the data) this data could be teased out.

I think you'd want to run fairly short tasks repeatedly with pauses in between to minimize any impact of the MBA's lack of fans - i.e. find something that runs in the exact same time on the Mini. Because obviously when the MBA gets too hot and clocks down it will run more efficiently at that lower clock so you need something that runs at full speed but not long enough to overheat. Then powermetrics will tell how much power the CPU used to run it.

Sorry I can't provide examples - if I had a Mac I could come up with something you could just run. So I'm kind of handicapped as far as helping beyond mere suggestion. Maybe if @name99 sees this and is interested in the results he can be of more help.
 

Eug

Lifer
Mar 11, 2000
24,024
1,644
126
You're right measuring it from the wall wouldn't tell you anything useful, you'd have to measure just the SoC itself. Apple has a way to do just that called powermetrics that's a CLI command. It can break it down to CPU power alone, so I would think with the appropriate powermetrics calls and running some representative stuff when logging in remotely to a Mac (so the screen is in power save to minimize GUI related CPU usage, and with as many processes killed as possible to avoid contaminating the data) this data could be teased out.

I think you'd want to run fairly short tasks repeatedly with pauses in between to minimize any impact of the MBA's lack of fans - i.e. find something that runs in the exact same time on the Mini. Because obviously when the MBA gets too hot and clocks down it will run more efficiently at that lower clock so you need something that runs at full speed but not long enough to overheat. Then powermetrics will tell how much power the CPU used to run it.

Sorry I can't provide examples - if I had a Mac I could come up with something you could just run. So I'm kind of handicapped as far as helping beyond mere suggestion. Maybe if @name99 sees this and is interested in the results he can be of more help.
The article I posted above has some of the stuff you wanted, but the M4 MBA is throttling so that screws things up.

Somebody out there must have an M4 MacBook Pro to compare against the M4 Mac mini or M4 iMac. That would make for a better test, since all those have fans.
 

johnsonwax

Member
Jun 27, 2024
165
281
96
I wouldn't be completely shocked if Apple chose to abandon ARM completely one day and design a processor from the ground up with a fully custom instruction set that is optimized for their targeted use cases.
I don't think Apple feels any pressure to abandon ARM. They co-founded the company and have tremendous influence as a result, the technology is open to be licensed, so it's not a situation where ARM would say 'oh, we're not interested in what Apple is doing' because ARM isn't a potential competitor and isn't likely to pivot to other markets. Licensing to companies like Apple is central to their business model.

Apple is easy to understand in the sense that they want to own all of their gatekeeping IP. ARM license their IP to anyone (unlike Intel) and TSMC will fab for anyone (unlike Intel). That's all the information you needed to see that Apple would have sought to leave Intel for ARM/TSMC (or Samsung, etc.) back in 2007. The only impediment was their ability to afford it - to have the scale, the expertise, and so on. But predicting that Apple would be investing in getting there was a free money bet.

So yeah, Apple could do that, but there would need to be significant benefits. Essentially, Apple is already doing that but maintaining ARM compatibility, probably because it benefits them to be able to run most open source software, and almost everything out there has an ARM branch being maintained. That's not something Apple can just replace, and there's no reason to believe those maintainers would add an Apple ISA branch quickly, which would almost certainly require buying Apple hardware/software to compile, at least for a while. It's a nasty chicken/egg situation. But a week after Apple Silicon was announced, I tipped up an Amazon ARM instance and moved my entire data science operation over to it in about an hour just to see what kind of hassle it would be, and there was no effort involved. Apple got a lot of the AS transition for free from that perspective and that's a benefit they can't easily replace.

But when Apple wants to do bespoke things, they just do bespoke things. Some gets offered back to ARM, but much doesn't. You can see the whole AMX-SME arc. Apple has their own ML instructions. They have their own GPU stuff too. They don't have to nudge ARM to design and license those things, they can build them themselves while they work through the standard, and then switch on their terms. And that was always the point - Apple maintains control. It's not that the M1 was cheaper than an i5, it's that Apple wanted a particular thermal budget and Intel wasn't interested in making that. Sure there's a cost savings, but there's also increased risk. You can save money not having health insurance, but there's a risk, so the financial calculation isn't as cut and dry as it appears and quite often not buying that insurance bites you in the ass financially. So given Intels current state, the move to AS seems genius, but if Apple had failed to maintain a performance advantage, or had tripped on a generation, I don't think people would be so quick to say 'oh, yeah, that was the right decision'. It's worked out because it's worked out. It's not guaranteed to always work out.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |