Discussion Intel Meteor, Arrow, Lunar & Panther Lakes Discussion Threads

Page 269 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Tigerick

Senior member
Apr 1, 2022
664
541
106






As Hot Chips 34 starting this week, Intel will unveil technical information of upcoming Meteor Lake (MTL) and Arrow Lake (ARL), new generation platform after Raptor Lake. Both MTL and ARL represent new direction which Intel will move to multiple chiplets and combine as one SoC platform.

MTL also represents new compute tile that based on Intel 4 process which is based on EUV lithography, a first from Intel. Intel expects to ship MTL mobile SoC in 2023.

ARL will come after MTL so Intel should be shipping it in 2024, that is what Intel roadmap is telling us. ARL compute tile will be manufactured by Intel 20A process, a first from Intel to use GAA transistors called RibbonFET.



Comparison of upcoming Intel's U-series CPU: Core Ultra 100U, Lunar Lake and Panther Lake

ModelCode-NameDateTDPNodeTilesMain TileCPULP E-CoreLLCGPUXe-cores
Core Ultra 100UMeteor LakeQ4 202315 - 57 WIntel 4 + N5 + N64tCPU2P + 8E212 MBIntel Graphics4
?Lunar LakeQ4 202417 - 30 WN3B + N62CPU + GPU & IMC4P + 4E08 MBArc8
?Panther LakeQ1 2026 ??Intel 18A + N3E3CPU + MC4P + 8E4?Arc12



Comparison of die size of Each Tile of Meteor Lake, Arrow Lake, Lunar Lake and Panther Lake

Meteor LakeArrow Lake (20A)Arrow Lake (N3B)Arrow Lake Refresh (N3B)Lunar LakePanther Lake
PlatformMobile H/U OnlyDesktop OnlyDesktop & Mobile H&HXDesktop OnlyMobile U OnlyMobile H
Process NodeIntel 4Intel 20ATSMC N3BTSMC N3BTSMC N3BIntel 18A
DateQ4 2023Q1 2025 ?Desktop-Q4-2024
H&HX-Q1-2025
Q4 2025 ?Q4 2024Q1 2026 ?
Full Die6P + 8P6P + 8E ?8P + 16E8P + 32E4P + 4E4P + 8E
LLC24 MB24 MB ?36 MB ??8 MB?
tCPU66.48
tGPU44.45
SoC96.77
IOE44.45
Total252.15



Intel Core Ultra 100 - Meteor Lake



As mentioned by Tomshardware, TSMC will manufacture the I/O, SoC, and GPU tiles. That means Intel will manufacture only the CPU and Foveros tiles. (Notably, Intel calls the I/O tile an 'I/O Expander,' hence the IOE moniker.)

 

Attachments

  • PantherLake.png
    283.5 KB · Views: 23,961
  • LNL.png
    881.8 KB · Views: 25,431
Last edited:

adroc_thurston

Platinum Member
Jul 2, 2023
2,219
2,936
96
This can't be real right?
Lmao.
If the focus is on efficiency it could be
Intrinsically every core focuses on efficiency.
seeing how bad the inefficiency Raptor Lake is at the moment
It's ok unless you consider KS parts real.
Raptor-HX is genuinely alright.
Could be better in MT than Zen 5 but worse in ST and edges out in efficiency kind of deal
It doesn't.
Skymont is pretty good but it's no Zen5.
 

S'renne

Member
Oct 30, 2022
129
88
61
Lmao.

Intrinsically every core focuses on efficiency.

It's ok unless you consider KS parts real.
Raptor-HX is genuinely alright.

It doesn't.
Skymont is pretty good but it's no Zen5.
Just saying the "efficiency emphasis" idea out loud only because of the rumour of LNL having 1.5x MT GB(? Forgot which software) score than MTL or something like that
 

itsmydamnation

Platinum Member
Feb 6, 2011
2,776
3,164
136
What's the drawback if it doesn't do that after every op?
If you want to do gemm nothing it's why the exsist, if you want to do something other then gemm well your doing your math in the wrong unit.

Gemm units are specialised for a specific workload , avx/simd units are general for any work that can be batched together.
 

Ghostsonplanets

Senior member
Mar 1, 2024
290
420
91
Skymont is pretty good but it's no Zen5.
It's interesting that while there a lot of rumors about ARC not being a good jump over GLC, the rumors around Skymont are pretty positive. It's almost as if the Atom team has been able to deliver a consistent cadence of performance improvements while the Core team takes their sweet time.

Tremont to Gracemont was a substantial jump, Crestmont was a nice update over Gracemont and Skymont is rumored to be around GLC performance, which is really good for what are meant to be area focused cores. One might wonder if the Atom team won't surpass the Core team someday if they keep the impressive cadence.
 
Reactions: Tlh97 and Geddagod

dullard

Elite Member
May 21, 2001
25,069
3,420
126
?
They do bf/fp8|16 math just fine.

Most support FP16/BF16 just fine.
They can support different data types. But, they often do not. This link below is an Intel graphic, so it is obvious that it is geared to put Intel in the best light, but you can see that not all NPUs ran FP16 at the time of the image creation. If anyone can find AMD's list of supported NPU data types, then I'll link that as well (my Google skills are failing me, the latest information I found is that the list of supported data types has not yet been released).
 
Last edited:

naukkis

Senior member
Jun 5, 2002
706
578
136
?
They do bf/fp8|16 math just fine.

Most support FP16/BF16 just fine.

Speculation was about replacing FPU with NPU. FPU doesn't usually even support FP16 math, it's single precision is 32 bit and double precision 64 bit(or more). OK, FPU SIMD today do support also FP16 but that's just outlier for something like AI where NPU is aimed at. So no, you can't replace FPU with NPU as they are designed to totally different kind of precision.
 

Markfw

Moderator Emeritus, Elite Member
May 16, 2002
25,573
14,526
136
They can support different data types. But, they often do not. This link below is an Intel graphic, so it is obvious that it is geared to put Intel in the best light, but you can see that not all NPUs ran FP16 at the time of the image creation. If anyone can find AMD's list of supported NPU data types, then I'll link that as well (my Google skills are failing me, the latest information I found is that the list of supported data types has not yet been released).
forbidden on that link
 

dullard

Elite Member
May 21, 2001
25,069
3,420
126
Speculation was about replacing FPU with NPU. FPU doesn't usually even support FP16 math, it's single precision is 32 bit and double precision 64 bit(or more). OK, FPU SIMD today do support also FP16 but that's just outlier for something like AI where NPU is aimed at. So no, you can't replace FPU with NPU as they are designed to totally different kind of precision.
The ability to use different data types is theoretically possible with NPUs. But, the main issue is memory, bandwidth, and power used. If you only need 4 bits (which AI often only needs 4 bits or 8 bits), then using something set up for 512 bits is quite a waste. Using 512 bits when your application needs 4 bits will require 128x more memory, will have to move 128x more data around, and will have to process 128x more of that data, using much more power. All while only being able to use much smaller AI models due to those limits. So, it isn't really efficient to use something set up for 512 bits with 4 bits

The reverse is true too. If you have an NPU optimized and designed for say, 4-bit math, and need 16-bit data, then you need to transfer that data around in 4 chunks which takes more time. Then you have memory to store only 1/4th the data. It can work, but just won't be as performant as you want.
 
Last edited:

dullard

Elite Member
May 21, 2001
25,069
3,420
126
Last edited:

adroc_thurston

Platinum Member
Jul 2, 2023
2,219
2,936
96
Just saying the "efficiency emphasis" idea out loud only because of the rumour of LNL having 1.5x MT GB(? Forgot which software) score than MTL or something like that
That's compared to (non-existent) MTL-U which is one of the worst parts Intel has ever made.
Not a real achievement per se.
Again, if someone has a list of AMD supported NPU data types, I'll include the official list since I prefer my posts to be neutral in sources.
this really says nothing about the hardware, just DirectML support.
AIE-ML in PHX supports FP16/BF16 just fine.
see the docs.
 
Reactions: lightmanek
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |