Can probably base things off this, not much will have changed except for mild increases from all sides:
https://www.anandtech.com/show/17024/apple-m1-max-performance-review
(Then all scaled up for the ultra of course.).
There do seem to be some workloads where it really is simply sarcastically...
Wasn't it massively up last year or two though? Looks more like a reversion towards the long term mean than anything actually very worrying.
(Some combination of Crypto & Covid you have to presume.).
Unless/until the AI people stop demanding the biggest dies they can get, there's no way that NW will stop making the biggest dies they can.
Whether those will then still be able to be priced at levels that gamers might want to afford, I have no idea.
Well, how well did the 12400/12600 sell vs the K chips last time? Those were 'only' 4+0/6+0 but also had hugely more manageable power draws than the K chips, which must surely rate as a nice win for main stream customers.
(Definitely why I got a 12600 - no zen as I wanted an iGPU as backup.)...
To be fair, it should work much better than the more basic extant algorithms for doing this.
They're doing a lot of computational work of a sort that NN's are brilliant at. With the images on both sides to work with the results should presumably be very good in terms of image quality. They...
Zen4 will be more efficient I presume, even quite a lot more so. Just need to wait for the versions limited to lower power draws. Like with the recent GPU's, some people need/want all the power they can get. Technology not keeping up so the top things have a higher power draw.
99% of consumers...
Its not insane at all. There's a huge market for these cards in AI these days, and that market wants the biggest, most powerful things they can get.
The power draw reductions from processes aren't keeping up remotely well with the amount of transistors on offer, so you get this. Happy with the...
Machine learning stuff, broadly, I think. Hence them building some of these chips up so big/expensive. Perhaps AR/VR at some future point?
You can definitely see why they started focusing into RT. Without it, even at 4K, there just wouldn't be a remotely sane reason to have the top end cards...
Well, if they'd really pushed perf/W but not mentioned it then I'll choose to be disappointed on those grounds :) I also just don't like the way the top few SKU's are having their TDP raised for really quite marginal gains.
(Obviously Intel have been much worse at this recently!).
The...
Well, actually yes it is isn't it? Clockwork. I mean they could in theory have run some series for an 18 month cycle, the odd random one for 12, some for 30 months etc etc.
As it is its 2 years, with perhaps slight changes in when the cards actually appear between July -> September.
iirc the...
This sort of thing always amuses me when I think of people trying to train models to predict the stock market based on the news :)
(Some reasons of course, I guess but still!).
As per the initial question, it's just not that comparable - Intel's big cores are very, very big and power hungry.
Somewhat undesirably so to be honest. Apple (and AMD) have got their ones rather more under control, so it makes sense to load up on them instead.
This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
By continuing to use this site, you are consenting to our use of cookies.