Has anyone posted the Semiaccurate article about Intel's driver teams in Russia? It has a date of Sept. 2nd but I only read it today.
Giving up the consumer dGPU business isn't a simple binary argument. Even a mediocre dGPU business can add up to foundry volumes, which in turn helps their momentum. The same applies to their iGPU efforts, the increased mind share and R&D from dGPU may push Intel's consumer mobile segment in the medium & long term. Whatever they choose between dividends and Arc, I think there's a potential for unintended consequences in the future.I think you guys are underestimating how hard it's going to be for Intel to even pretend to continue to be a leading edge foundry while being cash flow negative, even with the free Government money. And I'm sure they really, really, really want to keep the dividend.
On Twitter:
John Ospanov: There are rumours of Intel dropping it's video card business. Any update on that?
Dr. Ian Cuttress: They're not.
John Ospanov: Are you sure? Moore's law is dead claims so.
Dr. Ian Cuttress: Sorry I forgot MLID was the authority.
Hopefully they takes MLID down a notch, or ten!
Quite frankly, being a long-time PC (x86/x64) enthusiast, it frightens the crap out of me that, if the winds blow where they may, that we might just see "Windows-on-ARM" PC Desktops on the budget PC shelf at Walmart, for example. That there would be a split between the "average joe's PC" and our enthusiast PC, and not just a matter of grade of part class, but a wholly different taxonomy of parts.but the bigger issue is it might push OEMs to actually really start moving to ARM stuff for their volume shipments. And the timing is poor for Intel as I think the deal between Microsoft and Qualcomm for Windows ARM is over (or just about). Intel might get lucky as there's some rift in the ARM space due to Qualcomm seemingly wanting to make performance ARM designs targeting PCs and the licensing deal that ARM is suing over (which that might be due to the Qualcomm/Microsoft situation which has stifled ARM there), but is also possible that gets resolved without too much issue. But there's plenty of others (including both Nvidia and AMD) and so things could pretty quickly change.
Same here. There will be compatibility sacrifices made for going with WoArm. It wouldn't be ideal. But then, what if we get 25 hour battery life with 99Whr battery? That might make it worth the trade-off. But it might kill budget x86. And in future, x86 itself for good. That is not something I want to see in my lifetime.Quite frankly, being a long-time PC (x86/x64) enthusiast, it frightens the crap out of me that, if the winds blow where they may, that we might just see "Windows-on-ARM" PC Desktops on the budget PC shelf at Walmart, for example.
God, anyone remember the complete fail purchase of McAfee by Intel for $7.5 billion back in 2011? That money could have been used to better effect in almost any other product field. Imagine if Intel had slowly and steadily developed a dGPU business for datacenters with that money. Imagine if it had invested that money more in mobile or spent the money on doing IFS a decade early.Remember when they took a 10 billion dollar plus loss trying to penetrate the mobile market, then failed and laid off 12.000 employees? After blowing golden opportunities the decade before by selling XScale. And passing up a chance to power the iPhone. That's probably what trying to supplant AMD would be like this late in the game. They missed their window, and would have to play catch up. All to win a market with low margins? Unlikely.
Stupid? Yes it is.Also isn't that the cost over 5 years. So 700 Million/year is Not that big of a deal for Intel. Especially considering They still need to design the core architectures, and still need to design the drivers, and UI, so killing discrete GPUs probably wouldn't even save half of that. Maybe 200 Million.
200 million/year savings isn't make or break for Intel. It would be stupid to can it over that.
But Intel has a bad habit of over-emphasizing what would make investors happy and not enough on what its long-term vision will be.
Yeah, in those excess years Intel absolutely dominated the market in the core business but had zero idea how to expand beyond it.It would seem Intel habitually spends a great many excess years on side projects.
Yeah, in those excess years Intel absolutely dominated the market in the core business but had zero idea how to expand beyond it.
It's the curse of success (and also due to the Innovator's Dilemma). Many successful dominating companies have died to this hubris and willing blindness to the future.Yeah, in those excess years Intel absolutely dominated the market in the core business but had zero idea how to expand beyond it.
How many years on modems?
Intel made a ton of money off of the modem.
Where? After how many years of losses? It seems mostly to have been bundled with their massively money losing smartphone/tablet chips group.
WSJ said the selling iPhone modems was losing money:
Intel Finally Makes the Right Call on Modems
Intel’s news that it was effectively exiting the business of making modem chips for smartphones was not so surprising. Its iPhone business was losing money and straining capacity—and likely to go away soon anyway.www.wsj.com
Quite frankly, being a long-time PC (x86/x64) enthusiast, it frightens the crap out of me that, if the winds blow where they may, that we might just see "Windows-on-ARM" PC Desktops on the budget PC shelf at Walmart, for example.
Granted, I know that Intel experimented with ATOM-based Celeron and later, "Pentium Silver". but they've been pretty good about it lately (or OEMs have), about keeping the abominations of processors like Pentium Silver in laptops, and the occasional budget AIO PC, where low-power due to slim profile and nearly no cooling suits it as well, better than your average 65W Desktop TDP CPU.
Now, with ARM on Desktop, if you want to install a GPU, you no longer need "NVidia drivers", but "ARM-specific NVidia drivers" for that PCI device-id. It adds an additional layer of complexity to servicing those machines.
But if MS does things right, then Windows-on-ARM should be as familiar as ever.
I think you misunderstood my take, never intended to imply Intel didn't try or gave up to quickly. That wasn't the case. Intel just didn't know how to expand from its core business, it didn't know what expansions could be worth pursuing (I mean, McAfee?), how to approach them, and at what point to cut losses (Optane is the most recent example of that).Just pointing out, that a claim that Intel gives up easy, doesn't really track with history.
You could claim we are in uncharted territory, but that's different than holding up history as evidence of things that really didn't happen.