News Intel GPUs - waiting for B770

Page 87 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

LightningZ71

Platinum Member
Mar 10, 2017
2,210
2,694
136
Eh? During the long period of core amount stagnation the iGPU size was actually what Intel "improved" between generations. Which of course on the desktop was completely superficial considering essentially everybody gaming used a dGPU anyway. The fact that Intel has been doing iGPUs for so long already doesn't seem to help it pushing out dGPUs.
"Improved" is a very generous assessment of their iGPU activities between Broadwell and Comet Lake. There are minor additions to the media engine and modest feature adds to the EUs, but, aside from a few "Iris" products that almost exclusively went to MACs, and an abortive MCM with AMD in Kady Lake G, it's barely a blip with respect to performance... That level of effort certainly wasn't taking anyone away from even the worst of dGPUs, with the 1030 roundly beating every one of those standard iGPU solutions by a country mile.
 

LightningZ71

Platinum Member
Mar 10, 2017
2,210
2,694
136
Their chips already consume too much power and run too hot. Pushing that up even higher isn't really an option.
In the mobile space, that can be customized at the firmware and software level. For that additional performance and power draw, you get to eliminate from many designs low end dGPU provisions, allowing for the added cooling to make the higher total thermal dissipation more manageable. The power being concentrated in one chip, usually on a more refined process, would only serve to increase power efficiency of the whole system at similar performance levels.
 

DrMrLordX

Lifer
Apr 27, 2000
22,577
12,441
136
The power being concentrated in one chip, usually on a more refined process, would only serve to increase power efficiency of the whole system at similar performance levels.

That also makes cooling potentially more expensive. OEMs are cheapskates.
 

LightningZ71

Platinum Member
Mar 10, 2017
2,210
2,694
136
Oh, OEMs will certainly find a way to mess it up. However, it also could make it a lot more efficient with materials usage.
 

Panino Manino

Golden Member
Jan 28, 2017
1,091
1,330
136
I should had think harder to express me batter and write something like "a moment f weakness", but SHAME ON ALL OF YOU! For even joking that I'm taking Intel's side.

Back to topic, even with these GPUs aren't that great, even so don't Intel have an opportunity? If AMD really step up their APU game and focus on the high end, it seems that Intel will have an open niche to fill up. They don't need more than that, just some price segment where they can start their long run.
Still believe Intel have an important part to play fighting Nvidia along with AMD.
 
Jul 27, 2020
24,640
17,127
146
If AMD really step up their APU game and focus on the high end, it seems that Intel will have an open niche to fill up. They don't need more than that, just some price segment where they can start their long run.
Yeah if AMD's bright idea of future low end GPU's is similar to 6400/6500 XT, then sure. Intel gets to carve their own slightly better low end niche.
 

moinmoin

Diamond Member
Jun 1, 2017
5,212
8,378
136
don't Intel have an opportunity?
Intel had and still has plenty opportunities, but still hasn't made use of a single one yet. Which is why this thread turned out the way it did. Not because Intel's dGPUs are bad (or not, we still don't know) but because the only thing Intel's dGPUs managed to do so far is missing great opportunities.

Note that this thread started in 2017 and is huge at 87 pages so far, and we are still waiting for the first real dGPU resulting from this whole effort.
 
Reactions: Tlh97

NTMBK

Lifer
Nov 14, 2011
10,405
5,651
136
I should had think harder to express me batter and write something like "a moment f weakness", but SHAME ON ALL OF YOU! For even joking that I'm taking Intel's side.

Back to topic, even with these GPUs aren't that great, even so don't Intel have an opportunity? If AMD really step up their APU game and focus on the high end, it seems that Intel will have an open niche to fill up. They don't need more than that, just some price segment where they can start their long run.
Still believe Intel have an important part to play fighting Nvidia along with AMD.

In all seriousness, yes, Intel has a shot. It will require long term commitment and a willingness to eat big losses in the medium term, but hopefully their new leadership is willing to do that.

In my view, this generation was always meant more to lay the groundwork, and start building up the ecosystem, the driver teams, the distribution channels, etc. It's built on TSMC! Intel really wants to be building GPUs in their own fabs, getting the benefits of vertical integration, and putting more money into their own process R&D instead of funding their competitor. They need a wider product range in order to sustain the R&D costs of fabs in the Angstrom era. They want GPU market share for the same reason they want a foundry business.

It's in their long term strategic interests to be building huge numbers of Intel GPUs in Intel fabs. I think they are willing to ride out a bad generation, to play the pricing games they need to be competitive, in order to achieve that long term.
 

moonbogg

Lifer
Jan 8, 2011
10,731
3,440
136
In all seriousness, yes, Intel has a shot. It will require long term commitment and a willingness to eat big losses in the medium term, but hopefully their new leadership is willing to do that.

In my view, this generation was always meant more to lay the groundwork, and start building up the ecosystem, the driver teams, the distribution channels, etc. It's built on TSMC! Intel really wants to be building GPUs in their own fabs, getting the benefits of vertical integration, and putting more money into their own process R&D instead of funding their competitor. They need a wider product range in order to sustain the R&D costs of fabs in the Angstrom era. They want GPU market share for the same reason they want a foundry business.

It's in their long term strategic interests to be building huge numbers of Intel GPUs in Intel fabs. I think they are willing to ride out a bad generation, to play the pricing games they need to be competitive, in order to achieve that long term.

Or they'll just fold like a cheap card table and throw the whole thing out.
 

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
In the mobile space, that can be customized at the firmware and software level. For that additional performance and power draw, you get to eliminate from many designs low end dGPU provisions, allowing for the added cooling to make the higher total thermal dissipation more manageable. The power being concentrated in one chip, usually on a more refined process, would only serve to increase power efficiency of the whole system at similar performance levels.

Not always. In a system with a discrete GPU, the discrete GPU is often powered down. And the much lower power integrated GPU is doing basic screen drawing and video rendering. If the CPU has a big GPU on it, it would require Intel to be able to really ramp the power down on the GPU when its doing basic work.

Also, its typically much easier to cool two chips putting out 100W each than a single chip putting out 200W (random numbers, not actual numbers). You need a lot more heatsink material to be able to draw away 200W from a single point. And laptops being height limited puts a limit on this.

Now, if Intel can find a way to make their chips consume way less power, then yes, having a big integrated GPU would work fine. This is how AMD is doing it. They have a pretty decent integrated GPU, but the CPU side is lower power.
 

maddie

Diamond Member
Jul 18, 2010
5,108
5,447
136
Not always. In a system with a discrete GPU, the discrete GPU is often powered down. And the much lower power integrated GPU is doing basic screen drawing and video rendering. If the CPU has a big GPU on it, it would require Intel to be able to really ramp the power down on the GPU when its doing basic work.

Also, its typically much easier to cool two chips putting out 100W each than a single chip putting out 200W (random numbers, not actual numbers). You need a lot more heatsink material to be able to draw away 200W from a single point. And laptops being height limited puts a limit on this.

Now, if Intel can find a way to make their chips consume way less power, then yes, having a big integrated GPU would work fine. This is how AMD is doing it. They have a pretty decent integrated GPU, but the CPU side is lower power.
You're forgetting selective shutting down of unneeded logic. Everyone is going to very fine grained power gating. I see this extended to individual shader units and other component sub-blocks.
 

Aapje

Golden Member
Mar 21, 2022
1,530
2,106
106
In my view, this generation was always meant more to lay the groundwork, and start building up the ecosystem, the driver teams, the distribution channels, etc. It's built on TSMC! Intel really wants to be building GPUs in their own fabs, getting the benefits of vertical integration, and putting more money into their own process R&D instead of funding their competitor. They need a wider product range in order to sustain the R&D costs of fabs in the Angstrom era. They want GPU market share for the same reason they want a foundry business.

My prediction is that we are heading into a golden era for gaming in 2025+ when all three competitors are firing on all cilinders, boosted by huge war chests. Intel coasted for a decade on the CPU side and they will have to really compete now, while Nvidia also really needs to step up it's game. Intel's new strategies of entering the dGPU and foundry markets will shake the dust out of Intel, as they don't have past successes to coast on in those markets. And AMD finally seems to have figured out how to compete sustainably (at least as long as they have the console income).

DDR5 will most likely go to the moon in the next few years (while the price will rapidly drop once it is no longer optional when using newer CPUs). Add in the breakthroughs we've had and will have for chiplets. Exciting times ahead. We'll just have to get used to undervolting and underclocking...

We also need Intel and Samsung fabs to improve. Can't just let TMSC make all the top tier stuff.
 
Reactions: Lodix and Tlh97

LightningZ71

Platinum Member
Mar 10, 2017
2,210
2,694
136
Intel can easily find footing in the market by using the vast regions of market pricing now being created by nvidia and AMD by their continuing push of MSRPs into the stratosphere with each generation. This has opened up many spaces for Intel to offer better performance per dollar while still being at worst revenue neutral until they have established market momentum. They just need something that they have routinely failed to provide to the market, stable, high performance drivers for 3d gaming. Of, they've made "playable" drivers for many games over the years, but, they did poorly with Xe. Their last performance "i"gpu product, KadyLake G, they literally walked away from supporting while products were still in warehouses to be sold, then pointed their fingers at the supplier that they refused to pay for it.

Prove it Intel.
 

eek2121

Diamond Member
Aug 2, 2005
3,341
4,920
136
Or they'll just fold like a cheap card table and throw the whole thing out.

That will never happen. Intel needs the GPU segment to compete with NVIDIA and AMD in enterprise/datacenter/cloud markets.

People keep saying Intel is delayed, but I'm not seeing a delay. The roadmap I saw showed laptops getting a release, followed by desktops. I believe the official first desktop cards were due for release at the end of May, however I may be mistaken. I saw this roadmap in January or February. If I find it I post a link.
 

moonbogg

Lifer
Jan 8, 2011
10,731
3,440
136
That will never happen. Intel needs the GPU segment to compete with NVIDIA and AMD in enterprise/datacenter/cloud markets.

People keep saying Intel is delayed, but I'm not seeing a delay. The roadmap I saw showed laptops getting a release, followed by desktops. I believe the official first desktop cards were due for release at the end of May, however I may be mistaken. I saw this roadmap in January or February. If I find it I post a link.

I can't imagine Intel ever taking the big performance jump needed to be relevant in the high-end or even mid-range gaming market. I think they will always be way too conservative in terms of raw performance and they will always come too late. I think their heads are stuck in mobile garbage GPU mode and they can't get unstuck. They will make some low-end garbage and pop champagne bottles and say "that'll do" then stick them in $800 garbage OEM builds sold at Walmart. They will champion their all Intel sticker PCs as being capable of full HD gaming with high-details and 60+ fps. These GPUs will be invisible to the enthusiast and high-end gaming crowd probably forever, and that's if they even exist.
 
Reactions: VirtualLarry

Aapje

Golden Member
Mar 21, 2022
1,530
2,106
106
People keep saying Intel is delayed, but I'm not seeing a delay. The roadmap I saw showed laptops getting a release, followed by desktops. I believe the official first desktop cards were due for release at the end of May, however I may be mistaken. I saw this roadmap in January or February. If I find it I post a link.

They put Q1 in their slides and they clearly missed that. They then claimed to have released the first products in Q2, but only a handful of laptops of a single make have been released in Korea. I don't see how you call this anything other than a paper launch. No laptops were provided to any reviewers either, which is unlike a normal release and also highly indicative of this being a launch to only technically meet their promises, while not doing so in a way that most of us would see as a launch (a non-negligible quantity being available in all regions).
 
Last edited:

moinmoin

Diamond Member
Jun 1, 2017
5,212
8,378
136
Weirdly, other sites are reporting a "partnership" while PCMag reports it as an acquisition.
It's an acquisition.
 

eek2121

Diamond Member
Aug 2, 2005
3,341
4,920
136
I can't imagine Intel ever taking the big performance jump needed to be relevant in the high-end or even mid-range gaming market. I think they will always be way too conservative in terms of raw performance and they will always come too late. I think their heads are stuck in mobile garbage GPU mode and they can't get unstuck. They will make some low-end garbage and pop champagne bottles and say "that'll do" then stick them in $800 garbage OEM builds sold at Walmart. They will champion their all Intel sticker PCs as being capable of full HD gaming with high-details and 60+ fps. These GPUs will be invisible to the enthusiast and high-end gaming crowd probably forever, and that's if they even exist.

It's not like these cards are weak. It will come down to price and driver quality. Intel technically does not NEED a high-end card. Far more 3060s + 3070s have sold than 3080s + 3090s. Even if next gen from AMD/NVIDIA is significantly faster in the midrange, Intel need only tweak the price a bit. Since these things are on N6, they are going to be cheap to make so although will have to sacrifice margins, they could potentially price the cards competitively with next gen.

Also, next gen cards from NVIDIA and AMD are almost certainly Q4. Intel will likely have launched desktop and mobile parts by then. Note that OEMs will 'prefer' Intel for Intel laptops and desktops since they'll likely get huge discounts if going all Intel. that means you are likely to see a LOT of laptops that use Intel + NVIDIA switch to Arc.
 

Aapje

Golden Member
Mar 21, 2022
1,530
2,106
106
It's not like these cards are weak. It will come down to price and driver quality. Intel technically does not NEED a high-end card. Far more 3060s + 3070s have sold than 3080s + 3090s. Even if next gen from AMD/NVIDIA is significantly faster in the midrange, Intel need only tweak the price a bit.

The driver quality seems to suck real hard though. I'd rather pay more for a card that does actually work than for one that theoretically has better performance for the price, but is horrible to use in practice due to crashes, weird bugs, uneven performance, etc.

Intel has a lot of fanboys, but selling them crappy cards seems like a good way to cure those fanboys.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |