DrMrLordX
Lifer
- Apr 27, 2000
- 22,544
- 12,412
- 136
Is Intel joining AMD in the open source movement?
Arguably they're already there (see: Intel's Linux distro, their SYCL efforts, and a lot of other stuff).
Is Intel joining AMD in the open source movement?
Like most libraries data science related now come with a "warning" along the lines of "this build is optimized with intel oneapi for AVX AVX2". So yeah they are putting their oneapi in things already where AMD after a decade is still absent from.Arguably they're already there (see: Intel's Linux distro, their SYCL efforts, and a lot of other stuff).
It's kind of ironic as intel in general is also pretty big software house and AMD traditional a hardware company. So I would have expected intel to be able to deliver good drivers. (but we don't know if the chips are so buggy that driver development gets to be a chore? Or they simply didn't get enough budget. time and heads-up? Intel should have poached at least one senior person from NV or AMD driver team...also for that guys connection to the game developers for optimizations.
What I find odd is that the driver team could have started getting their drivers up to speed back when Intel announced to develop discrete GPUs almost 5 years ago. The current troubles make it seem Intel started to care about drivers way later.They simply didn't care or felt the need to since it was iGPUs. Now they will have to because the problem is much closer to them if people complain.
But more on that in a moment. First of all, in this context, I would like to announce our official “launch” of the Gunnir Arc A380 for Wednesday 07/20/2022 at 12:00.
One sample went to Computerbase, one to Golem, one to a forum member from the 3D Center and one thankfully to me. And since we all want to test collegially with each other and not against each other, we have agreed on a common time when we will all go online at the same time. And we also agreed on the content in order to complement each other and not to overlap in a big way. Everyone does what they do best. That’s why you should also read the other two articles on Wednesday to get a really full picture.
Apparently on Wedneseday we will see A380 reviews from Computerbase, Golem, Igorslab.
Ha, ha, ha!No Anandtech review?
Or they seem to have focused on 3DMARK and synthetics way too much. Once they got those running fine, they must have expected everything to just start working. It did work but it was slow coz shaders in games are targeting AMD/Nvidia hardware. Hopefully, we will get a game developer's opinion on Intel's dGPU at some point. Should be very interesting to hear how it differs from AMD/Nvidia GPUs and how those differences impact performance.What I find odd is that the driver team could have started getting their drivers up to speed back when Intel announced to develop discrete GPUs almost 5 years ago. The current troubles make it seem Intel started to care about drivers way later.
What's weird is that a roughly 100MHz GPU OC is adding almost 10-30 fps to the average. Is the RAM also overclocked? Doom Eternal OC improvement is pretty good.OC A380 is quite a bit faster than the default A380.
I am looking forward to seeing how it all plays out. I have systems for older games. So I would not hesitate to buy an Intel ARC for new ones, if they are the best bang for buck.
Or they seem to have focused on 3DMARK and synthetics way too much. Once they got those running fine, they must have expected everything to just start working. It did work but it was slow coz shaders in games are targeting AMD/Nvidia hardware. Hopefully, we will get a game developer's opinion on Intel's dGPU at some point. Should be very interesting to hear how it differs from AMD/Nvidia GPUs and how those differences impact performance.
I find this very weird. Power consumption OC vs. no OC.What's weird is that a roughly 100MHz GPU OC is adding almost 10-30 fps to the average. Is the RAM also overclocked? Doom Eternal OC improvement is pretty good.
I find this very weird. Power consumption OC vs. no OC.
View attachment 64693
Left is stock A380, center is A380 OC and right is GTX 1650.
And Cyberpunk...
View attachment 64694
Yeah.... it looks like A380 might not be too bad after all. Their default power limits are extremely weird by the looks of it.
That shows the non-OC ARC GPU at 100MHz. Something is definitely amiss.And Cyberpunk...
View attachment 64694
Didn't the DG1 have some weird power issues? There's clearly something weird power related issues going on.How are they getting a 23% - 40% increase in performance with a 6% increase in clock frequency? Power increase for the shown clock increase also seems. . . weird.
What I find odd is that the driver team could have started getting their drivers up to speed back when Intel announced to develop discrete GPUs almost 5 years ago. The current troubles make it seem Intel started to care about drivers way later.
That shows the non-OC ARC GPU at 100MHz. Something is definitely amiss.
Also, the OC GPU is also placing more demands on the CPU.
What he did change were the power targets and voltage offsets in Intel's own graphics utility, with a setting of 55% on the "GPU performance Boost" slider and a +0.255mv voltage offset.
Doesn't Intel normally design their desktop platforms, to be laptop-first, desktop-second, most of the time?So at default the power is too low. That's weird because it's under 40W lot of the time. Why did Intel make that decision?
Doesn't Intel normally design their desktop platforms, to be laptop-first, desktop-second, most of the time?
the Resizable BAR reliance from the a380 to perform is pretty disappointing from an old system upgrade point of view, performance overall is not that terrible.
Rebar "issue" also looks like a power related issue. It's not ramping as it should without Rebar off. So logically the solution is to increase the power limits to be in line with Rebar on.
Watch Dogs
Rebar off: 30-35W
Rebar on: 35-40W
Rebar on + OC: 50W
Both the hardware and the drivers base on the iGPUs which exist for well over a decade already. Improving the drivers is something Intel could have done starting from the announcement. At first the opposite happened instead.How would they? At minimum, they couldn’t start until the initial design was complete.