Intel Broadwell Thread

Page 7 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

antihelten

Golden Member
Feb 2, 2012
1,764
274
126
Gaming can also refer to Angry Birds, though...

Ok, then let me correct myself. DCC does not, as far as I can tell, come in to play with any kind work load that would be relevant for the kind of gaming I would assume posters on this forum are actually interested in (i.e. the kind that Anandtech uses in their benchmarks).
 

Piroko

Senior member
Jan 10, 2013
905
79
91
Yes, that's probably true. But even then I can see cases where the igpu might sleep for a few milliseconds here and there, for example A.I. turns in civ 5.
 

witeken

Diamond Member
Dec 25, 2013
3,899
193
106
No one disagrees with that, but you said "Now obviously Vmin is not quite the same as threshold voltage, but even then Vmin should still only come into play when idling."
 

antihelten

Golden Member
Feb 2, 2012
1,764
274
126
Yes, that's probably true. But even then I can see cases where the igpu might sleep for a few milliseconds here and there, for example A.I. turns in civ 5.

I very much doubt that the iGPU would sleep in that case actually, since you still have to display the world map with all that that includes.

No one disagrees with that, but you said "Now obviously Vmin is not quite the same as threshold voltage, but even then Vmin should still only come into play when idling."

Actually what I posted that IntelUser2000 first replied to was: "It is my impression from Intels slides that DCC only comes into play when you've already gone down to threshold voltage, which I would imagine would only occur at idle (or very close to it)." (IntelUser2000's highlighting)

Now I'll admit that I forgot to put in the "(or very close to it)" in my reply to him, but it was clearly there in the first post (and he even highlighted it).
 

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,786
136
Ok, then let me correct myself. DCC does not, as far as I can tell, come in to play with any kind work load that would be relevant for the kind of gaming I would assume posters on this forum are actually interested in (i.e. the kind that Anandtech uses in their benchmarks).

I wouldn't doubt that it would need to use DCC for the games that most AT Forum users are expecting.

But that's not necessarily what I am talking about.

-Warcraft III: Frozen Throne
-Logging on my Ivy Bridge Ultrabook(Dell XPS 12, Core i7, 8GB RAM, HD 4000)
-Tools used: HWINFO64 and GPU-Z
-Balanced Power Mode used on Battery
-1366x768 High settings

The min clocks for the HD 4000 on the Core i7 3517U is 350MHz and max is 1.15GHz. GPU-Z shows that it ramps up to 650MHz. HWINFO64 shows that it changes between 350MHz and 650MHz every few seconds or so.

HD 4400 is faster than this, and Broadwell graphics even faster. If the HD 4000 only needs to run between 350-650MHz, I doubt Broadwell needs anywhere near that and GT3 version(specifically the version for the U having same TDP as the GT2) needing anywhere more than base 350MHz. Nvidia is using frame-rate capping technology to improve battery life in 3D gaming while Intel has provided research papers doing the same. They aren't going to mindlessly aim for frame rates that are unrealistic(100+) on a battery-capped device like Tablets and Laptops.

(Anecdotally, I remember playing League of Legends couple of years ago on fully charged battery just for fun. While Anand's tests show the Ivy Bridge Ultrabook lasting only 107 minutes in 3DMark06, I played the game for 2.5+ hours(150+ minutes) before running out of battery. League of Legends is much more demanding than WC3, but noticeably less than running 3DMark06. But that's a popular enough game to be relevant to the discussion)

Broadwell doesn't just target PC applications, but Tablet ones. The 3D requirements of Tablets are far less than even the "mainstream" PC titles. Furthermore, Intel even wants Core powered Android devices.
 
Last edited:

antihelten

Golden Member
Feb 2, 2012
1,764
274
126
I wouldn't doubt that it would need to use DCC for the games that most AT Forum users are expecting.

But that's not necessarily what I am talking about.

-Warcraft III: Frozen Throne
-Logging on my Ivy Bridge Ultrabook(Dell XPS 12, Core i7, 8GB RAM, HD 4000)
-Tools used: HWINFO64 and GPU-Z
-Balanced Power Mode used on Battery
-1366x768 High settings

The min clocks for the HD 4000 on the Core i7 3517U is 350MHz and max is 1.15GHz. GPU-Z shows that it ramps up to 650MHz. HWINFO64 shows that it changes between 350MHz and 650MHz every few seconds or so.

HD 4400 is faster than this, and Broadwell graphics even faster. If the HD 4000 only needs to run between 350-650MHz, I doubt Broadwell needs anywhere near that and GT3 version(specifically the version for the U having same TDP as the GT2) needing anywhere more than base 350MHz. Nvidia is using frame-rate capping technology to improve battery life in 3D gaming while Intel has provided research papers doing the same. They aren't going to mindlessly aim for frame rates that are unrealistic(100+) on a battery-capped device like Tablets and Laptops.

(Anecdotally, I remember playing League of Legends couple of years ago on fully charged battery just for fun. While Anand's tests show the Ivy Bridge Ultrabook lasting only 107 minutes in 3DMark06, I played the game for 2.5+ hours(150+ minutes) before running out of battery. League of Legends is much more demanding than WC3, but noticeably less than running 3DMark06. But that's a popular enough game to be relevant to the discussion)

Simple Fact is that we just don't know were Fmax@Vmin falls on the frequency scale (although we can probably agree that it's not equal to the max clock).

If we assume that Fmax@Vmin is equal to the base clock (i.e. 350 MHz in your case), then it is obvious that WC3 requires an average level of performance that is above this level, otherwise it wouldn't be jumping up to 650 MHz half the time.

Now obviously the performance at Fmax@Vmin goes up with increase in functional units (i.e. Broadwell), but one would hope that by the time Broadwell comes out, 1366x768 will be largely extinct, and you would be looking at 1080p at a minimum, and thus the load would also go up. Plus as time goes by more and more people would probably be moving on from 12 year old games like WC3 (to something like league of legends for instance), which would also increase performance demands.

Broadwell doesn't just target PC applications, but Tablet ones. The 3D requirements of Tablets are far less than even the "mainstream" PC titles. Furthermore, Intel even wants Core powered Android devices.

If the tablet in question has a Broadwell chip in it, then it will almost certainly be running windows, and as such the stuff you'll be running on it is in fact precisely PC applications.

Either way tablets don't have 3D performance requirements, software does, and yes you could run Angry Birds on your Broadwell equipped tablet if you felt like it and probably get some extra battery life from DCC, but again I kinda doubt most people on this forum (or in general for that matter), buys a Broadwell tablet to play Angry Birds.

Also if Broadwell goes in to an android tablet it would probably be a sku with only 8 EU, or half of what your 3517U has, and even with an increase in IPC/frequency, it would probably still be a fair bit slower than the HD 4000.
 

Enigmoid

Platinum Member
Sep 27, 2012
2,907
31
91
Simple Fact is that we just don't know were Fmax@Vmin falls on the frequency scale (although we can probably agree that it's not equal to the max clock).

If we assume that Fmax@Vmin is equal to the base clock (i.e. 350 MHz in your case), then it is obvious that WC3 requires an average level of performance that is above this level, otherwise it wouldn't be jumping up to 650 MHz half the time.

Now obviously the performance at Fmax@Vmin goes up with increase in functional units (i.e. Broadwell), but one would hope that by the time Broadwell comes out, 1366x768 will be largely extinct, and you would be looking at 1080p at a minimum, and thus the load would also go up. Plus as time goes by more and more people would probably be moving on from 12 year old games like WC3 (to something like league of legends for instance), which would also increase performance demands.



If the tablet in question has a Broadwell chip in it, then it will almost certainly be running windows, and as such the stuff you'll be running on it is in fact precisely PC applications.

Either way tablets don't have 3D performance requirements, software does, and yes you could run Angry Birds on your Broadwell equipped tablet if you felt like it and probably get some extra battery life from DCC, but again I kinda doubt most people on this forum (or in general for that matter), buys a Broadwell tablet to play Angry Birds.

Also if Broadwell goes in to an android tablet it would probably be a sku with only 8 EU, or half of what your 3517U has, and even with an increase in IPC/frequency, it would probably still be a fair bit slower than the HD 4000.

768p is going to be here for a long time with budget notebooks.

Cherry trail has 16 EU but low end SKUs may have 8 disabled.
 

antihelten

Golden Member
Feb 2, 2012
1,764
274
126
768p is going to be here for a long time with budget notebooks.

Cherry trail has 16 EU but low end SKUs may have 8 disabled.

But would those budget notebooks have Broadwell and not Cherry trail (or an AMD chip for that matter) though?. Obviously there will be some models with Broadwell and 768p, but I would hope (perhaps naively) that the vast majority would be above 768p.

Also as far as I can tell Intels notebook aspirations for broadwell seems to be mainly 2 in 1 convertibles,which generally seem to belong to a segment above budget and thus would hopefully have something above 768p.
 

Enigmoid

Platinum Member
Sep 27, 2012
2,907
31
91
But would those budget notebooks have Broadwell and not Cherry trail (or an AMD chip for that matter) though?. Obviously there will be some models with Broadwell and 768p, but I would hope (perhaps naively) that the vast majority would be above 768p.

Also as far as I can tell Intels notebook aspirations for broadwell seems to be mainly 2 in 1 convertibles,which generally seem to belong to a segment above budget and thus would hopefully have something above 768p.

I think the vast majority will have 768p. Especially for budget (>$400).

For haswell, the vast majority of budget notebooks are using ULV CPUs and are 768p around $450-700. You start hitting 1080p at around $750.
 

antihelten

Golden Member
Feb 2, 2012
1,764
274
126
I think the vast majority will have 768p. Especially for budget (>$400).

For haswell, the vast majority of budget notebooks are using ULV CPUs and are 768p around $450-700. You start hitting 1080p at around $750.

Honestly I'll gladly admit that I'm probably being overly optimistic here, but I do believe that we will (slowly) start to see an increase in resolution even at the budget end.

First of all if Intel (and the OEMs) are serious about the whole 2 in 1 convertible concept, then they will have to deal with the fact that the average consumer will be comparing the 2 in 1 laptop with a high resolution $500 tablet from Samsung or Apple. And no, of course an android tablet won't be close to an x86 windows machine capability wise, but again your average consumer might not realise that (or even care, given their usage pattern).

Secondly the really low budget windows laptops (i.e. $250-$400), will inevitably have to compete with chromebooks, where we are already seeing 1080p machines for $300 (the new K1 acer)

Thirdly, with Microsoft now offering a free OEM version of windows, OEMs will hopefully use those saving to become more competitive in other areas (i.e. resolution).
 
Last edited:

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,786
136
Either way tablets don't have 3D performance requirements, software does, and yes you could run Angry Birds on your Broadwell equipped tablet if you felt like it and probably get some extra battery life from DCC, but again I kinda doubt most people on this forum (or in general for that matter), buys a Broadwell tablet to play Angry Birds.
Android applications are far less demanding in 3D because of the devices the OS matured from. You do have few companies trying to run Android on Core already.

Angry Birds isn't relevant anyway. It's not even a 3D application. By the way, my XPS res is 19x12. WC3 just ran at 1366.

Also if Broadwell goes in to an android tablet it would probably be a sku with only 8 EU, or half of what your 3517U has, and even with an increase in IPC/frequency, it would probably still be a fair bit slower than the HD 4000.
Ideally you should have tablet-level power when running tablet-level applications and high-end system power when running very demanding applications.

Unfortumately while the latter is true the former is not for Core-based devices. Now as the TDP goes lower and lower(4.5W with Core M) you have scenarios where its not worth it to lower TDP anymore since there is a fixed low level ceiling that takes quite a bit of the percentages.

If you look at very low power Core devices like Haswell-Y, and take a look at some power numbers, you'll notice that its exactly so. Currently, that limit is the GPU(Intel has mentioned power efficient GPU as the limiter for low TDP few times).

22nm allows significant increases exactly at very low voltages. At the high-end, not so much. The point where there's no return lowering voltages just became much higher. Probably further so at 14nm.

Why would they care about lowering 3D clocks/power when 2D applications have a seperate clock?
 

TreVader

Platinum Member
Oct 28, 2013
2,057
2
0
14nm, if delivered as promised (Q4 2014), will be impressive. 15hr battery life on a 13" notebook
 

witeken

Diamond Member
Dec 25, 2013
3,899
193
106
Does Broadwell-Y use any method to reduce clock distribution power consumption? Not using a metal grid could significantly reduce power consumption like Atom (by 30%), but I haven't heard anything about it.
 

witeken

Diamond Member
Dec 25, 2013
3,899
193
106
Try not to read into it too much. Intel held this press briefing at their Oregon campus (as opposed to their Santa Clara campus), so it made a lot more sense to send me since I live in the area.
You were not lying, were you?
 
Last edited:

witeken

Diamond Member
Dec 25, 2013
3,899
193
106
I understand the second sentence, but you can't deny that the first one unambiguously states that we shouldn't expect crazy news like Anand's retirement.

Would the Anand Shimpi from 2012 deliberately miss out on a chance to get to know Intel's next-gen architecture, platform and manufacturing process? Of course not.
 

sm625

Diamond Member
May 6, 2011
8,172
137
106
(Anecdotally, I remember playing League of Legends couple of years ago on fully charged battery just for fun. While Anand's tests show the Ivy Bridge Ultrabook lasting only 107 minutes in 3DMark06, I played the game for 2.5+ hours(150+ minutes) before running out of battery.

Anecdotally, I played a game of LoL after I built my latest G3258 system, but before I installed my video card. I played one game on that intel IGP. And I will never ever play another. The display driver crashed 20 times. Sure, the FPS and IQ were ok (after I fiddled with the quality settings), but crash after crash is just not acceptable. And that is going to be a problem 5 years from now even if somehow intel's IGP becomes powerful enough to play any game.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
I fear this before but next year is looking like a yawnfest for the mainstream Intel desktop CPU platform. I didn't quite understand why Intel would launch Broadwell-K to only later introduce the superior Skylake-K. Well, it looks like my fears are coming true and Intel will gimp Skylake in 2015 and launch the slower Skylake-S, while Broadwell-K is what will replace i7 4790K. If so, I don't foresee any worthwhile Skylake-K in 2015 at all. Perhaps, Intel is shifting the performance aspect to the X99 platform with Broadwell-E taking over HW-E in 2015 as the performance offering.

http://www.fudzilla.com/home/item/35645-broadwell-to-be-faster-than-skylake-s-in-desktop

I guess Intel wants to recoup the R&D costs on Broadwell which more or less means Skylake-K is likely going to be pushed into 2016. I am now just counting until an article pops up stating just that.
 

SAAA

Senior member
May 14, 2014
541
126
116
I fear this before but next year is looking like a yawnfest for the mainstream Intel desktop CPU platform. I didn't quite understand why Intel would launch Broadwell-K to only later introduce the superior Skylake-K. Well, it looks like my fears are coming true and Intel will gimp Skylake in 2015 and launch the slower Skylake-S, while Broadwell-K is what will replace i7 4790K. If so, I don't foresee any worthwhile Skylake-K in 2015 at all. Perhaps, Intel is shifting the performance aspect to the X99 platform with Broadwell-E taking over HW-E in 2015 as the performance offering.

http://www.fudzilla.com/home/item/35645-broadwell-to-be-faster-than-skylake-s-in-desktop

I guess Intel wants to recoup the R&D costs on Broadwell which more or less means Skylake-K is likely going to be pushed into 2016. I am now just counting until an article pops up stating just that.

So it seems in 2015 there will be an incredible three architectures out at the same time. What a mess.
Plus I'm certain 99% that they will name the locked skylake ix-5xxx cpus and have the unlocked -K part with a higher number, just to confuse more all consumers.

Sad thing is the improvement may be small as you say: ~5% IPC and eventually more clockspeed, if they are really kind.
I hope it's not only 4.1-4.5 GHz stock... maybe ok just if it overclocks as high as sandy tough.
Can they reach that in 65W tdp? Probably, but temperatures are really a problem now with the die almost halved...
 

witeken

Diamond Member
Dec 25, 2013
3,899
193
106
I don't understand the fuss. We know that Devil's Canyon's successor will be released in Q2. So unlocked Skylake will be released in 2016, no surprises here.
 
Mar 10, 2006
11,715
2,012
126
I fear this before but next year is looking like a yawnfest for the mainstream Intel desktop CPU platform. I didn't quite understand why Intel would launch Broadwell-K to only later introduce the superior Skylake-K. Well, it looks like my fears are coming true and Intel will gimp Skylake in 2015 and launch the slower Skylake-S, while Broadwell-K is what will replace i7 4790K. If so, I don't foresee any worthwhile Skylake-K in 2015 at all. Perhaps, Intel is shifting the performance aspect to the X99 platform with Broadwell-E taking over HW-E in 2015 as the performance offering.

http://www.fudzilla.com/home/item/35645-broadwell-to-be-faster-than-skylake-s-in-desktop

I guess Intel wants to recoup the R&D costs on Broadwell which more or less means Skylake-K is likely going to be pushed into 2016. I am now just counting until an article pops up stating just that.

My bet is that Intel wants to push "enthusiasts" to X99. it works better for customers since everybody has always been saying "get rid of the iGPU and give us more cores."

What I would like to see, though, is Intel filling out its HEDT lineup with smaller core variants, like an even cheaper quad-core.
 

BrightCandle

Diamond Member
Mar 15, 2007
4,762
0
76
My bet is that Intel wants to push "enthusiasts" to X99. it works better for customers since everybody has always been saying "get rid of the iGPU and give us more cores."

It would make a lot more sense to do it if the removal of the iGPU was replaced with CPU transistors/cores and the price remained basically the same. As it is we are getting quad channel memory and cache architectures that are more suitable to server workloads than the desktop and the considerably price increase along with it. If this is Intel's strategy it will fail, its not wrong to want -E like parts but we don't really want to be paying $1000 for 8 cores when we have had 6 cores for years and years already.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
I don't understand the fuss. We know that Devil's Canyon's successor will be released in Q2. So unlocked Skylake will be released in 2016, no surprises here.

Umm... Nope. It is only recently that we found out Intel would release the gimped S version and remain silent on the status of Skylake K for 2015. If you look at the roadmap, Skylake K should be a Summer/Fall 2015 product.

I7 860 launched September 2009.
New architecture: 2600K launched January 2011.
New architecture: 4770K launched June 2013.

Intel vowed to release a new architecture every 2 years or so. That means if Skylake-K launches in 2016, it's 1 year late. And actually, it was always meant to be that Skylake K would replace Broadwell-K in 2015 which aligns with the 2 year cadence. Intel said the delay of Broadwell won't impact Skylake which is now total BS because if what they said is true, we would have seen Skylake-K June 2015-December 2015 at the latest!

Broadwell's delays clearly impacted Skylake K or otherwise Intel wouldn't be launching Broadwell K and holding back Skylake K in 2015.
---

I wouldn't mind if Intel pushed the enthusiast X99 and subsequent replacement lines to enthusiasts. For me a $340 quad-core seems like a big rip-off now that Intel is selling an underclocked hexa-core for barely more. Right now X99 platform isn't as attractive as can be due to slightly higher DDR4 prices. However, if Intel can bring out the latest architecture 6-8 months after the mainstream platform in the form of $390 6-core, the mainstream i7 is dead to me because sooner or later DDR4 will reach pricing parity with DDR3.

I think Intel will move the mainstream i7 below $300 in a generation or 2 OR they will have to keep these 500mhz+ clock delta between the i5 and i7s. Once DDR4 reaches pricing parity, it will cost just $120-140 more to go full hexa-core over the Z107 platform.
 
Last edited:
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |