Gaming can also refer to Angry Birds, though...
Yes, that's probably true. But even then I can see cases where the igpu might sleep for a few milliseconds here and there, for example A.I. turns in civ 5.
No one disagrees with that, but you said "Now obviously Vmin is not quite the same as threshold voltage, but even then Vmin should still only come into play when idling."
Ok, then let me correct myself. DCC does not, as far as I can tell, come in to play with any kind work load that would be relevant for the kind of gaming I would assume posters on this forum are actually interested in (i.e. the kind that Anandtech uses in their benchmarks).
I wouldn't doubt that it would need to use DCC for the games that most AT Forum users are expecting.
But that's not necessarily what I am talking about.
-Warcraft III: Frozen Throne
-Logging on my Ivy Bridge Ultrabook(Dell XPS 12, Core i7, 8GB RAM, HD 4000)
-Tools used: HWINFO64 and GPU-Z
-Balanced Power Mode used on Battery
-1366x768 High settings
The min clocks for the HD 4000 on the Core i7 3517U is 350MHz and max is 1.15GHz. GPU-Z shows that it ramps up to 650MHz. HWINFO64 shows that it changes between 350MHz and 650MHz every few seconds or so.
HD 4400 is faster than this, and Broadwell graphics even faster. If the HD 4000 only needs to run between 350-650MHz, I doubt Broadwell needs anywhere near that and GT3 version(specifically the version for the U having same TDP as the GT2) needing anywhere more than base 350MHz. Nvidia is using frame-rate capping technology to improve battery life in 3D gaming while Intel has provided research papers doing the same. They aren't going to mindlessly aim for frame rates that are unrealistic(100+) on a battery-capped device like Tablets and Laptops.
(Anecdotally, I remember playing League of Legends couple of years ago on fully charged battery just for fun. While Anand's tests show the Ivy Bridge Ultrabook lasting only 107 minutes in 3DMark06, I played the game for 2.5+ hours(150+ minutes) before running out of battery. League of Legends is much more demanding than WC3, but noticeably less than running 3DMark06. But that's a popular enough game to be relevant to the discussion)
Broadwell doesn't just target PC applications, but Tablet ones. The 3D requirements of Tablets are far less than even the "mainstream" PC titles. Furthermore, Intel even wants Core powered Android devices.
Simple Fact is that we just don't know were Fmax@Vmin falls on the frequency scale (although we can probably agree that it's not equal to the max clock).
If we assume that Fmax@Vmin is equal to the base clock (i.e. 350 MHz in your case), then it is obvious that WC3 requires an average level of performance that is above this level, otherwise it wouldn't be jumping up to 650 MHz half the time.
Now obviously the performance at Fmax@Vmin goes up with increase in functional units (i.e. Broadwell), but one would hope that by the time Broadwell comes out, 1366x768 will be largely extinct, and you would be looking at 1080p at a minimum, and thus the load would also go up. Plus as time goes by more and more people would probably be moving on from 12 year old games like WC3 (to something like league of legends for instance), which would also increase performance demands.
If the tablet in question has a Broadwell chip in it, then it will almost certainly be running windows, and as such the stuff you'll be running on it is in fact precisely PC applications.
Either way tablets don't have 3D performance requirements, software does, and yes you could run Angry Birds on your Broadwell equipped tablet if you felt like it and probably get some extra battery life from DCC, but again I kinda doubt most people on this forum (or in general for that matter), buys a Broadwell tablet to play Angry Birds.
Also if Broadwell goes in to an android tablet it would probably be a sku with only 8 EU, or half of what your 3517U has, and even with an increase in IPC/frequency, it would probably still be a fair bit slower than the HD 4000.
768p is going to be here for a long time with budget notebooks.
Cherry trail has 16 EU but low end SKUs may have 8 disabled.
But would those budget notebooks have Broadwell and not Cherry trail (or an AMD chip for that matter) though?. Obviously there will be some models with Broadwell and 768p, but I would hope (perhaps naively) that the vast majority would be above 768p.
Also as far as I can tell Intels notebook aspirations for broadwell seems to be mainly 2 in 1 convertibles,which generally seem to belong to a segment above budget and thus would hopefully have something above 768p.
I think the vast majority will have 768p. Especially for budget (>$400).
For haswell, the vast majority of budget notebooks are using ULV CPUs and are 768p around $450-700. You start hitting 1080p at around $750.
Android applications are far less demanding in 3D because of the devices the OS matured from. You do have few companies trying to run Android on Core already.Either way tablets don't have 3D performance requirements, software does, and yes you could run Angry Birds on your Broadwell equipped tablet if you felt like it and probably get some extra battery life from DCC, but again I kinda doubt most people on this forum (or in general for that matter), buys a Broadwell tablet to play Angry Birds.
Ideally you should have tablet-level power when running tablet-level applications and high-end system power when running very demanding applications.Also if Broadwell goes in to an android tablet it would probably be a sku with only 8 EU, or half of what your 3517U has, and even with an increase in IPC/frequency, it would probably still be a fair bit slower than the HD 4000.
You were not lying, were you?
(Anecdotally, I remember playing League of Legends couple of years ago on fully charged battery just for fun. While Anand's tests show the Ivy Bridge Ultrabook lasting only 107 minutes in 3DMark06, I played the game for 2.5+ hours(150+ minutes) before running out of battery.
I fear this before but next year is looking like a yawnfest for the mainstream Intel desktop CPU platform. I didn't quite understand why Intel would launch Broadwell-K to only later introduce the superior Skylake-K. Well, it looks like my fears are coming true and Intel will gimp Skylake in 2015 and launch the slower Skylake-S, while Broadwell-K is what will replace i7 4790K. If so, I don't foresee any worthwhile Skylake-K in 2015 at all. Perhaps, Intel is shifting the performance aspect to the X99 platform with Broadwell-E taking over HW-E in 2015 as the performance offering.
http://www.fudzilla.com/home/item/35645-broadwell-to-be-faster-than-skylake-s-in-desktop
I guess Intel wants to recoup the R&D costs on Broadwell which more or less means Skylake-K is likely going to be pushed into 2016. I am now just counting until an article pops up stating just that.
I fear this before but next year is looking like a yawnfest for the mainstream Intel desktop CPU platform. I didn't quite understand why Intel would launch Broadwell-K to only later introduce the superior Skylake-K. Well, it looks like my fears are coming true and Intel will gimp Skylake in 2015 and launch the slower Skylake-S, while Broadwell-K is what will replace i7 4790K. If so, I don't foresee any worthwhile Skylake-K in 2015 at all. Perhaps, Intel is shifting the performance aspect to the X99 platform with Broadwell-E taking over HW-E in 2015 as the performance offering.
http://www.fudzilla.com/home/item/35645-broadwell-to-be-faster-than-skylake-s-in-desktop
I guess Intel wants to recoup the R&D costs on Broadwell which more or less means Skylake-K is likely going to be pushed into 2016. I am now just counting until an article pops up stating just that.
My bet is that Intel wants to push "enthusiasts" to X99. it works better for customers since everybody has always been saying "get rid of the iGPU and give us more cores."
I don't understand the fuss. We know that Devil's Canyon's successor will be released in Q2. So unlocked Skylake will be released in 2016, no surprises here.