Nemesis 1
Lifer
- Dec 30, 2006
- 11,366
- 2
- 0
Thats the same as todays trinity test . AMD was saying 50% + improvements . Its was true in 1 benchmark . I don't get your chart I don't see any games benchmarks
so you concede your previous point that intels estimation wasn't off then on hd2000 vs 4000? we know haswell igp is supposed to have 40 EU's so based on what we have now it doesn't sound unreasonable.
Thats the same as todays trinity test . AMD was saying 50% + improvements . Its was true in 1 benchmark . I don't get your chart I don't see any games benchmarks
Yeah, but you never said anything about a 45W Trinity chip before pulling out that slide and claiming I hadn't read the article.A 45W Trinity would be able to have higher CPU and iGPU frequencies. That would translate in to higher performance. It would be able to switch to a higher frequency Turbo mode thus increasing performance even more.
The max frequencies of the iGPUs is already pretty well known, and all the 35W dual cores have a max boost of 1200MHz or 1250MHz. There is obviously a difference in L3, but how much performance you'd lose going from the 6MB 3720QM that was tested to a 35W 4MB 3520M is questionable.A 35W IB Core i5 has less LLC(L3) and perhaps lower iGPU frequencies (not more than 100-150MHz) than 45W Core i7. That translates in to lower iGPU performance even thought it has the same iGPU shader count.
As Inteluser2000 have shown, a 35W SB is 10-20% slower in Games than 45W SB.
We have to wait and see if the same happens with IB. I believe we will see the same behavior.
So to re cop, a 35W Trinity could be more than 30-50% faster in games than 35W IB on average.
You do the same bringing Haswell to a Trinity vs IB game
Not even close.
Where is 35W Core i5 again ??? not released yet ???
Lets face it, AMD fucked up with Trinity. No big surprise since they continued the Dulldozer flop. -20% CPU, 20% GPU approx.
Piledover is another pile of crap. Those waiting for the next FX processor to magically perform waits in vain...again.
Trinity is pretty damn good, actually. Better GPU performance and better CPU performance as well as better perf-per-watt on the same node. That's not pretty damn good that's awesome, actually. All of those improvements were architectural. Consider Intel achieved only 5-15% increase in CPU and a great shift in GPU along with a node shrink and it still consumes more power than the previous gen, so AMD's achievements look impressive.
The perf-per-watt is really the most impressive thing here. Remember Bulldozer's biggest issues were price and power consumption. If mobile Trinity is clocking up to 3.2ghz with an IPC bump at 35W TDP it's going to mean the 100W+ Vishera chips shouldn't have any of the ailments Bulldozer had in perf-per-watt.
What none of the reviews have bothered with is a direct IPC comparison between Llano and Trinity at equal clock speeds with Turbocore off. For those looking for rough estimates on Vishera's supposed performance it would provide a clue. It doesn't look like AMD have managed to gain that ~10% they lost from the Bulldozer CMT design but it does look like whatever was negatively impacting the clock speeds, be it process maturity or design side, seems to have been fixed.
Olikan forgot to read
Lets face it, AMD fucked up with Trinity. No big surprise since they continued the Dulldozer flop. -20% CPU, 20% GPU approx.
Piledover is another pile of crap. Those waiting for the next FX processor to magically perform waits in vain...again.
It's almost too crazy to consider, but AMD actually has better power consumption than Intel has on a smaller node. That just looks wrong, but Intel seemed to have screwed up somewhere as far as power gating is concerned. The IB mobile chips draw far too much power at idle and low loads. This really shouldn't be too surprising as the IB desktop chips draw roughly the same power at idle as do the SB chips and it's only in loads where IB stretches its legs. So AMD provides great savings in power consumption, they just haven't been able to match Intel's perf-per-watt that's the problem (though it has increased dramatically) whereas Intel seems to have went backwards.
It is pathetic to see a 22nm 45W Intel CPU loose to a 32nm 35W AMD CPU in games. But yes AMD fucked up Trinity are you trolling or what ??
It's almost too crazy to consider, but AMD actually has better power consumption than Intel has on a smaller node.
"the only laptops that can consistently beat Trinity are found in Sandy Bridge ultrabooks"
those trinity laptops may have some shanenigan, similar to what they did with bobcat...
But yet, we have resonant clock mesh, google it...
at the end we need some retail laptop review....
If Intel is claiming to achieve roughly 550Ti levels of performance then it's likely to match AMD's 7750-level estimates for their GCN-based Kaveri. I don't think either one of those are out of the question but I do think AMD is more likely to reach their goals despite often inflating their claims and not living up to expectations. HD4000 is very very good but it's not as impressive on the desktop as it is on mobile. Where AMD alters the number of shaders and clock speeds significantly throughout their lineup, Intel (roughly) maintains the same structure, so whatever you get in mobile you're essentially getting on the desktop. And while it may look more impressive as you go down in TDP, as you move up it looks less and less so. We might see Intel skip above AMD in GPU performance on the mobile side but fall behind on the desktop with Kaveri/Haswell.
I hope they're both right, personally.
I thought it was an APU, because if it was just a CPU we both know the AMD part would be getting its ass handed to it. The only thing AMD does well is something they couldn't engineer themselves and had to buy (ATi)
He's not trolling. He's referencing the CPU part of Trinity because we wish they would make a better CPU with more IPC not less IPC than their previous architecture. Instead we get a mildly improved IGP that the enthusiasts that post on this forum don't give 2 shits about.
its not that they effed up really, its just that they aren't capable of making CPU's as fast as Intels
Oh no, I know that RCM tech helped but only in the order of 10% power savings or clock speed bumps at equal TDP. In the case of Trinity the savings have been applied to clock speeds rather than power savings. The power savings come from what we saw in Bulldozer vastly improved: Very very very good power gating. Bulldozer idle power draw was low already but the Piledriver cores and Trinity design seem to have taken that to another level. Probably some process maturity as well on account of GloFo fixing some issues.
Well i dont see Intel doing any better, they should buy NVIDIA.
ps: i have to remind you that Intel throw away in the garbage a billion dollars trying to do a GPU.
You said this is an APU not a high end desktop CPU. Trinitys iGPU is faster than intels HD4000. The CPU is faster than last years Llano CPU at the same manufacturing process. They have lower power consumption at the same 32nm process as Llano and yet hes saying that AMD fucked up Trinity ??? give me a brake mate. By his standards then, Intel fucked up real hard on IB.
what a joke
Did Intel get there on its own? Didn't they settle some sort of lawsuit with Nvidia where they paid a lot of money to NV but got access to a bunch of NV Graphics IP? I seem to remember reading about that many months ago.You dont see INtel doing any better? At least they are getting their on their own. AMD had to buy that because they couldnt' do it themselvs so yes i would say Intel has done better and by Haswell they are going to be equal to Ati IGP and they are not even a GPU company like AMD is.
ps: i have to remind you that Intel throw away in the garbage a billion dollars trying to do a GPU.
Trinity IGPU loses to hd4000 in half the games.....sad
Did Intel get there on its own? Didn't they settle some sort of lawsuit with Nvidia where they paid a lot of money to NV but got access to a bunch of NV Graphics IP? I seem to remember reading about that many months ago.
Maybe I'm wrong or maybe Intel got help as well.
We've only known about Intel and Nvidia's cross licensing a few months back. It is impossible for them to implement in a matter of months. These stuff takes years of planning before it even hits the market.Did Intel get there on its own? Didn't they settle some sort of lawsuit with Nvidia where they paid a lot of money to NV but got access to a bunch of NV Graphics IP? I seem to remember reading about that many months ago.
Maybe I'm wrong or maybe Intel got help as well.