Trinity review

Page 31 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Phynaz

Lifer
Mar 13, 2006
10,140
819
126
Your argument is bad.

5% faster and 25% more efficient (and keeping the die size roughly the same size) on the same process node is a very big deal

If you wanted to make a good argument you admit all these factors are true (which they are by the way) but you then argue that the performance is still too low.

-----

5% faster and 25% more efficient due to a die shrink is not a technical marvel, it is par for the course. If you wanted to have a good argument you should argue that intel doesn't need to make sandybridge much faster when they went with ivybridge instead they were focused on learning the process tech, yields, and keeping the die size small. It is on Haswell where you will see the big performance increase.

What you don't seem to realize is there's no such thing as a dumb shrink anymore and hasn't been for several generations. A shrink requires a new chip design that follows the new process rules. That's where the power and performance increases come from, not really the shrink itself.
 

wrbt

Member
Oct 9, 2009
48
0
0
The new Anandtech Diablo III benches definitely have me leaning even more towards Trinity as it appears to deliver a serious whuppin on HD4000 chip, in same cases almost putting the two into the playable vs. unplayable category. If at similar price point the two APUs are somewhat comparable in some games but Trinity is far superior in others, decision is easy. Of course price points remain to be seen so still somewhat speculative, and both are great products for many laptop buyer's usage profiles.



Those mid-tier games are where both APUs will really shine IMO, for the people wanting better than Plants vs. Zombies but don't need Metro2033 running on high detail. The two games I've probably spent the most hours in this year were probably Orcs Must Die and Defense Grid. Last year was probably Borderlands, which I'm sure these APUs could handle, but I don't know much about upcoming Borderlands 2 reqs/specs.

I look forward to more benchmarks when both lines are available in actually shipping products.
 
Last edited:

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
What you don't seem to realize is there's no such thing as a dumb shrink anymore and hasn't been for several generations. A shrink requires a new chip design that follows the new process rules. That's where the power and performance increases come from, not really the shrink itself.

90% of Ivy Bridge lower power consumption (over SandyBridge) comes from the shrinked 22nm Tri-Gate process than the design of the Chip.



 
Last edited:

lau808

Senior member
Jun 25, 2011
217
0
71
If discrete graphics would always be the superior option, handhelds simply wouldn't exist. Form factor, battery life and price are all things that iGPUs excel at.

I'm mostly playing roguelikes and need a laptop that's not too cumbersome to lug around. If Trinity laptops are cheaper, provide around the same battery life and perform much better when I decide to play Tribes, Guild Wars 2 when it comes out or anything else thats graphically more intensive, you know which one I'm going to choose.

ok thank u.
 

Arzachel

Senior member
Apr 7, 2011
903
76
91
What are they using for the load measurement? If it's something where the i5 scores better, then it makes that number meaningless.
 

Phynaz

Lifer
Mar 13, 2006
10,140
819
126
90% of Ivy Bridge lower power consumption (over SandyBridge) comes from the shrinked 22nm Tri-Gate process than the design of the Chip.




Where do your slides state that?

How much of it is from the shrink and how much is from the new transistor design?

Read my post again.
 

Abwx

Lifer
Apr 2, 2011
11,804
4,726
136
What are they using for the load measurement? If it's something where the i5 scores better, then it makes that number meaningless.

They run 3DMark11 to check comsumption
but true that the configurations are not the same
at all.
 

Abwx

Lifer
Apr 2, 2011
11,804
4,726
136
Where do your slides state that?

How much of it is from the shrink and how much is from the new transistor design?

Read my post again.

Look at the curves at 1V , we can see that at this voltage
a simple shrink of planar topology (grey curve) yield almost
as much perfs increase as a shrink + new transistor design.

At lower voltages FinFet provide 20% better perfs
than a 22nm planar (non SOI) process.

Indeed , a simple shrink would have yielded about 20% better
power consumption at normal speed with no difficulties but
in this case not only the node is shrinked but the CPU itself
is vastly enlarged so the better performance/watt ratio that would
have been available to reduce TDP is instead used to increase
the CPU and GPU processing power.
 
Last edited:
Aug 11, 2008
10,451
642
126
The new Anandtech Diablo III benches definitely have me leaning even more towards Trinity as it appears to deliver a serious whuppin on HD4000 chip, in same cases almost putting the two into the playable vs. unplayable category. If at similar price point the two APUs are somewhat comparable in some games but Trinity is far superior in others, decision is easy. Of course price points remain to be seen so still somewhat speculative, and both are great products for many laptop buyer's usage profiles.



Those mid-tier games are where both APUs will really shine IMO, for the people wanting better than Plants vs. Zombies but don't need Metro2033 running on high detail. The two games I've probably spent the most hours in this year were probably Orcs Must Die and Defense Grid. Last year was probably Borderlands, which I'm sure these APUs could handle, but I don't know much about upcoming Borderlands 2 reqs/specs.

I look forward to more benchmarks when both lines are available in actually shipping products.

This game does make the HD 4000 look weak. The problem is that even trinity is barely playable and D3 is not a graphically intense game. A several year old Gt540 with a dual core CPU slightly beats it, and if you read the review, the gt540 system was in a 13inch chassis and was probably not performing optimally due to thermal constraints. So I still am not convinced trinity is a better choice than an Intel CPU with a mid range discrete card. It might be OK if the price is right. We also need to see how much cheaper the trinity A8 is and how that performs. At least for Llano the top line product seemed overpriced relative to the A6.
 

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
This part really nails it:
http://www.anandtech.com/show/5865/laptop-graphics-face-off-diablo-iii-performance/4
Wrapping up, while Diablo III isn’t the most demanding new release, it can still bring basic laptops to their knees. Unfortunately, unlike desktops it’s often not possible (or at least not practical) to upgrade a laptop’s graphics capabilities. I’ve had a couple friends ask for help with running Diablo III on their old Core 2 Duo laptops, and they’re basically out of luck unless they want to purchase a new system. That’s something we’ve tried to explain in our laptop reviews, and Diablo III drives the point home: buying at the bottom of the barrel in terms of GPU capabilities may not matter for you right now, but kids and/or future applications may eventually make your IGP-only laptop insufficient.
 
Aug 11, 2008
10,451
642
126
Yeap, so dont buy any laptop with only Intel HD4000. Instead, get a Trinity or if you have more money to spend get a Trinity or Intel with a discrete Mobile graphics card :biggrin:

Amazing, this is one of the times that I have to agree with you.
 

Schmide

Diamond Member
Mar 7, 2002
5,690
926
126
This game does make the HD 4000 look weak. The problem is that even trinity is barely playable and D3 is not a graphically intense game. A several year old Gt540 with a dual core CPU slightly beats it, and if you read the review, the gt540 system was in a 13inch chassis and was probably not performing optimally due to thermal constraints. So I still am not convinced trinity is a better choice than an Intel CPU with a mid range discrete card. It might be OK if the price is right. We also need to see how much cheaper the trinity A8 is and how that performs. At least for Llano the top line product seemed overpriced relative to the A6.

Dude? The gt540 was released in Jan 11? You can't marginalize several in a year and 4 months. If you look at the graphs that's 1600x900 which is eye candy for laptops. Then you call thermal constraints in a lab. I call shenanigans.
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
Where do your slides state that?

How much of it is from the shrink and how much is from the new transistor design?

Read my post again.

I will give you an example of what a node process can do.

NVIDIA GTX280. 65nm 236W TDP
NVIDIA GTX285. 55nm 204W TDP.

GTX285 was a direct shrink at the 55nm half node process at TSMC. That is, they simple shrinked the same chip at the 55nm without redesign it. You got higher performance and at the same time lower power consumption due to the shrink process.

At 1V operating voltage, the Tri-Gate design only gives 2-5% more performance over a planar CMOS. It is at lower operating voltages that Intel's Tri-Gate 22nm process have the bigger impact in performance per power usage.

So, the majority of IB lower power consumption comes from Intel's 22nm node process and not from the design of the microArch.
 
Aug 11, 2008
10,451
642
126
Dude? The gt540 was released in Jan 11? You can't marginalize several in a year and 4 months. If you look at the graphs that's 1600x900 which is eye candy for laptops. Then you call thermal constraints in a lab. I call shenanigans.

I dont know what you are implying by calling "shenanigans". I thought the 540Gt had been out longer, if I am incorrect, I apologize. However in the current context, it is a lower mid-range card that still beats either APU. If you read the article completely, including the posts at the end, the reviewer agreed with my comments that the acer was probably throttling. Actually, the GT630 which is a re-badged GT540M with LOWER clocks, performed considerably better. So I will stand by my position that the A10 is not as fast as the GT540m.

I dont know what kind of agenda you are implying that I have, but I was only expressing my opinion. If I were to buy a laptop, I would either by an ultraportable(where trinity or HD4000 are sufficient) and not expect to game on it or buy something with a discrete card. If Trinity is sufficient for the games you play, and you think it will be adequate for games coming out in the next couple of years after the new consoles come out, good for you.
 

Schmide

Diamond Member
Mar 7, 2002
5,690
926
126
I dont know what you are implying by calling "shenanigans". I thought the 540Gt had been out longer, if I am incorrect, I apologize. However in the current context, it is a lower mid-range card that still beats either APU. If you read the article completely, including the posts at the end, the reviewer agreed with my comments that the acer was probably throttling. Actually, the GT630 which is a re-badged GT540M with LOWER clocks, performed considerably better. So I will stand by my position that the A10 is not as fast as the GT540m.

Did he agree?

RE: Question about GT630M by JarredWalton on Sunday, May 27, 2012
And funny enough, after additional investigation, the issue isn't throttling on the Acer but rather a higher clock on the GT 630M compared to the GT 540M. NVIDIA's updated specs page for the 630M lists 800MHz as the clock, but oddly their control panel is only reporting 475MHz on the ASUS. According to GPU-Z's Sensors tab, however, it really is running an ~800MHz core clock (1600MHz shaders), which accounts for the higher performance compared to the 672MHz GT 540M. I've updated the text in the article to explain this.

I dont know what kind of agenda you are implying that I have, but I was only expressing my opinion. If I were to buy a laptop, I would either by an ultraportable(where trinity or HD4000 are sufficient) and not expect to game on it or buy something with a discrete card. If Trinity is sufficient for the games you play, and you think it will be adequate for games coming out in the next couple of years after the new consoles come out, good for you.

I'm implying that you are misrepresentation things. Whether intentional or not. I can let one thing go, but you had multiple things out of order in that paragraph. Thus the shenanigans.
 
Aug 11, 2008
10,451
642
126
Did he agree?





I'm implying that you are misrepresentation things. Whether intentional or not. I can let one thing go, but you had multiple things out of order in that paragraph. Thus the shenanigans.

OK. I stand corrected about the clocks of the two cards. Initially Jason basically agreed with me about the heat issues probably causing poor performance of the 540m. I had not revisited the article since his updated post today, which said the clocks on the 630m initially were reported incorrectly.
 

Vesku

Diamond Member
Aug 25, 2005
3,743
28
86
I'm puzzled how the conclusion to the Diablo 3 tests could be so mild considering the actual results. The Trinity A10 is up with the entry level gaming discrete solutions while the HD4000 really has a hard time with the fastest selling game of all time.

Just like with almost anything AMD puts out this will hinge on pricing. A10 solutions will obviously have to be ~$100 less than IB+GT630M or AMD equivalent dedicated for it to be a good buy. Or perhaps offset by better features like a nicer screen and slimmer+lighter chassis.
 

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
I will give you an example of what a node process can do.

NVIDIA GTX280. 65nm 236W TDP
NVIDIA GTX285. 55nm 204W TDP.

GTX285 was a direct shrink at the 55nm half node process at TSMC. That is, they simple shrinked the same chip at the 55nm without redesign it. You got higher performance and at the same time lower power consumption due to the shrink process.

At 1V operating voltage, the Tri-Gate design only gives 2-5% more performance over a planar CMOS. It is at lower operating voltages that Intel's Tri-Gate 22nm process have the bigger impact in performance per power usage.

So, the majority of IB lower power consumption comes from Intel's 22nm node process and not from the design of the microArch.

That false.

You cannot use the same layout just "scaled" down...every die shrink coems with a redesign, it might not be a major redesign.
It even says so in the review og the GTX285, that is some redsign was involed:

http://www.anandtech.com/show/2711/9

Power isn't going to be straight forward here, as this is both a die shrink and an overclock. If all other things were equal, the die shrink would have enabled a some power savings, but increasing the clock speeds (and likely voltages) means that we have factors at work that will push against each other. As for which will win, let's take a look at the data and find out.
Since we didn't take a look at power in our GeForce GTX 295 article, we'll keep an eye on that card as well. Also, keep in mind that there have been 55nm GTX 260s being slowly phased in but that our GTX 260 parts are 65nm. The 55nm GTX 260s will show a power advantage over similarly clocked 65nm GTX 260s.
Idle power shows that NVIDIA is able to get some power savings when nothing is going on with the GPU. Power draw at idle decreased by about 10W with the move to 55nm which shows that in addition to their power saving features the die shrink does help. This advantage carries over to SLI as well with the GTX 285 SLI landing between the two single card dual-GPU systems.
The GeForce GTX 295 slides in just above the single GPU 4870 1GB while AMD's 4870 X2 consumes about 10W more than NVIDIA's higher performing dual-GPU card.
We see a different story when we look at load power. In spite of the die shrink, the added overclock pushes the GeForce GTX 285 higher under load than any other single GPU part. When SLI is enabled this becomes the most power hungry dual card setup we tested.
As for the GeForce GTX 295, we once again see good performance with lower power draw than the Radeon HD 4870 X2 and, in fact, less power draw than all the other dual-GPU setups we tested.
While a half node die shrink isn't the holy grail of power savings, the major advantage for NVIDIA comes from the die size decrease. We don't have measurements on the GPU after the shrink (we don't want to tear apart our hardware until we've tested things like 3-way SLI), but with the massive size of GT200 and the heavy price cuts NVIDIA was forced to make shortly after launch, the cost savings is a very important factor in this move.
NVIDIA needs to keep its price competitive and that means it needs to keep its costs down. Building an overclocked GTX 280 helps raise the price while building the parts at 55nm helps lower the cost. NVIDIA wants this card to be successful.
 

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
I'm puzzled how the conclusion to the Diablo 3 tests could be so mild considering the actual results. The Trinity A10 is up with the entry level gaming discrete solutions while the HD4000 really has a hard time with the fastest selling game of all time.

Just like with almost anything AMD puts out this will hinge on pricing. A10 solutions will obviously have to be ~$100 less than IB+GT630M or AMD equivalent dedicated for it to be a good buy. Or perhaps offset by better features like a nicer screen and slimmer+lighter chassis.

D3 is not a very normal game, it has very low requirements...so that game isn't opushing the hardware as much as other games,..
 

Vesku

Diamond Member
Aug 25, 2005
3,743
28
86
Your statement is making me scratch my head, the HD4000 has a lot more trouble than the A10. Yet we should ignore it because it has low minimum requirements?

D3 is not a very normal game, it has very low requirements...so that game isn't opushing the hardware as much as other games,..
 

georgec84

Senior member
May 9, 2011
234
0
71
Your statement is making me scratch my head, the HD4000 has a lot more trouble than the A10. Yet we should ignore it because it has low minimum requirements?

People will say anything to put AMD down and prop Intel up.

CPU: Intel > AMD
iGPU: AMD > Intel

End thread.
 
Aug 11, 2008
10,451
642
126
I'm puzzled how the conclusion to the Diablo 3 tests could be so mild considering the actual results. The Trinity A10 is up with the entry level gaming discrete solutions while the HD4000 really has a hard time with the fastest selling game of all time.

Just like with almost anything AMD puts out this will hinge on pricing. A10 solutions will obviously have to be ~$100 less than IB+GT630M or AMD equivalent dedicated for it to be a good buy. Or perhaps offset by better features like a nicer screen and slimmer+lighter chassis.

Microcenter already has an Asus Ivy quad core and GT630m for 799. To me that is a really good price. It is a matter of personal opinion, but to me the A10 would have to be in the mid 600 dollar range to be competitive, especially since a hundred dollars or two spread over the 3 or 4 years I would want to keep the computer is not really that much. So we will see. If the initial trinity laptops are in the 700 dollar range like Jason said in the article, they will have to offer some nice extra features or come down from that price.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |