Trinity review

Page 25 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Kevmanw430

Senior member
Mar 11, 2011
279
0
76
LOL at people praising AMD's driver team when you have threads here with people complaining about issues with the HD 7000 series.

You do realize in the Anandtech review they did not mention having any issues with the drivers, right?

And you do realize the GMA 950 is a 6-year-old IGP, right? If we were to make arguments here making reference to issues with 6-year-old products everyone here would be complaining about the horrible driver support for the Radeon HD 2000 and 3000 series and how that means AMD has horrible driver support.

Really, because I have three AMD GPU's, never had a driver problem with any of them. Sure, the 7 series has some problems. It's new. It will be sorted out.

You do realize that while they don't mention any issues with drivers, they don't mention which ones they use?

Do you realize his point was that he was just making the point that he hasn't had any direct experience with these drivers in a while, not basing his opinion on the support then, but on what has been said of recent drivers.
 

LOL_Wut_Axel

Diamond Member
Mar 26, 2011
4,310
8
81
I even said that no one will be putting an 35W CPU into ultrabooks but you went and strawmanned anyways. I was pointing out that the comparison to the i5-2410m wasn't apples to apples. That said, you linked to the normalized chart in a later post I guess.

If you weren't meaning to use it as an argument you wouldn't have posted it, unless you were either being sarcastic or some other reason.

The i5-2410M is a 35W product, as is the A10-4600M. They both deliver the same relative (normalized) battery life, which also means they consume a very similar amount of power.

If you look at the Zenbook review the relative battery life going from the 17W Sandy Bridge to the 17W Ivy Bridge improved by... nothing. They both have the same run time and battery capacity, but Ivy Bridge is faster so at least efficiency did improve somewhat.
 

LOL_Wut_Axel

Diamond Member
Mar 26, 2011
4,310
8
81
Really, because I have three AMD GPU's, never had a driver problem with any of them. Sure, the 7 series has some problems. It's new. It will be sorted out.

You do realize that while they don't mention any issues with drivers, they don't mention which ones they use?

Do you realize his point was that he was just making the point that he hasn't had any direct experience with these drivers in a while, not basing his opinion on the support then, but on what has been said of recent drivers.

I haven't had any problems with the drivers on my laptop which has an HD 3000, but then again, I'm part of the 90%+ that doesn't game on their laptops. ()

The Anandtech review didn't mention having any issues with the drivers from what I could see, and it's funny how people say the HD 7000 series is new and therefore the small issues with the drivers will be fixed yet I've seen no one mention any issues with the HD 4000's drivers even though it's even newer.
 

pelov

Diamond Member
Dec 6, 2011
3,510
6
0
AMD's drivers are definitely better than Intel's, LOLWTFBBQAXEL. I don't think anyone in their right mind would claim otherwise. Where AMD lacks today is in crossfire and they still haven't picked up the pace. nVidia still has way better SLi support and provides it in far more timely manner even though AMD has better crossfire scaling. Who cares for crossfire scaling if it's late As far as single GPU issues go nVidia and AMD tend to be pretty even. Although AMD took months to officially support the 79xx series you'll note that nVidia also took months before they officially supported the 560Ti as well which double-sucked because of the infamous black flicker bug that GPU suffered from. AMD had a bug where the 58xx cards were underclocked in multi-monitor support and it caused flickering that required a BIOS update (I own a 5870 and I had to manually tweak an .ini to fix it), nVidia had the TDR bug GPU-accelerated apps like web browsers would crash your drivers and required a reboot.

Intel has a far more spotty history, though. Their image quality still lacks and their tesselation performance also sucks. Compare Intel's latest driver notes that I posted up there to nVidia's or AMD's here below.

http://support.amd.com/us/kbarticles/Pages/AMDCatalystSoftwareSuiteVersion124ReleaseNotes.aspx - AA improvements, image quality, texture fill rates as well as fixing game bugs.

http://us.download.nvidia.com/Windows/301.42/301.42-win7-winvista-desktop-release-notes.pdf Much of the same, along with fixing SLi performance gains, etc.

Both companies also release drivers far more frequently and you can also download crossfire/SLi profiles before they make it to the official drivers. Intel doesn't offer dual graphics but even if they did it would suck The biggest issue AMD has with mobile drivers is that they're controlled by OEMs, or more specifically they were and we're not sure about Trinity. This meant that people with Llanos had to rely on Lenovo/HP/Asus, etc. for their drivers and had to hope the companies weren't lazy in uploading the AMD official drivers to their site. If things change with Trinity then you can expect better and more frequent drivers and thus better performance than we had with Llano. If drivers are still a locked-in deal then AMD and their friends screwed the pooch yet again. Even still with locked-in drivers AMD has way better drivers than Intel provides.

edit - actually you could still get the updated Llano graphics drivers "unofficially" but it required digging.
 
Last edited:

Kevmanw430

Senior member
Mar 11, 2011
279
0
76
I haven't had any problems with the drivers on my laptop which has an HD 3000, but then again, I'm part of the 90%+ that doesn't game on their laptops. ()

The Anandtech review didn't mention having any issues with the drivers from what I could see, and it's funny how people say the HD 7000 series is new and therefore the small issues with the drivers will be fixed yet I've seen no one mention any issues with the HD 4000's drivers even though it's even newer.

I also haven't seen much of anyone that has a laptop with an HD 4000 yet. And even then, people who game on their laptops don't buy laptops with just Intel IGP's, so your probobly not going to see many people who do complain. But those that will will have legitimate things to coplain about, like crappy image quality, something absent on AMD solutions.

No, you are correct, they did not mention any driver problems. But they didn't mention driver problems with the AMD system either, so your argument is invalid.
 

LOL_Wut_Axel

Diamond Member
Mar 26, 2011
4,310
8
81
Obviously, a 17W chip would get better relative battery life, because it has a whole 18W of TDP less to use.

Interesting that while the A10 had more than 100% more TDP available than the ULV IVB chip, but its relative battery life was only no where near 100% worse.

I know that is not how TDP is quantified, but you get my point.

TDP doesn't mean much when it comes to actual power consumption. Both the 2500K/2600K are 95W TDP yet they consume significantly less power than the 95W FX-6100. Rather, it's a number that is used throughout a whole line of products to indicate the minimum amount of power the OEM cooling solution needs to be able to dissipate.

The A6-4455M won't consume half the power of the A10-4600M, either.
 

Kevmanw430

Senior member
Mar 11, 2011
279
0
76
Both companies also release drivers far more frequently and you can also download crossfire/SLi profiles before they make it to the official drivers. Intel doesn't offer dual graphics but even if they did it would suck The biggest issue AMD has with mobile drivers is that they're controlled by OEMs, or more specifically they were and we're not sure about Trinity. This meant that people with Llanos had to rely on Lenovo/HP/Asus, etc. for their drivers and had to hope the companies weren't lazy in uploading the AMD official drivers to their site. If things change with Trinity then you can expect better and more frequent drivers and thus better performance than we had with Llano. If drivers are still a locked-in deal then AMD and their friends screwed the pooch yet again. Even still with locked-in drivers AMD has way better drivers than Intel provides.

Just wanted to point out that laptop with only the AMD IGP could install normal drivers. Also, most Dual Graphics laptop can install normal drivers once the latest drivers form the manufacturer are installed.

I say this from experience with a DV6z I had w/A8-3810MX and a 6750M. Thought I'd add that so LOL couldn't use my post as ammunition.
 

Kevmanw430

Senior member
Mar 11, 2011
279
0
76
TDP doesn't mean much when it comes to actual power consumption. Both the 2500K/2600K are 95W TDP yet they consume significantly less power than the 95W FX-6100. Rather, it's a number that is used throughout a whole line of products to indicate the minimum amount of power the OEM cooling solution needs to be able to dissipate.

The A6-4455M won't consume half the power of the A10-4600M, either.

I realize that, and you are correct in that regard. My point still has some value, in that there should still be more of a 22% difference in relative battery from a 17W to 35W. Of course, we won't know for sure until we have numbers on a 17W Trinity.
 

LOL_Wut_Axel

Diamond Member
Mar 26, 2011
4,310
8
81
I also haven't seen much of anyone that has a laptop with an HD 4000 yet. And even then, people who game on their laptops don't buy laptops with just Intel IGP's, so your probobly not going to see many people who do complain. But those that will will have legitimate things to coplain about, like crappy image quality, something absent on AMD solutions.

No, you are correct, they did not mention any driver problems. But they didn't mention driver problems with the AMD system either, so your argument is invalid.

Well, the HD 4000 is available in a variety of quad-core Intel CPUs (desktop and laptop Core i7; desktop Core i5). And no one has mentioned any issues, so for now none exist and therefore for now the drivers argument is unfounded.

And Intel improved the image quality a lot compared to their previous efforts. From the review:

Image quality is actually quite good, although there are a few areas where Intel falls behind the competition. I don't believe Ivy Bridge's GPU performance is high enough yet where we can start nitpicking image quality but Intel isn't too far away from being there.

Anisotropic filtering quality is much improved compared to Sandy Bridge. There's a low precision issue in DirectX 9 currently which results in the imperfect image above, that has already been fixed in a later driver revision awaiting validation. The issue also doesn't exist under DX10/DX11.

Game compatibility is also quite good, not perfect but still on the right path for Intel. It's also worth noting that Intel has been extremely responsive in finding and eliminating bugs whenever we pointed at them in their drivers.

I'm not finding this driver and image quality "issue" many people are talking about.
 

pelov

Diamond Member
Dec 6, 2011
3,510
6
0
I'm not finding this driver and image quality "issue" many people are talking about.

Reread that again. They were comparing HD3000 vs HD4000, so hardware and not driver updates. They also reached the conclusion that though it's improved tremendously, it still lags behind AMD's IQ and tesselation. Intel's working on their drivers but they still have a lot of room for improvement if they wish to considered on the same level as AMD/nV.
 

Kevmanw430

Senior member
Mar 11, 2011
279
0
76
Take your E420 and fire up any game, really. The few I've playerd on my HD3000 all have crappy image quality, especially Skyrim, which barely runs anyways. The image quality may be improved over the HD 4000, but I've not seen anyone complain about image quality with AMD, other than RAGE.

And people buying K series CPU's almost NEVER use the IGP. Sure, it's available in quad core CPU's in laptops. Find me one person on the forum with a HD 4000 laptop on the forums.

Honestly, the only thing that Intel has over AMD in the IGP department is QuickSync. However, it is a big factor in some areas.
 

LOL_Wut_Axel

Diamond Member
Mar 26, 2011
4,310
8
81
I realize that, and you are correct in that regard. My point still has some value, in that there should still be more of a 22% difference in relative battery from a 17W to 35W. Of course, we won't know for sure until we have numbers on a 17W Trinity.

With the 17W Trinity you'll have roughly the same normalized battery life increase (20-25%) in comparison to the 35W version. Again, TDP is just a metric for OEMs to follow when it comes to heat dissipation and their heatsink/fans and nothing more. The 45W TDP i5-3570T doesn't consume anywhere near 42% less power than the 77W i5-3570K or even the 3770K, even if that's what the TDP would indicate.
 

LOL_Wut_Axel

Diamond Member
Mar 26, 2011
4,310
8
81
Reread that again. They were comparing HD3000 vs HD4000, so hardware and not driver updates. They also reached the conclusion that though it's improved tremendously, it still lags behind AMD's IQ and tesselation. Intel's working on their drivers but they still have a lot of room for improvement if they wish to considered on the same level as AMD/nV.

Yeah, no... they were talking about the drivers.

And they said in the review the IQ is more than enough for casual gamers looking to game at Mainstream resolutions.

Or is Trinity now suddenly an Enthusiast solution and that's why you need the best IQ instead of good IQ, which according to the review is what the HD 4000 delivers? Do clarify, because perhaps I foolishly thought that it would be people looking at discrete graphics solutions like GT 650M and up that would care about having the best image quality rather than "good enough". I thought AMD's target with Llano and Trinity was "good enough" budget gaming, so perhaps you should explain why a person buying a $500-600 laptop to do some gaming would care about the very best image quality when he's playing at Medium settings and is playing casually.
 

pelov

Diamond Member
Dec 6, 2011
3,510
6
0
Yeah, no... they were talking about the drivers.

Is that so?

Game Image Quality

For most gamers, image quality is an important consideration, and Intel has a bad reputation for delivering less-than-stellar visuals—particularly poor anisotropic filtering. Although SemiAccurate reports that the situation is much improved on Ivy Bridge, Sandy Bridge still suffers from terrible filtering quality.

http://www.tomshardware.com/reviews/a10-4600m-trinity-piledriver,3202-16.html

In a nutshell, we want the color patterns to map consistently to the geometry (so, in this case, we want them to be perfectly circular), and we want the transitions between each color to be smooth. Trinity's Radeon HD 7760G integrated graphics has no trouble with either task. Ivy Bridge's HD 4000 IGP also manages mostly circular patterns with smooth transitions, but if you look closely, you'll see jagged lines where the red fades into the background checkerboard pattern. As for Sandy Bridge, well, the image speaks for itself.

In a real-world example, the differences are plainly visible. Trinity and Ivy Bridge both give us nice, sharp textures at off-axis angles of inclination, while Sandy Bridge fails in a very noticeable way.

Drivers, huh? Is that why the older hardware falls behind? Purely drivers although they're both updated?

Or is Trinity now suddenly an Enthusiast solution and that's why you need the best IQ instead of good IQ, which according to the review is what the HD 4000 delivers? Do clarify, because perhaps I foolishly thought that it would be people looking at discrete graphics solutions like GT 650M and up that would care about having the best image quality rather than "good enough". I thought AMD's target with Llano and Trinity was "good enough" budget gaming, so perhaps you should explain why a person buying a $500-600 laptop to do some gaming would care about the very best image quality when he's playing at Medium settings and is playing casually.

So you buy the worse performing hardware because it's good enough? They went with "good enough" on the CPU side and are still improving on the GPU side. Hell, here's Rory Read admitting the same thing. Intel hasn't said as much, but if you look at the CPU "improvements" on Ivy Bridge and upcoming Haswell you'll see they're also doing the same. "Good enough" refers to CPU improvement and not GPU improvement, particularly where budget gaming laptops are concerned (or laptops in general. over 3/4s of new PCs sold in the US now are laptops and not desktops so graphical performance has taken the front wheel along with perf-per-watt and CPU performance sits in the back). Improving CPU performance will only help by a few FPS max when you're GPU limited by the on-die graphics. Improving on-die graphics by a significant amount and providing good enough CPU performance means you get far more substantial gains.

Err, here's another better link.
http://www.brightsideofnews.com/news/2012/4/27/amds-close-future-analyzed-what-tomorrow-brings.aspx

On the other hand Rory Read claims that North America performed very well, when in fact any reputable market research outfit reports similarly bleak numbers. The truth of the matter is that for a considerable amount of people computing got a commodity in the developed Western countries, which prolonged upgrade cycles. The old machines are simply good enough as long as they don't break. As usual, anyone playing demanding games or doing serious business on their PCs is an exception to this rule.

Those who use their PC as a workstation generally have to pay more $$ for a better CPU and Intel has split its desktop platforms to supply that. So the biziness and workstation crowd is covered by more expensive chips on a more expensive feature rich platform. But what about the gamers? Well, that too is getting weird. Year-to-year discrete GPU sales have gone down by 3%. Although nV/AMD will attribute this to hard drive shortage, the problem is that this same trend wasn't seen in laptops/desktops. Instead of increasing discrete GPU sales they've actually gone down. What the hell? Well, the reason for that is the prevalence of cheap laptops and HD3000+APUs which have stolen the thunder from low end discrete GPU sales. HD4000 now means that Intel doesn't need nVidia/AMD for low-tier discrete GPUs while Trinity can perform as well as a 6630-6650m discrete GPU. This doesn't affect only laptops, despite laptops now making up the overwhelming % of sales, but also on the desktop. Business PCs focused around offices don't need a discrete GPU anymore and the low TDP of SB has allowed OEMs to make small and cheap PCs because they also skip on discrete GPUs. Llano on the desktop has also allowed for lower res gaming on higher settings for the gaming crowd.

It's sort of strange, really, but expected if you think about it. Considering so many people are bypassing desktops and just going with laptops and tablets, Intel and AMD have focused on providing the extras that would benefit their average usage. As a result, instead of having a 3820 as the main desktop chip, essentially a server chip that's better served for the job, we get the 2500K and 2600K with on-die graphics which most desktop users who buy unlocked chips just don't need. AMD hasn't done that yet but they're also embracing the full APU strategy that Intel has already adopted [though if you count Faildozer then they technically have as it's a server-first architecture]. Combine the on-die graphics on desktop enthusiast chips with the fact that most of the improvements generation-to-generation are going to be graphical in nature, we're definitely seeing both companies tell us we've been spoiled by expecting CPU performances and they're spending their R&D elsewhere.

Outside of the 2011 workstation platform and AMD's horrible attempt at selling us a crappy server processor, the last year+ has been almost all on-die graphics focused. Haswell and Kaveri are looking to be much of the same as well. This shift of R&D and focus is one of the reasons I want to dump my desktop entirely. I don't need a workstation and I've outgrown supermagic enthusiast gaming. As a hardware enthusiast it seems I've fallen in with the mobile crowd.
 
Last edited:

LOL_Wut_Axel

Diamond Member
Mar 26, 2011
4,310
8
81
Take your E420 and fire up any game, really. The few I've playerd on my HD3000 all have crappy image quality, especially Skyrim, which barely runs anyways. The image quality may be improved over the HD 4000, but I've not seen anyone complain about image quality with AMD, other than RAGE.

And people buying K series CPU's almost NEVER use the IGP. Sure, it's available in quad core CPU's in laptops. Find me one person on the forum with a HD 4000 laptop on the forums.

Honestly, the only thing that Intel has over AMD in the IGP department is QuickSync. However, it is a big factor in some areas.

Well, you said any game, so I downloaded Team Fortress 2 (I admit it isn't graphically demanding) which is still a pretty popular game played by casual gamers and it ran fine.









But from what I've seen in reviews it will only run newer, more graphically demanding games at Low settings. The HD 4000 can play those fine at Medium settings, and it has good IQ.
 

Rezist

Senior member
Jun 20, 2009
726
0
71
If you want a gaming laptop you get ivy bridge with a dedicated card, if you want a casual gaming laptop maybe check out trinity after we see pricing, if you want a non-gaming laptop (as in never actually run any game made in the last 5-10 years) get ivy bridge.
 

Kevmanw430

Senior member
Mar 11, 2011
279
0
76
Way to pick TF2, awesome game.

However, like you said, it's not necessarily known for its grapical fidelity.
 

LOL_Wut_Axel

Diamond Member
Mar 26, 2011
4,310
8
81
Is that so?http://www.tomshardware.com/reviews/a10-4600m-trinity-piledriver,3202-16.html


So you buy the worse performing hardware because it's good enough? They went with "good enough" on the CPU side and are still improving on the GPU side. Hell, here's Rory Read admitting the same thing. Intel hasn't said as much, but if you look at the CPU "improvements" on Ivy Bridge and upcoming Haswell you'll see they're also doing the same. "Good enough" refers to CPU improvement and not GPU improvement, particularly where budget gaming laptops are concerned (or laptops in general. over 3/4s of new PCs sold in the US now are laptops and not desktops so graphical performance has taken the front wheel along with perf-per-watt and CPU performance sits in the back). Improving CPU performance will only help by a few FPS max when you're GPU limited by the on-die graphics. Improving on-die graphics by a significant amount and providing good enough CPU performance means you get far more substantial gains.

Err, here's another better link.
http://www.brightsideofnews.com/news/2012/4/27/amds-close-future-analyzed-what-tomorrow-brings.aspx



Those who use their PC as a workstation generally have to pay more $$ for a better CPU and Intel has split its desktop platforms to supply that. So the biziness and workstation crowd is covered by more expensive chips on a more expensive feature rich platform. But what about the gamers? Well, that too is getting weird. Year-to-year discrete GPU sales have gone down by 3%. Although nV/AMD will attribute this to hard drive shortage, the problem is that this same trend wasn't seen in laptops/desktops. Instead of increasing discrete GPU sales they've actually gone down. What the hell? Well, the reason for that is the prevalence of cheap laptops and HD3000+APUs which have stolen the thunder from low end discrete GPU sales. HD4000 now means that Intel doesn't need nVidia/AMD for low-tier discrete GPUs while Trinity can perform as well as a 6630-6650m discrete GPU. This doesn't affect only laptops, despite laptops now making up the overwhelming % of sales, but also on the desktop. Business PCs focused around offices don't need a discrete GPU anymore and the low TDP of SB has allowed OEMs to make small and cheap PCs because they also skip on discrete GPUs. Llano on the desktop has also allowed for lower res gaming on higher settings for the gaming crowd.

It's sort of strange, really, but expected if you think about it. Considering so many people are bypassing desktops and just going with laptops and tablets, Intel and AMD have focused on providing the extras that would benefit their average usage. As a result, instead of having a 3820 as the main desktop chip, essentially a server chip that's better served for the job, we get the 2500K and 2600K with on-die graphics which most desktop users who buy unlocked chips just don't need. AMD hasn't done that yet but they're also embracing the full APU strategy that Intel has already adopted [though if you count Faildozer then they technically have as it's a server-first architecture]. Combine the on-die graphics on desktop enthusiast chips with the fact that most of the improvements generation-to-generation are going to be graphical in nature, we're definitely seeing both companies tell us we've been spoiled by expecting CPU performances and they're spending their R&D elsewhere.

Outside of the 2011 workstation platform and AMD's horrible attempt at selling us a crappy server processor, the last year+ has been almost all on-die graphics focused. Haswell and Kaveri are looking to be much of the same as well. This shift of R&D and focus is one of the reasons I want to dump my desktop entirely. I don't need a workstation and I've outgrown supermagic enthusiast gaming. As a hardware enthusiast it seems I've fallen in with the mobile crowd.

Yes, that is so. The link you provided said the image quality is much improved and good given the market being target. It's both because of software and hardware improvements.

I thought the whole mantra of the pro-AMD here was that AMD was "good enough" when it came to the CPU and IGP. Intel is now "good enough" when it comes to the IGP, and now suddenly having the best IQ matters to a casual gamer playing at Mainstream settings? C'mon.

The problem for AMD is exactly that: their "good enough" mantra. Being only "as good" as Intel isn't gonna get them huge points in market share. Now with most of their IGP advantage gone, there's even less reason to go AMD. After all, Intel and AMD are comparable on everything that isn't CPU performance, and because of that people will flock to Intel by default. AMD needs to be better than Intel when it comes to battery life, design wins, power consumption, among other things. If they're only "as good", then they're not really being "predators", as Read is saying AMD wants to be.
 

LOL_Wut_Axel

Diamond Member
Mar 26, 2011
4,310
8
81
Way to pick TF2, awesome game.

However, like you said, it's not necessarily known for its grapical fidelity.

Yeah. Like I said, I don't play games on my laptop, so that's the only thing I could test.

The HD 4000 is able to play much more demanding games like Skyrim at Medium settings without issue, though. IQ isn't the best, but it's good.
 

pelov

Diamond Member
Dec 6, 2011
3,510
6
0
So you quote my TL/DR block o' text and assume it's still "good enough" in GPU performance and not CPU performance?

In case you missed it, here's the Ivy Bridge review.
http://www.anandtech.com/show/5771/the-intel-ivy-bridge-core-i7-3770k-review

Here's the Haswell architecture
http://en.wikipedia.org/wiki/Haswell_(microarchitecture)

And AMD is also focusing mostly on GPU improvements as well.
Kaveri
http://www.nordichardware.com/news/...s-performance-on-par-with-radeon-hd-7750.html
Trinity
http://techreport.com/articles.x/22932

Yes, that is so. The link you provided said the image quality is much improved and good given the market being target. It's both because of software and hardware improvements.

By definition, if driver updates don't improve it that means hardware has, which was exactly my point. HD3000 doesn't run into an IQ problem because of drivers but because of hardware.
 

Olikan

Platinum Member
Sep 23, 2011
2,023
275
126
Honestly, the only thing that Intel has over AMD in the IGP department is QuickSync. However, it is a big factor in some areas.


Both VCE and QuickSync appear to halve transcoding times... except the latter looks to be considerably faster. We didn't see much of a difference in output image quality between the two, but the output files had drastically different sizes. QuickSync spat out a 69MB video, while VCE got the trailer down to 38MB. (Our source file was 189MB.) Using QuickSync in high-quality mode extended the Core i7-3760QM's encoding time to about 10 seconds, but the resulting file was even larger—around 100MB. The output of the software encoder, for reference, weighed in at 171MB.
http://techreport.com/articles.x/22932/8
QuickSync is faster, but VCE gives a better file size.....IMO a tie


actually, it seems that VCE give amd a tecnology similar to intel's WIDI...
The brand-new VCE block throws hardware-accelerated H.264 encoding into the mix, too—something that's important not just for performance and power efficiency reasons, but also for enabling new features like wireless displays
 
Last edited:

pelov

Diamond Member
Dec 6, 2011
3,510
6
0
That also depends on the program and implementation.

Moving water is a very difficult thing to encode properly, but CME’s Quick Sync version looks like something hacked together with MS Paint and QuickTime 1.0. There are additional differences between the various outputs that we can’t show you in a still image. Stargate:AoT was shot and encoded on the DVD at 23.97 frames per second (often called 24p). The GTX 580 and the Quick Sync file maintain this frame rate. The Radeon 7950 and the software-encoded version increase the frame rate to 30 fps. In the old days, when a 24p film had to be modified for broadcast television, this was done by repeating frames using a method known as 2:3 pulldown.

There’s absolutely no reason to convert a 24p film for 30p playback on a modern digital device. Doing so creates a perceptible judder in motion sequences. It also increases file size by adding 25% more frames.

http://www.extremetech.com/computing/128681-the-wretched-state-of-gpu-transcoding
That's a very good article regarding the topic of image quality. Granted, this was before openCL Trinity acceleration and depends very heavily on the program. Some don't utilize QS while others do and the same goes for they CUDA and openCL. Unless I have things backwards, Handbrake performs very well on AMD's architecture and is generally the go-to program for such tasks.

It's a lot more than which one is faster. There's file size, like you already mentioned, which program you favor as well as image quality. AMD relies on software and GPU architecture to make up for the lack of specific hardware with that purpose in mind (Quick Sync).
 

LOL_Wut_Axel

Diamond Member
Mar 26, 2011
4,310
8
81
So you quote my TL/DR block o' text and assume it's still "good enough" in GPU performance and not CPU performance?

In case you missed it, here's the Ivy Bridge review.
http://www.anandtech.com/show/5771/the-intel-ivy-bridge-core-i7-3770k-review

Here's the Haswell architecture
http://en.wikipedia.org/wiki/Haswell_(microarchitecture)

And AMD is also focusing mostly on GPU improvements as well.
Kaveri
http://www.nordichardware.com/news/...s-performance-on-par-with-radeon-hd-7750.html
Trinity
http://techreport.com/articles.x/22932



By definition, if driver updates don't improve it that means hardware has, which was exactly my point. HD3000 doesn't run into an IQ problem because of drivers but because of hardware.

Intel is focusing mostly on IGP improvements because they've largely perfected what they can do on the CPU side. Therefore, you run into points of diminishing returns. The IGP is where they were far behind, so even if it doesn't matter to the majority of users (earlier platforms had no problems with sales because of it) that's where they're focusing on improving. Improving their IGP performance with each new uArch and process node still allows them to reach their always-allusive higher performance/watt.

And it doesn't matter that the HD 3000 had lower IQ because it wasn't an IGP meant for pretty much any type of gaming. It was made to be efficient, play HD video, and do quick transcoding and it excels in those areas. Since HD 4000 can play games in Medium, Intel focused on improving IQ further. They don't need the best IQ. Again, people gaming at Medium settings don't care as long as it's good and it is.

The problem for AMD is, who and what are they aiming at?

Budget consumers will go with Intel because of their better brand recognition and the fact it does everything AMD can as good and at the same price.

Budget laptop gamers can go either way--the HD 4000 is good enough for gaming at Medium settings. Budget desktop gamers can go for a dual-core IB and discrete graphics for the same price as Trinity and get higher performance.

Mainstream consumers will go with Intel because of the aforementioned brand recognition and the fact they care more about CPU than IGP performance and, as I said, earlier, Intel is just as good when it comes to chassis weight, battery life, and size.

Business/enterprise will go with Intel because of higher CPU performance at AMD's price points and the fact AMD doesn't have high-performance CPUs not to mention power consumption difference between both is a wash in mobile, and in Intel's favor in desktop/server. In both cases, performance/watt is higher with Intel.

Power users will go for a Core i5 or i7 and a dGPU for the best performance.

Travelers may go for an AMD ultra-thin because much lower pricing than Intel's Ultrabook and "good enough" performance coupled with competitive battery life, weight, and size.

So really the only people I'm seeing AMD are targeting are budget gamers and travelers, along with AMD fans. Doesn't surprise me too much that Intel has more than 80% of the laptop market.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |