What name do you post under at AMDZone?
The most elementary thought would have been to assume
that if i was to post in another forum it would be under
the same name....
I can conclude that you re inherently paranoid....
What name do you post under at AMDZone?
For your standards, other people find them acceptable
I havent seen you complain about the pro-Intel statements in this thread before, i guess you are Intel biased :whiste:
Have you ever used an A8 on a laptop? I bought one for my daughter, and I can't disagree more. The only issues I have with it is actually with the less than great keyboard on the laptop. Fantastic chip for all games at the 13x9 resolution of the laptop. I could play everything at max settings as well. I was quite impressed.
They also probably cannot even match the performance of a SB i5 with a GT540m add in card for gaming.
it is not up to you to say that you are objective.. because honestly you are not. (but that is not a problem, thats human nature... just don't act as if you are above the rest...)1. I am not a fan of Intel. I am a fan of the best product for the money. I really get frustrated when people only see one side of a question, and blindly support one company or another. Believe it or not, I used to be a big fan of AMD, until they started putting out an inferior CPU and hyping it to high heaven. I am particularly irked by the "good enough" arguments in these forums (and now apparently espoused by RR himself) used to excuse the inferior CPUs being put out by AMD. I could accept "good enough" CPU performance in the days of the Phenom II because they had a distinct price advantage. Now that has disappeared, so I totally dont understand why one would settle for "good enough" when better performance is available at the same price or less. I also think it very unfair that people on these forums continually use "good enough" to excuse AMD where they trail in performance, but dont seem to give Intel the same break in regards to graphics.
2. APUs are not acceptable to me (either AMD or Intel) because neither can even match the performance of my almost six year old, bone stock except for a 9800GT added, Core 2 duo desktop. They also probably cannot even match the performance of a SB i5 with a GT540m add in card for gaming. I actually was hoping Trinity would be a killer on the graphics front, but it turned out to be just a moderate improvement.
Doesn't everything depend on graphics nowadays? Even Windows itself is a Graphical User Interface (GUI). I think snappy graphics are always important (and the first thing normal people would notice if they are choppy/lacking), whereas other "processing/loading" have always been expected to take time on computers. I think it's just the ways humans brains are wired.I also think it very unfair that people on these forums continually use "good enough" to excuse AMD where they trail in performance, but dont seem to give Intel the same break in regards to graphics.
Doesn't everything depend on graphics nowadays? Even Windows itself is a Graphical User Interface (GUI). I think snappy graphics are always important (and the first thing normal people would notice if they are choppy/lacking), whereas other "processing/loading" have always been expected to take time on computers. I think it's just the ways humans brains are wired.
Kind of. Aero is built into the operating system for the most part, when in Linux or OSX, the window manager is usually a program running on top of the OS. The problem you have with that is the fact that most the instructions related to graphical activity will still be pushed through the CPU. That's more or less the way that modern systems are written. Graphics COULD be more important, but until proper code is written to really embrace heterogeneous computing, jittery windows systems will remain largely a CPU problem.
You can replace the GUI in Windows. It's documented.
What were you playing that you could use "max settings"? And what screen were you using that had 13x9 resolution? Most laptops are 1366x768. I bet you could not even play Skyrim at high, much less ultra ("max")settings on that laptop, unless you are satisfied with something like 20FPS. And believe me, I would love it if a 500-600 dollar laptop could play "all games" on highest settings. I would have bought one in a heartbeat and given it to my grandson whose old Asus gaming laptop is giving him problems.
Just to chime in, not all games have to played on high. When I had to go back to an 8800GTS a few months ago. I had to play a lot of games at lower settings. It wasn't so bad to be honest. If a $500 laptop can let me play Dirt3 or SC2 or D3 at lower settings with playable FPS, that fine with me. I mean D3 is the fastest selling PC game and Llano can play it at 1920x1080. I'm chearing for intel or AMD here, but you guys are unfairly beating up on these APUs. Not even $300 desktop video cards can max out every game and you guys are condemning 35w APU's for not being able to max out every game.
If I traveled a lot, I would get myself trinity just cause it can play light games and it has good battery life. Until Intel fix their IQ issues, im not touching HD4000.
Image quality is actually quite good, although there are a few areas where Intel falls behind the competition. I don't believe Ivy Bridge's GPU performance is high enough yet where we can start nitpicking image quality but Intel isn't too far away from being there.
Anisotropic filtering quality is much improved compared to Sandy Bridge. There's a low precision issue in DirectX 9 currently which results in the imperfect image above, that has already been fixed in a later driver revision awaiting validation. The issue also doesn't exist under DX10/DX11.
http://www.anandtech.com/show/5771/the-intel-ivy-bridge-core-i7-3770k-review/8Game compatibility is also quite good, not perfect but still on the right path for Intel. It's also worth noting that Intel has been extremely responsive in finding and eliminating bugs whenever we pointed at them in their drivers.
The HD 4000 can play games at Mainstream settings and Ivy Bridge can deliver good battery life. And the HD 4000 doesn't have any notable IQ issues, unlike some AMD fans may tell you.
http://www.anandtech.com/show/5771/the-intel-ivy-bridge-core-i7-3770k-review/8
IQ issues not found. They say IQ is "quite good" and there's nothing wrong given the intended use (games at 1366x768 at Medium settings).
]The HL2 and Skyrim screenshot earlier in this thread shows IQ issues.[/B] There was also another table on earlier in this thread showing some issues (not all IQ related) in a few games.
I'm really going to dig through the thread to find them. Also you might want to look at more than 1 website for benchmarks. D3 and source engine games run extremely poorly on HD4000.
Just to chime in, not all games have to played on high. When I had to go back to an 8800GTS a few months ago. I had to play a lot of games at lower settings. It wasn't so bad to be honest. If a $500 laptop can let me play Dirt3 or SC2 or D3 at lower settings with playable FPS, that fine with me. I mean D3 is the fastest selling PC game and Llano can play it at 1920x1080. I'm chearing for intel or AMD here, but you guys are unfairly beating up on these APUs. Not even $300 desktop video cards can max out every game and you guys are condemning 35w APU's for not being able to max out every game.
If I traveled a lot, I would get myself trinity just cause it can play light games and it has good battery life. Until Intel fix their IQ issues, im not touching HD4000.
The resolution was likely 13x7 rather than 9, but to be honest I don't knkw, nor care what the actual resolution is. I have not installed Skyrim on her computer, nor will I, but I have no doubt it will run well at relatively high settings since it runs so well on my computer at a much higher resolution. Lets face it, there are only a hand full of games that need a more powerful GPU on that resolution, since most games are made with old engines and are developed in concurrence with multiple consoles that have even less Graphical power than the Llano.
Please list by name any current, reasonably graphically intensive games that you have played on the A8 at maximum settings, at native resolution and at a frame-rate over 30FPS.
Otherwise, your post is just speculation and generalization that means nothing.
Not sure if seriousNot sure if serious... unless you found Netburst to be "sound". If Bulldozer was sound, AMD wouldn't be downplaying the negative of it having underwhelming performance. Piledriver is rumored to be 10% faster than Bulldozer and have 10-25% power savings.
That's really barely better than what Intel did with Ivy Bridge, which was an increase in performance of 5% and power savings by the same percentage as Piledriver.
Not sure if serious
Did you just say a die shrink that is 5% faster and 25% more power efficient is as big as technical marvel as doing a tweaked design that achieves the same results (5% faster and 25% more power efficent) when the tweaked design is on the same process node?
Not when that design is unoptimized yet unrevivable, which is what the Bulldozer architecture is. AMD can continue to make tweaks, but the baseline they're working with here is complete crap. Bulldozer isn't gonna get AMD into a competitive position in CPU performance. Not now, not ever.
I guess they didn't learn from Netburst.
Your argument is bad.
5% faster and 25% more efficient (and keeping the die size roughly the same size) on the same process node is a very big deal
If you wanted to make a good argument you admit all these factors are true (which they are by the way) but you then argue that the performance is still too low.
-----
5% faster and 25% more efficient due to a die shrink is not a technical marvel, it is par for the course. If you wanted to have a good argument you should argue that intel doesn't need to make sandybridge much faster when they went with ivybridge instead they were focused on learning the process tech, yields, and keeping the die size small. It is on Haswell where you will see the big performance increase.
They haven't addressed the decoder width issue yet.
I think it'll be interesting to see what they do with Steamroller. Shorter pipeline with faster caches? From the looks of Piledriver it looks like the IMC hasn't been tweaked at all besides the bump to support 1600mhz memory and whatever it goes to now on the desktop. If AMD want their APUs to perform well the IMC better be one of the strongest parts of the Steamroller architecture otherwise they're screwed.
The HD 4000 can play games at Mainstream settings and Ivy Bridge can deliver good battery life. And the HD 4000 doesn't have any notable IQ issues, unlike some AMD fans may tell you.
No, it is not. Because, again, all they're doing is optimizing things that didn't make it into Bulldozer.
It's basically the same thing Intel did with Ivy Bridge, but the difference is Sandy Bridge is an excellent architecture and Intel used the new transistors and process node to still hugely improve IGP performance. It's not a big deal IF AMD got a 10% improvement in CPU performance and a 20% reduction in power consumption simply because they're so far behind already and it means they made zero grounds to close the gap to Intel. The huge CPU gap is still there and it's not being closed.
Again, AMD will keep optimizing Bulldozer, but at no stage will it be better than an Intel architecture unless Intel manages to destroy two-three generations of progress and I doubt that will happen.
could you elaborate on this? I'm not sure what you mean. and this is not a negative post, i'm just a curious guy that loves reading everyone's opinions.