Anandtech Reviews R600!

Laminator

Senior member
Jan 31, 2007
852
2
91
http://www.anandtech.com/video/showdoc.aspx?i=2988&p=31

For now, R600 is a good starting place for AMD's DX10 initiative, and with a bit of evolution to their unified shader hardware it could eventually rise to the top. We aren't as excited about this hardware as we were about G80, and there are some drawbacks to AMD's implementation, but we certainly won't count them out of the fight. Power efficiency on 65nm remains to be seen, and there is currently a huge performance gap NVIDIA has left between the 8600 GTS and the 8800 GTS 320MB. If AMD is able to capitalize here with the HD 2600 series, they will certainly still have a leg to stand on. We will have to wait to see those performance results though.

In the meantime, we are just happy that R600 is finally here after such a long wait. Let's hope for AMD's sake that the next revision of their hardware doesn't take quite so long to surface and manages to compete better with six month old competing products. We certainly hope we won't see a repeat of the R600 launch when Barcelona and Agena take on Core 2 Duo/Quad in a few months....

Something of a balance between Tom's Hardware and HardOCP. By the way, why is Tom's Hardware still using Doom 3? And what's up with their conclusion?
 

MrWizzard

Platinum Member
Mar 24, 2002
2,493
0
71
I guess it's just "OK".

I think I would still get a 8800 GTS over it though. But that's just me.
 

PingSpike

Lifer
Feb 25, 2004
21,755
599
126
Seems pretty good. Edges out the 8800gts (except in stalker), enough for it to be competitive at that price point. But, with some downward price pressure on the 8800gts 640mb...and the drawbacks on the R600 of a louder cooler and high power consumption its kind of a wash to me.

ATI had a hefty performance advantage and vastly superior IQ last generation to make up for its crummy coolers and weak selection of board partners. Not so this time around. With how much these cards cost, I feel like the 8800gts is still a better choice.
 

KeithTalent

Elite Member | Administrator | No Lifer
Administrator
Nov 30, 2005
50,231
118
116
I couldn't find it in the article (probably missed it), but what drivers were they using?

KT
 

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
I'd have to say that that was one of their better reviews.

The 2900XT loses to even the X1950XTX in some cases, which should *never* happen with a new launch to over take the high-end. Granted, it's not the XTX version, but still.
 

aka1nas

Diamond Member
Aug 30, 2001
4,335
1
0
The power consumption doesn't look to be quite as bas as others have made out. It only a little higher than the GTX.

What was more interesting was that they benched 8800GTS SLI. I have had tons of trouble finding reviews that included that. If Nvidia doesn't do a G80 refresh anytime soon, then maybe SLI will be a somewhat more viable upgrade than usual this year.
 

Dashel

Senior member
Nov 5, 2003
226
0
71
Yup I really liked this review and pretty much confirms my opinion judging by what other reviewers have said. It's a pretty good card, potential is there to be a very good card, not a great card. It was late, eats power, is noisy.



 

DaveSimmons

Elite Member
Aug 12, 2001
40,730
670
126
It looks like the hack to get AA working hurts performance bad.

Numbers for Unreal 3 engine (Rainbow 6, no AA) and Oblivion without AA were both good. But given the power draw (19 watts above a GTX) and poor AA speed I think I'd wait for the 2950 refresh.

Luckily I'm waiting until the end of the year for my next card anyway, so I'll hopefully have 2950 and 8900 to pick from. nvidia Vista drivers might even work by then
 

Xarick

Golden Member
May 17, 2006
1,199
1
76
8800 320 is still the best price/perf bang for your buck at less than $300 in most places unless you game above 1680x1050.
 

yacoub

Golden Member
May 24, 2005
1,991
14
81
I wrote a comment but it apparently didn't post. The gist of it was thanking Derek for such an amazing amount of in-depth technical explanation of the R600's technology and then comparing it to the G80's hardware. Very cool and I imagine it took quite a lot of time to research and write.
Maybe it didn't post because the subject line i used was "holy crap Derek!". Does it filter comments with a subject that uses the word "crap"? hehe
 

jim1976

Platinum Member
Aug 7, 2003
2,704
6
81
A good gpu article from AT after a significant amount of time.. Well put and organized..
 

v8envy

Platinum Member
Sep 7, 2002
2,720
0
0
Originally posted by: Xarick
8800 320 is still the best price/perf bang for your buck at less than $300 in most places unless you game above 1680x1050.

At least two manufacturers have the 640 meg 8800 for $330 rebated at the egg. Even at below 1680x1050 the extra 20% cost is more than worth it to handle the ginormous textures in current and undoubtedly future games.

As I've said in another thread -- the heat, noise and possible requirement of a new power supply makes this card less attractive than the FPS/$ figures suggest. It'd have to be discouted in relation to the 640 M 8800GTS to be attractive to the typical end user, IMO. And considering how much cheaper the 8800GTS is to make coupled with 7 month lead to recoup R&D costs, that may never happen.

This is the X1800XT all over again, 17 months later.

 

yacoub

Golden Member
May 24, 2005
1,991
14
81
It's really $360. MIRs don't count because they're a hassle to fill out and you have to wait, so you're really paying $360. Come tell me when there's an eVGA 640MB GTS for $325 that doesn't include rebates.
 

zetto

Member
May 2, 2004
89
0
0
Wow, what a dissapointment... Kyle Bennet of Hard was hinting for a long time that r600 is largerly a flop... While he does seem biased against this thing, I gotta agree, r600 release has a lot in common with Geforce FX fiasco... Granted, r600 may yet to redeem itself, the launch is atrocious, imho.

while it's off-topic, I gotta say I've never had a problem with rebates (except once) and I did a ton of them... And if you go into hassle of upgrading your videocard, you can afford to spend a couple of minutes of your time to fill out rebate and get some cash for that little effort
 

Chadder007

Diamond Member
Oct 10, 1999
7,560
0
0
Im thinking AMD will redeem itself in the low to midrange when comparing the 2600 with the 8600s. Though I know one site had a different driver set and the performance difference was 40% in Doom3 between the drivers. Maybe they have some driver issues to work out still.
 

swtethan

Diamond Member
Aug 5, 2005
9,071
0
0
in the midrange, they are still using a 128bit just like nvidia... lets see how everything works out there.
 

Extelleron

Diamond Member
Dec 26, 2005
3,127
0
71
I think it's odd they don't mention what drivers they were using. Every other review has noted the problem of immature drivers, yet AT says nothing about this (at least I didn't see it) and doesn't say what drivers they used.
 

spittledip

Diamond Member
Apr 23, 2005
4,480
1
81
They need to drop that price to make it competitive... Nvidia could raise the price on their GTS's and still get people to buy at this point.
 

Extelleron

Diamond Member
Dec 26, 2005
3,127
0
71
Originally posted by: spittledip
They need to drop that price to make it competitive... Nvidia could raise the price on their GTS's and still get people to buy at this point.

Why would you buy an 8800GTS when you could get an HD 2900XT? Besides consuming less power, there's no real advantages to the GTS. If the GTS is $70 less (which it is now but will not be soon) then I could possibly see why, but at the same price the 2900XT is a better card. If you compared nVidia's cards at release to the 2900XT right now, things would be a different story. All ATI needs is to improve drivers and things will change.

Even right now, the 2900XT is ahead of even an overclocked GTX in Rainbow Six Vegas, which is based on the Unreal 3 engine. UT 2007 and numerous other games are based on that engine, and it is seen as one of the next-generation gaming engines (along with CryTek 2). If the 2900XT shines here, it will likely shine in those games.

The two things that I think are causing the HD 2900XT to not perform as well as it should are a) only 16 TMU's, and b) immature drivers (especially with AA performance). The majority of games today are still very texture dependant and the 8800GTX with 32 TMU's has a huge advantage over the HD 2900XT's 16. This is one of the reasons, along with drivers, the 2900XT doesn't see a huge advantage over the X1950XTX in some games - it only has slightly more texture power from the higher clockspeed (750MHz vs 650MHz). "Next-gen" games like Company of Heroes and those based on the Unreal 3 engine are much more shader dependant, and here the HD 2900XT shines.
 

Tridus

Junior Member
May 14, 2007
1
0
0
Originally posted by: Extelleron
[Why would you buy an 8800GTS when you could get an HD 2900XT? Besides consuming less power, there's no real advantages to the GTS. If the GTS is $70 less (which it is now but will not be soon) then I could possibly see why, but at the same price the 2900XT is a better card. If you compared nVidia's cards at release to the 2900XT right now, things would be a different story. All ATI needs is to improve drivers and things will change.

Even right now, the 2900XT is ahead of even an overclocked GTX in Rainbow Six Vegas, which is based on the Unreal 3 engine. UT 2007 and numerous other games are based on that engine, and it is seen as one of the next-generation gaming engines (along with CryTek 2). If the 2900XT shines here, it will likely shine in those games.

Right now its $70 less for similar performance. That alone is reason enough.

Power is also an issue. the 2900XT draws insane amounts of power for the performance, far more then the 8800GTS. Power costs money, both in the power itself and in a PSU that can feed it (not a problem if you bought a 600w PSU, but lots of us don't have those). Plus, all that extra power gets converted into extra heat.

There isn't much to recommend the 2900XT against a cheaper 8800GTS.
 

spittledip

Diamond Member
Apr 23, 2005
4,480
1
81
Um... you can save over 120$ buying the 320. The performance difference is minimal between the 320 and the 2900xt as it is except for the highest resolution. The 640 beats the 2900xt several times also, and the times it does not beat it, the difference is only a few FPS. Why would I spend more money for a part that is pretty much the equal of another less expensive part?
 

Extelleron

Diamond Member
Dec 26, 2005
3,127
0
71
Originally posted by: Tridus
Originally posted by: Extelleron
[Why would you buy an 8800GTS when you could get an HD 2900XT? Besides consuming less power, there's no real advantages to the GTS. If the GTS is $70 less (which it is now but will not be soon) then I could possibly see why, but at the same price the 2900XT is a better card. If you compared nVidia's cards at release to the 2900XT right now, things would be a different story. All ATI needs is to improve drivers and things will change.

Even right now, the 2900XT is ahead of even an overclocked GTX in Rainbow Six Vegas, which is based on the Unreal 3 engine. UT 2007 and numerous other games are based on that engine, and it is seen as one of the next-generation gaming engines (along with CryTek 2). If the 2900XT shines here, it will likely shine in those games.

Right now its $70 less for similar performance. That alone is reason enough.

Power is also an issue. the 2900XT draws insane amounts of power for the performance, far more then the 8800GTS. Power costs money, both in the power itself and in a PSU that can feed it (not a problem if you bought a 600w PSU, but lots of us don't have those). Plus, all that extra power gets converted into extra heat.

There isn't much to recommend the 2900XT against a cheaper 8800GTS.

The most advanced games featured in the test suites (CoH as one of them, Rainbow Six another which is based on the UE3 engine) run very well on the HD 2900XT. In CoH the 2900XT is significantly faster than the GTS, and with 2048x1536 4x/8x settings in Vista beats the GTX. In Rainbow Six the 2900XT is faster than an overclocked GTX. (In one review at least, Anandtech shows the GTX and XT with equal performance. Perhaps legitreviews used a newer driver)

 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |