Poll: GT300 VS. HD5870

Page 9 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

SlowSpyder

Lifer
Jan 12, 2005
17,305
1,002
126
Originally posted by: ochadd
Does a six month wait really merit any kind of wait and see? If GT300 comes out with 20% or better frame rates over a 5870 then sell it and pick up the better card. Probably take a $50-100 hit from the purchase price and you have gotten six months of gaming @ max settings.

That's the thing, to me, by Nvidia just releasing specs to counter the 5870, it makes me think it may be a while until their card is released. Had they released benches, that'd be different.
 

nismotigerwvu

Golden Member
May 13, 2004
1,568
33
91
To me it seems like the comparison is going to be more 5890 or 6870 versus GT300. Evergreen will more than likely have a refresh or binned part by the time we see gt300 in volume. No links, no evidence just gut feeling based on availabilities.
 

SlowSpyder

Lifer
Jan 12, 2005
17,305
1,002
126
Originally posted by: Wreckage
Originally posted by: SlowSpyder


That's the thing, to me, by Nvidia just releasing specs to counter the 5870, it makes me think it may be a while until their card is released. Had they released benches, that'd be different.


http://www.anandtech.com/video/showdoc.aspx?i=3573

What exactly are you showing me? That AMD demoed Evergreen 40nm parts on June 3rd and launched on September 23rd? That we should expect the same from Nvidia, around 3 months until Nvidia has cards?

Did Nvidia demo parts even, or just announce specs?
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
Originally posted by: nismotigerwvu
To me it seems like the comparison is going to be more 5890 or 6870 versus GT300. Evergreen will more than likely have a refresh or binned part by the time we see gt300 in volume. No links, no evidence just gut feeling based on availabilities.

This may explain why they are losing money. NVIDIA makes an architecture well enough to last more than a few months so that they can recoup the cost. AMD has to kill of it's architectures after a rather short time. I bet that can get expensive.
 

WelshBloke

Lifer
Jan 12, 2005
32,582
10,757
136
Originally posted by: SlowSpyder
Originally posted by: Wreckage
Originally posted by: SlowSpyder


That's the thing, to me, by Nvidia just releasing specs to counter the 5870, it makes me think it may be a while until their card is released. Had they released benches, that'd be different.


http://www.anandtech.com/video/showdoc.aspx?i=3573

What exactly are you showing me? That AMD demoed Evergreen 40nm parts on June 3rd and launched on September 23rd? That we should expect the same from Nvidia, around 3 months until Nvidia has cards?

Did Nvidia demo parts even, or just announce specs?

Funnily enough it does look like it'll be three or so months.

 

SlowSpyder

Lifer
Jan 12, 2005
17,305
1,002
126
Originally posted by: Wreckage
Originally posted by: nismotigerwvu
To me it seems like the comparison is going to be more 5890 or 6870 versus GT300. Evergreen will more than likely have a refresh or binned part by the time we see gt300 in volume. No links, no evidence just gut feeling based on availabilities.

This may explain why they are losing money. NVIDIA makes an architecture well enough to last more than a few months so that they can recoup the cost. AMD has to kill of it's architectures after a rather short time. I bet that can get expensive.

What are you even talking about now? AMD and Nvidia are both losing money. AMD's architecture is based off of the R600 which later became the RV770 and now the Evergreen chips. Nvidia's G80 was used to make the G92 then the GTX2x0 cards were based off of it.

If anything AMD did their homework for DX11 by using DX10.1 and getting 40nm out first, this probably aided them in getting the 5870 out first. Not to mention AMD having used tesselation for a number of generations. It appears they were more forward thinking in many ways. Seems to me like their architectures last more than "a few months."

Distorting things again...
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Originally posted by: Wreckage
Originally posted by: nismotigerwvu
To me it seems like the comparison is going to be more 5890 or 6870 versus GT300. Evergreen will more than likely have a refresh or binned part by the time we see gt300 in volume. No links, no evidence just gut feeling based on availabilities.

This may explain why they are losing money. NVIDIA makes an architecture well enough to last more than a few months so that they can recoup the cost. AMD has to kill of it's architectures after a rather short time. I bet that can get expensive.

Over 70% of AMD's revenue is CPU related. Has it ever dawned on you that it is AMD's cpu division that is struggling to make the company profitable? Why do you constantly jump to conclusions that it is AMD graphics that is bringing the firm down?

You are failing to realize that AMD graphics was profitable before the takeover. Yes they completely flopped 2900 series, but after that things went ok for them. Making blank statements that AMD Graphics Division has not been making profit (implying after the takeover until now based on your statements), is just incorrect. The reason AMD is struggling is largely as a result of the enormous interest expense obligation related to its debt from the buyout, lack of competitiveness on the CPU mobile space, and reduced CPU margins due to uncompetitiveness with Intel. Your logic implies that AMD Graphics has negative gross margins due to increased cost of sales...(i.e. manufacturing costs included).

GAAP gross margin (complete AMD):
AMD: Q1 43% - Q2 37%
nVidia: Q1 29% - Q2 20%
 

nismotigerwvu

Golden Member
May 13, 2004
1,568
33
91
Originally posted by: Wreckage
Originally posted by: nismotigerwvu
To me it seems like the comparison is going to be more 5890 or 6870 versus GT300. Evergreen will more than likely have a refresh or binned part by the time we see gt300 in volume. No links, no evidence just gut feeling based on availabilities.

This may explain why they are losing money. NVIDIA makes an architecture well enough to last more than a few months so that they can recoup the cost. AMD has to kill of it's architectures after a rather short time. I bet that can get expensive.

I'm pretty sure higher binned chips (ala 4890) cost AMD nothing and die shrinks lead to better margins. In all reality AMD is still using a tweaked version of R600. Nothing has been killed here.
 

jiffylube1024

Diamond Member
Feb 17, 2002
7,430
0
71
Originally posted by: swing848
ATI could have taken more time with the GPU and redesigned it heavily, including a 512 bus. The biggest problem with that is more time [NVIDIA would probably have their card out first], cost of R&D and cost of a new fabrication process that would be untried.

ATI played it smart by using the 40nm HD4770 as a test bed then basically doubled internal hardware of the HD4890 using 40nm. This saved a great deal of time and money.

If ATI can kick their marketing department in the rear they could go into overdrive with a lot of clout with software manufacturers before Nvidia's next generation hits the shelves. And this is happening, at least to some extent, because some companies are already writing code for ATI's new card.

The longer it takes Nvidia to bring a new card to market the more leverage ATI will have.

Again, as to which company will have the fastest video card is yet to be seen. Personally I like competition, it drives prices down and new products are brought to the market faster.

Dan

A 512-bit bus is exactly what ATI has been trying to avoid since the 2900XT days and was one of the factors that led to ATI's "small die" strategy.

Releasing a $379 334mm²-die sized part while the economy is still recovering is a very smart strategy. GT300 will be a monster, no doubt, and will definitely be faster than the 5870, but Nvidia will have a delicate tightrope to walk between clockspeed and TDP.

-----------

A second point I want to make is I'm very curious why several board posters here think that a $579 GT300 will cause ATI to crap their pants and immediately drop the 5870 to ~$250. At $579, the GT300 and 5870 won't even compete directly against one another; it will be the GT300 vs the 5870 X2!
 

firewolfsm

Golden Member
Oct 16, 2005
1,848
29
91
The 5870 is fast enough for every current game at 2560 with AA and AF and it doesn't look like graphics are getting any better next year. We won't even need a card faster than the 5870 until 2011, which makes Fermi absolutely useless and overpriced.
 

Idontcare

Elite Member
Oct 10, 1999
21,110
59
91
Originally posted by: RussianSensation
Originally posted by: Wreckage
Originally posted by: nismotigerwvu
To me it seems like the comparison is going to be more 5890 or 6870 versus GT300. Evergreen will more than likely have a refresh or binned part by the time we see gt300 in volume. No links, no evidence just gut feeling based on availabilities.

This may explain why they are losing money. NVIDIA makes an architecture well enough to last more than a few months so that they can recoup the cost. AMD has to kill of it's architectures after a rather short time. I bet that can get expensive.

Over 70% of AMD's revenue is CPU related. Has it ever dawned on you that it is AMD's cpu division that is struggling to make the company profitable? Why do you constantly jump to conclusions that it is AMD graphics that is bringing the firm down?

You are failing to realize that AMD graphics was profitable before the takeover. Yes they completely flopped 2900 series, but after that things went ok for them. Making blank statements that AMD Graphics Division has not been making profit (implying after the takeover until now based on your statements), is just incorrect. The reason AMD is struggling is largely as a result of the enormous interest expense obligation related to its debt from the buyout, lack of competitiveness on the CPU mobile space, and reduced CPU margins due to uncompetitiveness with Intel. Your logic implies that AMD Graphics has negative gross margins due to increased cost of sales...(i.e. manufacturing costs included).

GAAP gross margin (complete AMD):
AMD: Q1 43% - Q2 37%
nVidia: Q1 29% - Q2 20%

Actually AMD reports their graphics division revenue as well, its in their reports, and they've lost money. There was a thread about it complete with the specifics here whenever the last earning reports cycle happened.
 

akugami

Diamond Member
Feb 14, 2005
6,210
2,551
136
Originally posted by: Idontcare
Actually AMD reports their graphics division revenue as well, its in their reports, and they've lost money. There was a thread about it complete with the specifics here whenever the last earning reports cycle happened.

The problem I have is that Wreckage is constantly harping about how AMD's GPU division (ATI) is losing money. WTF does that have to do with the quality of the 5870? WTF does that have to do with the 5870 vs GT300? Absolutely nothing. Just another in a series of BS from him meant to destroy any meaningful discussion and trash ATI.

Granted it is troubling for ATI to be losing money because we want to have two healthy competitors. We've all seen how the lack of competitors can lead to stagnant products (Creative sound cards).
 

Idontcare

Elite Member
Oct 10, 1999
21,110
59
91
Originally posted by: akugami
Originally posted by: Idontcare
Actually AMD reports their graphics division revenue as well, its in their reports, and they've lost money. There was a thread about it complete with the specifics here whenever the last earning reports cycle happened.

The problem I have is that Wreckage is constantly harping about how AMD's GPU division (ATI) is losing money. WTF does that have to do with the quality of the 5870? WTF does that have to do with the 5870 vs GT300? Absolutely nothing. Just another in a series of BS from him meant to destroy any meaningful discussion and trash ATI.

Granted it is troubling for ATI to be losing money because we want to have two healthy competitors. We've all seen how the lack of competitors can lead to stagnant products (Creative sound cards).

akugami, thanks for explaining the backstory there, I fully admit I wasn't really paying much attention as to what exactly I was stepping into by interjecting my comments into that ongoing thought stream...I understand now that you guys are arguing cause-and-effect (or lack thereof) between financial health and product strategy, etc.

I merely wanted to make sure people knew they do have access to the graphics division financial health info (no guessing is needed) perchance the specifics of that info made the rebuttals any more potent or succinct.

I certainly did not mean to detract from the merits of either side of the debate there...if that turns out to be the net result of my post then I do apologize for effectively thread-crapping there regardless my motive/intent for posting.
 

biostud

Lifer
Feb 27, 2003
19,672
6,760
136
It really depends on the pricepoint of the GT300. If the GT300 is priced around $500 it has to compete with 5850 CF or 5850x2 cards. I doubt it will be able to beat a 5850x2, and if it can't then it has to cost around $450 to compete, if it can beat 5850x2 then the $500 is a good pricepoint.
 

Kuzi

Senior member
Sep 16, 2007
572
0
0
Originally posted by: akugami
Originally posted by: Idontcare
Actually AMD reports their graphics division revenue as well, its in their reports, and they've lost money. There was a thread about it complete with the specifics here whenever the last earning reports cycle happened.

The problem I have is that Wreckage is constantly harping about how AMD's GPU division (ATI) is losing money. WTF does that have to do with the quality of the 5870? WTF does that have to do with the 5870 vs GT300? Absolutely nothing. Just another in a series of BS from him meant to destroy any meaningful discussion and trash ATI.

Granted it is troubling for ATI to be losing money because we want to have two healthy competitors. We've all seen how the lack of competitors can lead to stagnant products (Creative sound cards).

Intelligent readers on the forum already know that ATI released a quality product in the HD5870/HD5850, it does not matter whether they are nV or ATI fans. But there will always be a few minority, probably employees or focus group members that will always try their best to trash the opposing company's rep and/or stellar products. In this case ATI's HD5xxx series.

It seems very likely that ATI will make a profit for Q4 2009 because of the fact that nV won't have any competing products until some months to come, and most new systems will be sold with Win7+HD5xxx series cards.

If we look at the price/performance right now, would someone pay $500 for a GTX295, or $380 for the HD5870, which performs similarly but has DX11, lower power consumption/heat, and no micro-stutter. The same thing applies to the HD5850 which offers even better price/performance ratio. Would anyone buy a GTX285 for $330, or a faster HD5850 for $270, I think the answer is pretty obvious.
 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
From my understanding, Fermi was actually demoed. It was only used for the astrophysics demo, while the rest was run with GT200 hardware.
 

evolucion8

Platinum Member
Jun 17, 2005
2,867
3
81
Originally posted by: Kuzi
Intelligent readers on the forum already know that ATI released a quality product in the HD5870/HD5850, it does not matter whether they are nV or ATI fans. But there will always be a few minority, probably employees or focus group members that will always try their best to trash the opposing company's rep and/or stellar products. In this case ATI's HD5xxx series.

It seems very likely that ATI will make a profit for Q4 2009 because of the fact that nV won't have any competing products until some months to come, and most new systems will be sold with Win7+HD5xxx series cards.

If we look at the price/performance right now, would someone pay $500 for a GTX295, or $380 for the HD5870, which performs similarly but has DX11, lower power consumption/heat, and no micro-stutter. The same thing applies to the HD5850 which offers even better price/performance ratio. Would anyone buy a GTX285 for $330, or a faster HD5850 for $270, I think the answer is pretty obvious.

The performance of the HD 5870 currently matches the HD 4870X2 80% of the time, which is an accomplishment, sometimes can be faster, sometimes can be slower, while isn't the biggest jump in performance compared to the HD 3870 vs HD 4870, its significant enough, plus the great features like UVD2, DX11, GPGPU/DirectCompute performance, support for BluRay audio standards, overall is the fastest single GPU card in the world with the most advanced feature set and performance wrapped with a thermal envelope which is slighly higher than the HD 4890 at full load and can match the HD 3870 in idle, a sane person will not buy a GTX 285 or a GTX 295 at this point.

Even though I'm tempted to buy a HD 4870 to do Crossfire and match or outperform the HD 5870 slighly, its horrible power consumption, heat dissipation and typical scaling issues that plagues any multi GPU solution makes me think twice about it, is it better to spend $150 on a HD 4870 and get great performance, huge power bills, heat dissipation and some scaling issues, or getting a HD 5870 for almost $400, but with less power consumption, better features and no multi GPU issues? Interesting. . .
 

Pantalaimon

Senior member
Feb 6, 2006
341
40
91
If we look at the price/performance right now, would someone pay $500 for a GTX295, or $380 for the HD5870, which performs similarly but has DX11, lower power consumption/heat, and no micro-stutter. The same thing applies to the HD5850 which offers even better price/performance ratio. Would anyone buy a GTX285 for $330, or a faster HD5850 for $270, I think the answer is pretty obvious.

I'm sure some posters will come out with the usual reasons why someone should buy nvidia's current generation cards instead.
 

Kuzi

Senior member
Sep 16, 2007
572
0
0
Originally posted by: evolucion8
Even though I'm tempted to buy a HD 4870 to do Crossfire and match or outperform the HD 5870 slighly, its horrible power consumption, heat dissipation and typical scaling issues that plagues any multi GPU solution makes me think twice about it, is it better to spend $150 on a HD 4870 and get great performance, huge power bills, heat dissipation and some scaling issues, or getting a HD 5870 for almost $400, but with less power consumption, better features and no multi GPU issues? Interesting. . .

In your case I would say your best bet is to sell the HD4870 and buy an HD5850. An HD5850 is only a bit slower than the HD5870 and your are only using a 19" monitor so the HD5850 will be more than enough for your needs.

Or you could wait till early next year when Fermi gets released and decide then as most likely the prices of these cards will be lower, and nV might have competing cards around this price range.
 

Dribble

Platinum Member
Aug 9, 2005
2,076
611
136
Originally posted by: evolucion8
Even though I'm tempted to buy a HD 4870 to do Crossfire and match or outperform the HD 5870 slighly, its horrible power consumption, heat dissipation and typical scaling issues that plagues any multi GPU solution makes me think twice about it, is it better to spend $150 on a HD 4870 and get great performance, huge power bills, heat dissipation and some scaling issues, or getting a HD 5870 for almost $400, but with less power consumption, better features and no multi GPU issues? Interesting. . .

Gotta ask - what do you want to play that doesn't work fine on the 4870?

You are gaming at 1280 x 1024 if the monitor in your sig is correct, even crysis is gonna to run near maxed at that res on a 4870. Sure if you were to buy a 25*16 display then a 58xx or xfire would become important but at 12*10 even a 4870 is overkill.
 

akugami

Diamond Member
Feb 14, 2005
6,210
2,551
136
Originally posted by: Idontcare
akugami, thanks for explaining the backstory there, I fully admit I wasn't really paying much attention as to what exactly I was stepping into by interjecting my comments into that ongoing thought stream...I understand now that you guys are arguing cause-and-effect (or lack thereof) between financial health and product strategy, etc.

I merely wanted to make sure people knew they do have access to the graphics division financial health info (no guessing is needed) perchance the specifics of that info made the rebuttals any more potent or succinct.

I certainly did not mean to detract from the merits of either side of the debate there...if that turns out to be the net result of my post then I do apologize for effectively thread-crapping there regardless my motive/intent for posting.

Hey, I have no problem with your post. You were just making sure that the correct information flowing. You were trying to be helpful. Even if I disagreed with what you were saying, and I don't, I would have no problem with it since you are genuinely trying to add to the conversation when you post here regardless of which thread.

But from my point of view, and this was happening with the Radeon 4xx0 series launch as well, Wreckage keeps mentioning the financial health of AMD/ATI in a discussion about the pros and cons of one architecture vs another architecture.

Don't get me wrong, finance does impact products. If a company is not doing well and scale back on R&D monies it can affect products by perhaps delaying the time in getting it to market or perhaps in creating a less aggressive or forward thinking product. We've seen it with AMD in designing their new CPU's and how they're either late or not as competitive compared to Intel's CPU's which receive massive R&D funding.

AMD's GPU division didn't fare too badly with the 4xx0 series. Not as well as they would perhaps like but the financial aspects were certainly better than in past quarters leading up to the 4xx0 series. But again, what does finance have to do with the Radeon 5870 vs the theoretical and rumored specs of the GT300? The Radeon 5870 is set in stone. It is what it is. I believe the same with the GT300 even though it isn't out yet.

Bottom line is that the 5870 is in line with what a doubling of the 4870 would perform. That it doesn't perform as well as a 4870x2 or 2x4870 would suggest that perhaps the drivers are not mature and there is a little bit of extra performance to be squeezed out of it.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |