nVidia GT200 Series Thread

Page 28 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Nemesis 1

Lifer
Dec 30, 2006
11,366
2
0
I call it hard work and a bit of luck. I am so pleased my daughter is continuing what I started . It made things easier for us. Be careful ArchAngel . I live for that kind of talk.

Man are you guys in for a surprize.
 

OCGuy

Lifer
Jul 12, 2000
27,224
37
91
I still like how people say if a 2 GPU card (4870x2) slightly beats out a single GPU GTX280, then somehow ATI has scored a win. It wont get anywhere near beating 2 GTX280's, so who cares. People buying top of the line are less concerned about price anyway.

Having a single GPU will far outweigh anything the 4870x2 does.

And I agree with everyone else, we dont need someone who is on the ATI teet of free hardware to be in here throwing out random speculation. Thats what the 4xxx thread is for.
 

Extelleron

Diamond Member
Dec 26, 2005
3,127
0
71
Originally posted by: Ocguy31
I still like how people say if a 2 GPU card (4870x2) slightly beats out a single GPU GTX280, then somehow ATI has scored a win. It wont get anywhere near beating 2 GTX280's, so who cares. People buying top of the line are less concerned about price anyway.

Having a single GPU will far outweigh anything the 4870x2 does.

And I agree with everyone else, we dont need someone who is on the ATI teet of free hardware to be in here throwing out random speculation. Thats what the 4xxx thread is for.

:disgust:

This is what I am not happy with regarding enthusiasts today; they refuse to consider anything different than what they are used to, and seem to hate ATI.

If the HD 4870 X2 beats the GTX 280, then it beats the GTX 280. People make it as if it is some hack together card. R700 is not a hack card thrown together because ATI cannot compete with nVidia. Get this out of your heads. ATI is going dual-GPU because it can provide the best performance at a good cost to end users and it is simply the only way to go in the future. And GTX 280 SLI is not at all a valid comparison to the HD 4870 X2. Like with any card of previous generations, you compare one HD 4870 X2 to one GTX 280.

If ATI wanted to, they could have put 1600SP, 64 TMU, and 32 ROP on a huge card with a 512-bit bus and competed with a single GPU. But they decided that this was not the way to go. Multiple, small GPUs with great yield are the way to go. nVidia can go with a huge GPU that costs way too much as a result of its horrible yields. But ATI is smarter than that.

If ATI made that card I talked about above, with a single huge GPU, they would likely have to use a 65nm process and would have to have low clocks because of high TDP, just like nVidia is facing with GT200. The cost would have been tremendous, just like GT200. And the cost to end users would be tremendous, just like with GT200. The high-end would have been $650 just like the GTX 280, and likely a slower version for $450-500.

I guarantee you the above card would not beat the hypothetical HD 4870 X2 for $499, certainly not in price-performance.

nVidia will be going the same route as well, I can guarantee you that. They will not keep building bigger and bigger GPUs; that might have been OK when the top of the line GPU was ~200mm^2, but not anymore. nVidia cannot make a GPU any bigger than GT200. In 2009, I am pretty sure you will see a multi-GPU nVidia card as well.

I'm not saying there is anything wrong with the GTX 280 or GTX 260. I'm sure they will be great cards and perform very well. But people should consider each card based on performance, drivers, and pricing. If the 4870 X2 costs $500 and outperforms a $650 GTX 280, why would I care what is under the hood? You should always consider every card from any manufacturer.
 

dv8silencer

Member
May 7, 2008
142
0
0
Originally posted by: Extelleron
Originally posted by: Ocguy31
I still like how people say if a 2 GPU card (4870x2) slightly beats out a single GPU GTX280, then somehow ATI has scored a win. It wont get anywhere near beating 2 GTX280's, so who cares. People buying top of the line are less concerned about price anyway.

Having a single GPU will far outweigh anything the 4870x2 does.

And I agree with everyone else, we dont need someone who is on the ATI teet of free hardware to be in here throwing out random speculation. Thats what the 4xxx thread is for.

:disgust:

This is what I am not happy with regarding enthusiasts today; they refuse to consider anything different than what they are used to, and seem to hate ATI.

If the HD 4870 X2 beats the GTX 280, then it beats the GTX 280. People make it as if it is some hack together card. R700 is not a hack card thrown together because ATI cannot compete with nVidia. Get this out of your heads. ATI is going dual-GPU because it can provide the best performance at a good cost to end users and it is simply the only way to go in the future. And GTX 280 SLI is not at all a valid comparison to the HD 4870 X2. Like with any card of previous generations, you compare one HD 4870 X2 to one GTX 280.

If ATI wanted to, they could have put 1600SP, 64 TMU, and 32 ROP on a huge card with a 512-bit bus and competed with a single GPU. But they decided that this was not the way to go. Multiple, small GPUs with great yield are the way to go. nVidia can go with a huge GPU that costs way too much as a result of its horrible yields. But ATI is smarter than that.

If ATI made that card I talked about above, with a single huge GPU, they would likely have to use a 65nm process and would have to have low clocks because of high TDP, just like nVidia is facing with GT200. The cost would have been tremendous, just like GT200. And the cost to end users would be tremendous, just like with GT200. The high-end would have been $650 just like the GTX 280, and likely a slower version for $450-500.

I guarantee you the above card would not beat the hypothetical HD 4870 X2 for $499, certainly not in price-performance.

nVidia will be going the same route as well, I can guarantee you that. They will not keep building bigger and bigger GPUs; that might have been OK when the top of the line GPU was ~200mm^2, but not anymore. nVidia cannot make a GPU any bigger than GT200. In 2009, I am pretty sure you will see a multi-GPU nVidia card as well.

I'm not saying there is anything wrong with the GTX 280 or GTX 260. I'm sure they will be great cards and perform very well. But people should consider each card based on performance, drivers, and pricing. If the 4870 X2 costs $500 and outperforms a $650 GTX 280, why would I care what is under the hood? You should always consider every card from any manufacturer.

I agree. What performance do you get for what price? This includes the quality of drivers. Other considerations should be made only if talking about the future (sustainability) or technology itself (to learn?).
 

Extelleron

Diamond Member
Dec 26, 2005
3,127
0
71
Originally posted by: dv8silencer
Originally posted by: Extelleron
Originally posted by: Ocguy31
I still like how people say if a 2 GPU card (4870x2) slightly beats out a single GPU GTX280, then somehow ATI has scored a win. It wont get anywhere near beating 2 GTX280's, so who cares. People buying top of the line are less concerned about price anyway.

Having a single GPU will far outweigh anything the 4870x2 does.

And I agree with everyone else, we dont need someone who is on the ATI teet of free hardware to be in here throwing out random speculation. Thats what the 4xxx thread is for.

:disgust:

This is what I am not happy with regarding enthusiasts today; they refuse to consider anything different than what they are used to, and seem to hate ATI.

If the HD 4870 X2 beats the GTX 280, then it beats the GTX 280. People make it as if it is some hack together card. R700 is not a hack card thrown together because ATI cannot compete with nVidia. Get this out of your heads. ATI is going dual-GPU because it can provide the best performance at a good cost to end users and it is simply the only way to go in the future. And GTX 280 SLI is not at all a valid comparison to the HD 4870 X2. Like with any card of previous generations, you compare one HD 4870 X2 to one GTX 280.

If ATI wanted to, they could have put 1600SP, 64 TMU, and 32 ROP on a huge card with a 512-bit bus and competed with a single GPU. But they decided that this was not the way to go. Multiple, small GPUs with great yield are the way to go. nVidia can go with a huge GPU that costs way too much as a result of its horrible yields. But ATI is smarter than that.

If ATI made that card I talked about above, with a single huge GPU, they would likely have to use a 65nm process and would have to have low clocks because of high TDP, just like nVidia is facing with GT200. The cost would have been tremendous, just like GT200. And the cost to end users would be tremendous, just like with GT200. The high-end would have been $650 just like the GTX 280, and likely a slower version for $450-500.

I guarantee you the above card would not beat the hypothetical HD 4870 X2 for $499, certainly not in price-performance.

nVidia will be going the same route as well, I can guarantee you that. They will not keep building bigger and bigger GPUs; that might have been OK when the top of the line GPU was ~200mm^2, but not anymore. nVidia cannot make a GPU any bigger than GT200. In 2009, I am pretty sure you will see a multi-GPU nVidia card as well.

I'm not saying there is anything wrong with the GTX 280 or GTX 260. I'm sure they will be great cards and perform very well. But people should consider each card based on performance, drivers, and pricing. If the 4870 X2 costs $500 and outperforms a $650 GTX 280, why would I care what is under the hood? You should always consider every card from any manufacturer.

I agree. What performance do you get for what price? This includes the quality of drivers. Other considerations should be made only if talking about the future (sustainability) or technology itself (to learn?).

The one legitimate concern I can see with the HD 4870 X2 is if it suffers from micro-stutter like the HD 3870 X2. That is a legitimate problem and in that case it may be worth taking slightly less performance from a GTX 280 as the gameplay experience may be better. But I am really hoping 4870 X2 is a smarter implementation of multi-GPU and that AMD fixes some of the problems people had with the 3870 X2.
 

OCGuy

Lifer
Jul 12, 2000
27,224
37
91
[. If the 4870 X2 costs $500 and outperforms a $650 GTX 280, why would I care what is under the hood? You should always consider every card from any manufacturer.


Why should you care what is under the hood? Because other than programs that scale well, like artificial benchmarks (3Dmark06) Xfire and SLI have inherant problems. It is still up to both ATI and NV to prove otherwise. They have not yet, unless I missed something..
 

Extelleron

Diamond Member
Dec 26, 2005
3,127
0
71
Originally posted by: Ocguy31
[. If the 4870 X2 costs $500 and outperforms a $650 GTX 280, why would I care what is under the hood? You should always consider every card from any manufacturer.


Why should you care what is under the hood? Because other than programs that scale well, like artificial benchmarks (3Dmark06) Xfire and SLI have inherant problems. It is still up to both ATI and NV to prove otherwise. They have not yet, unless I missed something..

You are already factoring that into the decision by considering the performance of the card. If the 4870 X2 outperforms the GTX 280 in Crysis, but CF scaling is only 5%, why would I care? It still outperforms the 280.

Plenty of games scale well with CF/SLI. The only issue I consider valid at this point is micro-stuttering, which seems to occur in a few situations with the HD 3870 X2. If this still occurs with the 4870 X2, then it is a legitimate criticism.
 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
@Extelleron

You do realise that ATi/AMD CANT go with a single monolithic GPU design? Basically its all related to the financial crisis that AMD has been suffering for the past several years. They can't afford to spend anymore R&D as they would have maybe 2~3 years ago. The current solution (for the graphics market) that they've adopted to meet the financial problems is a smart move considering that its reduced overall cost in designing/manufacturering of their GPUs. Instead of pouring money into 3 seperate GPU designs (high/mid/low), they've chosen to design a mid/high end GPU solution, where they can take advantage of the multi GPU concept to compete in the highend if the need arises, while producing 1~2 lesser derivatives of the GPU family for the low/mid end. You can also tell that AMD/ATi has abandoned the idea of refreshes and instead branding these refreshes as new generations. People seem to forget that the RV770 is simply a refresh of a refresh of R600. (where if you follow the traditional design cycle, RV770 would have been a refresh to the slightly crippled R700 that was aimed to tackle the mid/high end market segment).

Plus they know that they cant take that risk in competing with nVIDIA atm. In the high end market, you either win or lose. They lost the last time (R600) and in turn had quite an effect on the high end market. AMD/ATi is in no position to take these gambles/risks because one loss can really dent their hold on the GPU market (not to mention the waste in resources/money etc).

So its outlandish to claim that AMD/ATi is taking this multi GPU approach because its the "future". Well its not because they can't afford to stick with the traditional design cycle and design route unless they want to bleed more. Same goes for CPUs and MCM. Theres a reason why nehalem is going to be a "native" quad core design. Multi GPUs, and MCM packaged CPUs have their place in this industry but these are more like place holders and short term technologies that would be placed with something thats more efficient. As an end note i do hope AMD/ATi has been working on the "real" R700 because i dont see just how much they can keep refreshing the R600 architecture.

Anyway enough being OT but IMO i dont think we will ever see a GX2 type card for GT200 either. Its power/heat envelope is probably far to large even with a die shrink to 55nm process (unless serious changes are made to the architecture to reduce die size, transistor count and its heat/power output). However, a GT200 @ 55nm thats clocked similarly to what the 9800GTX is clocked at could bring some serious performance improvements (seeing as the GT200 is clocked pretty low, especially withs shader clock domain). If nVIDIA could pair it with GDDR5, that that also could bring some improvements especially at ultra high res with AA/AF.
 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
Originally posted by: nitromullet
Originally posted by: mharr7
http://www.hexus.net/content/item.php?item=13670

Not sure if that has been posted yet. Picture of Gainwards box. On the bottom in fine print says it requires a 550+ watt PSU to power the GTX 280.

I like the claim, "Your games in living color". It would kinda suck if the GTX 280 only put out a monochrome signal.

they're just saying that...it's really a black and white only card
 

ronnn

Diamond Member
May 22, 2003
3,918
0
71
Originally posted by: HOOfan 1
Originally posted by: Nemesis 1
Why do you find the bolded part interesting . Rollo . I have been recieving free ATI cards since the 800xtpe . The differance between why I recieve my parts is way differant than why you recieve yours. You get yours to lead people around like sheep and to confuse issues like DX10.1.

I on the other hand get mine to test the hardware. Big differance ROllo. Interesting now even more isn't it. I was only recieving 1 GPU per generation . No other hardware . Now I get Almost all of it . For beta testing not marketing.

[sarcasm]no no, you don't try to badmouth nVidia and celebrate ATI at all on these forums

and you never ever butcher the English language either.[/sarcasm]

I don't know why the moderators haven't flat told you to stay out of this thread already.

You obviously don't want to discuss the GTX 280/260 series at all. You want to discuss how much you love ATI and dislike nVidia.

:thumbsup: While they are at it, they could keep the Nvidia employees out of the ati thread. Live would be good.........................:laugh:
 

ronnn

Diamond Member
May 22, 2003
3,918
0
71
Originally posted by: Cookie Monster
@Extelleron

You do realise that ATi/AMD CANT go with a single monolithic GPU design? Basically its all related to the financial crisis that AMD has been suffering for the past several years. They can't afford to spend anymore R&D as they would have maybe 2~3 years ago. The current solution (for the graphics market) that they've adopted to meet the financial problems is a smart move considering that its reduced overall cost in designing/manufacturering of their GPUs. Instead of pouring money into 3 seperate GPU designs (high/mid/low), they've chosen to design a mid/high end GPU solution, where they can take advantage of the multi GPU concept to compete in the highend if the need arises, while producing 1~2 lesser derivatives of the GPU family for the low/mid end. You can also tell that AMD/ATi has abandoned the idea of refreshes and instead branding these refreshes as new generations. People seem to forget that the RV770 is simply a refresh of a refresh of R600. (where if you follow the traditional design cycle, RV770 would have been a refresh to the slightly crippled R700 that was aimed to tackle the mid/high end market segment).

Plus they know that they cant take that risk in competing with nVIDIA atm. In the high end market, you either win or lose. They lost the last time (R600) and in turn had quite an effect on the high end market. AMD/ATi is in no position to take these gambles/risks because one loss can really dent their hold on the GPU market (not to mention the waste in resources/money etc).

So its outlandish to claim that AMD/ATi is taking this multi GPU approach because its the "future". Well its not because they can't afford to stick with the traditional design cycle and design route unless they want to bleed more. Same goes for CPUs and MCM. Theres a reason why nehalem is going to be a "native" quad core design. Multi GPUs, and MCM packaged CPUs have their place in this industry but these are more like place holders and short term technologies that would be placed with something thats more efficient. As an end note i do hope AMD/ATi has been working on the "real" R700 because i dont see just how much they can keep refreshing the R600 architecture.

Anyway enough being OT but IMO i dont think we will ever see a GX2 type card for GT200 either. Its power/heat envelope is probably far to large even with a die shrink to 55nm process (unless serious changes are made to the architecture to reduce die size, transistor count and its heat/power output). However, a GT200 @ 55nm thats clocked similarly to what the 9800GTX is clocked at could bring some serious performance improvements (seeing as the GT200 is clocked pretty low, especially withs shader clock domain). If nVIDIA could pair it with GDDR5, that that also could bring some improvements especially at ultra high res with AA/AF.

So if nvidia sucked a oversized lemon and multi gpu works - ati will have been right? The gtx200 looks very good on paper, but the rumours of production problems are concerning. The 4870 is much harder to rate - as not even the correct specs are known. Maybe it doesn't exist.
 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
Originally posted by: SickBeast
Originally posted by: Extelleron
nVidia needs to license SLI to Intel
NV's insane CEO has already pissed intel off to the point that they are revoking their chipset licence for the Nehalem (future CPU). I don't think they will be giving them SLI any time soon. They seem to be aligning themselves with VIA, which could create a 3-horse race of sorts. Of the 3, AMD looks like the most balanced company. VIA has terrible CPUs and intel has horrific graphics.

NV may be in a bit of a situation if they indeed cannot make a chipset for Nehalem. In a way I suppose it's intel's way of forcing them to licence SLI. IMO it's fair game if NV wouldn't give it to them to begin with.

huang, isn't insane, he's a genius. he took a company from nowhere to $20 billion + market cap in 15 years. now he's set his sights on the ultimate target: intel. I'm not a blind fanboy of intel, amd, or nvidia, but as an impartial 3rd party you have to admire the cojones of that guy. Intel should be scared.
 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
Originally posted by: ArchAngel777
Originally posted by: Nemesis 1
But I will also recieve a 4850/4870/4870x2. 4850 I will give to my daughter . The 4870 goes in wifes gamer . I will likely have to buy 1 more 4870 to xfire. My gamer will get the 4870x2. So I will infact have all 4 cards. All using great hardware so I will know whats what just like the review sites. I actually did it today I am getting a K10 . LOL . YA I had to have it. It was a forced down my throat deal . But the NV 280 that I wanted. Just to see.

I call shens...

dude, come on, this makes perfect sense! ATI marketing always got their asses handed to them by nvidia, and amd continually gets clobbered by intel. this has created the perfect marketing storm, resulting in nemesis getting free ati hardware to "test". ATI is too stupid to realize that they need people like rollo on their side. Whom would you rather have spouting propaganda for you, rollo or nemesis? dan quayle or nemesis? stalin or...never mind.
 

Janooo

Golden Member
Aug 22, 2005
1,067
13
81
NV does not look into the future. AMD/ATI does and they set directions. It happened with 580 and it's happening with multi GPU. NV will just follow.
 

Janooo

Golden Member
Aug 22, 2005
1,067
13
81
Originally posted by: Cookie Monster
Unless you have 100% scaling, multi chip solutions will always be inferior to single chip solutions.

Inferior in what?
If it's going to end up faster and cheaper?
200 is already inferior to 770 in number crunching.
 

Heatlesssun

Junior Member
Jan 19, 2006
11
0
66
Originally posted by: Janooo
Originally posted by: Cookie Monster
Unless you have 100% scaling, multi chip solutions will always be inferior to single chip solutions.

Inferior in what?
If it's going to end up faster and cheaper?
200 is already inferior to 770 in number crunching.

It's simply a matter of the technology maturing. Indeed the crux of the issue is parallel processing which is the Holy Grail of increasing computational power these days. Not only hardware must improve but software design to take advantage of multiple processing units, and it?s not a easy task but it will improve.
 

Janooo

Golden Member
Aug 22, 2005
1,067
13
81
Originally posted by: Heatlesssun
Originally posted by: Janooo
Originally posted by: Cookie Monster
Unless you have 100% scaling, multi chip solutions will always be inferior to single chip solutions.

Inferior in what?
If it's going to end up faster and cheaper?
200 is already inferior to 770 in number crunching.

It's simply a matter of the technology maturing. Indeed the crux of the issue is parallel processing which is the Holy Grail of increasing computational power these days. Not only hardware must improve but software design to take advantage of multiple processing units, and it?s not a easy task but it will improve.

That's the direction. It happened to CPU's it's happening to GPU's. NV will follow. Maybe 200 is the last dinosaur.
 

ronnn

Diamond Member
May 22, 2003
3,918
0
71
Originally posted by: Cookie Monster
Unless you have 100% scaling, multi chip solutions will always be inferior to single chip solutions.

You don't need 100% scaling. Cost/feature effective will do.
 

ronnn

Diamond Member
May 22, 2003
3,918
0
71
Originally posted by: BFG10K
Inferior in what?
Compatibility and general robustness.

That's not a scaling issue. Just because last gen was bad, does not mean a solution will never be presented to meet your concerns. Techno luddites are an oxymoron.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,000
126
Can you define robustness?
The likelihood a game will work and perform properly in a given situation.

Compatibility? Are you under NDA? Do you know how 700 is compatible?
Is this some kind of joke? Have you not been paying attention to the multi-GPU issues of the past four years or so?

Making a global memory pool and using a different bridge isn't going to do squat for the inherent problems of AFR.

Given any single GPU, it?s always preferable to simply double everything on it rather combining two of them.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,000
126
That's not a scaling issue.
Scaling is a big part of it. If the game's not scaling properly then the multi-GPU solution is broken for that game.

A single card wouldn't have that problem as increasing its specs automatically increases the game's performance (assuming you?re not CPU limited of course).
 

Janooo

Golden Member
Aug 22, 2005
1,067
13
81
Originally posted by: BFG10K
Can you define robustness?
The likelihood a game will work and perform properly in a given situation.

Compatibility? Are you under NDA? Do you know how 700 is compatible?
Is this some kind of joke? Have you not been paying attention to the multi-GPU issues of the past four years or so?

Making a global memory pool and using a different bridge isn't going to do squat for the inherent problems of AFR.

Given any single GPU, it?s always preferable to simply double everything on it rather combining two of them.

And how do you know it's going to be AFR? And even if it's AFR how do you know it's going to suffer all the AFR issues? What is your biggest AFR concern?
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |