Originally posted by: Ocguy31
I still like how people say if a 2 GPU card (4870x2) slightly beats out a single GPU GTX280, then somehow ATI has scored a win. It wont get anywhere near beating 2 GTX280's, so who cares. People buying top of the line are less concerned about price anyway.
Having a single GPU will far outweigh anything the 4870x2 does.
And I agree with everyone else, we dont need someone who is on the ATI teet of free hardware to be in here throwing out random speculation. Thats what the 4xxx thread is for.
Originally posted by: Extelleron
Originally posted by: Ocguy31
I still like how people say if a 2 GPU card (4870x2) slightly beats out a single GPU GTX280, then somehow ATI has scored a win. It wont get anywhere near beating 2 GTX280's, so who cares. People buying top of the line are less concerned about price anyway.
Having a single GPU will far outweigh anything the 4870x2 does.
And I agree with everyone else, we dont need someone who is on the ATI teet of free hardware to be in here throwing out random speculation. Thats what the 4xxx thread is for.
:disgust:
This is what I am not happy with regarding enthusiasts today; they refuse to consider anything different than what they are used to, and seem to hate ATI.
If the HD 4870 X2 beats the GTX 280, then it beats the GTX 280. People make it as if it is some hack together card. R700 is not a hack card thrown together because ATI cannot compete with nVidia. Get this out of your heads. ATI is going dual-GPU because it can provide the best performance at a good cost to end users and it is simply the only way to go in the future. And GTX 280 SLI is not at all a valid comparison to the HD 4870 X2. Like with any card of previous generations, you compare one HD 4870 X2 to one GTX 280.
If ATI wanted to, they could have put 1600SP, 64 TMU, and 32 ROP on a huge card with a 512-bit bus and competed with a single GPU. But they decided that this was not the way to go. Multiple, small GPUs with great yield are the way to go. nVidia can go with a huge GPU that costs way too much as a result of its horrible yields. But ATI is smarter than that.
If ATI made that card I talked about above, with a single huge GPU, they would likely have to use a 65nm process and would have to have low clocks because of high TDP, just like nVidia is facing with GT200. The cost would have been tremendous, just like GT200. And the cost to end users would be tremendous, just like with GT200. The high-end would have been $650 just like the GTX 280, and likely a slower version for $450-500.
I guarantee you the above card would not beat the hypothetical HD 4870 X2 for $499, certainly not in price-performance.
nVidia will be going the same route as well, I can guarantee you that. They will not keep building bigger and bigger GPUs; that might have been OK when the top of the line GPU was ~200mm^2, but not anymore. nVidia cannot make a GPU any bigger than GT200. In 2009, I am pretty sure you will see a multi-GPU nVidia card as well.
I'm not saying there is anything wrong with the GTX 280 or GTX 260. I'm sure they will be great cards and perform very well. But people should consider each card based on performance, drivers, and pricing. If the 4870 X2 costs $500 and outperforms a $650 GTX 280, why would I care what is under the hood? You should always consider every card from any manufacturer.
Originally posted by: dv8silencer
Originally posted by: Extelleron
Originally posted by: Ocguy31
I still like how people say if a 2 GPU card (4870x2) slightly beats out a single GPU GTX280, then somehow ATI has scored a win. It wont get anywhere near beating 2 GTX280's, so who cares. People buying top of the line are less concerned about price anyway.
Having a single GPU will far outweigh anything the 4870x2 does.
And I agree with everyone else, we dont need someone who is on the ATI teet of free hardware to be in here throwing out random speculation. Thats what the 4xxx thread is for.
:disgust:
This is what I am not happy with regarding enthusiasts today; they refuse to consider anything different than what they are used to, and seem to hate ATI.
If the HD 4870 X2 beats the GTX 280, then it beats the GTX 280. People make it as if it is some hack together card. R700 is not a hack card thrown together because ATI cannot compete with nVidia. Get this out of your heads. ATI is going dual-GPU because it can provide the best performance at a good cost to end users and it is simply the only way to go in the future. And GTX 280 SLI is not at all a valid comparison to the HD 4870 X2. Like with any card of previous generations, you compare one HD 4870 X2 to one GTX 280.
If ATI wanted to, they could have put 1600SP, 64 TMU, and 32 ROP on a huge card with a 512-bit bus and competed with a single GPU. But they decided that this was not the way to go. Multiple, small GPUs with great yield are the way to go. nVidia can go with a huge GPU that costs way too much as a result of its horrible yields. But ATI is smarter than that.
If ATI made that card I talked about above, with a single huge GPU, they would likely have to use a 65nm process and would have to have low clocks because of high TDP, just like nVidia is facing with GT200. The cost would have been tremendous, just like GT200. And the cost to end users would be tremendous, just like with GT200. The high-end would have been $650 just like the GTX 280, and likely a slower version for $450-500.
I guarantee you the above card would not beat the hypothetical HD 4870 X2 for $499, certainly not in price-performance.
nVidia will be going the same route as well, I can guarantee you that. They will not keep building bigger and bigger GPUs; that might have been OK when the top of the line GPU was ~200mm^2, but not anymore. nVidia cannot make a GPU any bigger than GT200. In 2009, I am pretty sure you will see a multi-GPU nVidia card as well.
I'm not saying there is anything wrong with the GTX 280 or GTX 260. I'm sure they will be great cards and perform very well. But people should consider each card based on performance, drivers, and pricing. If the 4870 X2 costs $500 and outperforms a $650 GTX 280, why would I care what is under the hood? You should always consider every card from any manufacturer.
I agree. What performance do you get for what price? This includes the quality of drivers. Other considerations should be made only if talking about the future (sustainability) or technology itself (to learn?).
[. If the 4870 X2 costs $500 and outperforms a $650 GTX 280, why would I care what is under the hood? You should always consider every card from any manufacturer.
Originally posted by: Ocguy31
[. If the 4870 X2 costs $500 and outperforms a $650 GTX 280, why would I care what is under the hood? You should always consider every card from any manufacturer.
Why should you care what is under the hood? Because other than programs that scale well, like artificial benchmarks (3Dmark06) Xfire and SLI have inherant problems. It is still up to both ATI and NV to prove otherwise. They have not yet, unless I missed something..
Originally posted by: nitromullet
Originally posted by: mharr7
http://www.hexus.net/content/item.php?item=13670
Not sure if that has been posted yet. Picture of Gainwards box. On the bottom in fine print says it requires a 550+ watt PSU to power the GTX 280.
I like the claim, "Your games in living color". It would kinda suck if the GTX 280 only put out a monochrome signal.
Originally posted by: HOOfan 1
Originally posted by: Nemesis 1
Why do you find the bolded part interesting . Rollo . I have been recieving free ATI cards since the 800xtpe . The differance between why I recieve my parts is way differant than why you recieve yours. You get yours to lead people around like sheep and to confuse issues like DX10.1.
I on the other hand get mine to test the hardware. Big differance ROllo. Interesting now even more isn't it. I was only recieving 1 GPU per generation . No other hardware . Now I get Almost all of it . For beta testing not marketing.
[sarcasm]no no, you don't try to badmouth nVidia and celebrate ATI at all on these forums
and you never ever butcher the English language either.[/sarcasm]
I don't know why the moderators haven't flat told you to stay out of this thread already.
You obviously don't want to discuss the GTX 280/260 series at all. You want to discuss how much you love ATI and dislike nVidia.
Originally posted by: Cookie Monster
@Extelleron
You do realise that ATi/AMD CANT go with a single monolithic GPU design? Basically its all related to the financial crisis that AMD has been suffering for the past several years. They can't afford to spend anymore R&D as they would have maybe 2~3 years ago. The current solution (for the graphics market) that they've adopted to meet the financial problems is a smart move considering that its reduced overall cost in designing/manufacturering of their GPUs. Instead of pouring money into 3 seperate GPU designs (high/mid/low), they've chosen to design a mid/high end GPU solution, where they can take advantage of the multi GPU concept to compete in the highend if the need arises, while producing 1~2 lesser derivatives of the GPU family for the low/mid end. You can also tell that AMD/ATi has abandoned the idea of refreshes and instead branding these refreshes as new generations. People seem to forget that the RV770 is simply a refresh of a refresh of R600. (where if you follow the traditional design cycle, RV770 would have been a refresh to the slightly crippled R700 that was aimed to tackle the mid/high end market segment).
Plus they know that they cant take that risk in competing with nVIDIA atm. In the high end market, you either win or lose. They lost the last time (R600) and in turn had quite an effect on the high end market. AMD/ATi is in no position to take these gambles/risks because one loss can really dent their hold on the GPU market (not to mention the waste in resources/money etc).
So its outlandish to claim that AMD/ATi is taking this multi GPU approach because its the "future". Well its not because they can't afford to stick with the traditional design cycle and design route unless they want to bleed more. Same goes for CPUs and MCM. Theres a reason why nehalem is going to be a "native" quad core design. Multi GPUs, and MCM packaged CPUs have their place in this industry but these are more like place holders and short term technologies that would be placed with something thats more efficient. As an end note i do hope AMD/ATi has been working on the "real" R700 because i dont see just how much they can keep refreshing the R600 architecture.
Anyway enough being OT but IMO i dont think we will ever see a GX2 type card for GT200 either. Its power/heat envelope is probably far to large even with a die shrink to 55nm process (unless serious changes are made to the architecture to reduce die size, transistor count and its heat/power output). However, a GT200 @ 55nm thats clocked similarly to what the 9800GTX is clocked at could bring some serious performance improvements (seeing as the GT200 is clocked pretty low, especially withs shader clock domain). If nVIDIA could pair it with GDDR5, that that also could bring some improvements especially at ultra high res with AA/AF.
Originally posted by: SickBeast
NV's insane CEO has already pissed intel off to the point that they are revoking their chipset licence for the Nehalem (future CPU). I don't think they will be giving them SLI any time soon. They seem to be aligning themselves with VIA, which could create a 3-horse race of sorts. Of the 3, AMD looks like the most balanced company. VIA has terrible CPUs and intel has horrific graphics.Originally posted by: Extelleron
nVidia needs to license SLI to Intel
NV may be in a bit of a situation if they indeed cannot make a chipset for Nehalem. In a way I suppose it's intel's way of forcing them to licence SLI. IMO it's fair game if NV wouldn't give it to them to begin with.
Originally posted by: ArchAngel777
Originally posted by: Nemesis 1
But I will also recieve a 4850/4870/4870x2. 4850 I will give to my daughter . The 4870 goes in wifes gamer . I will likely have to buy 1 more 4870 to xfire. My gamer will get the 4870x2. So I will infact have all 4 cards. All using great hardware so I will know whats what just like the review sites. I actually did it today I am getting a K10 . LOL . YA I had to have it. It was a forced down my throat deal . But the NV 280 that I wanted. Just to see.
I call shens...
Originally posted by: Cookie Monster
Unless you have 100% scaling, multi chip solutions will always be inferior to single chip solutions.
Originally posted by: Janooo
Originally posted by: Cookie Monster
Unless you have 100% scaling, multi chip solutions will always be inferior to single chip solutions.
Inferior in what?
If it's going to end up faster and cheaper?
200 is already inferior to 770 in number crunching.
Originally posted by: BFG10K
Compatibility and general robustness.Inferior in what?
Originally posted by: Heatlesssun
Originally posted by: Janooo
Originally posted by: Cookie Monster
Unless you have 100% scaling, multi chip solutions will always be inferior to single chip solutions.
Inferior in what?
If it's going to end up faster and cheaper?
200 is already inferior to 770 in number crunching.
It's simply a matter of the technology maturing. Indeed the crux of the issue is parallel processing which is the Holy Grail of increasing computational power these days. Not only hardware must improve but software design to take advantage of multiple processing units, and it?s not a easy task but it will improve.
Originally posted by: Cookie Monster
Unless you have 100% scaling, multi chip solutions will always be inferior to single chip solutions.
Originally posted by: BFG10K
Compatibility and general robustness.Inferior in what?
The likelihood a game will work and perform properly in a given situation.Can you define robustness?
Is this some kind of joke? Have you not been paying attention to the multi-GPU issues of the past four years or so?Compatibility? Are you under NDA? Do you know how 700 is compatible?
Scaling is a big part of it. If the game's not scaling properly then the multi-GPU solution is broken for that game.That's not a scaling issue.
Originally posted by: BFG10K
The likelihood a game will work and perform properly in a given situation.Can you define robustness?
Is this some kind of joke? Have you not been paying attention to the multi-GPU issues of the past four years or so?Compatibility? Are you under NDA? Do you know how 700 is compatible?
Making a global memory pool and using a different bridge isn't going to do squat for the inherent problems of AFR.
Given any single GPU, it?s always preferable to simply double everything on it rather combining two of them.