SunnyD
Belgian Waffler
Originally posted by: Idontcare
Originally posted by: thilan29
Originally posted by: Keysplayr
Originally posted by: SunnyD
Originally posted by: OCguy
Wow...that could be an amazing chip. :Q
Amazingly HUGE and HOT and POWER HUNGRY... yeah. Oh yeah, also amazingly EXPENSIVE too.
You don't know the size, you don't know the heat dissapation, you don't know the power it will draw, you don't know the price. Thanks for crapping by.
They're pretty good guesses though. If what he said was completely unfathomable (like saying GT300 would perform worse than GT200) I could see your issue with it. But can you honestly say GT300 (and the ATI 5000 series for that matter) WON't be larger, and more power hungry than this generation's cards?
55nm -> 40nm transition involved in there too, which makes most assertions regarding power consumption and die-size a pointless debate until we have data.
All that aside, we're talking something roughly twice the "power" of a GT200, literally as the article reads 50% more functional units. Last I knew, these things took transistors to make, which means 50% more transistors. Okay, I'll grant you some changes in architecture which, being generous we'll say a grand total of 33% more transistors as a GT200 (very debatable given that the article says they're moving to MIMD - more logic needed), which weighed in at what - 1.4 billion transistors? So we're moving to 1.8 billion transistors. That also doesn't account for DX11 spec callign for two new types of shaders, plus tessellation hardware as well (among other things). Even factoring in the die shrink, and you still have a humongous die pushing an awful lot of power through it. So pardon me while I take a well educated dump here - it's going to be hotter, bigger, and more expensive (for Nvidia). :roll: