Nvidia reveals Specifications of GT300

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Genx87

Lifer
Apr 8, 2002
41,091
513
126
Originally posted by: BFG10K
Originally posted by: Bateluer

The first GT300s will likely be 55nm, with a 40nm refresh coming later.
I find that highly unlikely for the simple reason that the necessary performance increase in order to remain competitive would produce a die size larger than the original G80.

I think it depends if they bring out a 40nm part before the GT300. They got burned badly on the FX part 6 years ago when they went with a new process that didnt work out in the time frame they needed on a GPU that needed clocks to compete. If they bring out a 40nm part this qtr, get the kinks out of the system. I think GT300's will be on 40nm. If not they will show up on the older process with a bigger chip but higher yields and make their clock targets. With a plan to introduce a die shrink version months later.
 

Idontcare

Elite Member
Oct 10, 1999
21,110
59
91
Originally posted by: Genx87
I wonder how much this will actually benefit gaming though? It sounds to me like they are going after the high computational market that needs more flexibility from the units?

No doubt the GT300 was developed within the backdrop of "must outperform whatever Larrabee delivers to the GPGPU world".

Even if Larrabee falls flat on its face it would have been imprudent of NV's designers to not assume worst-case (for NV) scenario of Larrabee's performance expectations while fleshing out the architecture and ISA involved in the GT300.
 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
Originally posted by: Creig
Originally posted by: Keysplayr
Bottom line and the only things that should concern you are..................

Performance, power consumption, heat, price. You should not give a rats arse if it takes 90 billion transistors to make.
We all know that, as a rule of thumb, the more complex a GPU is, the higher its power consumption is. That translates to more heat. The more heat a GPU puts out, generally the lower the clock speed it will stably run at. It also is more expensive to produce a higher transistor count GPU than one on the same process that uses fewer transistors since you will get fewer dies per wafer. And typically, yields will be lower on the GPU with a higher transistor count simply from a mathematics point of view. That ties directly into price.

So yes, we should give a rats arse what its transistor count is as it relates to everything you just mentioned (performance, power consumption, heat, price).

Oh really.. Tell me, does transistor count and die size correlate directly to heat output and power consumption? If you think so, let me introduce you to 55nm GT200 and RV770.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
Originally posted by: ilkhan
regarding the 40nm launch, wasn't it reported that nV had huge issues trying to shrink GT200 to 40nm and it just wasn't going to happen? That would require a 40nm launch with GT300; OR a 40nm G9x. Which would make me laugh.

Dual machines takes care of the issues with power consumption quite nicely. Dedicate the power hungry desktop for gaming and an ultraportable laptop with docking station for the main machine. Screens are power hungry (my 24+20" setup uses 100W), but a docking station and a KVM make it really easy to switch back and forth. Its a good setup.

it has huge issues because you can't do optical shrinks anymore, its too small, when you make such a shrink you have to actually rework parts of the processor because they don't WORK anymore... 1nm = ~100 HYDROGEN ATOMS in a row. Silicon and the metals used take quite a bit more space than an hydrogen atom... when you get to those sizes, shrinking means major changes in operation and requires redesign of portions. Which is exactly what they are doing in the GT300 core, they are redesigning portions of it directly for 40nm...

They COULD redesign the GT200 for it, but its equal effort redesigning a last gen part for it than it is doing next gen part, so out of practicality they only do next gen.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
Fillrate complete? Is there even a such thing?

2500x1600 is 4Million pixels. With the current fill on a GTX295 it can handle just over 23 thousand texels per pixel per second. This is why I say we are very close to fillrate complete.

More fillrate the card has faster it performs.

A few different reasons why you may see a correlation between these points at the moment- One is that shader hardware is tied to ROPs. The more ROPs you have, the more fillrate you have simply as that is how GPUs are currently designed. Another is the amount of multi passing that is required due to shaders(although that again comes back to shader hardware). As developers migrate up the DX chain, the biggest differences we have at this point are those reducing the amount of passes it takes to execute a given shader with a certain level of complexity. Ideally moving forward simply using what is already available to us raw fill requirements could very well start decreasing moving forward simply by using the latest available API features.
 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Originally posted by: Bateluer
Nvidia never debuts a new chip design on a new process. They've used the older, reliable, process ever since their FX Fiasco. The first GT300s will likely be 55nm, with a 40nm refresh coming later.
No, they very clearly stated early on they were going to push 40nm for their next generation high-end part. They were supposed to begin sampling GT212 (high-end GT200 refresh) along with some mainstream parts (GT214, 216, 218) around now, but it looks like that was all pushed back due to TSMC's 40nm problems. New expectations are 40nm mainstream parts in a few months with the current rumors of GT300 already taping out.

It makes sense though, ATI's recent short-lived successes with their parts in general and X2 specifically have been due in large part to process edge. Matching any process from the outset would eliminate that edge. There's quite a few news blurbs about it as well, with increasing tension between Nvidia and TSMC over 40nm production allocation. I guess TSMC wasn't thrilled about the prospect of transitioning from "super expensive" GT200 production to really low-end parts like NAND flash RAM, Nvidia chipset, and Atom production.
 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Originally posted by: Idontcare
Originally posted by: Genx87
I wonder how much this will actually benefit gaming though? It sounds to me like they are going after the high computational market that needs more flexibility from the units?

No doubt the GT300 was developed within the backdrop of "must outperform whatever Larrabee delivers to the GPGPU world".

Even if Larrabee falls flat on its face it would have been imprudent of NV's designers to not assume worst-case (for NV) scenario of Larrabee's performance expectations while fleshing out the architecture and ISA involved in the GT300.
LOL! There's credible skepticism indicating Larrabee won't even be competitive with current generation parts from Nvidia and ATI, so I highly doubt GT300's designs and performance targets were influenced by Larrabee much, if at all. GT300's launch time frame will be ~15-18 months after GT200's launch, which falls within their previous launch schedules.

Specs look great though and seem reasonable within process and core size budgets. We'll have to see what core and shader clocks look like, but most of the info in there is good news, especially scaling granularity.
 

Insomniator

Diamond Member
Oct 23, 2002
6,294
171
106
It all sounds great... now how about some games that make people need one.

Crysis at 19x12+ is expensive these days... anything else?

If the only thing these new chips allow us to do is turn on AA/AF at any resolution for Crysis and a select few other games then somehow I doubt people are gonna upgrade from a GTX260 or even less (well, people here would of course, but normal ones won't)

Cryostasis... and... someone help me here...

 

yacoub

Golden Member
May 24, 2005
1,991
14
81
Originally posted by: thilan29
Originally posted by: Keysplayr
Originally posted by: SunnyD
Originally posted by: OCguy
Wow...that could be an amazing chip. :Q

Amazingly HUGE and HOT and POWER HUNGRY... yeah. Oh yeah, also amazingly EXPENSIVE too.

You don't know the size, you don't know the heat dissapation, you don't know the power it will draw, you don't know the price. Thanks for crapping by.

They're pretty good guesses though. If what he said was completely unfathomable (like saying GT300 would perform worse than GT200) I could see your issue with it. But can you honestly say GT300 (and the ATI 5000 series for that matter) WON't be larger, and more power hungry than this generation's cards?

I personally don't have an issue with it though...low power draw has not been that high my list...as long as I can have low IDLE power consumption I'm all good. It seems to be the price of progress.

Or just the price of impatience
 

DaveSimmons

Elite Member
Aug 12, 2001
40,730
670
126
and more power hungry than this generation's cards?

Look at the 4890 respin though - something like 20% faster using 20 watts less power.

If nV designs to balance power use and performance, they might keep power use no worse than it is now.
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
Originally posted by: Insomniator
It all sounds great... now how about some games that make people need one.

Crysis at 19x12+ is expensive these days... anything else?

If the only thing these new chips allow us to do is turn on AA/AF at any resolution for Crysis and a select few other games then somehow I doubt people are gonna upgrade from a GTX260 or even less (well, people here would of course, but normal ones won't)

Cryostasis... and... someone help me here...

That's part of the problem. There will be zero DirectX 11 games this year as far as I know. Probably none next year. Heck DirectX 10 barely got used.
 

Syntax Error

Senior member
Oct 29, 2007
617
0
0
I've never bought into the DX9 vs. DX10 argument, it's always just been icing on the cake for me in terms of whether it mattered to me or not.
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
Great, another huge, complex and expensive GPU that performs 10% faster in games than a much smaller and cheaper GPU from a competitor.
 

OCGuy

Lifer
Jul 12, 2000
27,224
37
91
Originally posted by: munky
Great, another huge, complex and expensive GPU that performs 10% faster in games than a much smaller and cheaper GPU from a competitor.

Id be interested in where you are getting your data.


Links?
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
Originally posted by: OCguy
Id be interested in where you are getting your data.
Links?

Hmm, let's see... MIMD, GPGPU, Dual-Precision performance, "GT200 architecture was a test for real things coming", ... yeah, all that just screams maximum gaming performance with least amount of wasted transistors. Oh wait, no it doesn't!

Originally posted by: Keysplayr
:::hands munky a roll of TP:::

I'll trade you a caffeine pill for it.
 

OCGuy

Lifer
Jul 12, 2000
27,224
37
91
Originally posted by: munky
Originally posted by: OCguy
Id be interested in where you are getting your data.
Links?

Hmm, let's see... MIMD, GPGPU, Dual-Precision performance, "GT200 architecture was a test for real things coming", ... yeah, all that just screams maximum gaming performance with least amount of wasted transistors. Oh wait, no it doesn't!

Ah, talking out your ass about relative performance when the next gen of both companies is barely taping out.

You are a special one.
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
Originally posted by: OCguy
Originally posted by: munky
Originally posted by: OCguy
Id be interested in where you are getting your data.
Links?

Hmm, let's see... MIMD, GPGPU, Dual-Precision performance, "GT200 architecture was a test for real things coming", ... yeah, all that just screams maximum gaming performance with least amount of wasted transistors. Oh wait, no it doesn't!

Ah, talking out your ass about relative performance when the next gen of both companies is barely taping out.

You are a special one.

Time to pull your head out of the sand. Look at where the gt200 design departed from the g80, and then take it one step further.
 

OCGuy

Lifer
Jul 12, 2000
27,224
37
91
Originally posted by: munky

Time to pull your head out of the sand. Look at where the gt200 design departed from the g80, and then take it one step further.

LOL

Hey, can you tell me what Sandy Bridge is going to be like while you are at it?


Nostradamus you are not.
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
Originally posted by: OCguy
Originally posted by: munky

Time to pull your head out of the sand. Look at where the gt200 design departed from the g80, and then take it one step further.

LOL

Hey, can you tell me what Sandy Bridge is going to be like while you are at it?


Nostradamus you are not.

my point ------> .

........"WHOOSH!"

your head -->:Q
 

thilanliyan

Lifer
Jun 21, 2005
12,031
2,243
126
Originally posted by: munky
my point ------> .

........"WHOOSH!"

your head -->:Q

:laugh:

OT, I really hope we get almost doubling of performance...or at least 1.5x. And I hope ATI is 1st to market...they need to be out of the gates first at least once in a while.
 

Beanie46

Senior member
Feb 16, 2009
527
0
0
And no one is mentioning the fact that the author of the article, Theo Valich, has about as much credibility as CNBC does regarding the stock market?

But is it possible that this might be Nvidia's worst possible move, taking such a big risk making such a radical architecture? If it doesn't pan out, they'll be completely screwed...but then again they could rename gt200 to cover it up.

It also seems they are focusing too much on the general purpose part of their GPUs, which is not always beneficial to game performance. And game performance will still be the most important metric by which a GPU is judged, not GPGPU performance.
 

cbn

Lifer
Mar 27, 2009
12,968
221
106
Originally posted by: Beanie46


It also seems they are focusing too much on the general purpose part of their GPUs, which is not always beneficial to game performance.

If a General purpose GPU could be used as a CPU would there be significant saving in electrical overhead? (as far as the whole system goes)

I am also wondering what impact using GPUs as CPUs would have on software writing? Would this mean new Operating systems? New applications possible? Or do other supporting technologies need to mature first?
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
Originally posted by: munky
Great, another huge, complex and expensive GPU that performs 10% faster in games than a much smaller and cheaper GPU from a competitor.

I'm sure the 5800 won't be that bad. Although maybe %20 slower.
 

Jacen

Member
Feb 21, 2009
177
0
0
Originally posted by: Beanie46
And no one is mentioning the fact that the author of the article, Theo Valich, has about as much credibility as CNBC does regarding the stock market?
.
This. These wild speculations and rumors hardly ever pan out, especially this early before the launch period. I'll buy in to the GT300 hype when I can snatch sneak review from a OEM.

Until then these amazing figures are going to be just that, figures, on the internet of all places.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |