Nvidia reveals Specifications of GT300

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

cbn

Lifer
Mar 27, 2009
12,968
221
106
How much bandwidth would a full 512 processor GT300 use up? If they use 2GB memory for the highend chip I think it is very safe to say it would completely smoke a 480 processor 295 GTX (which also isn't able to use the memory of both GPUs in a additive fashion).

How about a X2 version of GT300 (maybe called 395 GTX)? Would 1024 processors @ 2GB per GPU exceed the bandwidth of PCI-E 2.0 (16 lanes)? Isn't PCI-E 3.0 planned for 2010?
 

cbn

Lifer
Mar 27, 2009
12,968
221
106
Originally posted by: Jacen
Originally posted by: Beanie46
And no one is mentioning the fact that the author of the article, Theo Valich, has about as much credibility as CNBC does regarding the stock market?
.
This. These wild speculations and rumors hardly ever pan out, especially this early before the launch period. I'll buy in to the GT300 hype when I can snatch sneak review from a OEM.

Until then these amazing figures are going to be just that, figures, on the internet of all places.

Seriously this idea of 512 processor GT300s sounds pretty wild. I wonder how much of this really will be true?

(I don't know anything about how GPUs are laid out) but just using some really basic elementary school math I am thinking a die shrink from 55nm to 40nm would allow them to almost double the density of the transistors. However this GT300 part they are mentioning is more than double the number of processors. Maybe using DDR5 would increase the amount of free'd up real estate on the die?
 

alyarb

Platinum Member
Jan 25, 2009
2,425
0
76
here's a 65nm GT200 so you can see how much area is allocated for the shaders. You can see each of ten coherent-looking clusters that are 24 shaders deep taking up less than 33% of the whole area.
http://www.beyond3d.com/images...-arch/gt200die-big.jpg

Instead of ten clusters, there will be 16, and instead of having 24 shaders each they will have 32. You can do this without doubling the number of transistors on the entire chip, but there still will be close to 2 billion, if not more. I would be surprised if ROP count didn't increase, even if not substantially.
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
Originally posted by: Just learning
Originally posted by: Beanie46


It also seems they are focusing too much on the general purpose part of their GPUs, which is not always beneficial to game performance.

If a General purpose GPU could be used as a CPU would there be significant saving in electrical overhead? (as far as the whole system goes)

I am also wondering what impact using GPUs as CPUs would have on software writing? Would this mean new Operating systems? New applications possible? Or do other supporting technologies need to mature first?

At the moment there is no way to use a GPU as a CPU without an actual CPU running the system. Even if NV designed the gt300 with CPU-like capabilities, it wouldn't x86 compatible by a long shot, so you wouldn't be running Windows on it anytime soon.

To me it looks like they're reaching further into their Tesla business, where you have a cluster of GPU's doing the heavy math, but ultimately the system is still controlled by a separate CPU.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
Originally posted by: Insomniator
It all sounds great... now how about some games that make people need one.

Crysis at 19x12+ is expensive these days... anything else?

If the only thing these new chips allow us to do is turn on AA/AF at any resolution for Crysis and a select few other games then somehow I doubt people are gonna upgrade from a GTX260 or even less (well, people here would of course, but normal ones won't)

Cryostasis... and... someone help me here...

Only crytek is stupid enough to write code that only works on hardware that will not even exist for years.

When the hardware arrives, games will be made, until then only 2560x1600 resolution games will make use of that much power.

Monkey, where did the G200 depart from G80? it was a single chip with "double almost everything in one chip". Thats not departure, thats exactly what everyone asked for. and performance SHOWS that. this is why the G200 is the best SINGLE CHIP performer.
Is it the most EFFICIENT or CHEAP? not really, but that is not exactly "Wild departure".
 

TC91

Golden Member
Jul 9, 2007
1,164
0
0
Originally posted by: taltamir
Originally posted by: Insomniator
It all sounds great... now how about some games that make people need one.

Crysis at 19x12+ is expensive these days... anything else?

If the only thing these new chips allow us to do is turn on AA/AF at any resolution for Crysis and a select few other games then somehow I doubt people are gonna upgrade from a GTX260 or even less (well, people here would of course, but normal ones won't)

Cryostasis... and... someone help me here...

Only crytek is stupid enough to write code that only works on hardware that will not even exist for years.

When the hardware arrives, games will be made, until then only 2560x1600 resolution games will make use of that much power.

Monkey, where did the G200 depart from G80? it was a single chip with "double almost everything in one chip". Thats not departure, thats exactly what everyone asked for. and performance SHOWS that. this is why the G200 is the best SINGLE CHIP performer.
Is it the most EFFICIENT or CHEAP? not really, but that is not exactly "Wild departure".

It's not stupid, it's innovation and motivation for better gpus .
 

cbn

Lifer
Mar 27, 2009
12,968
221
106
Originally posted by: TC91
Originally posted by: taltamir
Originally posted by: Insomniator
It all sounds great... now how about some games that make people need one.

Crysis at 19x12+ is expensive these days... anything else?

If the only thing these new chips allow us to do is turn on AA/AF at any resolution for Crysis and a select few other games then somehow I doubt people are gonna upgrade from a GTX260 or even less (well, people here would of course, but normal ones won't)

Cryostasis... and... someone help me here...

Only crytek is stupid enough to write code that only works on hardware that will not even exist for years.

When the hardware arrives, games will be made, until then only 2560x1600 resolution games will make use of that much power.

Monkey, where did the G200 depart from G80? it was a single chip with "double almost everything in one chip". Thats not departure, thats exactly what everyone asked for. and performance SHOWS that. this is why the G200 is the best SINGLE CHIP performer.
Is it the most EFFICIENT or CHEAP? not really, but that is not exactly "Wild departure".

It's not stupid, it's innovation and motivation for better gpus .

How about the proliferation of higher resolution LCD TVs/monitors for the innovation of better GPUs.

We are already at the point where 1920x1200 is affordable. Back in 2006/2007 I couldn't find a 1920x1200 monitor for under $500.

In fact, I wonder how much longer it will take for the smaller dot pitches to show up on the market.

 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
interesting how they kept on saying "if only we named it differently people wouldn't mind"... I set my 2000$ computer to the max playable settings on a game whose ONLY selling point was "will make love to your eyes"... and I ended up with a somewhat choppy game with bad plot and gameplay that LOOKED less good than my UE3 games on that system... sure it looked better than UE if i cranked up the settings, at which point it was also not playable at all.

oh, and blaming pirates, after selling 1 million copies for 14 mil more than development costs, even though there werent 1 mil computers capable of properly running the beast...

and lets not forget promising to go over to making console only games.

And there were a few other things too i think... meh.
 

alyarb

Platinum Member
Jan 25, 2009
2,425
0
76
are you saying that cryengine 3 will not be worth a glance when it comes out? Even if we have 2 or 5 tflops to run it on? and are you really saying UE3 looks as good as CE2? i'm not going to knock the heavy hardware requirements because the game editor is so great that a skilled modder can make crysis fun for many years. and i'm not going to knock their graphics engine for being ahead of its time. i used to love giants: citizen kabuto and i had to wait 4 years to max out that game.

crytek made serious mistakes but they have a talented design crew. just not a good sales dept. i can't wait to see what they do with shader model 5.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
I am saying that for a similar system BACK THEN with the required quality settings, UE3 looks better than CE2... NOW I can crank up CE2 so that it looks better than UE3 while remaining playable, because my hardware can now render enough FPS on UE3 that its bound by the monitor refresh rate. while running CE2 at 30fps is both playable and higher quality.

But this is years AFTER the case, and also LOWER END HARDWARE runs better on other engines than the CE2
 

alyarb

Platinum Member
Jan 25, 2009
2,425
0
76
are you then saying that a shader-intensive engine like CE2 could be made to run as well as UE3 (or better) on low-end hardware if only the talent existed to create it? because if your answer is no, then i just don't understand what you're mad about. they made a great game, here's how it runs with mainstream technology. take it or leave it.

edit: engine. they made a great engine. the game was super gay and i appreciate that they provided me with the tools i need to mod out all the gay. still excited about shader model 5.0.
 

sisq0kidd

Lifer
Apr 27, 2004
17,043
1
81
What is wrong with some of you? Nvidia is coming out with a new chip that looks to be a monster. Why is this bad? If you don't like it, don't buy it.

As for me, I'm excited for any new GPU whether it be AMD, Nvidia, Intel (k, not really Intel).

I won't be buying high end chips as I cannot afford them, but I will certainly welcome any advancement of GPUs as they will certainly push down some prices of other cards I can afford.

Seriously, some of you need to take a deep breath and let go some of the hate.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
Originally posted by: alyarb
are you then saying that a shader-intensive engine like CE2 could be made to run as well as UE3 (or better) on low-end hardware if only the talent existed to create it? because if your answer is no, then i just don't understand what you're mad about. they made a great game, here's how it runs with mainstream technology. take it or leave it.

edit: engine. they made a great engine. the game was super gay and i appreciate that they provided me with the tools i need to mod out all the gay. still excited about shader model 5.0.

I couldn't give about what kind of engine it is, you are just making excuses for them now. The bottom line is that with the hardware available to people at the time, the engine did not produce as good a quality as other engines. with the hardware available TODAY it produces better imagine quality at playable settings.

And thats really all there is to it.
 

Idontcare

Elite Member
Oct 10, 1999
21,110
59
91
Originally posted by: taltamir
Originally posted by: alyarb
are you then saying that a shader-intensive engine like CE2 could be made to run as well as UE3 (or better) on low-end hardware if only the talent existed to create it? because if your answer is no, then i just don't understand what you're mad about. they made a great game, here's how it runs with mainstream technology. take it or leave it.

edit: engine. they made a great engine. the game was super gay and i appreciate that they provided me with the tools i need to mod out all the gay. still excited about shader model 5.0.

I couldn't give about what kind of engine it is, you are just making excuses for them now. The bottom line is that with the hardware available to people at the time, the engine did not produce as good a quality as other engines. with the hardware available TODAY it produces better imagine quality at playable settings.

And thats really all there is to it.

Same could be said of Vista.
 

Denithor

Diamond Member
Apr 11, 2004
6,298
23
81
Originally posted by: taltamir
Only crytek is stupid enough to write code that only works on hardware that will not even exist for years.

Seriously? What about Oblivion from Bethesda? In the early days - nothing - would properly play it at full resolution/detail. Even today, with the mega texture addon installed it brings many mid-tier video cards to their knees.

Originally posted by: munky
To me it looks like they're reaching further into their Tesla business, where you have a cluster of GPU's doing the heavy math, but ultimately the system is still controlled by a separate CPU.

And I think munky's onto something here. G80 was the last generation that focused directly on gaming performance. G92 to a degree and GT200 very much have been aimed at the GPGPU market with gaming performance improvements almost an afterthought.

I wonder if we will see a split at nVidia into two "divisions" that focus separately on gaming & GPGPU instead of trying to keep them integrated? They could probably push both sides harder if each side didn't have to be bothered with the other.

Originally posted by: Idontcare
Same could be said of Vista.

Ouch. Just ouch.
Not saying I disagree, mind you...
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
Seriously? What about Oblivion from Bethesda? In the early days - nothing - would properly play it at full resolution/detail. Even today, with the mega texture addon installed it brings many mid-tier video cards to their knees.
you are right, to a point... oblivion also didn't work too well with existing hardware, there are other examples i can make... but crytek is the worse offender in gaming history when it comes to coding for non existant hardware, their hardware demands were just so far out compared to anything in the past.
 

Denithor

Diamond Member
Apr 11, 2004
6,298
23
81
Originally posted by: taltamir
Seriously? What about Oblivion from Bethesda? In the early days - nothing - would properly play it at full resolution/detail. Even today, with the mega texture addon installed it brings many mid-tier video cards to their knees.
you are right, to a point... oblivion also didn't work too well with existing hardware, there are other examples i can make... but crytek is the worse offender in gaming history when it comes to coding for non existant hardware, their hardware demands were just so far out compared to anything in the past.

Originally posted by: TC91
It's not stupid, it's innovation and motivation for better gpus .

I'd say TC91 pretty well addressed that already. And I agree - pushing the envelope on the software side forces development from the GPU designers & upgrades among the gamers. Which is a good thing. I hope game devs continue pushing like this in the future.
 

cbn

Lifer
Mar 27, 2009
12,968
221
106
http://news.ati-forum.de/index...fikationen-aufgetaucht

Here are some possible specs for HD5870 (posted originally by error8).

40nm process
1200 shaders @ 900 MHz
256 bit
DDR5@ 4400 Mhz
2.1 TFLOPS

Sounds like HF58xx will be built on a smaller die like HD48xx. While the jump in processing power isn't as dramatic as the Nvidia GT300 at least the possibility exists that current aftermarket coolers could still be used on HD58xx.
 

Idontcare

Elite Member
Oct 10, 1999
21,110
59
91
Originally posted by: taltamir
Seriously? What about Oblivion from Bethesda? In the early days - nothing - would properly play it at full resolution/detail. Even today, with the mega texture addon installed it brings many mid-tier video cards to their knees.
you are right, to a point... oblivion also didn't work too well with existing hardware, there are other examples i can make... but crytek is the worse offender in gaming history when it comes to coding for non existant hardware, their hardware demands were just so far out compared to anything in the past.

My recollection of the pre-Crysis days was that the vocal crowd pissed and moaned about the chicken-and-the-egg when it came to GPU power with the preferred argument being that if game makers built uber demanding games then the GPU makers would be justified to spend the R&D to create uber complex (and equally more expensive and power hungry) GPU's.

Edit: deleted needless inflammatory comments, me needs my Friday afternoon beer.
 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
Originally posted by: BenSkywalker
How would you say it backfired?

Given the outcome and the data we have today, its clear that the process technology used for GT200 was a bad decision made by nVIDIA. I say it backfired because this strategy involving the use of older/mature proces technology instead of a newer process technology that nVIDIA has been using for awhile now didn't quite work out for GT200.

GT200 as a GPU is quite the power house as all we know, but when you look at it in terms of performance/mm^2, its not a optimal product to engage in a price war or meet a large demand. How so? the GT200 chips based on 65nm process is 576mm^2 big and this results in only ~92 cores that could be usable per wafer. Now not all of these pass verification tests and what not to meet the specs set out by nVIDIA, so you are only left with a handful of chips.

For us consumers, this is very irrelevant as long as its price accordingly to its performance but for nVIDIA it is highly relevant seeing as their competition had a GPU with a size of 256mm^2 implying that more than twice the number of chips compared to GT200 were being produced per wafer while performing at ~90% of a full fledged GT200 (not to mention that its yields were very good due to the ALU redundancy technique that worked successfully for AMD). Since we are speculating about GT300, I was replying to a comment (that nVIDIA would stick with 55nm with GT300) which IMO isnt a very smart idea unless all they like to do is see who can create the biggest chips.

note - Id like to mention that nVIDIA didn't hit everything they were shooting for. Most obviously one is the launch of GT200. Its launch time frame suggests this GPU suffered manufacturing problems, mainly because it was reaching the limits of the 65nm process technology resulting in a ~7 month delay. (G80 -> GT200 took roughly 1 year and 7 months) nVIDIA from looking at past launch dates normally launch a new architecture ~1 year after another. For example NV40 to G70 Q2 2004 -> Q3 2005, G70 to G80 Q3 2005 -> Q4 2006, and so forth compared to G80 to GT200 Q4 2006 -> Q2 2008.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
Originally posted by: Denithor
Originally posted by: taltamir
Seriously? What about Oblivion from Bethesda? In the early days - nothing - would properly play it at full resolution/detail. Even today, with the mega texture addon installed it brings many mid-tier video cards to their knees.
you are right, to a point... oblivion also didn't work too well with existing hardware, there are other examples i can make... but crytek is the worse offender in gaming history when it comes to coding for non existant hardware, their hardware demands were just so far out compared to anything in the past.

Originally posted by: TC91
It's not stupid, it's innovation and motivation for better gpus .

I'd say TC91 pretty well addressed that already. And I agree - pushing the envelope on the software side forces development from the GPU designers & upgrades among the gamers. Which is a good thing. I hope game devs continue pushing like this in the future.

Pushing the envelope = innovation.
Coding for future hardware WHILE NEGLECTING CURRENT HARDWARE is stupid. That was my point.

If they made an awesome current GPU engine, than an awesome next gen engine, then an awesome next next gen GPU engine, things would be spiffy, they started with next next gen, made it awesome, then worked to trickle it down to next gen hardware, finally getting to getting to sorta kinda barely work and look like crap with current gen hardware.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
Originally posted by: Idontcare
Originally posted by: taltamir
Seriously? What about Oblivion from Bethesda? In the early days - nothing - would properly play it at full resolution/detail. Even today, with the mega texture addon installed it brings many mid-tier video cards to their knees.
you are right, to a point... oblivion also didn't work too well with existing hardware, there are other examples i can make... but crytek is the worse offender in gaming history when it comes to coding for non existant hardware, their hardware demands were just so far out compared to anything in the past.

My recollection of the pre-Crysis days was that the vocal crowd pissed and moaned about the chicken-and-the-egg when it came to GPU power with the preferred argument being that if game makers built uber demanding games then the GPU makers would be justified to spend the R&D to create uber complex (and equally more expensive and power hungry) GPU's.

Edit: deleted needless inflammatory comments, me needs my Friday afternoon beer.

people don't know what they need or want, giving people what they ASK for is a recipe for disaster.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
GT200 as a GPU is quite the power house as all we know, but when you look at it in terms of performance/mm^2, its not a optimal product to engage in a price war or meet a large demand.

In spite of the GT200 not being supply limited at any point which the 4870 was, in spite of nV reporting margins significantly higher then ATi for their GPUs, and in spite of the fact that nV has had no issues moving their prices, you think that is truly the case?

For us consumers, this is very irrelevant as long as its price accordingly to its performance but for nVIDIA it is highly relevant seeing as their competition had a GPU with a size of 256mm^2 implying that more than twice the number of chips compared to GT200 were being produced per wafer while performing at ~90% of a full fledged GT200 (not to mention that its yields were very good due to the ALU redundancy technique that worked successfully for AMD).

I wouldn't quite say irrelevant, as nV's design choice did offer superior performance/watt which is somewhat relevant for consumers. Yields- all evidence that we have would indicate were rather strong for nV. Chips that failed to yield perfect were sold as 260s, indicators point to the 192sp 260s being a bit too conservative and their yields were better then expected bringing us the 216. People seriously overestimate the costs of a larger die unless it impacts the ability to fill orders which didn't seem to impact nV this generation at all. nV chose to price their higher end parts in the ~$500 range which has been the norm for a long time now. The x800xt pe and 6800U were over $600 MSRP, it isn't like they overpriced the norm, ATi just undercut in an obvious focus on gaining marketshare(which is a VERY valid approach, not knocking them at all).

Id like to mention that nVIDIA didn't hit everything they were shooting for. Most obviously one is the launch of GT200.

They beat ATi. The timeline you placed out has a 1Q varriation across four years from any of the other launches, not exactly a big difference.

We already know that nV is planning on a 40nm GT2xx part prior to moving to GT3xx which will also be 40nm.

I guess we could look at it a different way, if nV had build the GT200 on 55nm from the start as ATi did given the thermals we had coming out of TSMC at that time it would have had to been clocked around 400MHZ and would have been destroyed by the 4870, it would have been the NV30 all over again.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
ATI, while saving on the cost of the GPU die, had to pay a lot more for GDDR5 ram to get equal performance... nVidia decided to use CHEAP GDDR3 ram with a more expensive larger GPU die. Also, the costs of the IPC differ between the two.
So a simple "die size = cost to manufacturer" comparison is faulty.

The only reason the GT200 was overpriced at the begining is because nvidia thought that they had no competition... imagine intel's surprise if AMD released a 200$ phenom2 chip that outperformed their 1000$ i7? the result would be the same, everyone would say "at this price point it is not worth buying intel", and intel will drop the price, and some would remember it as "it is not worth buying intel at any price point" and jump to the conclusion that the intel was more expensive initiatially because it was more expensive to produce, thus intel must be losing money now that its selling it competitively with the 200$ p2. where in reality intel is still making more money per chip AND selling more chips.
 

ibex333

Diamond Member
Mar 26, 2005
4,094
123
106
Good thing I didn't get a new video card. Thanks to the GT300 I should be able to buy a 295gtx for less than $100 less than a year from now.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |