G92 taped out boys

Page 7 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Rusin

Senior member
Jun 25, 2007
573
0
0
Originally posted by: chewietobbacca
Originally posted by: Rusin

Again.. there have been only one news about G92 that came from person who should know something about G92. Nvidia's vice president said that G92 is high end chip and Nvidia doesn't have any reason to break that cycle where they first bring high end chips at Q4 and mid range chips at spring.


Actually if you read that article carefully, he never specifically states G92 = high end. The article states that he is speaking of a high end card, which is rumored to be the G92 project. That doesn't mean no high end exists, it just doesn't mean G92 automatically is high end. Anyways, the Gxx numbers are all rumors but it is know that a mid range and low end card are coming out soon and a high end will be around sooner or later, it just might not be released earlier than the G92/G98 this time around.
Actually if you read carefully Michael Hara says that G92 is successor to G80 and G80 were high end last time I looked..
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
Originally posted by: Rusin
Originally posted by: chewietobbacca
Originally posted by: Rusin

Again.. there have been only one news about G92 that came from person who should know something about G92. Nvidia's vice president said that G92 is high end chip and Nvidia doesn't have any reason to break that cycle where they first bring high end chips at Q4 and mid range chips at spring.


Actually if you read that article carefully, he never specifically states G92 = high end. The article states that he is speaking of a high end card, which is rumored to be the G92 project. That doesn't mean no high end exists, it just doesn't mean G92 automatically is high end. Anyways, the Gxx numbers are all rumors but it is know that a mid range and low end card are coming out soon and a high end will be around sooner or later, it just might not be released earlier than the G92/G98 this time around.
Actually if you read carefully Michael Hara says that G92 is successor to G80 and G80 were high end last time I looked..

G80 includes the GTS line which last time I looked was Nvidia's midrange regardless of what people say about the 8600 which is a piece of junk.
 

Rusin

Senior member
Jun 25, 2007
573
0
0
Thing is that these G92-guesses that few news sites publish as real facts indicates that G92 would be successor to G84; G92 would purely be midrange chip. Geforce 8800GTS 320MB is on border of high end and uppermost midrange. It's basically high end card's cut down version. It has the same G80, with fewer shader units than GTX-version and also it's memory amount is dropped.
Same thing happened with Geforce 7900GS and 6800LE.. both had highend chip but were cut down versions.

So these pure guesses say that G92 ain't successor to G80, but to G84..and this goes against what Nvidia says about G92.
 

Rusin

Senior member
Jun 25, 2007
573
0
0
Sylvanas:
We have seen that guess already.

With being anonymous rumour this one goes in to same gategory as "Geforce 8600 ULTRA" news..
 

speckedhoncho

Member
Aug 3, 2007
156
0
0
How can the G92 be a successor to the 8800GTS OR 8800GTX? Going from a 3xx+ bit bus to a 256-bit bus? Does anyone else not see the lack of progress here? It's supposed to be a successor.
 

GZDynastar

Member
Jan 29, 2003
117
0
0
Originally posted by: bryanW1995
the past year has been rough on the video card consumer imho. no decent mid-range cards has probably gotten many to hold off until gts 320 goes below 200 or a new card comes out. Gary Key is confident that nobody is releasing a new vid card until Q1 08 and I'm not betting against him, so that means we still have another 4 mos minimum to wait...

I agree completely. Look at the trend. Back in the 4 series you could get the low end Ti4200 for a buck and a half. And you knew it was a great card because everyone and their mom had one. Its performance was incredible for that amount of money spent to buy it. Nowadays the "low end cards" are priced well, but the performance is lacking. The spread between performance and price is getting larger.

I'm still running my 6800GT because I cant afford a new card. I wanted an 8600 etc. but they suck IMO for the price you have to pay. I also own an ATi card and I'm not to pleased with its performance. SO now... I have to WAIT for prices to drop. Hopefully this new product will have a good priceerformance ratio... then I'll be a customer.

My ti4200 is still chuggin away BTW
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
Originally posted by: keysplayr2003
Originally posted by: Rusin
Ok, we get it. You want it to be true. Run with it. Enjoy.
Well I believe more Nvidia's word than anonymous rumours..when we don't have any reason to believe that this information would come from someone who should really know something about G92.

How's this then?:

G92 is GeForce 8700GTS - VRZone Link

Published September 6th. In their time zone

G90 will be the flagship, whenever that comes around.

Why Nvidia chose to use G9x codes remains to be seen. This technically should have been something like "G82". This thing is going to have to have at least 96 shaders onboard if it's going to replace the 8800GTS. 256-bit bus, and 900 to 1000MHz GDDR3 1ns.

Only thing I can think of is that NV scrapped their actual 9 series project and going straight to G100. Why might they do something like this? Ask ATI.

I never expected the g9x to be a major generation leap like the g7x->g8x, I'm not sure why people expected something more. Looking at past product cycles from Nv, the g9x is supposed to be a refresh of the g8x.
 

biostud

Lifer
Feb 27, 2003
19,495
6,556
136
my guess would be a midrange and maybe also a 8950 GX2 board based on the chip
 

Griswold

Senior member
Dec 24, 2004
630
0
0
Originally posted by: munky
Originally posted by: keysplayr2003
Originally posted by: Rusin
Ok, we get it. You want it to be true. Run with it. Enjoy.
Well I believe more Nvidia's word than anonymous rumours..when we don't have any reason to believe that this information would come from someone who should really know something about G92.

How's this then?:

G92 is GeForce 8700GTS - VRZone Link

Published September 6th. In their time zone

G90 will be the flagship, whenever that comes around.

Why Nvidia chose to use G9x codes remains to be seen. This technically should have been something like "G82". This thing is going to have to have at least 96 shaders onboard if it's going to replace the 8800GTS. 256-bit bus, and 900 to 1000MHz GDDR3 1ns.

Only thing I can think of is that NV scrapped their actual 9 series project and going straight to G100. Why might they do something like this? Ask ATI.

I never expected the g9x to be a major generation leap like the g7x->g8x, I'm not sure why people expected something more. Looking at past product cycles from Nv, the g9x is supposed to be a refresh of the g8x.


QFT. Many will soon cry into their pillows because of too high expectations.
 

Rusin

Senior member
Jun 25, 2007
573
0
0
Originally posted by: speckedhoncho
How can the G92 be a successor to the 8800GTS OR 8800GTX? Going from a 3xx+ bit bus to a 256-bit bus? Does anyone else not see the lack of progress here? It's supposed to be a successor.
This is reason why these made up spesifications are funny.
 

Rusin

Senior member
Jun 25, 2007
573
0
0
Originally posted by: munky
I never expected the g9x to be a major generation leap like the g7x->g8x, I'm not sure why people expected something more. Looking at past product cycles from Nv, the g9x is supposed to be a refresh of the g8x.
I believe it's like GF6->GF7. They based on same basic architecture, but were still very different in performance.

We have to wait for update from reliable source..so far there have been lately only pure speculation that has been written as real news. So far G92 has been high end chip according to Nvidia, there has been some anonymous rumours saying it would be midrange chip because their logical capabilities are not enough to see G92-codename for high end chip. There were news of GF 8600 ULTRA before, and spesifications for GF8600-cards. Basically these "GF 8700"-cards are the same as GF8600-cards were on similar speculations like 10-11 months ago.

 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
Originally posted by: speckedhoncho
How can the G92 be a successor to the 8800GTS OR 8800GTX? Going from a 3xx+ bit bus to a 256-bit bus? Does anyone else not see the lack of progress here? It's supposed to be a successor.

Its because performance hardly comes from bandwidth. Bandwidth limited scenarios are rare and theres other efficent ways of increasing performance.

For one thing, if this G92 with the rumoured specs of

Core clock of 740~ Mhz
Mem clock of 900 (1800)
64 SP
16 ROPs
Im guessing 16 TMUs (exactly half of G80, and plus the TMUs on G80 is overkill)
256bit bus
512mb GDDR3

Im sure it can take on the 8800GTS.
Comparing the fillrate

G92 - 11840Mpixel/s
G80GTS - 10000Mpixel/s

G92 - 11840texel/s
G80GTS - 12000Mtexel/s

So fillrates are similiar which is a good thing.
Framebuffer 512 vs 640 is pretty similiar.

Basically on the shader side of things, assuming the shader core is clocked at 1.6Ghz compared to G80GTS 1.2Ghz

G92 - 204GFlops. (Assuming everything stays the same as G80 which is probably not the case, as G92 could potentinally be in a dual MADD configuration or MADD + ADD)
G80GTS - 230GFlops.

So im presuming shader performance will be same if not faster. Theres other things such as display port, DX10.1 etc

Also with the potential of being a single slot design due to being manufactured on 65nm process i think this card could have the potential to replace both the 8800GTS 320/640mb models where margins are pretty bad (G92 wont be a NVIO + Main GPU combination like G80). power consumption would subordinately be better than G80.

With a price tag of $249~$299 dollars, i think this card will do pretty well unless we are completely off with our predictions and nVIDIA releases another powerhouse.
 

biostud

Lifer
Feb 27, 2003
19,495
6,556
136
Originally posted by: Cookie Monster
Originally posted by: speckedhoncho
How can the G92 be a successor to the 8800GTS OR 8800GTX? Going from a 3xx+ bit bus to a 256-bit bus? Does anyone else not see the lack of progress here? It's supposed to be a successor.

Its because performance hardly comes from bandwidth. Bandwidth limited scenarios are rare and theres other efficent ways of increasing performance.

For one thing, if this G92 with the rumoured specs of

Core clock of 740~ Mhz
Mem clock of 900 (1800)
64 SP
16 ROPs
Im guessing 16 TMUs (exactly half of G80, and plus the TMUs on G80 is overkill)
256bit bus
512mb GDDR3

Im sure it can take on the 8800GTS.
Comparing the fillrate

G92 - 11840Mpixel/s
G80GTS - 10000Mpixel/s

G92 - 11840texel/s
G80GTS - 12000Mtexel/s

So fillrates are similiar which is a good thing.
Framebuffer 512 vs 640 is pretty similiar.

Basically on the shader side of things, assuming the shader core is clocked at 1.6Ghz compared to G80GTS 1.2Ghz

G92 - 204GFlops. (Assuming everything stays the same as G80 which is probably not the case, as G92 could potentinally be in a dual MADD configuration or MADD + ADD)
G80GTS - 230GFlops.

So im presuming shader performance will be same if not faster. Theres other things such as display port, DX10.1 etc

Also with the potential of being a single slot design due to being manufactured on 65nm process i think this card could have the potential to replace both the 8800GTS 320/640mb models where margins are pretty bad (G92 wont be a NVIO + Main GPU combination like G80). power consumption would subordinately be better than G80.

With a price tag of $249~$299 dollars, i think this card will do pretty well unless we are completely off with our predictions and nVIDIA releases another powerhouse.

Although the memory bandwith be a bit less 256-bit 54.4 Gb/s vs 320-bit 64Gb/s
 

biostud

Lifer
Feb 27, 2003
19,495
6,556
136
A cheap to produce midrange card is where the money is, and that they will get with a 65nm midrange chip.
 

Sylvanas

Diamond Member
Jan 20, 2004
3,752
0
0
Fillrate and memory bandwidth are meaning less and less these days anyway, as seen in the R6x0 gen. It is all about the shaders.

Hmmm 65nm GPU's....wonder if we ever escape the monstrous heat output that seems to remain at around 80c every generation no matter of process.
 

biostud

Lifer
Feb 27, 2003
19,495
6,556
136
Originally posted by: Sylvanas
Fillrate and memory bandwidth are meaning less and less these days anyway, as seen in the R6x0 gen. It is all about the shaders.

Hmmm 65nm GPU's....wonder if we ever escape the monstrous heat output that seems to remain at around 80c every generation no matter of process.

uhm, it's not meaningless for the R600 it seems that the bandwidth is "more than enough", but if you halved the bandwidth I would guess the AA scores would start to drop.

and some more rumors:
http://theinquirer.net/?article=42199
 

speckedhoncho

Member
Aug 3, 2007
156
0
0
How can a 33% narrower bus not limit the shading and geometry? Don't instructions need to be loaded into the GPU memory before execution? Or are they fed directly into the processor from main memory?
 

speckedhoncho

Member
Aug 3, 2007
156
0
0
Says CookieMonster:

For one thing, if this G92 with the rumoured specs of

Core clock of 740~ Mhz
Mem clock of 900 (1800)
64 SP
16 ROPs
Im guessing 16 TMUs (exactly half of G80, and plus the TMUs on G80 is overkill)
256bit bus
512mb GDDR3

Im sure it can take on the 8800GTS.
Comparing the fillrate

G92 - 11840Mpixel/s
G80GTS - 10000Mpixel/s

G92 - 11840texel/s
G80GTS - 12000Mtexel/s

True that the clock of the G92 due to its process is higher, but Intel and AMD learned several years ago that increasing clock speeds will only work for a little while. Intel drastically deepened their pipeline, but both eventually went to scaling the parallel execution units. And, true, changes to the FSB <200MHz(effective) won't effect system performance a whole lot, the long run will be happy that the bus' 64-bit width will be used.

I'd have to assume that even though Nvidia & ATI will scale the shader & sp clocks to above 1GHz in a couple of years (oc'ers beat them to the moon of course), but once the speed gravy train is done, the bandwidth to the processor will be all the more important.
 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
Originally posted by: Rusin
Originally posted by: munky
I never expected the g9x to be a major generation leap like the g7x->g8x, I'm not sure why people expected something more. Looking at past product cycles from Nv, the g9x is supposed to be a refresh of the g8x.
I believe it's like GF6->GF7. They based on same basic architecture, but were still very different in performance.

We have to wait for update from reliable source..so far there have been lately only pure speculation that has been written as real news. So far G92 has been high end chip according to Nvidia, there has been some anonymous rumours saying it would be midrange chip because their logical capabilities are not enough to see G92-codename for high end chip. There were news of GF 8600 ULTRA before, and spesifications for GF8600-cards. Basically these "GF 8700"-cards are the same as GF8600-cards were on similar speculations like 10-11 months ago.

Not really sure why you are "raging" against this particular machine. G90 or G100 will be the next high end GPU from Nvidia. History dictates this to be so. G92/93/94 etc. etc. would be the low/mid range of GPU's. And since when is VRZone an unreliable source? Because you misinterpreted that Nvidia employee? Humans are a strange lot, aren't we?
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
Originally posted by: speckedhoncho
How can a 33% narrower bus not limit the shading and geometry? Don't instructions need to be loaded into the GPU memory before execution? Or are they fed directly into the processor from main memory?

The narrower bus will lower the performance somewhat, but not at the instruction level. All that bandwidth is needed to transfer data like textures and geometry to and from the gpu to the video memory, and to apply post-rendering effects like MSAA. However, the general trend in modern games is toward longer and more complex shader programs, and for that reason in modern video cards running modern games, gpu number-crunching power has a greater impact on performance than the memory bandwidth.
 

Rusin

Senior member
Jun 25, 2007
573
0
0
Originally posted by: keysplayr2003

Not really sure why you are "raging" against this particular machine. G90 or G100 will be the next high end GPU from Nvidia. History dictates this to be so. G92/93/94 etc. etc. would be the low/mid range of GPU's. And since when is VRZone an unreliable source? Because you misinterpreted that Nvidia employee? Humans are a strange lot, aren't we?
Well there are statement by Nvidia's vice presidents, who should know about G92, saying it will be high end chip. Then there are some "news" that doesn't even tell just how did they found out what they are claiming, doesn't give any reason to believe that for example VR-zone or Inquirer etc. should know what G92 really is.

It wouldn't be the first time when these sites would publish rubbish; Geforce 8600 ULTRA for example. They had 'learned' that this will be the card for midrange and GF8600 cards have 256-bit memorybandwith etc. They published this as news, not as rumour.

It's hard to misinterprete what that Michael Hara said when he said that G92 is successor to G80 and said that mid range parts of next generation comes 2008 spring and they don't have any reason to break that cycle that they had with GF8.

BTW. doesn't people find it funny that earlier Inquirer reported that G92 will be high end and their story started like "Nvidia says it's G92 high end graphhics card.."

Also VR-Zone earlier reported this way:
"NVIDIA is also going to launch their next generation PCI Express 2.0 SLI chipsets for both Intel and AMD processors together with the next generation enthusiast GPUs known as G92 in November."
-----

Ok tell me: Why we shouldn't believe what Nvidia's Michael Hara said about G92?

 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
Nvidia G92 Card is 9-inch Long

VR-Zone learned that G92 reference card is 9-inch long, same length as the GeForce 8800 GTS but it is single slot. The PCB version is P393. The 1ns DDR3 memories are arranged in 16Mx32x8PCS array. The card has outputs like HDTV, HDCP, DVI, but no display port yet. Expect some interesting PCB and cooler designs from the card makers at launch.
 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
Originally posted by: Rusin
Originally posted by: keysplayr2003

Not really sure why you are "raging" against this particular machine. G90 or G100 will be the next high end GPU from Nvidia. History dictates this to be so. G92/93/94 etc. etc. would be the low/mid range of GPU's. And since when is VRZone an unreliable source? Because you misinterpreted that Nvidia employee? Humans are a strange lot, aren't we?
Well there are statement by Nvidia's vice presidents, who should know about G92, saying it will be high end chip. Then there are some "news" that doesn't even tell just how did they found out what they are claiming, doesn't give any reason to believe that for example VR-zone or Inquirer etc. should know what G92 really is.

It wouldn't be the first time when these sites would publish rubbish; Geforce 8600 ULTRA for example. They had 'learned' that this will be the card for midrange and GF8600 cards have 256-bit memorybandwith etc. They published this as news, not as rumour.

It's hard to misinterprete what that Michael Hara said when he said that G92 is successor to G80 and said that mid range parts of next generation comes 2008 spring and they don't have any reason to break that cycle that they had with GF8.

BTW. doesn't people find it funny that earlier Inquirer reported that G92 will be high end and their story started like "Nvidia says it's G92 high end graphhics card.."

Also VR-Zone earlier reported this way:
"NVIDIA is also going to launch their next generation PCI Express 2.0 SLI chipsets for both Intel and AMD processors together with the next generation enthusiast GPUs known as G92 in November."
-----

Ok tell me: Why we shouldn't believe what Nvidia's Michael Hara said about G92?

Because it's called "G92". That's why. If it was called G90, and without any hint of specs, i'd totally say it would be a high end part.

 

imported_Shaq

Senior member
Sep 24, 2004
731
0
0
Ouch...single slot too. Sounds like no high end for one of the most prolific years in PC gaming history. I would have definitely bought a 9800 if it came out this year, but next april there won't be a good reason to do so. Their sales on the high end would have been much higher releasing it this year with UT3, Crysis, Gears etc. There will still be people playing next year of course, but most who buy the games would have played it through several times by then. So there will be less incentive to buy.

I guess I can sell my GTS and upgrade to a GTX if I want a 20% speed boost. If you could slap an aftermarket cooler on the G92 and volt mod it though it could be equivalent to a GTX/Ultra for only $300 which isn't too bad.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |