G80 Stuff

Page 7 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
Originally posted by: josh6079
One thing is a sigh of relief, I don't expect we'll see anymore 7***## names now that G80 is so close and those have been beaten like a dead horse.

Finally enough with those NV4x/G7x cores. No more shimmering. No more this or that.
 

Matt2

Diamond Member
Jul 28, 2001
4,762
0
0
Originally posted by: jiffylube1024
Originally posted by: Matt2
I dont think we'll see 512bit memory bus AND GDDR4.

The added complexity of a memory bus 512bits wide plus using expensive GDDR4... we're talking a huge manufacturing cost that will be ever so graciously passed down to us.

I don't think GDDR4 is that much more money, especially compared to high end GDDR3 (also expensive). Especially since ATI's already done it on the X1950XTX. Plus, 1GHz GDDR4 is the slowest speed grade there is for GDDR4, making it also the 'cheapest'.

I think Nvidia is sticking with GDDR3 for the time being because of the quantities of G80 they want to ship; just like going on 90nm, they're picking the safest parts to get in quantity.

512bit ring-bus jsut allows ATI to process information internally using the 512bit bus. It has nothing to do with the external memory bus from core to memory.

It can process 512-bits already, and address up to 512-bits externally (unless I'm gravely mistaken). It's meant to be scalable for future generations, which is why ATI invested so much R&D into the ring bus.

At any rate, "I don't think ATI will do it, it will be too complex and cost too much" is the exact attitude everyone took to a 256-bit memory bus ATI had on R300, and I wouldn't be too surprised if history repeats itself on R600.

But I will grant that 'only' 384 bits is still a possibility for R600.

Anything's possible at this point.

I still dont think 512bit bus is going to be a reality.

If ATI does pull off a 512bit bus and 1GHZ GDDR4 and assuming the core can supply the memory with that much information to make full use of that bandwidth...

It wont even be close between R600 and G80.
 

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
Originally posted by: Cookie Monster
Originally posted by: josh6079
One thing is a sigh of relief, I don't expect we'll see anymore 7***## names now that G80 is so close and those have been beaten like a dead horse.

Finally enough with those NV4x/G7x cores. No more shimmering. No more this or that.

And more importanly, no more reoccuring 7k vs. X1k series shimmering threads that everyone already knows how to deal with that just cause flames...

Instead, we'll be able to b!tch about the newest problems and burn those into the ground probably over the course of 2+ years....
 

NanoStuff

Banned
Mar 23, 2006
2,981
1
0
$650 for a video card is hard to find acceptable, and that's before you consider power consumption.

The GTS however just might be within the limits of sanity.
 

Pantalaimon

Senior member
Feb 6, 2006
341
40
91
Originally posted by: Gstanfor
LOL! History put nvidia in a bad light does it? I'll happily compare ATi & nvidia's histories anytime you like appopin.
Maybe you should stop ranting about ATI's sordid maketing ploys until NVIDIA has stopped having AEG and people like Rollo batting for them. Until then your indignation about ATI's dirty marketing sounds a tad hollow.
 

Regs

Lifer
Aug 9, 2002
16,665
21
81
I hope G80 kills everything else out on the market

I hope it allows me to play COH and Cyrsis at 1600x1200 without paying 400-600 dollars for just a 5% difference on top of ATis best offering. Looks good so far on paper. But like Apoppin said - hype has got the better of us lately. That's not limited to just video cards too.
 

Pabster

Lifer
Apr 15, 2001
16,986
1
0
Originally posted by: jiffylube1024
I don't think GDDR4 is that much more money, especially compared to high end GDDR3 (also expensive). Especially since ATI's already done it on the X1950XTX. Plus, 1GHz GDDR4 is the slowest speed grade there is for GDDR4, making it also the 'cheapest'.

GDDR4 is vastly more expensive than GDDR3. There's a reason nVidia is not using 4 on the new 8800 series. Cost and availability. (And something for a refresh ) You are correct that 1.0GHz is the "slow" GDDR4 speed grade (Samsung apparently has sampled 1.6GHz modules) but they are nowhere near mass production.
 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
Originally posted by: josh6079
I hope G80 kills everything else out on the market at the time of it's release and seriously contends with R600. Since anything beyond R600 is in a haze due to AMD, and with rumors of Intel buying out Nvidia, I don't think anyone will be able to predict the outcome of cards after the R600/G80 era. These two GPU's may be the last serious competitors in their class for a while and I hope that they compete well enough to drive eachother's prices down so that we can benefit.

Originally posted by: nitromullet
hope that G80 kills everything in November and then R600 tops G80 in February... As long as AMD is willing and able to play, NVIDIA will be there and we'll keep getting great cards at better prices. The worst thing that could happen this coming year would be for R600 to be a flop and AMD decide that they could utilize ATI's engineers more effectively elesewhere. I have pretty high hopes for AMD/ATI in the long run, and I wouldn't want to seem them get discouraged early on.


Beyond G80/R600, maybe a generation or two, I have a strong feeling that:

AMD/ATI will introduce unified CPU/GPU designs.
Nvidia will introduce their own CPU/GPU designs.
And Intel, after aquiring PowerVR and 3% of Imagination, they too will introduce CPU/GPU designs.

I honestly don't think Intel will buy nvidia for a few reasons:

It would cost Intel about 12 to 13 billion at the $38 per share offering which would be acceptable to shareholders of nvidia stock.

Intel is currently shelling out 9+ billion dollars to complete 3 new 45nm fabs.

Thousands of Intel employees were laid-off over the past few months worldwide. (Maybe part of Otellini's house cleaning campaign).

The word is, Intel and Nvidia is to oil and water when it comes to getting along with each other. They don't mix well.

=============

About G80:

Going strictly by these specs, even the 8800GTS looks to be a monster doesn't it?
We shall see. 96 unified shaders and a 320-bit memory bus is far more than I expected their UBER high end part to be. I just really hope they have focused more on efficiency this gen, but at the same time, not disregarding the merits of brute force. I'd like to see less hits on performance when AA/AF is applied.

:::::breathes::::::
 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
Originally posted by: Pabster
Originally posted by: jiffylube1024
I don't think GDDR4 is that much more money, especially compared to high end GDDR3 (also expensive). Especially since ATI's already done it on the X1950XTX. Plus, 1GHz GDDR4 is the slowest speed grade there is for GDDR4, making it also the 'cheapest'.

GDDR4 is vastly more expensive than GDDR3. There's a reason nVidia is not using 4 on the new 8800 series. Cost and availability. (And something for a refresh ) You are correct that 1.0GHz is the "slow" GDDR4 speed grade (Samsung apparently has sampled 1.6GHz modules) but they are nowhere near mass production.


Yeah, I think they wish to avoid another 7800GTX 512 memory conundrum. They probably want to do a hard launch like the 7800GTX/GT. And with the wider memory bus, they may not need faster GDDR4 memory to attain sufficient bandwidth. Good to know that G80 does support GDDR4 so they can switch over when it becomes as widely available and cost comparable to GDDR3.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: Matt2

It wont even be close between R600 and G80.

so let's assume the WORST for ATi . . . they just saw the G80 specs and decided to 'redo' r600 [like with the 'inferior' x1800 series . . . we really don't know what 'happened' there]

lets say they are really late --6 months . . . mid '07 with an 'inferior' r600

is ATi out of high-end gfx? will AMD 'disassemble' them to use elsewhere?

:Q

and more importantly - is this what you nvidia guys want?
:shocked:



 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
Originally posted by: Pantalaimon
Originally posted by: Gstanfor
LOL! History put nvidia in a bad light does it? I'll happily compare ATi & nvidia's histories anytime you like appopin.
Maybe you should stop ranting about ATI's sordid maketing ploys until NVIDIA has stopped having AEG and people like Rollo batting for them. Until then your indignation about ATI's dirty marketing sounds a tad hollow.

Since when has AEG done anything "sordid" (outside of Rollo's posts - and I'm not even sure you can blame AEG for that - he would have posted that way regardless - he has nvidia contacts other than AEG as well). Speaking of Rollo, when was the last time you heard boo out of him? You are just as bad as appopin, desperately digging the past up try to make it somehow seem relevant to the present.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
'bad as apoppin'?


he can take that as a serious complement

there is no one here as bad as Gstanfor
:shocked:



you will never understand.

IF i had a PCIe rig AND if the g80 meets expectations, i'd rip my AGP ATI piece of crap outta my rig and replace it in a heartbeat with a nvidia solution . . . i doubt i'd even bother to wait for r600 speculative previews.

why wait when you can have the fastest performing GPU?

and finally IF the r600 turned out to be everything and a bowl of cherries, i'd upgrade again.

i think everyone else 'gets it'. . . .and the 'why' of what i do.
 

DeathReborn

Platinum Member
Oct 11, 2005
2,786
789
136
Originally posted by: apoppin
'bad as apoppin'?


he can take that as a serious complement

there is no one here as bad as Gstanfor
:shocked:



you will never understand.

IF i had a PCIe rig AND if the g80 meets expectations, i'd rip my AGP ATI piece of crap outta my rig and replace it in a heartbeat with a nvidia solution . . . i doubt i'd even bother to wait for r600 speculative previews.

why wait when you can have the fastest performing GPU?

and finally IF the r600 turned out to be everything and a bowl of cherries, i'd upgrade again.

i think everyone else 'gets it'. . . .and the 'why' of what i do.

Judging by your posts in this thread alone you are just as bad as Gstanfor but you are too blind to notice that. If you 2 spent more time talking about the topic there'd be much less flames going around Video.

Gstanfor = nvidiot (who cares)
appopin = fanatic (who cares)

Just grow up a bit and talk about the topic not bitching back and forth like little kids.

OT stuff below this...

I'm wondering if 8800GTS will maybe alow for SLI on a ~600W PSU. I certainly hope it does. If it can then i'd go for that over the GTX to give me more money to buy a decent LCD to replace my 22" CRT.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: DeathReborn
Originally posted by: apoppin
'bad as apoppin'?


he can take that as a serious complement

there is no one here as bad as Gstanfor
:shocked:



you will never understand.

IF i had a PCIe rig AND if the g80 meets expectations, i'd rip my AGP ATI piece of crap outta my rig and replace it in a heartbeat with a nvidia solution . . . i doubt i'd even bother to wait for r600 speculative previews.

why wait when you can have the fastest performing GPU?

and finally IF the r600 turned out to be everything and a bowl of cherries, i'd upgrade again.

i think everyone else 'gets it'. . . .and the 'why' of what i do.

Judging by your posts in this thread alone you are just as bad as Gstanfor but you are too blind to notice that. If you 2 spent more time talking about the topic there'd be much less flames going around Video.

Gstanfor = nvidiot (who cares)
appopin = fanatic (who cares)

Just grow up a bit and talk about the topic not bitching back and forth like little kids.

OT stuff below this...

I'm wondering if 8800GTS will maybe alow for SLI on a ~600W PSU. I certainly hope it does. If it can then i'd go for that over the GTX to give me more money to buy a decent LCD to replace my 22" CRT.

personally . . . looking at your posts . . . i don't think you should be 'judging' another member at all.

i thought the sli'd GTxes were supposed to require 800w . . .

edit: yep SLI mode will likely carry a power supply "recommendation" of 800W.
 

ronnn

Diamond Member
May 22, 2003
3,918
0
71
Originally posted by: Cookie Monster
Performance and Enthusiast Graphics Cards Drive Market Revenues'

?ATI and Nvidia continued to dominate the desktop GPU market in the second quarter, but with a shift in relative positions. Nvidia has lead ATI in the discrete desktop segment for four consecutive quarters. However, with the launch of its flagship Radeon 1900 series and improvements in segments, ATI was able to regain lost segment share in Q2?06. The delays ATI experienced in bringing its latest generation GPU technology to market also assisted Nvidia in growing share during the period,? said Lisa Epstein, a senior analyst at Jon Peddie Research.

Note that most the revenue comes from the high end/performance/enthuisast. Its 74% of the total revenue.

Not to mention high end products have been creeping up price generation to generation. A 9700pro was released around $299. The 7800GTX was around $599. nVIDIA and ATi (well not any more) have always fought for the mid/high end because they cannot sustain a fight with Intel in the IGP/low end sector (where millions needs to be sold, unlike the high end which only tens of thousands needs to be sold to match it). Similiar to AMD resorting to quality over quantity.


Very interesting! Thanks for the link. :beer: This at least seems to support my theory that nvidia is leaking high end details, as the x1900 series is kicking their butt. Or maybe I read that wrong. Am very much looking forward to this launch. Hope we don't need to wait for dx10 games to see these new features in real action.
 

redbox

Golden Member
Nov 12, 2005
1,021
0
0
Originally posted by: Gstanfor
Originally posted by: Pantalaimon
Originally posted by: Gstanfor
LOL! History put nvidia in a bad light does it? I'll happily compare ATi & nvidia's histories anytime you like appopin.
Maybe you should stop ranting about ATI's sordid maketing ploys until NVIDIA has stopped having AEG and people like Rollo batting for them. Until then your indignation about ATI's dirty marketing sounds a tad hollow.

Since when has AEG done anything "sordid" (outside of Rollo's posts - and I'm not even sure you can blame AEG for that - he would have posted that way regardless - he has nvidia contacts other than AEG as well). Speaking of Rollo, when was the last time you heard boo out of him? You are just as bad as appopin, desperately digging the past up try to make it somehow seem relevant to the present.


1.) I don't see how you can distiguish between AEG and Rollo. If an employee of a company is a complete ass to me then I as a customer will view that company as an ass. Unless of course it is your favorite company, in which case you'll take anything they do and turn it into a march toward sainthood.

2.) Why not bring up the past? What?....has history never repeated itself? I am not meaning to bag on the G80 it looks amazing, but a person would be a bit foolish to totally forget history. That being said I do think we would need to wait untill G80 is out before we can compare it to any situation in the past. One thing we do know from past video card launches is they sometimes don't deliver on the hipe, and this happens from both camps. I'm mean think about it when has the INQ ever underestimated a card? Or any site for that matter?
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
the only launch that was underestimated - imo - was r300

no one knew what to expect and it caught everyone by surprise in that it generally surpassed expectations . ..

nvidia has a much more "polished" and mature - and aggressive - marketing department

finally, i expect and hope g80 to be a 'monster' . . . that's be good . . .

but

and there is one of those . . .

i also suggest 'caution' in getting caught up in hype

we'll know soon enough
 

ronnn

Diamond Member
May 22, 2003
3,918
0
71
Originally posted by: apoppin

i also suggest 'caution' in getting caught up in hype

Heretic! Thats like saying you can enjoy life with a midrange card.
 

East17

Junior Member
Apr 24, 2006
23
0
0
Hellor everybody,

It is quite simple 1350MHz for the shaders because the shading unit will function at a highyer speed than the rest of the GPU .

The thing could be like in the P4 NetBurst architecture where the ALUs were working at twice the speed of the rest of the CPU.

BUT ... since there are 2 BUSes for the memory ... there seems to be one BUS for the shading memory and another for the classic GPU .

What I'm trying to say is that the memory is separated for the GPU and the shading unit .

Maybe the shading unit is not even a part of the GPU ... it's separated on the PCB ... it would be simpler . 128 Bit BUS for the SHADING unit and 256 Bit BUS for the GPU .

So the memory will be separated also : 512 MB for the GPU and 256 MB for the shader processor .

What do you think ?
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: ronnn
Originally posted by: apoppin

i also suggest 'caution' in getting caught up in hype

Heretic! Thats like saying you can enjoy life with a midrange card.
or even make the preposterious 'claim' that you even enjoy gaming with it . . .

despite the awful visuals

that were somehow 'good enough' for the most 'elite' HW snob -- last year
:Q

yeah

that's me

 

GTaudiophile

Lifer
Oct 24, 2000
29,767
33
81
I agree with Pop...people need to take a chill pill, think of all the recent of examples of the The Big Green Marketing Hype Machine, and remember that G80 is NVDA's first-gen unified-shader product while ATI is polishing up their second-gen unified-shader product. ATI does have a 2-year head-start and I bet you NVDA took apart R500 before beginning work on G80.
 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
Remeber the VR-Zone specs. They mentioned TCPs. What are TCPs? Texture Command Protocol? Im sure T stands for Texture, but C P could mean control or even processor..

Now the specs are

8800GTX
80nm
700~ million transistor
575mhz Core Clock Speed
900mhz memory clock speed
128 unified shaders clocked at 1350mhz
7 TCPs
texture fill-rate of 38.4 billion pixels per second.
86GB/second of memory bandwidth
384bit (256+128)
GDDR3 768mb (512 + 256)
~10 inches long
2 PCI-e power adaptors


8800GTS
80nm
700~ million transistor
500mhz Core Clock Speed
900mhz memory clock speed
96 unified shaders clocked at 1200mhz
6 TCPs
texture fill-rate of ??? billion pixels per second.
64GB/second of memory bandwidth
320bit (256+64)
GDDR3 620mb (512 + 128)
9 inches long
1 PCI-e power adaptors


The slides said G80 has a total of 8 TCPs, and can scale up to 1.5ghz.
What im thinking is when R600 hit, nVIDIA is potentially hitting them with the 8800ultra which will be THE full G80.

A spec of

80nm
700~ million transistor
650~mhz Core Clock Speed
900mhz memory clock speed.
128 unified shaders clocked at 1500mhz
8 TCPs
texture fill-rate of ??? billion pixels per second.
115 GB/second of memory bandwidth
512 bit (256 + 256)
GDDR3 1024mb (512 + 512)
11 inches long
2 PCI-e power adaptors



Anyone have any idea how many ROPs, TMUs the G80 going to have? and what are TCPs? do they mean quad but in a different sense?





 

Pantalaimon

Senior member
Feb 6, 2006
341
40
91
Originally posted by: Gstanfor
Since when has AEG done anything "sordid" (outside of Rollo's posts - and I'm not even sure you can blame AEG for that - he would have posted that way regardless - he has nvidia contacts other than AEG as well). Speaking of Rollo, when was the last time you heard boo out of him? You are just as bad as appopin, desperately digging the past up try to make it somehow seem relevant to the present.
See, unlike you I'm not a fan of either ATI nor NVIDIA. I really don't care one way or another.

I just find it funny how you are always eager to point out ATI's sordid marketing but gloss over NVDIA's quite quickly. Rollo was retained by AEG and as part of that team, he embarked on some of the most obnoxious forum behavior here. When people accused him of belonging to AEG he threatened to get people banned, and kept lying about his involvement in AEG till the very end. He was finally banned, so that's probably why we haven't heard a peep from him lately. According to you, he is now dealing with NVIDIA directly, so, nice going there from NVIDIA rewarding him for his obnoxious behavior. NIVIDIA must have really liked his style.


So, you ranting about ATI's behavior or ATI sympathizers here is like China ranting about another country's human rights violation, real or perceived. Until NVIDIA dissassociates itself from AEG, and Rollo, your holding NVIDIA up as some knight in shining armor is rather hypocritical.
 

Nightmare225

Golden Member
May 20, 2006
1,661
0
0
Could we please stay on-topic? God, there's no use speculating until the cards actually come out. Then we'll talk about how similar it was to the NV30 or R300 launch.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |