Some more R600 information

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

TBSN

Senior member
Nov 12, 2006
925
0
76
Originally posted by: ArchAngel777
Doubt we will see anything under $400. They will probably come out with an XT and XTX version of it first, then follow up with something around April/May in the midrange. All speculation at this point, but I would highly doubt we see any midrange parts for quite a while after the initial flagship release.

I am pretty excited to see this new card of ATi's. I was origional planning an entire system overhaul, however, if they come out with a decent mid-range card sometime before June, I will just spend $300 or so and save the rest for a future upgrade.

So the ATI D600 XT and XTX would be roughly equivalent (in terms of performance and price difference to each other) to the Nvidia GTS and GTX? I think I may do the same as you and save my money until they come out, and see whatever comes out on top, (in terms of value) G80 or R600...
 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
Firstly, what the heck is madd4 shaders? people who dont know these things shouldnt act like they do. You people realise G80 use scalar shaders which are madd+mul capable. The current prospect is that R600 is going to be vec4+scalar (like the Xenos) or vec4, or even vec2+vec3. Althought theorectically 64 vec4 shaders clocked at ~750mhz equal 128 scalar shaders clocked at 1350mhz, it has its advantages and disadvantages. If its vec4, it can do more work per cycle, (4 compared to 1) but the ultisation of the shaders are far less then scalar ones. This is why nVIDIA chose scalar shaders because you are garaunteeing 100% ultisation which the vec4 shaders can only dream of matching (increasing the core past 1ghz is going to be mission impossible). In a real world performance sense, 100% ultisation sounds more promising than less ultisation but more work per cycle.

This is why ATi is rumoured to hate the G80 so much. Although current drivers maybe a mess, the architecture itself is very impressive and the bar that nVIDIA has raised is VERY high. So right now, the R600 isnt going to rape anything in that sense. Sites that said about ATi having 32 ROPs is just BS, because a) having more ROPs dont equal in gaining performance and b) it would be a waste of transistor budget. The only good thing is that its a good marketing material for the ATi PR. Not to mention 2GB of GDDR4? Unless you want AMD to lose money by selling these rumoured cards at affordable prices, i think theyve made the whole thing up. Half of the specs listed by was already floating around in the internet by the INQ and word of mouth.

Literally, dont get your hopes high because it will get to you once the real benchmarks hit. Im thinking its either 5~15% faster in some 3d apps while slower in others compared to the G80. AVIVO will be there i bet, and the "new" crossfire. I think the transistor count is so high (720 million rumoured) because of the ring bus design (which also casues the chip to have much more pins, e.g R600 die shot had 2240 pins), its unified shader architecture using vec4+scalar shaders, and the 512bit memory interface.

GDDR3? i guess GDDR4 isnt a good option due to availability as ATi hoped for. However hearing about their 2nd revision working is great and working at 800mhz is pretty amazing. But werent they aiming for 750mhz initially? 800mhz sounds abit too much for such a complex chip at 80nm. G80 on the other hand has much more headroom, with the GTX managing avg of 650~ from cherry picked cores (which means 1.5ghz shader clock).


Originally Posted by Morgoth the Dark Enemy from beyond3d
This is going bad...it`s worse than the whole 512-bit stuff. It wasn`t like that with the G80, it seems that ppl expect ATi to make a card that would improve even starship Enterprise`s systems AND wipe your ass at the same time.
 

Matt2

Diamond Member
Jul 28, 2001
4,762
0
0
Originally posted by: Cookie Monster
Firstly, what the heck is madd4 shaders? people who dont know these things shouldnt act like they do. You people realise G80 use scalar shaders which are madd+mul capable. The current prospect is that R600 is going to be vec4+scalar (like the Xenos) or vec4, or even vec2+vec3. Althought theorectically 64 vec4 shaders clocked at ~750mhz equal 128 scalar shaders clocked at 1350mhz, it has its advantages and disadvantages. If its vec4, it can do more work per cycle, (4 compared to 1) but the ultisation of the shaders are far less then scalar ones. This is why nVIDIA chose scalar shaders because you are garaunteeing 100% ultisation which the vec4 shaders can only dream of matching (increasing the core past 1ghz is going to be mission impossible). In a real world performance sense, 100% ultisation sounds more promising than less ultisation but more work per cycle.

This is why ATi is rumoured to hate the G80 so much. Although current drivers maybe a mess, the architecture itself is very impressive and the bar that nVIDIA has raised is VERY high. So right now, the R600 isnt going to rape anything in that sense. Sites that said about ATi having 32 ROPs is just BS, because a) having more ROPs dont equal in gaining performance and b) it would be a waste of transistor budget. The only good thing is that its a good marketing material for the ATi PR. Not to mention 2GB of GDDR4? Unless you want AMD to lose money by selling these rumoured cards at affordable prices, i think theyve made the whole thing up. Half of the specs listed by was already floating around in the internet by the INQ and word of mouth.

Literally, dont get your hopes high because it will get to you once the real benchmarks hit. Im thinking its either 5~15% faster in some 3d apps while slower in others compared to the G80. AVIVO will be there i bet, and the "new" crossfire. I think the transistor count is so high (720 million rumoured) because of the ring bus design (which also casues the chip to have much more pins, e.g R600 die shot had 2240 pins), its unified shader architecture using vec4+scalar shaders, and the 512bit memory interface.

GDDR3? i guess GDDR4 isnt a good option due to availability as ATi hoped for. However hearing about their 2nd revision working is great and working at 800mhz is pretty amazing. But werent they aiming for 750mhz initially? 800mhz sounds abit too much for such a complex chip at 80nm. G80 on the other hand has much more headroom, with the GTX managing avg of 650~ from cherry picked cores (which means 1.5ghz shader clock).


Originally Posted by Morgoth the Dark Enemy from beyond3d
This is going bad...it`s worse than the whole 512-bit stuff. It wasn`t like that with the G80, it seems that ppl expect ATi to make a card that would improve even starship Enterprise`s systems AND wipe your ass at the same time.

I was just gonna post about scalar shaders being 100% utilized.

I think it's also worth mentioning that we might see Nvidia drop the driver bomb on R600 on Jan. 20th.

I'm personally holding my breath for Nvidia to release a driver that unlocks the full potential of this card. Think of what Nvidia did with the original Detonator drivers.

My reasoning for thinking that? There is no reason why an 8800GTS overclocked should come close to matching 8800GTX performance with a 320bit bus and less shaders.
 

ronnn

Diamond Member
May 22, 2003
3,918
0
71
Originally posted by: Matt2

I was just gonna post about scalar shaders being 100% utilized.

I think it's also worth mentioning that we might see Nvidia drop the driver bomb on R600 on Jan. 20th.

I'm personally holding my breath for Nvidia to release a driver that unlocks the full potential of this card. Think of what Nvidia did with the original Detonator drivers.

My reasoning for thinking that? There is no reason why an 8800GTS overclocked should come close to matching 8800GTX performance with a 320bit bus and less shaders.

All you ever do is post about how awful ati drivers are and how good nvidia drivers are. Just gets very boring. I do agree with you, Nvidia has a history of having new drivers when they get well beat. This time I do not expect the g80 to need new drivers to compete, unless vista gaming is real demanding.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,000
126
I think it's also worth mentioning that we might see Nvidia drop the driver bomb on R600 on Jan. 20th.
Driver bomb? They should start by releasing a driver that doesn't have issues with basic rasterization.
 

Matt2

Diamond Member
Jul 28, 2001
4,762
0
0
Originally posted by: BFG10K
I think it's also worth mentioning that we might see Nvidia drop the driver bomb on R600 on Jan. 20th.
Driver bomb? They should start by releasing a driver that doesn't have issues with basic rasterization.

I think they're keeping you on the edge of your seat BFG.
 

Pugnate

Senior member
Jun 25, 2006
690
0
0
I love my 8800GTX, but these are some of the worst drivers I have ever seen.

I have been an ATi user all my life until I switched to the 7800GTX.

For the most part I loved my ATi cards, till the 9800pro had serious driver issues with any game ported from the Xbox.
 

Matt2

Diamond Member
Jul 28, 2001
4,762
0
0
I cant really see the company with by far the best track record with drivers simply not know how to program the drivers for their flagship product.

It may be that their programmers havent mastered the new architecture, it may be that they are holding back performance or utilization of some of the chip and that is causing the rendering errors, or it could be that the rendering errors are actually improving performance so the G80 looks better compared to R600. Boy speculating is fun! Although I know my speculations are worthless.

Whatever the case, something is going to happen with the Nvidia drivers on or near the launch of R600. I'll bet on that.

Sorry ronn, I know it was another post about Nvidia's drivers, but BFG and Pugnante egged me on!
 

Eomer of Aldburg

Senior member
Jan 15, 2006
352
0
0
Originally posted by: TBSN
Originally posted by: ArchAngel777
Doubt we will see anything under $400. They will probably come out with an XT and XTX version of it first, then follow up with something around April/May in the midrange. All speculation at this point, but I would highly doubt we see any midrange parts for quite a while after the initial flagship release.

I am pretty excited to see this new card of ATi's. I was origional planning an entire system overhaul, however, if they come out with a decent mid-range card sometime before June, I will just spend $300 or so and save the rest for a future upgrade.

So the ATI D600 XT and XTX would be roughly equivalent (in terms of performance and price difference to each other) to the Nvidia GTS and GTX? I think I may do the same as you and save my money until they come out, and see whatever comes out on top, (in terms of value) G80 or R600...
Bah! I guess I will just pick up an X1950XT in Janury than, really wanted a mainstream DX10 though :/
 

ronnn

Diamond Member
May 22, 2003
3,918
0
71
Originally posted by: Matt2

Sorry ronn, I know it was another post about Nvidia's drivers, but BFG and Pugnante egged me on!

You do your job well!
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
The only driver bomb I expect to magically boost performance for the g80 is an fx5800-style hack/cheat bomb. And I dont think Nvidia want that fiasco to happen all over again. If you look at the Firingsquad g80 overclocking article you can see that in most tests a 8800gts @647mhz still falls behind a stock 8800gtx, sometimes by a large margin. I dont buy into any theories that Nv is holding back performance with the g80. If anything, the most likely response to the r600 I can expect from Nv is a higher clocked version of the 8800gtx using cherry-picked cores.
 

GundamSonicZeroX

Platinum Member
Oct 6, 2005
2,100
0
0
Man, it's threads like this that make me feel bad that the best I'll probably be able to do this Janurary is getting a second 7900GT and SLIing them
 

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
I think they're keeping you on the edge of your seat BFG.
If that is really what they're doing then Nvidia has the worst business practice out of any GPU vendor. To make their early adopters pay a premium to do nothing but deal with troubleshooting and frustration just so they can one-up the competition several months down the road isn't something I think enthusiasts appreciate.
 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
Originally posted by: munky
The only driver bomb I expect to magically boost performance for the g80 is an fx5800-style hack/cheat bomb. And I dont think Nvidia want that fiasco to happen all over again. If you look at the Firingsquad g80 overclocking article you can see that in most tests a 8800gts @647mhz still falls behind a stock 8800gtx, sometimes by a large margin. I dont buy into any theories that Nv is holding back performance with the g80. If anything, the most likely response to the r600 I can expect from Nv is a higher clocked version of the 8800gtx using cherry-picked cores.

You seem to enjoy bringing up "fx5800-style hack/cheat bomb". What makes you think so? These performance drivers are all possibilities because the G80 drivers hasnt given one yet. IF you look at most generations, it took couple of driver versions to gain further 10~30% performance e.g the 7800GTX, and the X1800XT. How about the GF3? However what we need is isnt a performance driver, but rather a special "fix" driver to get rid of most the bugs.

nVIDIA might be having trouble also becauses it a new architecture beyond what they have experienced before.

Back to the R600. Looks like R600 is ready to go. Source: beyond3d. January 20th doesnt look that bad as a launch date
 

evolucion8

Platinum Member
Jun 17, 2005
2,867
3
81
Originally posted by: Cookie Monster
Firstly, what the heck is madd4 shaders? people who dont know these things shouldnt act like they do. You people realise G80 use scalar shaders which are madd+mul capable. The current prospect is that R600 is going to be vec4+scalar (like the Xenos) or vec4, or even vec2+vec3. Althought theorectically 64 vec4 shaders clocked at ~750mhz equal 128 scalar shaders clocked at 1350mhz, it has its advantages and disadvantages. If its vec4, it can do more work per cycle, (4 compared to 1) but the ultisation of the shaders are far less then scalar ones. This is why nVIDIA chose scalar shaders because you are garaunteeing 100% ultisation which the vec4 shaders can only dream of matching (increasing the core past 1ghz is going to be mission impossible). In a real world performance sense, 100% ultisation sounds more promising than less ultisation but more work per cycle.

This is why ATi is rumoured to hate the G80 so much. Although current drivers maybe a mess, the architecture itself is very impressive and the bar that nVIDIA has raised is VERY high. So right now, the R600 isnt going to rape anything in that sense. Sites that said about ATi having 32 ROPs is just BS, because a) having more ROPs dont equal in gaining performance and b) it would be a waste of transistor budget. The only good thing is that its a good marketing material for the ATi PR. Not to mention 2GB of GDDR4? Unless you want AMD to lose money by selling these rumoured cards at affordable prices, i think theyve made the whole thing up. Half of the specs listed by was already floating around in the internet by the INQ and word of mouth.

Literally, dont get your hopes high because it will get to you once the real benchmarks hit. Im thinking its either 5~15% faster in some 3d apps while slower in others compared to the G80. AVIVO will be there i bet, and the "new" crossfire. I think the transistor count is so high (720 million rumoured) because of the ring bus design (which also casues the chip to have much more pins, e.g R600 die shot had 2240 pins), its unified shader architecture using vec4+scalar shaders, and the 512bit memory interface.

GDDR3? i guess GDDR4 isnt a good option due to availability as ATi hoped for. However hearing about their 2nd revision working is great and working at 800mhz is pretty amazing. But werent they aiming for 750mhz initially? 800mhz sounds abit too much for such a complex chip at 80nm. G80 on the other hand has much more headroom, with the GTX managing avg of 650~ from cherry picked cores (which means 1.5ghz shader clock).


Originally Posted by Morgoth the Dark Enemy from beyond3d
This is going bad...it`s worse than the whole 512-bit stuff. It wasn`t like that with the G80, it seems that ppl expect ATi to make a card that would improve even starship Enterprise`s systems AND wipe your ass at the same time.

Yeah is Vec4, my bad, but you shoud leave it there, talking crap saying "people who dont know these things shouldnt act like they do" is just out of your professionalism (If any). Nobody ask you opinions about people, this thread is about the R600, got it? Scalar shaders has been used since the R300 and we all know that due to limitations on the architecture, compiler and other reasons, they are never fully utilized, can you recall why ATi decided to increase the Pixel Shaders Unit in the X1900?? Cause in the X1800, 16 scalar shaders aren't enough, is hard to keep them all busy, so increasing them thus reducing the pixel shader granularity should help in increasing the performance and utilization. So what makes you think that all Scalar shaders are 100% utilized all the time?? ATI's R600 features 64 Shader 4-way SIMD units. Better have 64 SIMD shaders that can output 4 instructions on each with little or no optimization rather than try to figure out how to fully utilize 128 simple scalar shaders.
 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
:Q

Firstly the X1800/X1900 used vec3+scalar.. So did the R300..

Im not talking crap, because things you say right now which are false can make the other less informed to really believe thats the case.

You have some good points though, although some of them dont makes sense.
 

evolucion8

Platinum Member
Jun 17, 2005
2,867
3
81
Originally posted by: Cookie Monster
:Q

Firstly the X1800/X1900 used vec3+scalar.. So did the R300..

Im not talking crap, because things you say right now which are false can make the other less informed to really believe thats the case.

You have some good points though, although some of them dont makes sense.


Ok, may be im disinformed, but that doesn't mean that Im pretending that I know something that I don't know, you have no right to say that, If I find somebody saying an unnacurate information, I will correct that person without disrespecting it, after all, we are here to share information and knowledge, right?
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
Originally posted by: Cookie Monster
Originally posted by: munky
The only driver bomb I expect to magically boost performance for the g80 is an fx5800-style hack/cheat bomb. And I dont think Nvidia want that fiasco to happen all over again. If you look at the Firingsquad g80 overclocking article you can see that in most tests a 8800gts @647mhz still falls behind a stock 8800gtx, sometimes by a large margin. I dont buy into any theories that Nv is holding back performance with the g80. If anything, the most likely response to the r600 I can expect from Nv is a higher clocked version of the 8800gtx using cherry-picked cores.

You seem to enjoy bringing up "fx5800-style hack/cheat bomb". What makes you think so? These performance drivers are all possibilities because the G80 drivers hasnt given one yet. IF you look at most generations, it took couple of driver versions to gain further 10~30% performance e.g the 7800GTX, and the X1800XT. How about the GF3? However what we need is isnt a performance driver, but rather a special "fix" driver to get rid of most the bugs.

nVIDIA might be having trouble also becauses it a new architecture beyond what they have experienced before.

Back to the R600. Looks like R600 is ready to go. Source: beyond3d. January 20th doesnt look that bad as a launch date

If you look at the driver history of of the last few generations, there were steady, gradual improvements in performance, a trend I expect to continue with the g80. However, with the g80 some people seem to expect a major performance boost from unlocking some secret power that lay dormant all this time, and in the most recent time Nvidia tried to accomplished such a feat was by cheating in benchmarks via clip planes and shader replacement, and lowering IQ in areas such as texture filtering/sampling. I find the whole notion of Nvidia intentionally holding back the g80 peroformance utterly ridiculous. If the g80 is limited in performance, it's from immature drivers. But that's the way things go with all newly designed gpus, and can hardly be considered a secret weapon driver bomb waiting in the wings to be used at the right time.

I expect the r600 to go through the same process of increasing performance with maturing drivers. I do not expect either card to receive major performance boosts from a certain driver bomb.
 

thilanliyan

Lifer
Jun 21, 2005
12,010
2,232
126
Originally posted by: josh6079
I think they're keeping you on the edge of your seat BFG.
If that is really what they're doing then Nvidia has the worst business practice out of any GPU vendor. To make their early adopters pay a premium to do nothing but deal with troubleshooting and frustration just so they can one-up the competition several months down the road isn't something I think enthusiasts appreciate.


QFT. If that IS what they did then I will have learned my lesson. I won't be a guinea pig next time....although it was pretty exciting to have my 8800GTS a day after it launched.
 

schneiderguy

Lifer
Jun 26, 2006
10,801
89
91
anyone know what the MSRP of the highest end R600 card is going to be? hopefully ATI wont follow in nvidia's footsteps of raising their top end card to $650 :|
 

Matt2

Diamond Member
Jul 28, 2001
4,762
0
0
Originally posted by: schneiderguy
anyone know what the MSRP of the highest end R600 card is going to be? hopefully ATI wont follow in nvidia's footsteps of raising their top end card to $650 :|

Dont hold your breath.

X1900XTX was $650 at launch.

R600 is rumored to be 720 million transistors with 512bit bus.

It's going to be expensive.

Who knows, maybe AMD getting involved with this DOJ investigation will persuade them to keep the price down.
 

thilanliyan

Lifer
Jun 21, 2005
12,010
2,232
126
Originally posted by: Cookie Monster
You got yours at 650/1000? wow :Q


Yep. I've been happy with the card except that I couldn't play Splinter Cell Double Agent although that now seems to have been partially fixed but I haven't had much time lately to try it out.
 

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
If you look at the driver history of of the last few generations, there were steady, gradual improvements in performance, a trend I expect to continue with the g80. However, with the g80 some people seem to expect a major performance boost from unlocking some secret power that lay dormant all this time, and in the most recent time Nvidia tried to accomplished such a feat was by cheating in benchmarks via clip planes and shader replacement, and lowering IQ in areas such as texture filtering/sampling. I find the whole notion of Nvidia intentionally holding back the g80 peroformance utterly ridiculous. If the g80 is limited in performance, it's from immature drivers. But that's the way things go with all newly designed gpus, and can hardly be considered a secret weapon driver bomb waiting in the wings to be used at the right time.

I expect the r600 to go through the same process of increasing performance with maturing drivers. I do not expect either card to receive major performance boosts from a certain driver bomb.
Exactly.
 

jim1976

Platinum Member
Aug 7, 2003
2,704
6
81
I suggest ppl to listen to the post of Cookie Monster.. He got there b4 I do and explained most things in a way that is easy to comprehend..
One point should be noted..
R600 will be certainly a beast but do not expect miracles.. Do not underestimate the efficiency of scalar shaders.. Oh and I think the granularity issue is a PITA for someone else right now as it is rumoured G80 is a beast itself.. It surpassed even the most optimistic expectations of Nvidia engineers.. It's one of those cases where simplicity almost reaches perfection.. ATI/AMD certainly didn't see that coming IMHO and they certainly don't have an easy job to fulfill
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |