ATi Radeon X1800

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Jun 14, 2003
10,442
0
0
Originally posted by: MegaWorks
Originally posted by: Frackal
Originally posted by: MegaWorks
Originally posted by: Creig
Originally posted by: MegaWorks
:Q a 650mhz 32 pipes card!!!!!!!!! nVidia complete Ownage if it's true.


Total wallet ownage as well.

In terms of performance king!

Megaworks your quotation is false at worst and at best, totally unconfirmed. Does accuracy not matter when it benefits your opinion?

Just had to point it out....

False in what concept?

dunno what he means but i dont think its a 32pipe 650Mhz part, i think you may of misunderstood?

i assume the 650Mhz is the 16pipe part (since if the INQ is anything to go buy...the 600Mhz versions are going to be thin on the ground and thats with 16 extreme pipes.)

though with only 10 32pipe parts i dont suppose it matters what the specs are becuase no-ones going to be getting one
 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
Originally posted by: Frackal
Btw, back on topic, does anyone know why, if these guys have this part, that they haven't benchmarked it?

Well, in that photo, the bin numbers have been intentionally blurred out. Probably still some sort of NDA. I'm sure they have been benchmarked.

 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: Paratus
I'm sure both Ronin and Rollo will each have two of those 10 cards in XFire mode within two weeks


(And then they say how it doesn't stack up to Nvidia )

Ronin maybe, I'm not planning to buy this card till I can get a deal on one. With 7800GTX SLI, I have no reason to pay early adopter fee for the R520.

 

compgeek89

Golden Member
Dec 11, 2004
1,860
0
76
At this point, after purchasing a GTX, I'll use eVGA's trade in if ever (which i doubt) and ultra comes out, else my next card will be a g80.
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
Originally posted by: n7
Thank god some users here have found out & mentioned what i have been saying forever: Most games these days are CPU-limited not GPU-limited.

Getting that 7800GTX for your 1280x1024 LCD with your A64 3000+ = teh jokes

What about an a64 3000+ @ 2.6 ghz, with a 1600x1200 monitor? That should give the gtx a good workout, unless you're running quake3 or something. Some games are more cpu-bound than gpu-bound, like HL2 for example, but try the FEAR demo - that game puts the smack down on any gpu combinantion available today, even dual 7800gtx's. Farcry is also stressful on the gpu if you max out the settings at 16x12. IMO, if you get slower fps when you increase the resolution, then you cant blame it on the cpu.
 

DRavisher

Senior member
Aug 3, 2005
202
0
0
Originally posted by: munky
Originally posted by: n7
Thank god some users here have found out & mentioned what i have been saying forever: Most games these days are CPU-limited not GPU-limited.

Getting that 7800GTX for your 1280x1024 LCD with your A64 3000+ = teh jokes

What about an a64 3000+ @ 2.6 ghz, with a 1600x1200 monitor? That should give the gtx a good workout, unless you're running quake3 or something. Some games are more cpu-bound than gpu-bound, like HL2 for example, but try the FEAR demo - that game puts the smack down on any gpu combinantion available today, even dual 7800gtx's. Farcry is also stressful on the gpu if you max out the settings at 16x12. IMO, if you get slower fps when you increase the resolution, then you cant blame it on the cpu.

I don't really think it matters if the CPU is the bottleneck if your CPU can drive the game to sufficient FPS. If you CPU is limiting you to 60FPS at 640x480, so what? If your rig can play the same game with all high settings at 1600x1200 with 60FPS as well, then having a weak CPU with a powerful GFX card is better than being able to play at 200FPS in 640x480, but not playable at all at 1600x1200. It seems to me like there is no game out there today that needs any more than an A64 3000+. Or are there games that will require that much more CPU power to play at high res? I really don't think they should.

And perhaps some on topic here. I think theinquirer is right. 16 pipes, and I doubt that the R520 has 32 pipes, just half of them disabled. What a waste of silicone that would be. And if that were the case, then I don't think performance / MHz / pipe will have gone much up from the R400 series.
 
Jun 14, 2003
10,442
0
0
Originally posted by: DRavisher
Originally posted by: munky
Originally posted by: n7
Thank god some users here have found out & mentioned what i have been saying forever: Most games these days are CPU-limited not GPU-limited.

Getting that 7800GTX for your 1280x1024 LCD with your A64 3000+ = teh jokes

What about an a64 3000+ @ 2.6 ghz, with a 1600x1200 monitor? That should give the gtx a good workout, unless you're running quake3 or something. Some games are more cpu-bound than gpu-bound, like HL2 for example, but try the FEAR demo - that game puts the smack down on any gpu combinantion available today, even dual 7800gtx's. Farcry is also stressful on the gpu if you max out the settings at 16x12. IMO, if you get slower fps when you increase the resolution, then you cant blame it on the cpu.

I don't really think it matters if the CPU is the bottleneck if your CPU can drive the game to sufficient FPS. If you CPU is limiting you to 60FPS at 640x480, so what? If your rig can play the same game with all high settings at 1600x1200 with 60FPS as well, then having a weak CPU with a powerful GFX card is better than being able to play at 200FPS in 640x480, but not playable at all at 1600x1200. It seems to me like there is no game out there today that needs any more than an A64 3000+. Or are there games that will require that much more CPU power to play at high res? I really don't think they should.

And perhaps some on topic here. I think theinquirer is right. 16 pipes, and I doubt that the R520 has 32 pipes, just half of them disabled. What a waste of silicone that would be. And if that were the case, then I don't think performance / MHz / pipe will have gone much up from the R400 series.


pretty good point you make there

id much rather be cpu bound....means i can get the same performance with even better IQ. i have a A64 3200, so ill be cpu in some games already. but its not like my cpu is achingly slow in these games
 

monster64

Banned
Jan 18, 2005
466
0
0
Anybody wanna run FEAR with really low physics and then really high physics to see if the game is gpu cpu bound? Think, definately no game, not even FEAR could be that hard on gpu. Because people are getting really low framerates with really good gpus, but just ok cpus, it makes complete sense if the game is cpu bound. even in the recommended system requirements you need a 3 GHz pentium. Thats unheard of. All these things clearly show the game is cpu bound. Again, like I said, if someone wants to run fear on high physics and low physics with some kind of fps log and show us the results that would be great.
 

DRavisher

Senior member
Aug 3, 2005
202
0
0
Originally posted by: monster64
Anybody wanna run FEAR with really low physics and then really high physics to see if the game is gpu cpu bound? Think, definately no game, not even FEAR could be that hard on gpu. Because people are getting really low framerates with really good gpus, but just ok cpus, it makes complete sense if the game is cpu bound. even in the recommended system requirements you need a 3 GHz pentium. Thats unheard of. All these things clearly show the game is cpu bound. Again, like I said, if someone wants to run fear on high physics and low physics with some kind of fps log and show us the results that would be great.

Ditto. Would be very interesting. But I do hope that the crappy performance I get in FEAR is because of crappy code that will be better in the final release, not my 3500+ lagging behind...
 

Paratus

Lifer
Jun 4, 2004
17,427
15,304
146
Originally posted by: Rollo
Originally posted by: Paratus
I'm sure both Ronin and Rollo will each have two of those 10 cards in XFire mode within two weeks


(And then they say how it doesn't stack up to Nvidia )

Ronin maybe, I'm not planning to buy this card till I can get a deal on one. With 7800GTX SLI, I have no reason to pay early adopter fee for the R520.

I was more thinking that you'd want one of the alleged 650mhz 32 pipe cards since we know you like "interesting" cards and a rare high end card would fit the bill.

Granted as you said, with 7800GTX SLI it's not like you'd need the gpu power.

 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
Originally posted by: monster64
Anybody wanna run FEAR with really low physics and then really high physics to see if the game is gpu cpu bound? Think, definately no game, not even FEAR could be that hard on gpu. Because people are getting really low framerates with really good gpus, but just ok cpus, it makes complete sense if the game is cpu bound. even in the recommended system requirements you need a 3 GHz pentium. Thats unheard of. All these things clearly show the game is cpu bound. Again, like I said, if someone wants to run fear on high physics and low physics with some kind of fps log and show us the results that would be great.

I could try that, but there's an even easier way. Run the game at 800x600 and then at 12x10 (16x12 would be better but I dont think the demo supports it). If your fps decreases, then it is gpu bound.

I cant believe all the people who fall for the hype how powerful the gpu is, especially the 7800gtx. Regardless of what NV marketing says, any gpu can be brought to a crawl under sufficient load. Anyone remember what happened in the original geforce256 era, when HW T&L was all the rage? In case you dont, it turned out that if you used more than one light source in the scene, then the performance was actually LOWER than when doing SW T&L using the cpu. And from reading all the marketing BS on the Nv website, you'd think they reinvented the wheel or something with the gf1.
 

DRavisher

Senior member
Aug 3, 2005
202
0
0
munky: The FEAR demo did support 16x12 in my setup. Increasing the resolution is the classical way of doing this, but are we sure that CPU load does not increase somewhat in higher resolutions? I would like to see the low/high physics too, as it would be interesting to se how much of a toll the physics engine puts on the CPU. Would be great if you bothered to do this
 

Ronin

Diamond Member
Mar 3, 2001
4,563
1
0
server.counter-strike.net
How, pray tell, can you put such stock in a card that only has 10 total in existence (and has proven time and again that it's not a stable part to make en mass). The 16 pipe card won't beat a 7800 GTX, period. The 24 pipe card will most likely provide flippy flop performance (meaning it's on par with the 7800, where each does their own things better). The 32 pipe card is vaporware as far as the market is concerned.

And to the dude that said the 7800's were 1k when they were released, stop smoking crack. You could buy them on release day for $600. Like the rest of you folks, I'm very interested to see these cards (and no, I don't have either an R520 part, or a Crossfire part, right now), and to see how they perform. I'm hoping that ATi gets the kinks worked out, because nVidia is working them right now, and they really need to bounce back (I'm a firm believer in healthy competition. It's what keeps the companies at least semi-honest when it comes to price, and it drives technology).

All speculation has shown both Crossfire and R520 released next month. For ATi's sake, I hope that's true.
 

ZobarStyl

Senior member
Mar 3, 2004
657
0
0
Originally posted by: munky
I cant believe all the people who fall for the hype how powerful the gpu is, especially the 7800gtx. Regardless of what NV marketing says, any gpu can be brought to a crawl under sufficient load. Anyone remember what happened in the original geforce256 era, when HW T&L was all the rage? In case you dont, it turned out that if you used more than one light source in the scene, then the performance was actually LOWER than when doing SW T&L using the cpu. And from reading all the marketing BS on the Nv website, you'd think they reinvented the wheel or something with the gf1.
I'm sorry, but how exactly does the T&L capability of a GF1 have anything to do with how powerful the 7800GTX is? What is it are you trying to prove with this sort of dredging up of the past? No one here conjectured that the 7800GTX is capable of running any game ever at any resolution/settings and never be slowed down. The point I believe people were trying to make is that saying a 7800GTX is only marginally faster than an x850XTPE means you weren't actually putting it to the test and were more CPU bound. I'm not taking my facts from nV marketing, but from Anand's own tests.
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
Originally posted by: ZobarStyl
Originally posted by: munky
I cant believe all the people who fall for the hype how powerful the gpu is, especially the 7800gtx. Regardless of what NV marketing says, any gpu can be brought to a crawl under sufficient load. Anyone remember what happened in the original geforce256 era, when HW T&L was all the rage? In case you dont, it turned out that if you used more than one light source in the scene, then the performance was actually LOWER than when doing SW T&L using the cpu. And from reading all the marketing BS on the Nv website, you'd think they reinvented the wheel or something with the gf1.
I'm sorry, but how exactly does the T&L capability of a GF1 have anything to do with how powerful the 7800GTX is? What is it are you trying to prove with this sort of dredging up of the past? No one here conjectured that the 7800GTX is capable of running any game ever at any resolution/settings and never be slowed down. The point I believe people were trying to make is that saying a 7800GTX is only marginally faster than an x850XTPE means you weren't actually putting it to the test and were more CPU bound. I'm not taking my facts from nV marketing, but from Anand's own tests.


What, did you expect the 7800gtx to be twice as fast as an x850xtpe? The fact that it's only marginally faster doesnt necessarily mean the game is cpu-bound. When the x850 gets ~6400 3dmarks and the 7800 gets ~7500, that should tell you something about the shader capabilities of each card, and in several games like Farcry and Chaos Theory, it's only marginally faster as well.

Different game engines will stress the cards in different ways, and while the 7800 might be a lot faster at some things, it will only be marginally faster or equal at other things. That does not mean it's being held back by the cpu.
 

DRavisher

Senior member
Aug 3, 2005
202
0
0
munky: To people with high resolution screens it is often well over twice as fast as a x850PE. Only in special cases, or when the game runs extremely well on either card anyway will you be seeing the x850PE anywhere near the 7800GTX.
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: Paratus

I was more thinking that you'd want one of the alleged 650mhz 32 pipe cards since we know you like "interesting" cards and a rare high end card would fit the bill.

Granted as you said, with 7800GTX SLI it's not like you'd need the gpu power.


Good point. It would be pretty sweet to have one of the alleged ten, but probably beyond my reach.
 

Matt2

Diamond Member
Jul 28, 2001
4,762
0
0
Originally posted by: n7
Thank god some users here have found out & mentioned what i have been saying forever: Most games these days are CPU-limited not GPU-limited.

Getting that 7800GTX for your 1280x1024 LCD with your A64 3000+ = teh jokes

Maybe, but for a long term investment (which is how I buy my hardware) a 7800GTX will let you play at 1280x1024 with all the eyecandy and AA/AF for a LOOONG time.
 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
Originally posted by: munky
Originally posted by: ZobarStyl
Originally posted by: munky
I cant believe all the people who fall for the hype how powerful the gpu is, especially the 7800gtx. Regardless of what NV marketing says, any gpu can be brought to a crawl under sufficient load. Anyone remember what happened in the original geforce256 era, when HW T&L was all the rage? In case you dont, it turned out that if you used more than one light source in the scene, then the performance was actually LOWER than when doing SW T&L using the cpu. And from reading all the marketing BS on the Nv website, you'd think they reinvented the wheel or something with the gf1.
I'm sorry, but how exactly does the T&L capability of a GF1 have anything to do with how powerful the 7800GTX is? What is it are you trying to prove with this sort of dredging up of the past? No one here conjectured that the 7800GTX is capable of running any game ever at any resolution/settings and never be slowed down. The point I believe people were trying to make is that saying a 7800GTX is only marginally faster than an x850XTPE means you weren't actually putting it to the test and were more CPU bound. I'm not taking my facts from nV marketing, but from Anand's own tests.


What, did you expect the 7800gtx to be twice as fast as an x850xtpe? The fact that it's only marginally faster doesnt necessarily mean the game is cpu-bound. When the x850 gets ~6400 3dmarks and the 7800 gets ~7500, that should tell you something about the shader capabilities of each card, and in several games like Farcry and Chaos Theory, it's only marginally faster as well.

Different game engines will stress the cards in different ways, and while the 7800 might be a lot faster at some things, it will only be marginally faster or equal at other things. That does not mean it's being held back by the cpu.

Did you, or did you not see Anands article on release day? 7800GTX hits the ground running? The X850XTPE gets heal marks in it from being stomped by a 7800GTX. And a lot of those tests Anandtech did, never went higher than 1600x1200. Some did, but not nearly all. This of all things should tell you that 3DMark means exactly crap.

 

ZobarStyl

Senior member
Mar 3, 2004
657
0
0
Originally posted by: munky
What, did you expect the 7800gtx to be twice as fast as an x850xtpe? The fact that it's only marginally faster doesnt necessarily mean the game is cpu-bound. When the x850 gets ~6400 3dmarks and the 7800 gets ~7500, that should tell you something about the shader capabilities of each card, and in several games like Farcry and Chaos Theory, it's only marginally faster as well.
I'm not sure what your definition of marginal is, but in Anand's tests but the closest the x850XTPE came to the 7800GTX was at 16x12 4xAA and the 7800 was still 40% faster. The others were 66% and 74% faster than the x850XTPE. That's not exactly marginal. Either you have very strange definitions of 'marginal' and 'faster', or your fanboy is showing. I'm thinking the latter. Either way, stop diverting the thread; it was supposed to be about the new R520's, not about you talking about GF1's and 3DMark scores, as if either matter.
 

ElFenix

Elite Member
Super Moderator
Mar 20, 2000
102,389
8,547
126
the text from vr-zone:

Our friends at HKEPC has gotten themselves a R520 card and it will probably be officially known as Radeon X1800 when launch in early October. Interestingly, there are 16, 24 and 32 pipes versions but there are only 10 pieces of the 32 pipes versions exist currently. The core clock of the card is 650Mhz+ and memory at 1400Mhz. The R520 card uses 1.26ns GDDR3 memories. The 32 pipes R520 performance will exceed the GeForce 7800GTX.


of course, no one has ever said exactly how many wafers were run off to make that 10 parts.
 

Elfear

Diamond Member
May 30, 2004
7,159
811
126
Originally posted by: keysplayr2003

Did you, or did you not see Anands article on release day? 7800GTX hits the ground running? The X850XTPE gets heal marks in it from being stomped by a 7800GTX. And a lot of those tests Anandtech did, never went higher than 1600x1200. Some did, but not nearly all. This of all things should tell you that 3DMark means exactly crap.

Actually, in this case, i would say 3DMark is pretty accurate. I compiled a quick chart of Anand's benchmark's from the article you quoted. I just took the results for 1600x1200 4AA/8AF as that's generally the max for the overwhelming majority of gamers. Here are the results with the % increase of the 7800GTX over the X850XT PE added in.

  • 7800..........................X850..........% Increase
  • BF2.......53.4...........BF2......39.6....25.8%
  • Doom3..54.2...........Doom3.40.5....25.3%
  • Eve.....52.2.............Eve.....50.9....2.5%
  • Everquest 2..28.......Everquest 2..21.5....23.2%
  • Guild Wars..55.1......Guild Wars..54.5....1.1%
  • Half-Life 2..119........Half-Life 2...100....16.0%
  • Splinter Cell: CT...56.5....... Splinter Cell: CT..39.4....30.3%
  • Star Wars: KotoR 2...52....Star Wars: KotoR 2..47.4....8.8%
  • Tiger Woods 2005..39.9....Tiger Woods 2005..36.2....9.3%
  • UT2004........73.9.......UT2004....60.7....17.9%
  • Wolfenstein.....98.5......Wolfenstein...74.2....24.7%
............................................................................Avg 16.8%

Sorry the chart is a little hard to read.

Now if you take the average 3DMark scores that review sites were getting when the 7800GTX launched which was ~7700 and the average for X850XT PE which is ~6400 (with everything else being equal) that's a % difference of ~16.9%. Pretty close if you ask me. Not every game ever made was included in the results, but a good sample was.

Things will change of course if the resolution is increased since the 7800GTX really shines then. I must admit though that the 7800GTX does better than I originally thought from my own benchmarks. I guess it really depends on the games you play.
 

biostud

Lifer
Feb 27, 2003
19,500
6,559
136
Originally posted by: ElFenix
the text from vr-zone:

Our friends at HKEPC has gotten themselves a R520 card and it will probably be officially known as Radeon X1800 when launch in early October. Interestingly, there are 16, 24 and 32 pipes versions but there are only 10 pieces of the 32 pipes versions exist currently. The core clock of the card is 650Mhz+ and memory at 1400Mhz. The R520 card uses 1.26ns GDDR3 memories. The 32 pipes R520 performance will exceed the GeForce 7800GTX.


of course, no one has ever said exactly how many wafers were run off to make that 10 parts.

1 chip pr. wafer, it's VERY large even though it's 90nm
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
Originally posted by: ZobarStyl
Originally posted by: munky
What, did you expect the 7800gtx to be twice as fast as an x850xtpe? The fact that it's only marginally faster doesnt necessarily mean the game is cpu-bound. When the x850 gets ~6400 3dmarks and the 7800 gets ~7500, that should tell you something about the shader capabilities of each card, and in several games like Farcry and Chaos Theory, it's only marginally faster as well.
I'm not sure what your definition of marginal is, but in Anand's tests but the closest the x850XTPE came to the 7800GTX was at 16x12 4xAA and the 7800 was still 40% faster. The others were 66% and 74% faster than the x850XTPE. That's not exactly marginal. Either you have very strange definitions of 'marginal' and 'faster', or your fanboy is showing. I'm thinking the latter. Either way, stop diverting the thread; it was supposed to be about the new R520's, not about you talking about GF1's and 3DMark scores, as if either matter.

I'm referring to your post:
The point I believe people were trying to make is that saying a 7800GTX is only marginally faster than an x850XTPE means you weren't actually putting it to the test and were more CPU bound.

And 17% speed average advantage (see post above) is actually less than I expected from a next gen high end card.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |