Crossfire limited to 16X12 resolution!

Page 5 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Jun 14, 2003
10,442
0
0
Originally posted by: DidlySquat
x-fire is only for bragging rights, it's not going to be a major volume seller (on the contrary) as it makes very little sense to invest in such a setup considering the required mobo, "master" cards, etc, all of which will be quite expensive because of the fact that they'll be only small numbers produced.

R520 as a single card is a different story, but I doubt it's going to be able to outrun my 7800GTX OC.


yea its the same with SLI to an extent, but SLI seems to be a bit more friendly, you dont need master cards or dongles, you dont even need the bridge if you dont want it. and according to gamingphreek the next bunch of drivers from NV are supposed to allow you to mix and match card vendors, so you can have a 6800 from bfg and a 6800 from evga work together.

plus i think alot of ATi's rendering modes are pretty superfolus i mean scissor mode isnt dynamic like NV's, tiling only works on 16 pipe cards, i honestly think SLI will still be better than xfire when it arrives

but either way....yes both are for bragging rights really lol......but everyone likes to be able to brag every now an then
 

Busithoth

Golden Member
Sep 28, 2003
1,561
0
76
This constraint seems un-ATI-like. Surely someone during development would have pointed out that people would scream at limiting resolution to 16x12?

<pats AGP card and starts up another game, beautifully rendered>
 

ArchAngel777

Diamond Member
Dec 24, 2000
5,223
61
91
Originally posted by: DidlySquat
Originally posted by: eastvillager
eh, makes crossfire pointless, imho, if it is true.

The only people who need an sli/crossfire type of solution are people gaming at 1920x1200(and above). Top-end single card solutions will handle any game at 1600x1200 already.


I game at 1920x1200 (see sig) with a single 7800 GTX which is perfectly capable of maintaining high FPS (above 50 avg) in most of the games that I play, even with all details maxxed out including 4xAA/16xAF.


Yeah, I can attest to that for sure, but on the other hand... SLI does make it better and even more smooth. 50 FPS is great and IMO, pretty gosh darn fast... But if my wife let me spend the money, you can bet I would have another 7800 GTX on my machine. If you want to do high resolution competitive gaming with the best eye candy, look no further than 7800 GTX in SLI... Not to mention you have double the fill rate for things like AA and AF
 

jasonja

Golden Member
Feb 22, 2001
1,864
0
0
Originally posted by: otispunkmeyer
Originally posted by: jasonja
Originally posted by: otispunkmeyer
Originally posted by: Turtle 1
Originally posted by: malG
Originally posted by: jasonja

A 20ms response time equivalent to 50hz. So to answer your question, go look up all the LCD's with 20ms response times (hint, there's an assload of them). Anything with 16ms response times runs up to 60hz. So running above 50hz refresh rate on a 20ms LCD is pointless... you're wasting bandwidth.

You're wrong...if true then why is my Dell 2405FPW (16ms) natively supported refresh rate is 70Hz?

In the article that started this post it clearly states LCD wouldn't be affected. primative CRT'S only


also jasonja is completely wrong....response time is in no way shape or form connected to refresh rate

response time is the time it takes a pixel to change, and well theres no set standard for measuring it so companies will jus post the fastes response they can get.

refresh is simply the number of times the screen is refreshed, nothing to do with pixel response

60hz for all LCD's is fine because LCD's refresh the whole picture at once, and dont do any painting of horizontal lines

How the hell am I wrong ? If the pixel can only change every 20ms and you're refreshing the output to the LCD every 16ms; it's pretty damn obvious that the pixel isn't going to change every 16ms. Internal it's just dropping some of those refreshes on the floor.. thereby wasting bandwidth. So you can refresh the screen as fast as you want to... but the pixels aren't going to update any faster than the response time.



ok

look here and go down to Refresh rate, response time, flicker and motion-blur

Refresh rate is the rate at which the electronics in the monitor addresses (updates) the brightness of the pixels on the screen (typically 60 to 75Hz). For each pixel, an LCD monitor maintains a constant light output from one addressing cycle to the next (sometimes referred to as 'sample-and-hold'), so the display has no refresh-dependent flicker.
There should be no need to set a high refresh rate to avoid flicker on an LCD.


Response time relates to the time taken for the light throughput of a pixel to fully react to a change in its electrically-programmed brightness. The viscosity of the liquid-crystal material means it takes a finite time to reorientate in response to a changed electric field. A second effect (which has a rather more complicated explanation) is that the capacitance of the LC material is affected by the molecule alignment, and so if a step change is brightness is programmed, as the LC realigns the cell voltage changes and the brightness to which it settles is not quite what was programmed. Unless 'overdrive' (which tries to pre-compensate for this effect) is employed, it may take several refreshes before the light output stablises to the correct value. Response rate for dark-to-light is normally different from light-to-dark, and is often slower still between mid-greys. VESA and others define standard ways of measuring response time, but a single figure rarely tells the whole story.
Manufacturers 'response times' rarely tell the whole story.
Unless combined with a strobing backlight, response times much below 16ms are likely to be of only marginal benefit, owing to more-dominant 'sample and hold' effects (see below),


there you go

also a another good link explaining the differences

here


unless of course i may have the wrong end of your stick here. did you mean to say that 20ms = roughly 50Hz, so a pixel can change 50x a second (for one measurement since black to white takes a different amount of time than white to black, and again grey to grey takes a different time too) so theres no point in redrawing the screen above 50hz?

refresh doesnt really matter for LCD anyway, its mearly there because of the analogue outputs that still exist, these work at a frequency. like that link says i think refresh is simply refreshing the brightness. not really the colour


Didn't I say all this already in my first post and then you said I was completely wrong!

posted by ME!
Technically LCD's don't have refresh rates because they don't refresh! That's CRT lingo where it mattered. LCD's don't flicker so refresh rates mater less. LCD's are measured in response time. A 20ms response time equivalent to 50hz. So to answer your question, go look up all the LCD's with 20ms response times (hint, there's an assload of them). Anything with 16ms response times runs up to 60hz. So running above 50hz refresh rate on a 20ms LCD is pointless... you're wasting bandwidth.

We are both saying the same thing. It's pointless to run at refresh rates that equate to refresh rates faster than the response time of the LCD. If your LCD's repsonse time 16ms, 60hz is all you need. If you're LCD is slower than that (and MANY are) than 50hz or less refresh rates may be completely adequate.
 

Ackmed

Diamond Member
Oct 1, 2003
8,498
560
126
It certainly took you long enough Rollo to post this. You were several hours behind other forums. I figured since it was an anti-ATi gossip, you would have been first.

If true, it really, really sucks.
 

ArchAngel777

Diamond Member
Dec 24, 2000
5,223
61
91
Originally posted by: Ackmed
It certainly took you long enough Rollo to post this. You were several hours behind other forums. I figured since it was an anti-ATi gossip, you would have been first.

If true, it really, really sucks.

And of course you have to attack Rollo everytime... Well, here it goes back at ya... Why do you have your wife linked in your sig? It is a 50 X 50 pixel picture where it could be anyone. Could be a man in a swimsuit, hard to tell. Not to mention it would be disgraceful to your wife to display her in a swimsuit on a public forum. Either way, I'd remove the pic.
 

imported_Rampage

Senior member
Jun 6, 2005
935
0
0
The X1800 series appears to not top their Nvidia counterparts.. and the question about pricing and availablility are still up in the air completely.
This "elegant" crossfire is a disaster. Not going beyond 1600x1200 practically kills this technology.
ATI will have a hard time selling any of this junk, yes.. junk.. to anyone.






The message is clear:

ATI has failed.
 

imported_Rampage

Senior member
Jun 6, 2005
935
0
0
ive been saying it since the GF7 launch, theres no reason with such great technology such as a flawless SLI implementation (about as perfect as implementation as one could imagine), and incredibly fast and cool GF7 cards.




Nvidia just needs to go in for the finishing move and end ATI and their "superb" product and driver support forever...

fatality!



There is no truer statement today than that ATI was a one trick pony, and they bought that trick (R300) from an American company.

LOL
 

jasonja

Golden Member
Feb 22, 2001
1,864
0
0
Originally posted by: southpawuni
ive been saying it since the GF7 launch, theres no reason with such great technology such as a flawless SLI implementation (about as perfect as implementation as one could imagine), and incredibly fast and cool GF7 cards.




Nvidia just needs to go in for the finishing move and end ATI and their "superb" product and driver support forever...

fatality!


There is no truer statement today than that ATI was a one trick pony, and they bought that trick (R300) from an American company.

LOL


So how does the Xbox 360 GPU fit into your one trick pony theory? And before you say anything else stupid, the guys from ArtX had nothing to do with it and it's not in any way based on the R300. ATI has been around for 20 years... 15 years before and 5 years after "that one trick pony purchase"
 

sisq0kidd

Lifer
Apr 27, 2004
17,043
1
81
Originally posted by: jasonja
Originally posted by: southpawuni
ive been saying it since the GF7 launch, theres no reason with such great technology such as a flawless SLI implementation (about as perfect as implementation as one could imagine), and incredibly fast and cool GF7 cards.




Nvidia just needs to go in for the finishing move and end ATI and their "superb" product and driver support forever...

fatality!


There is no truer statement today than that ATI was a one trick pony, and they bought that trick (R300) from an American company.

LOL


So how does the Xbox 360 GPU fit into your one trick pony theory? And before you say anything else stupid, the guys from ArtX had nothing to do with it and it's not in any way based on the R300. ATI has been around for 20 years... 15 years before and 5 years after "that one trick pony purchase"

So how does the Xbox 360 GPU fit into your one trick pony theory?

Well, you nor I know that yet now do we?
 

jasonja

Golden Member
Feb 22, 2001
1,864
0
0
Originally posted by: sisq0kidd
Originally posted by: jasonja
Originally posted by: southpawuni
ive been saying it since the GF7 launch, theres no reason with such great technology such as a flawless SLI implementation (about as perfect as implementation as one could imagine), and incredibly fast and cool GF7 cards.




Nvidia just needs to go in for the finishing move and end ATI and their "superb" product and driver support forever...

fatality!


There is no truer statement today than that ATI was a one trick pony, and they bought that trick (R300) from an American company.

LOL


So how does the Xbox 360 GPU fit into your one trick pony theory? And before you say anything else stupid, the guys from ArtX had nothing to do with it and it's not in any way based on the R300. ATI has been around for 20 years... 15 years before and 5 years after "that one trick pony purchase"

So how does the Xbox 360 GPU fit into your one trick pony theory?

Well, you nor I know that yet now do we?


You may not, but don't speak for me.
 

sisq0kidd

Lifer
Apr 27, 2004
17,043
1
81
Originally posted by: jasonja
Originally posted by: sisq0kidd
Originally posted by: jasonja
Originally posted by: southpawuni
ive been saying it since the GF7 launch, theres no reason with such great technology such as a flawless SLI implementation (about as perfect as implementation as one could imagine), and incredibly fast and cool GF7 cards.




Nvidia just needs to go in for the finishing move and end ATI and their "superb" product and driver support forever...

fatality!


There is no truer statement today than that ATI was a one trick pony, and they bought that trick (R300) from an American company.

LOL


So how does the Xbox 360 GPU fit into your one trick pony theory? And before you say anything else stupid, the guys from ArtX had nothing to do with it and it's not in any way based on the R300. ATI has been around for 20 years... 15 years before and 5 years after "that one trick pony purchase"

So how does the Xbox 360 GPU fit into your one trick pony theory?

Well, you nor I know that yet now do we?


You may not, but don't speak for me.

:Q I demand a review from you right now!
 

Ackmed

Diamond Member
Oct 1, 2003
8,498
560
126
Originally posted by: ArchAngel777
Originally posted by: Ackmed
It certainly took you long enough Rollo to post this. You were several hours behind other forums. I figured since it was an anti-ATi gossip, you would have been first.

If true, it really, really sucks.

And of course you have to attack Rollo everytime... Well, here it goes back at ya... Why do you have your wife linked in your sig? It is a 50 X 50 pixel picture where it could be anyone. Could be a man in a swimsuit, hard to tell. Not to mention it would be disgraceful to your wife to display her in a swimsuit on a public forum. Either way, I'd remove the pic.

Its not an "attack", its the truth. When have you ever seen him post an anti-NV thread, or Pro-ATi thread?

I dont care if you would remove the pic or not. I put it there, because a sig to me, is someone trying to "show off". So I mock this, with going on step further and showing off my wife. I kept on getting PM's about what video card and system (mainly card) so I put the sig in there.
 

JonnyBlaze

Diamond Member
May 24, 2001
3,114
1
0
Originally posted by: Ackmed

I dont care if you would remove the pic or not. I put it there, because a sig to me, is someone trying to "show off". So I mock this, with going on step further and showing off my wife. I kept on getting PM's about what video card and system (mainly card) so I put the sig in there.

yeah, uh, except your not really showing anything off. its smaller than a postage stamp.
 

Ackmed

Diamond Member
Oct 1, 2003
8,498
560
126
Originally posted by: JonnyBlaze
Originally posted by: Ackmed

I dont care if you would remove the pic or not. I put it there, because a sig to me, is someone trying to "show off". So I mock this, with going on step further and showing off my wife. I kept on getting PM's about what video card and system (mainly card) so I put the sig in there.

yeah, uh, except your not really showing anything off. its smaller than a postage stamp.


Its not hard to figure out how to see the larger version.
 

JonnyBlaze

Diamond Member
May 24, 2001
3,114
1
0
Originally posted by: Ackmed
Originally posted by: JonnyBlaze
Originally posted by: Ackmed

I dont care if you would remove the pic or not. I put it there, because a sig to me, is someone trying to "show off". So I mock this, with going on step further and showing off my wife. I kept on getting PM's about what video card and system (mainly card) so I put the sig in there.

yeah, uh, except your not really showing anything off. its smaller than a postage stamp.


Its not hard to figure out how to see the larger version.

feed that women a sammich!
 

Ackmed

Diamond Member
Oct 1, 2003
8,498
560
126
Sorry, does no good. She doesnt gain weight. That pic is over 10 years old, was taken on our honeymoon. Shes still the exact same size, even a few pounds less I think. After two kids.. and 10+ years of marriage.

But back on topic... I really doubt ATi would limit Crossfire to 1600x1200/60Hz. I dont buy into to these rumors. There are already counter-argument rumors against it. Who knows what the truth is. A few more weeks and we'll know whats real, and whats not. If it is in fact true, its a serious blow to Crossfire to me. So much so, that it would be worthless to me, as Im far above 1600x1200.
 

Pr0d1gy

Diamond Member
Jan 30, 2005
7,774
0
76
Originally posted by: BFG10K
Yeah, but as long as it does 1280x728 @ 60hz there will still be a market for it honestly
Not for crossfire there won't be. I mean any single mid-range card like the 9800 Pro can manage 1280x728 quite easily.

Not with Unreal Tournament 2007 it won't...I like how people pick one line out of a whole statement & try to discredit it. In 3 years, a Crossfire x850xt pe system will still run every new game on my HDTV without any problems. I mean your name is BFG for christ's sake, quit being a fanboy please.

q]Originally posted by: SynthDude2001
Originally posted by: Pr0d1gy
Originally posted by: BFG10K
Most single link DVI connections can drive 1920x1200@60Hz...
That's still totally unacceptable. You should be able to do at least 2048x1536@85 Hz to match what single GPUs can do now.

Yeah, but as long as it does 1280x728 @ 60hz there will still be a market for it honestly. You know there are plenty of people out there who will get this setup for their HDTV/media center PC's to play games & watch movies & such with. Honestly, this setup will probably run all the games that come out for the next 4-6 years on an HDTV screen, so it's going to be a somewhat moot point to many people....meanwhile there are obviously a ton of you who feel differently...lol

But why would you buy a $1000 (presumably) Crossfire setup to play at ~720p? [/quote]

So I don't have to upgrade my PC for the nest 4-5 or more years. Oh and why would it be $1000? You can find X850XT PE's on Ebay for $350 and even less sometimes. Sure the new one will be a little more & you'll need a mobo, but since you're probably upgrading then you can sell your current mobo & gpu anyways. Total cost will more likely be around $700 for the full setup minus old hardware sold.
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: Ackmed
Sorry, does no good. She doesnt gain weight. That pic is over 10 years old, was taken on our honeymoon. Shes still the exact same size, even a few pounds less I think. After two kids.. and 10+ years of marriage.

But back on topic... I really doubt ATi would limit Crossfire to 1600x1200/60Hz. I dont buy into to these rumors. There are already counter-argument rumors against it. Who knows what the truth is. A few more weeks and we'll know whats real, and whats not. If it is in fact true, its a serious blow to Crossfire to me. So much so, that it would be worthless to me, as Im far above 1600x1200.

LOL

I remember it like it was yesterday:
Ackmed trashing SLI because it didn't support WS. (now it does)
Ackmed trashing SLI because it didn't support Win2000. (who cares?)
Ackmed trashing SLI because it didn't support a freaking Apple monitor of all things. (who cares?)

Now the the tide has turned: Crossfire not doing 19X12 at anything above 52Hz, and a lowly 60Hz at 16X12. On ANY monitor- some "high end" solution! So Ackie- who is this going to be a "good" solution for?
 

Pr0d1gy

Diamond Member
Jan 30, 2005
7,774
0
76
It will be a good solution for me and plenty of other people. Is it as good as SLi? Maybe not, but it will supposedly be much more user friendly. I like playing with my computer & settings sometimes, but not NEARLY as much as I just like to play my games on my LCoS HDTV baby!!!
 

fstime

Diamond Member
Jan 18, 2004
4,382
5
81
Stop making threads.

Thanks.

EDIT: Nice thread.

I would like 1900x/1920x if I was going with a highend solution like Crossfire/SLI.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |