yes, they are. theyre also closer to deciding that their userbase will be too stupid to use their system correctly. I dont think you realize this: 1. vsync is designed to limit the fps, no matter what (ok, i think you got THAT much.) That means that with vsync enabled, assumin your hardware can hit the limit, ANY card will give you the same thing. be it a 66GT that can get up to 40, or a 68 that can hit 90. (i think you MIGHT have gotten that much too) Alright, next point. I have never personally had my fps, in ANY game, go past 32 with vsync enabled, usually 30. NEXT point. Tearing is caused exclusively by your monitor being unable to physically keep up with the updated frame rate. Think of it this way. if i can flash 50 flashcards in a minute, and someone hands me a stack of 90, i just cant flash those 90 in that minute, and ill have to catch up, probably by skipping some. The tear you see is that catching up. live with it, or get a monitor that is good enough for your graphics cards. which brings me to my next point. maybe you cant get a monitor that will. the speed of electronics DOES doubble ever 18 months, but thats not to say every manufacturer has the upgrade on that timescale either. I do believe pixel shader v.3.0 was out a good while ago, but cards of the time were physically unable to handle the demand it introduced. the programmers didnt get to see the bounty of their creation until the new wave was out. This means, sli can physically hit frigging astronomical fps, but your monitor probably cant.
all in all, its not really sli's screw up, its yours.
also, this isnt really solarflare, its his friend, so dont bother banning him.