When talking about higher than your refresh vs. lower, I was thinking in terms of 80 vs 40 FPS. At 300 FPS, you'll have 5-6 tears all the time, causing each tear to be slightly offset from the previous. That said, I personally see it, but I see a lot of people, even professional reviews that act as if there is no visible tearing below 60 FPS, but I think they are not talking about 200+ FPS, and more like 60-100 FPS.
Example:
http://www.hardocp.com/article/2012/04/16/nvidia_adaptive_vsync_technology_review/#.Ugmw5m2JuCU
I do agree that tearing always happens, but I run into a lot of people that seem to think they aren't getting tearing when their FPS are lower than their refresh rate.
I had an discussion about this with Ubercake on Toms. He had tons of links of even professional sites talking as if tearing only happens at high FPS, and he personally couldn't see it when it was below his refresh rate. I could not find one source but the above that even mentioned tearing at FPS below your refresh rate, and it lead with "technically". That said, there is no way you can have 45 FPS without tearing or uneven frame times, ultimately leaving him rethinking his stance.
Tearing happens at all frame rates, this is just a FACT.
For a CRT you can actually calculate the likely average tear lines you'll see, the screen spends approx 95% of the time scanning and 5% of the time moving the electron gun back to the top of the screen, but for easy maths lets say that 100% of the time is spent refreshing, it just makes the concept of tear-scaling easier to grasp.
If you have a frame rate of 60fps and a refresh of 60hz, but it's not in perfect sync (vsync off) then the border between frames will always be somewhere inside your current refresh, so basically at 60fps@60hz you get 1 tear line per refresh or 60 tears per second.
If you have a frame rate higher than your refresh rate, say 120fps@60hz then you've got 120/60=2 frames for every refresh, which means you're likely to see 2 tear lines per refresh.
If you have a frame rate lower than your refresh rate say 30fps@60 then you've got 30/60=0.5 tears per refresh. What does this mean, well you don't have half a tear per frame you have 1 full tear per 2 frames.
Expressed in it's basic form, the number of tears per second you can expect to see is the frame rate you're running at divided by your refresh rate, frame rate is the numerator which means as it goes up the tearing goes up, refresh rate is the denominator which means as the refresh rate goes up the tearing (per refresh) goes down.
Now I said this is effectively for CRTs because they have a known scan time of about 95% of the total refresh period, however I'm not sure how fast LCDs are...do they take most of the refresh period to scan down the pixels, or is that time much smaller? I'm not actually sure it's hard to find numbers on and may differ from model to model.
The smaller that refresh window is the less impact tearing has, if you're at 60hz for example that's 16.6ms per refresh, but it could be that it only takes the LCD 8ms to scan from top to bottom in which case you'd expect on average to see about half the tearing compared to CRT.
But no matter what that window is (because it's constant) it still means that tearing scales linearly with frame rate, the higher the frame rate the more frequently you tear.