Originally posted by: kmmatney
I've never seen a 19" CRT with a "usable" desktop at 1600 x 1200. I guess for games you can get away with the fuzzyness, but every CRT I've ever used (mostly Trinitrons) look like crap at really high resolutions they supposedly support.
If you look at CRT specs, the specs themselves do not make sense. For instance see this monitor:
http://www.azatek.com/details.asp?iid=688
The Dot pitch is 0.244 mm, with a 20" viewable image size.
If you do the math, to get the following required dot pitches to hit certain resolutions:
2048x1536 = 0.179 dot pitch required
1600X1200 = 0.229 dot pitch required
Both of these resolutions require a higher resolution dot pitch than what the monitor is supposedly capable of.
They probably use some form of interpolation.
If my understanding of the technology is correct, a CRT's dot pitch is measured (for shadow masks) in terms of the diagonal distance from like-colored dots (or sub-pixels), the individual RGB components. Each dot is a light-emitting unit that is fixed (in both color and size), in exactly the same way that an LCD has sub-pixels. In this way, CRTs also have a native resolution -- the pixel resolution where every pixel has a single red, single green, and single blue dot associated with it.
To display in other resolutions, CRTs (just like LCDs) use interpolation. That is, if you're running at a lower resolution, then you spread out say 3 pixels worth of data (i.e. brightness values) across 4 pixels. The interpolation algorithm used for this determines how well the monitor can display in other resolutions. Now if you want to go above the monitor's native resolution, you do the same thing -- except this time, you're squeezing 5 pixel's worth of data (for example) across 4 pixels. Yes this also means that there's some blurriness involved, because text that was originally one pixel wide now has to be a fraction of a pixel wide. Oh well. They're hoping you're not looking that closely anyway. By the way, LCDs are just as capable of squeezing higher resolutions onto the panel -- my 19" LCD at home can do 1600 x 1200 even though it's really a 1280 x 1024 monitor. I guess it's helpful to remember that the image's internal state -- how it's organized in the video card and in the monitor drivers -- is different than how the image is actually displayed on the screen. In my case, the monitor's internal circuitry is keeping track of the 1600 x 1200 image, but when it gets sent to the screen for display, it is interpolated into 1280 x 1024 to fit with the screen's set size. Fundamentally, any monitor is being displayed at its native resolution, regardless of whatever the image's resolution is; fundamentally, any monitor (CRT and LCD) needs to do some sort of interpolation work if the image resolution is different from the scren's native resolution.
In other words, the spec saying that it has a maximum resolution of 2048 x 1536 just means that the monitor's internal circuitry is capable of storing that much data. It does not mean though that that much data is what's actually going onto to the screen -- if it's not the monitor's native resolution, there's some loss of data (via interpolation) along the way.
So why are interpolation problems virtually nonexistent on CRTs, while noticeable for LCDs? After all, CRTs (like LCDs) actually fundamentally use dots of fixed size and color. The reason there are no problems, however, is that for CRTs, the dots are equal in dimension (i.e. width and height), whereas for LCDs, each sub-pixel is really narrow but really tall. This means that LCDs are fine in terms of horizontal interpolation, where each dot is less than 0.1 mm across (and Microsoft's ClearType is proof of this), but really suck in terms of vertical interpolation, where each dot is almost 0.3 mm across. By contrast, via some geometry, a CRT with a dot pitch of 0.240 mm (the specs say 0.24 mm not 0.244 mm) has dots that are 0.139 mm across, in any direction (there are 6 adjacent dots to every dot for a CRT). Simply divide the given dot pitch (assuming it's given in terms of distance from dot to same colored dot) by the square root of 3 to get the dot-to-dot distance for shadow mask CRTs. Hence CRTs interpolate equally well in any direction, hence interpolation effects are, for the most part, unnoticeable. Unless you're looking really closely, no one is gonna notice a 0.14 mm difference. For LCDs, however, the vertical interpolation distance is a matter of 0.3 mm, which is very much within the resolution of the eye at normal viewing distances, and hence noticeable. By the way, I'm guessing that this is why fuzziness exists for CRTs once you go above their native resolution -- you're basically looking for data (i.e. the vertical and horizontal lines that make up text) -- that simply isn't there, because it's smeared out due to interpolation.
Something else to keep in mind. CRT manufacturers like to stretch their specs in the same way that LCD manufacturers do. Actually, this is inherent in every industry. For CRTs, though, they'll give the spec in terms of the shadow mask (or aperture grille). However, that is a certain distance away from the screen, so in terms of what you're actually seeing (i.e. the screen image), the dot pitch is actually somewhat bigger.
Given a 20" viewing distance (I'm assuming that means 16" wide and 12" high) and a CRT dot pitch of 0.24 mm, you can figure out the pixel area via geometry to be about 0.05 mm^2, and thus there should be around 2.48 million pixels on that thing, coming out to a native resolution of about 1820 x 1365. And no that's not a regular resolution, I'm expecting there to be some rounding errors (i.e. the dot pitch probably isn't exactly 0.24 but 0.2403 or whatever, not to mention how big each dot is when it actually reaches the screen). So it's fairly similar to 20" LCDs which usually have a resolution of 1600 x 1200 (and a dot pitch of 0.255 mm with a pixel area of about 0.065 mm^2) across the same viewing area.