Negative progress in monitor technology?

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Gunbuster

Diamond Member
Oct 9, 1999
6,852
23
81
A Dell U3011 (2560x1600) costs less then what I paid for a 24" 1920x1200 back in the day. I call that pretty good. Though I would not complain if they dropped the price to $400...
 

Insomniator

Diamond Member
Oct 23, 2002
6,294
171
106
The only reason the ~23 inch monitors cost 150 bucks is because its like 3 panels used a million times by different manufacturers. If I had to put a number on it I'd say literally 1% of people need/want a higher resolution monitor... even gamers aren't clamoring for it because of gpu requirements. So no gamers and no mainstream users.... that leaves professionals that can probably afford 800 dollars if it helps their work anyway.

When the next HD push is made then we'll see more screens with whatever new standard they try to push... but its gonna be a harder sell this time. The difference between the average persons 4:3 Magnavox tv to their new 42 inch 1080p is A LOT bigger than whatever the next jump we make is. GL getting people to buy new HDTV's all over again.
 

Arkadrel

Diamond Member
Oct 19, 2010
3,681
2
0
Where are the higher resolution monitors? Nearly 3 years ago I bought a 2048×1152 dell monitor on some deal for less than $300. Yet today I still don't see anything larger than 1920X1200 at newegg for less than $800. What is with the lack of progress, or worse, negative progress? Most screens are actually getting smaller, 1920X1050 being the norm. WTF?


Resolution displayed, and the physical size of the moniter are 2 differnt things.

The progress? LCDs are more calm pictures (more friendly for your eyes), take up less space, use less power, ect ect... only issue usually their refresh rates compaired to CRTs.

But now with 120hz LCDs for 3D gameing, even thats kinda gone.

Also whats wrong with 1920x1080? Even Blu-ray doesnt come with higher res.
(and not that many people game at higher resoultion that that either).
 
Last edited:

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Why? I sit about 18in away from my monitor when I game. My 23in fills up my entire field of vision.

Ok but on a 23 inch monitor, objects are still tiny. So you are much more aware that you are playing a video game.

I find a larger screen provides much more immersion. Would you rather watch movies/television on a 50-60 inch screen or a 30 inch screen? I'd much rather watch movies 10 feet away on my 50 inch plasma than on my 37 inch LCD. Similarly, I'd rather game on a 37-50 inch than on a 19-24 inch monitor. I can simply sit farther away from the screen to the point where my eyes can't discern the pixels anyway. And yet on a larger screen, objects appear larger, which makes them more lifelike to me. For instance on a 50 inch Plasma a gun looks about as big as it would be in real life. That's a lot more realistic to me than gaming on a small monitor with insane resolution.

If you have a much larger monitor, you aren't going to sit just 18 inches away from it. I think some people are just too caught up that higher resolution is better. Sit 5 feet away and I bet you cannot tell the difference between 720P and 1080P gaming on a 42 inch plasma. And yet Plasma will have superior response time/refresh rate, superior black levels, color, etc. to any PC monitor.

I mean, the iPhone has better resolution per pixel than an IMAX movie theatre. But which provides the more impressive watching experience?

LCDs/LEDs need to be replaced by something with far higher image quality. Resolution is but a tiny factor of what makes a good quality screen. A 30 inch 1080P plasma panel at 5 feet away will provide FAR superior image quality to a 30 inch 1080P LCD/LED panel simply because of other more important aspects such as viewing angles/black levels that trump resolution when it comes to image quality. Even the best 30 inch PC monitors don't hold a candle to the best plasma screens. Unfortunately, plasma isn't very good for 2D work due to static images.

You only have to look at MacBook Air vs. Asus X31. The X31 has higher resolution, but the overall quality of the screen (contrast ratio/viewing angles, etc.) is better on the Air. Most people like to compare resolutions because it's a # that's a lot easier to compare than often misleading contrast ratio, response time specs.

The problem is current LCD/LED technology does not provide the best image quality in the first place. You can increase resolution as many times as you want, it's not going to overcome the inherent flaws of inferior technology. Perhaps we'll see OLED or AMOLED screens 30 inches in size in the next 5-10 years.
 
Last edited:

KingstonU

Golden Member
Dec 26, 2006
1,405
16
81
Why? I sit about 18in away from my monitor when I game. My 23in fills up my entire field of vision.

I don't want to sit 18in away from my monitor to play games sitting at my desk like a student anymore. I want to sit 8 feet away on my living room couch and play my PC games, watch TV, and browse the internet on the same screen, in comfort.

1080p on a 40" from 8 feet away is fine for gaming and watching shows, but I can't use it also for my regular computer needs as the low ppi strains my eyes too much. So right now I have to have 2 screens, and I have to get up and go to the desk and vice versa if I want to switch tasks.
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
I don't want to sit 18in away from my monitor to play games sitting at my desk like a student anymore. I want to sit 8 feet away on my living room couch and play my PC games, watch TV, and browse the internet on the same screen, in comfort.

1080p on a 40" from 8 feet away is fine for gaming and watching shows, but I can't use it also for my regular computer needs as the low ppi strains my eyes too much. So right now I have to have 2 screens, and I have to get up and go to the desk and vice versa if I want to switch tasks.

Pretty sure you can do that now, pony up for a home pc theater. Sounds like you need an xbox 360 and a 50" screen :sneaky:
 
Last edited:

Athadeus

Senior member
Feb 29, 2004
587
0
76
$350 for 22" Samsung 226BW 1680x1050
3.5 years pass
$140 for 23" Asus VH236H 1920x1080
Thank you, come again.
 

Zorander

Golden Member
Nov 3, 2010
1,143
1
81
I think it depends on which price class of monitor we are looking at. The downward pressure in quality is IMO found at the lower end.

I don't personally see any IQ degradation on my subsequent jumps from the Sony G420 -> Dell 2709W -> Dell 3011U. At the lower end, the 1280x720 LED screen on my $350 netbook is simply atrocious compared to the 1024x768 screen on my 2003 Compaq notebook ($2000 back then). The 'just ok' 22-23in TN monitors at work are about similar in IQ to the more current TN monitors my friends use (which probably cost even less).

Also, until we have truly affordable (sub-$200 IMO) GPUs that can comfortably handle 2560 gaming, I doubt we will get higher resolution screens. Having the right horsepower for that many pixels is just too expensive for most people. That, and it's just more economical to mass produce something with standardised specs (HD-spec ).
 
Last edited:

BrightCandle

Diamond Member
Mar 15, 2007
4,762
0
76
The last time the mass market and cheap monitors become so dominate the previous generation of monitors all but disappeared. Its very hard to find a 4:3 monitor now, the 16:10 LCD's wiped them out. I wonder if the same thing will happen with 16:9 monitors. Right now they represent only 25% of the market to 16:10's 43% but based on the relative availability I would say they are likely outselling 16:10.

I am not a fan of the ever decreasing vertical space. For my IDE I need that additional vertical space just as much as the horizontal.
 

liddabit

Member
Jun 17, 2011
45
0
0
I have an old Acer 24inch p243w and it is only 1920x1200. Taking a look at some new screens makes me kinda scared of losing this one. I love it! The new ones seem very uncomfortable in my price range. Darkness at weird angles, not as sharp, and some of them seem like you can see pixels 0.o Its weird though, looking at the new ones I only see the pixels if I focus on them , but then they go away, and come back.. its almost like its playing with my emotions :O! I have to deal with those screens at work
 
Last edited:

ViRGE

Elite Member, Moderator Emeritus
Oct 9, 1999
31,516
167
106
Yeah, that, I think, is a big problem (the lack of compression) We could save a lot of bandwidth if we required monitors to support some sort decompression algorithms. Even a lossless algorithm would save some significant amounts of bandwidth.
The problem with a lossless algorithm is that it doesn't have a fixed compression ratio, which means the worst case scenario is that you have an image that you can't compress at all. And since you need to be able to handle that scenario, compression gets you nothing.
 

Bryf50

Golden Member
Nov 11, 2006
1,429
51
91
Things have gotten really bad in laptop screen world. There are literally no 13"-14" Windows laptops on the market with a good screen short of the $2000 Vaio. It's like the panel manufacturers don't even make them anymore. Every screen has a terrible 100-200:1 contrast ratio and vertical viewing angles worse than anything I've ever seen. Not to mention they're all shiny like a mirror and don't have the brightness to back it up.
 
Last edited:

keyser fluffy

Junior Member
Sep 30, 2011
8
0
0
I didn't see the 16:9 over 16:10 thing coming, but most people (the other 99.99% of the world that don't read Anandtech) wouldn't even notice, they see both as widescreen and that's all they care about. 16:9 is annoying, because it's backwards, it's less space and further away from the Golden Ratio (which is my favourite ratio).

My next monitor will be an upgrade to 120Hz, 2560×1440 and maybe slightly bigger than 24". If you wear glasses btw you lose about an inch in monitor size, but your penis remains the same.
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
Agreed on that part, I hate 16:9. 16:10 is much, much better and its surprising its not included with more monitors. Especially on 1080p 27" TN panels? Why would they not include 1920x1200, doesn't make sense.

Anyway, 27" IPS are starting go down in price, hopefully they'll be the "mainstream" thing soon. Then the OP will be happy
 

KingstonU

Golden Member
Dec 26, 2006
1,405
16
81
Pretty sure you can do that now, pony up for a home pc theater. Sounds like you need an xbox 360 and a 50" screen :sneaky:

Yes this is a solution, but my preferred solution would be to have 1 machine (my PC) hooked up to 1 screen that solves all my needs and doesn't require me to get up whenever I switch tasks (which like many people today, I switch tasks quite frequently).
 

Cogman

Lifer
Sep 19, 2000
10,284
138
106
The problem with a lossless algorithm is that it doesn't have a fixed compression ratio, which means the worst case scenario is that you have an image that you can't compress at all. And since you need to be able to handle that scenario, compression gets you nothing.

Buffering and frame drops perhaps? If 3 frames in a row are completely uncompressable then drop the next frame. That should still reduce the bandwidth while providing pretty acceptable display (especially in work environments where the screen is more or less static)

Lossy would be the way to go, for sure, but there are licensing issues that really hinder things. You could cut the bandwidth requirements in half, easily, and end up with images that are almost always an exact representation of the screen. High motion scenes would be where the pixels would start to have a higher probability of not being exact replicas. (though, in those cases most people don't care).
 

zlejedi

Senior member
Mar 23, 2009
303
0
0
It's a sad, sad market filled with fail.

I cannot believe the lack of advance with resolutions available to us consumers.

We've gone from 1920x1200 down to 1920x1080, and 2560x1600 to 2560x1440.

And we've had those resolutions forever...it appears nothing higher is going to be an option forever at a consumer level for us.

Sickening.

I don't care about resolution that much since it keeps GPU costs in check but the lack of quality progress is sad.

7 or 8 ? years ago I bough 16ms 17" TN screen for 500 euro
3 or 4 years ago I bought 24" A-MVA screen for 500 euro
today i can buy 24" IPS panel with uniformity problems for 400 euro (assuming we are speaking about full pivot and regulation options ones) which would be sidegrade at most.

Sadly nothing is going to change till they introduce OLED screens in mass market displays
 
Last edited:

fuzzymath10

Senior member
Feb 17, 2010
520
2
81
One possibility is that we've only seen things at the bottom get worse which affects overall perception of what is available.

You can get good displays on laptops, but rarely on laptops <$1000 (or even $1500). However, you could hardly get a laptop for that much several years ago (especially if you factor inflation) and if it did, its screen was just as terrible.

Before I got my Latitude D630 in 2008 (with its horrible 14" 1280x800 TN LCD, for $800), I had an Inspiron 8500 from 2003 with a fantastic 15.4" 1680x1050 LCD (and upgrading to 1920x1200 was another $100-150), but that laptop was around $2500 back then. I remember looking at a 15" 4:3 Toshiba that had an amazing 1600x1200 display, but it was also $3000+. Cheap laptops have always had bad screens, but cheap laptops were much less common before. Also, paying more than $2000 (in 2011 dollars no less) is considered rare these days, while it was fairly common even 5 years ago, and it is mostly in the expensive laptops that good LCDs are easier to find.

Considering how cheap the laptops we are looking at today, it does seem plausible that it takes Mac-class pricing to find good displays.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Things have gotten really bad in laptop screen world. There are literally no 13"-14" Windows laptops on the market with a good screen short of the $2000 Vaio. It's like the panel manufacturers don't even make them anymore. Every screen has a terrible 100-200:1 contrast ratio and vertical viewing angles worse than anything I've ever seen. Not to mention they're all shiny like a mirror and don't have the brightness to back it up.

Well the MacBook Air has a nice display, and that costs ~ $1300.
 

iCyborg

Golden Member
Aug 8, 2008
1,342
59
91
Buffering and frame drops perhaps? If 3 frames in a row are completely uncompressable then drop the next frame. That should still reduce the bandwidth while providing pretty acceptable display (especially in work environments where the screen is more or less static)

Lossy would be the way to go, for sure, but there are licensing issues that really hinder things. You could cut the bandwidth requirements in half, easily, and end up with images that are almost always an exact representation of the screen. High motion scenes would be where the pixels would start to have a higher probability of not being exact replicas. (though, in those cases most people don't care).
Another problem is that frames might be coming from multiple sources (apps/windows) that are handled by OS (e,g, DWM in Win and Compiz in Linux) so you'd need to encode this in real-time. And you need to be sending out at 60Hz for regular lcd monitors - try switching your monitor to 24Hz, it's noticeable right away. At high resolutions, that'll need some serious dedicated hardware. You'd need to decode as well at the monitor, again in real-time. All that will also introduce additional lag.

I mentioned once that there's something called Panel Self Refresh where drivers can detect if the image is static and instruct the monitor to repeat the image from its cache (monitor has to support PSR too). It's not really used for bandwidth reasons, but for laptops to save battery.

http://www.hardwaresecrets.com/article/Introducing-the-Panel-Self-Refresh-Technology/1384
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
Yes this is a solution, but my preferred solution would be to have 1 machine (my PC) hooked up to 1 screen that solves all my needs and doesn't require me to get up whenever I switch tasks (which like many people today, I switch tasks quite frequently).

This is definitely not how most people use their computer. Most people use their computers and prefer sitting close to their monitor, so any market for a super huge display to sit 10 feet away from isn't meant for the PC market.

I game 10% of the time and do other stuff 90% of the time, and for the other 90% I definitely want to be sitting up close, not squinting from 10 feet away.

The PC isn't considered a home entertainment center, some people play games but people that prefer what you're saying generally stick to consoles.
 

Cogman

Lifer
Sep 19, 2000
10,284
138
106
Another problem is that frames might be coming from multiple sources (apps/windows) that are handled by OS (e,g, DWM in Win and Compiz in Linux) so you'd need to encode this in real-time. And you need to be sending out at 60Hz for regular lcd monitors - try switching your monitor to 24Hz, it's noticeable right away. At high resolutions, that'll need some serious dedicated hardware. You'd need to decode as well at the monitor, again in real-time. All that will also introduce additional lag.
I think this is less of an issue than you are making it. The last stage of a graphics card is already decoding all the output and translating it into bits on the wire. This would simply be adding an encoding stage before placing the bits on the wire.

I mentioned once that there's something called Panel Self Refresh where drivers can detect if the image is static and instruct the monitor to repeat the image from its cache (monitor has to support PSR too). It's not really used for bandwidth reasons, but for laptops to save battery.

http://www.hardwaresecrets.com/article/Introducing-the-Panel-Self-Refresh-Technology/1384
I think this pretty much proves that an encoding stage can be done in a reasonable time frame. It already takes some pretty beefy circuitry to make sure that each pixel is the same.
 

sandorski

No Lifer
Oct 10, 1999
70,663
6,231
126
I think the state of the Economy has a lot to do with these issues. This is just not a great time for introducing new expensive technology and it's also very tempting to cut corners on existing Products/Tech in order to maintain the bottom line.
 

iCyborg

Golden Member
Aug 8, 2008
1,342
59
91
I think this is less of an issue than you are making it. The last stage of a graphics card is already decoding all the output and translating it into bits on the wire. This would simply be adding an encoding stage before placing the bits on the wire.
There already are encoding stages for stuff like color transformations, hardware overlay, formatting (for dvi/hdmi, dp, crt etc.) from frame buffer to actual bits going out. But this is a lot heavier encoding stage, think about what it takes to render 1920x1080 @ 120Hz in real time. And this is something supported over hdmi/dp today. You'd basically need to add something like QuickSynk/VCE, and these are not small features. I'm not sure how fast QS/VCE are either, so maybe I'm wrong, but I'd think 4Kx2K at 60Hz would be a challenge... It is probably cheaper and faster/less lag to introduce a beefier cable, this is basically what they have been doing with different versions of hdmi and dp specs.


I think this pretty much proves that an encoding stage can be done in a reasonable time frame. It already takes some pretty beefy circuitry to make sure that each pixel is the same.
Not really. Windows knows when it draws something, or when a mouse moves etc. Actually the driver itself already knows this because all graphics routines eventually land there. So you don't compare each pixel between two buffers, all you need are some sort of interrupts/flags to the PSR part of the driver to notify it that frame buffer might be changed and he'll have to send it.
And remember that these are for power savings; adding a beefy circuitry to detect static screens would be counterproductive.
 
Last edited:

ViRGE

Elite Member, Moderator Emeritus
Oct 9, 1999
31,516
167
106
Buffering and frame drops perhaps? If 3 frames in a row are completely uncompressable then drop the next frame. That should still reduce the bandwidth while providing pretty acceptable display (especially in work environments where the screen is more or less static)

Lossy would be the way to go, for sure, but there are licensing issues that really hinder things. You could cut the bandwidth requirements in half, easily, and end up with images that are almost always an exact representation of the screen. High motion scenes would be where the pixels would start to have a higher probability of not being exact replicas. (though, in those cases most people don't care).
That could work. But why would you use it? These are last-foot links, there's no reason they need to be lossy. HDMI has plenty of bandwidth if manufacturers actually used it (it's enough to match dual-link DVI), and it wouldn't be particularly hard to further improve it.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |