Has display technology stagnated?

I came across an old Jakob Nielsen column this afternoon, dating from November 1995. He predicted / hoped that, when computer monitors and network connections caught up with the human factors research of the day, we ought to have 27" x 40" displays at a whopping 1200 dpi (that's 1.5 billion pixels), driven at 120 Hz by graphics cards packing close to 5 GB of memory. To make full use of such incredible display capacity, Nielsen calculated that we'd need interconnects capable of passing a terabit or so per second. He estimated that it would take about 17 years, or until 2012, for such technology to become commonplace.

Well, we're closing in on the end of that timeline now, so how close are we to Nielsen's 1995 ideal?

We have the crazy-powerful graphics cards, although they do cost a mint. ATI's FirePro V9800 packs 4 GB of video memory and the ability to render 3D stereoscopic imagery while driving six displays.... if, that is, you can cough up the \$3400 or so needed to get your hands on a V9800 this week. Not that most of us would have any idea what to do with it, as a \$100 commodity video card is more than enough to do just about anything the average user can think of. We have the DisplayPort interconnect for these things, which can pass 17.28 Gbit/s, and we have Gigabit Ethernet (yep, 1 Gbit/s) to get the content to the local machine. So we're still two orders of magnitude short of a terabit video link, and remember that most of us are still very lucky to see an internet connection top 10 Mbit/s. But we've found plenty of ways to work around that particular problem. And we have 120 Hz displays, although mainly as a marketing trick; although this mattered a great deal with cathode-ray tubes, the difference between a 60 Hz and a 120 Hz refresh rate on a flat panel only becomes evident when you try to do stereoscopic 3D.

On the display end, though, it almost feels like we've stagnated. That 48", 1.5 gigapixel display Nielsen imagined? It's still very imaginary. We got to 1920 x 1080 pixels, ie. 1080p HDTV, and... well, we just stopped there. 2.07 million pixels, that's all.  Finding a 1920 x 1200 monitor, a far more convenient size, is difficult at best. 30" IPS panels at 2560 x 1600 are easy to find, but hard to buy (good luck finding one under $1500 or so).

A nice analysis of Apple's recent "retina" displays by retinal neuroscientist Bryan Jones suggests that a pixel pitch of about 60 to 77 per degree (0.78 to 1 arcminutes per pixel) is fine enough that our eyes shouldn't be able to discern individual pixels. At cellphone viewing distances (a foot or so from the eyes), the iPhone 4's 0.078 mm pixel pitch beats this criterion, and anyone who's used one can testify that the 960 x 640 display is pretty easy on the eyes- not as good as printed material, but much nicer than most mobile screens. If we extend out to desktop monitor distances (3 ft / 1 m), these figures suggest we should be able to get away with 0.26 mm pixels. So we don't really need Nielsen's 1200 dpi (0.021 mm pixels) display; the 0.28 mm pixels of my Asus VW266 LCD, or the 0.26 mm pixels on my Acer AL2002, should suffice. Right?

Well, not quite. Individual pixels are just a hair too small to easily distinguish, but fonts and most interface elements are anti-aliased to some degree. I can't see the individual pixels in these letters from a metre away, but I can certainly see the blurring around the letters from the anti-aliasing. Turn AA off, and the font rendering looks blocky and uneven. In order to effectively hide the gray blur of the anti-aliasing, we'd have to fit two pixels- not one- in the allotted 0.26 mm. That would quadruple the pixel counts; my 1920 x 1200 pixel Asus would have to be 3840 x 2400 pixels (0.13 mm or 195 dpi) to do a reasonable job of hiding the fact that it's a digital display.

A closely related issue is that, apart from the iPad's in-plane switching (IPS) display, most consumer-level (read: not obscenely overpriced) displays are based on twisted nematic (TN) LCD panels. TN panels have numerous flaws- their colour gamut is limited, their colours shift as the viewing angle changes- but they're relatively cheap. Therefore, they are everywhere.

In other words: The technology exists to create true colour, sunlight readable displays at a resolution high enough that they don't tire our eyes and that we can't tell that they're made of discrete pixels. The technology exists to drive such a display with any images we care to throw at it, for a fairly reasonable price. The technology to make optimal use of such displays is already built in to all common operating systems.

What's holding things back? All I can think of is that the market for displays has stagnated; there's been no substantial progress in display quality in half a decade. The excitement's in mobile these days, and that's where the skilled engineers must have gone. Samsung, inexplicably, has 33 distinct, but nearly identical, 1920 x 1080 monitors from 21.5" to 24". I don't see a need for more than six of those- three sizes each of TN LCD and LED. Or look at ASUS's equally long, repetitive list. Or that of any other display maker.

It would be quite interesting if one of these display makers were to take all the engineering talent and manufacturing resources that are currently spread out over a huge, overlapping model range and focus on making something that's actually better than the competition. Find a way to mass-produce IPS panels cheaply; I can think of no good reason why it couldn't be done. Use that technology to make huge quantities of perhaps four sizes of display, with pixel pitches fine enough (0.1 mm would be great, 0.15 mm would suffice for starters) to blow the competition out of the water. Pick a price point that puts you among the better consumer-grade TN panels in that size. Watch everyone abandon your competitors in favour of your slightly pricier, vastly superior product. Wallow in the river of money that comes your way. Samsung: I dare you to try it.

 

Topic: 

Technology: 

Comments

HD+ monitors

Nice poke in the eye, Matt.

There's a bit of a bypass on the salient argument that manufacturers aren't in the business of producing the best monitor possible. They are in the game of producing enough good monitors that a massive buying public can attain reasonably cheaply (as they perceive it) and slamming them out fast enough to saturate the marketplace. Then and only then, is it prudent in the scheme to introduce a better machine and repeat the cycle for the eternal sales nirvana that comes a from a continuously humming production line.

As a cameraman, director and producer of electronic media productions, I have access to the best monitors you can imagine. You are right, they are out there and they are quite expensive, relatively speaking. It's very tough, though, to push a wildly great looking image into the living rooms and design shops of North America at a $4K+ price tag, when a suitable picture is out there in Blue Ray style 1080HD.

We'll see it come to be when the public decides if they really need to have 3D and some fresh standard emerges. The prices will tumble as production lines tool-up. Personally, I do have a 55" 1080HD flat screen for the living room and I do not see the need for 3D. I suspect that there are still a heavy load of conventional 3 to 4 video screens out there, still working just fine and a public that will not bother to suit-up for a better flat panel until the existing things go kablooey.

I'm holding out for 1080HD quality holographic TV, but don't expect to see it before I croak.

Laters,

Chris

Good point on 3D

Matthew's picture

I tend to agree with you on 3D, Chris. Perhaps when the producers start doing it correctly, it'll take off. But as long as certain studios keep trying to digitally re-process 2D footage into headache-inducing artificial 3D, instead of investing in a proper twin camera setup, I suspect it'll remain a gimmick.

Frankly, I just want to have high enough resolution that I don't see the individual pixels and the anti-aliasing fuzz around letters. That means finer pixel pitch and higher pixel counts. And I'd like it without excessive distortion of the colour spectrum, as TN panels are notorious for doing. But why develop better technology when you can still turn a profit by competing on volume with eight-year-old tech?

Add new comment