Is 4K Necessary in HDTV?


At CES this year I saw a number of 4K or 4K x 2K (3840×2160) resolution TVs and projectors on display. The only time I’ve seen so many 4K displays under one roof was at Infocomm, a trade show that focuses on the professional, commercial and large-venue market (think Jumbotron) not the home market. A recent article by DisplaySearch analyst Paul Gray suggests there’s not much of a need for 4K displays (except in limited circumstances) and that there are several important issue which could impede their implementation.

But 4K displays may be coming to the home, and you may have good reason to want one. Let’s take a look at some of the issues:

The best home theater system is the one you plan with help from the experts. Get premium advice for FREE in our expert guide, Planning the Best Home Theater System: Choosing the Best Home Theater Projector, Best Home Theater Speakers, Best Home Theater Receiver, Best Home Theater Screens & More.

First, several companies have been actively showing 4K prototypes, and one even plans to market products within the next 12 months—that company? None other than Toshiba. The product will be a glasses-free 3D TV—the big one (55-ish-inches) it showed at CES this year. Sony, Panasonic, LG and Samsung all showed 4K resolution flat-panel displays at CES, though Toshiba so far has been the only company with confirmed plans to get them to U.S. retailers.

Is 4K necessary?

The answer to that depends on a lot of things. First, there’s no 4K content. No broadcaster, Blu-ray disc or VOD stream has 4K content, or the ability to deliver 4K at this time. Let’s face it, 4K is a heck of a lot of data and bandwidth, and many delivery mechanisms are already taxed just trying to meet the requirements of 1080p, 3D and multi-channel audio.

Certainly packaged media could do it, but is there a reason to? As more content delivery moves to online distribution, consumers are relying less on physical discs. Of all the distribution methods, Blu-ray offers the best picture and audio, but as we’ve seen with digital music, that’s not always people’s top priority.

Can you see the difference? According to Gray, if you sit 10 feet from your TV you’d need a display at least 55-inches in order to notice the increased resolution. The industry sells a fair number of 55-inch and up TVs, so that’s reasonable-enough motivation. For a company like Mitsubishi, which only focuses on big TVs, the added resolution may be just the thing to help set them apart from competitors. What about projectors? Could a projector aimed at a 100-inch screen benefit from 4K? Possibly, but again there are still issues.

Because there’s no 4K content, a 4K display would have to do some heavy processing on 1080p content. Video scaling of that sort can do a wonderful job, or it can introduce its own errors, or possibly highlight issues in the source that wouldn’t be apparent on a 1080p display. So far, the 4K displays I’ve seen looked good, but not mind-blowing.

What about cost? Any TV based on new or improved technology is going to cost more. We saw that when HD was introduced, when LED came along and then 3D. That’s something we should stop being shocked about. Of course 4K TVs would cost more. I’m OK with that as long as the value proposition of that increased cost is justified. With increased production and consumer adoption, costs eventually come down, and then manufactures come up with something else innovative to command a premium price.

What about confusion? I remember attending a DisplaySearch conference several years ago, before there were any 1080p TVs in the market or even publicly shown. At that time, the industry was still fighting over the benefits of 720p vs 1080i when an analyst predicted the introduction of 1080p TVs. Some attendees scoffed at the idea, said we couldn’t see the difference and that another resolution spec would just add confusion to the market and impede sales. When know how that worked out. History is a great tool for predicting things like this.

But back to the “is it necessary” question. I think the answer is yes. Among the premium models from the major TV makers, there really isn’t wide variation in picture quality. If you pick a TV from the top two lines of the top three or four brands, you’ll get a very good TV. One may have slightly better blacks while another has slightly more natural greens. You may like the smart TV menu of one brand over another, or the industrial design of one brand over another, but within reason, they’re all pretty good. The manufacturers know this, and it’s why they’re looking for something else. 4K may be that something else.

And here’s another reason 4K may be necessary—FPR. This year LG is introducing 3D TVs based on frame pattern retarder (FPR) technology. These are the TVs that use passive, polarized glasses rather than the battery-operated active-shutter glasses (LG also makes 3D TVs that use active-shutter technology). One issue some people have raised with this technology is that it reduces the full 1080p resolution to 540 for each eye. Moving to a 4K panel could solve that issue (if it’s actually an issue, meaning, if the viewer can even notice). At a meeting I had with LG Display representatives at CES, the company said that a 4K version of their polarized 3D TV would likely be coming—possibly even this year.

As we know, 1080p came, and the world didn’t end. We settled on a terminology (full HD), video scaling improved, and content eventually became easily available. Now 3D is here with its early adopter issues and growing pains. I expect 4K will come to the consumer market and eventually will overcome its obstacles the same way every other innovation has. I just hope I don’t have to buy another Blu-ray player.


Comments are closed.