Understanding the difference between resolutions on screen and TV

The scenario: You’re on a shopping spree and suddenly decide that you want to upgrade that old teletube. So you walk in to the nearest Store and go up to the sales person and ask them for a spanking new television. To your utmost horror, he/she replies with terms such as “resolution”, “pixels”, “pixel density” “Full HD’, “HD Ready” and so on and so forth.

The Result: An absolute bland look on your face akin to someone thumping you on the head with a bat. Mouth slightly ajar and brow furrowed, you attempt to make sense of all the terms which have now been added to your vocabulary.

The solution: Look no further. In this article lies the knowledge you seek. Everything you need to understand about what all these terms mean and what they do is given in this sacred text. Read it and arm yourself with the power of knowledge needed to thwart the salesperson’s relentless onslaught of tech terms.

 

What is Resolution?

tv

This is the bread and butter of all HDTVs and all things related to it. If you are a gamer then you would have probably come across this term when glancing through your video game graphic settings. The primary reason images and videos look so much sharper and clearer than regular TV is an HDTV (High-Definition Television) ‘s higher resolution.

In today’s world of digital TVs, resolution is measured in pixels. Short for “picture element”, this is the smallest bit of data in a video image. As pixel size gets smaller, more pixels can fit in the same screen area, increasing picture resolution, so it stands to reason that, the more the pixels, the more chance of providing higher resolution. The CRT televisions had approximately the equivalent of around 300,000 pixels, fast forward to present times where a standard HDTV has a little over one to two million pixels (That’s up to six times more). All this means a huge leap in picture quality.

 

Picture Resolution – There’s more?

When we talk about picture resolution, we’re actually talking about two things: the resolution of your TV’s screen and the resolution of the video source which can be your DVD player, cable box, or even a Blu-Ray of a movie on your external storage device such as a pen drive or USB OTG (On the Go) device. Both are important, and each can affect the other in determining the quality of the picture you see. Let’s take a closer look at each so you know how they relate, and how to get a good high-resolution picture.

TV screen resolution

Almost 90% of all of today’s HDTVs are “fixed-pixel displays,” meaning their screens use a fixed number of pixels to produce a picture. That includes flat-panel LCD and plasma TVs, as well as front- and rear-projection types that use DLP (Digital Light Processing), LCD (Liquid-crystal display), or LCoS (Liquid crystal on silicon) technology. All of these fixed-pixel displays have a native resolution that tells you the maximum level of image detail a TV can produce. Two of the most common resolutions are 768p and 1080p, though you may also see 720p. For example, you may see these same resolutions listed as “1366 x 768 pixels” or “1920 x 1080 pixels.” That tells you precisely how many pixels the screen actually has: the first number is the horizontal resolution and the second number is the vertical resolution. Multiplying these two numbers gives you a screen’s total pixel count. As an example, 1920 x 1080 = 2,073,600 pixels, which is usually simplified to “2 million.” By comparison, 1366 x 768 = 1,049,088 pixels — slightly over one million.

As resolution increases, the pixels get smaller, allowing much finer picture detail to be accurately displayed.

Now that we got that covered, let’s move on to video source resolution. Then, we’ll explain what “i” and “p” mean when you see one of those letters next to a resolution number.

Video source resolution

The two most common high definition video source resolutions are 720p (1280*720) and 1080i (1920*1080). All HDTV broadcasts, including local over-the-air broadcasts, satellite and cable signals, use one of these formats. 1080i is the most common resolution, but both formats have their benefits and limitations:

1080i has more lines and pixels to show more detail, so it’s great for slow-moving programs with lots of close-ups — think Law and Order or nature documentaries on The Discovery Channel. But the “i” tells you that it’s an interlaced format, which means fewer video frames per second, so it doesn’t handle fast-moving video as well as 720p.

The “p” in 720p tells you it’s a progressive-scan format, which means it presents fast-moving action much more cleanly. It’s ideal for things like sports and action-packed video games.

2012 and the release of UHD (4K)

2012 saw the release of the UHD or Ultra High Definition standard (also called 4K) which is a jaw-dropping 3840×2160 pixels. In other words, this is four times the pixel rate of the market dominating HD standard. Sony, Phillips and Samsung being some of the market leaders, embraced this new standard and released their own line-ups of 4K compatible Televisions and soon other manufactures followed suit. The detail level in these is phenomenal with viewers being able to see the most miniscule of details in true to life, vivid, sharp picture quality.

The downside to this is that these TVs are not quite wallet friendly with the cheapest (according to Tech site “Gizmodo”) costing approximately around USD $1000. The Samsung UN85S9VF which is a 85” TV will cost you around USD $3999.99 (with free shipping though).

 

The bottom line

Today’s digital TV displays are nearly all effectively progressive-scan, so interlaced and progressive are mostly relevant when describing video source signals sent to the TV. The main thing to remember is that a progressive signal has twice as much picture information as an equivalent interlaced signal, and generally looks a little more solid and stable, with on-screen motion that’s more fluid.

720

Also, the higher your resolution, the sharper your image – provided, of course, you have content to match. 1080p is sharper than 720p, though you might have some trouble telling from a distance.

What happens if your TV and video source have different resolutions?

This scenario actually happens all the time, and fortunately with today’s HDTVs, you don’t really need to worry about it. Whether the resolution of your video source material is low (VHS), medium (DVD), or high (HDTV), a fixed-pixel TV will always automatically convert or scale the video signal to fit the screen’s native resolution. Scaling lower-quality signals to fit a TV’s higher-resolution screen is often called upconversion. Upconversion works great with a good source like DVD, but it can’t make snowy analog antenna reception or a noisy cable picture look flawlessly crisp and clear.

Similarly, if the incoming source has more pixels than the screen’s native resolution, the video signal has to be “downconverted.”. That’s one of the reasons 1080p TVs are so popular — they can display every pixel of every available high-def resolution, so they never have to throw any detail out. But if you don’t get a 1080p TV, don’t worry — downconverted video can still look great. The best example is 1080i HD broadcasts that are downscaled to be viewed on 768p TVs.

 

What’s your resolution?

Well there you have it. All you need to know about resolutions and screen sizes and pixels and cookies (one should always be well versed about cookies). Now the big question. Do you NEED UHD? Well, it’s a simple case of different strokes for different folks. In my honest opinion, I see no practical sense in splurging on a 4K TV when our local television content is still stuck on 480p (at worst 360p). Apart from bragging rights, it feels a huge money hog which could be used for some other worthwhile purpose. A full HD TV is quite sufficient and starts at a moderate LKR 60,000 – which is not too bad considering you get a full 1080p experience.

Till we meet again.

LEAVE A REPLY

Please enter your comment!
Please enter your name here