HD – short for High Definition – is everywhere these days, but is it really a big deal and what’s the difference between 720p HD and 1080p Full HD when it comes to television displays and video content?
Games consoles play HD games, all of the public service broadcasters (that’s the BBC, ITV, Channel 4 and Channel 5) have dedicated HD channels and streaming services like Netflix and Amazon Prime Instant Video play thousands of movies in HD on demand. Even pretty much every cheap-as-chips entry-level phone that hits the shelves these days has an HD screen, so you can enjoy high-def video on the move.
But what actually is HD, how many different types of HD are there, and what’s the difference between 720p HD, 1080p Full HD, Quad HD and 4K Ultra HD? Here’s our explainer guide.
What is HD?
So what actually is HD? HD, or High Definition, refers to the detail level of a screen; or to put it another way, the number of pixels that are packed into a display. A pixel is the smallest visible element on a display, the ‘dots’ that combine to make up the overall picture.
HD follows from standard definition (the level of detail in analogue colour TV that most of us grew up with), cramming in even more pixels in order to produce sharper, cleaner images when playing video.
Confusingly there are three different types of ‘HD’ resolution out there, so it’s worth knowing a bit more when shopping around for high definition TV sets and related gear. Just because a TV set or monitor has ‘HD’ slapped on the side, it might not be exactly what you want or need. So let’s take a closer look at 720p, 1080i and 1080p here, and what you need to know.
720p, 1080i and 1080p HD – What’s the difference?
720p, 1080i and 1080p are all versions of HD, but they’re all different. It’s important to note that you can’t actually buy a TV set with a 1080i display, for reasons which we’ll go into a bit later. So to begin with, we’ll just look at 720p and 1080p and the differences between those.
720p vs 1080p
In the analogue TV days, all UK TVs used the PAL (Phase Alternating Line) broadcast system, which used the standard definition of 576i. In other words, video content came in a 720 pixels wide by 576 pixels tall resolution, also known as ‘standard definition’ or SD.
A 720p screen is 1280 pixels (wide) x 720 pixels (tall). That’s more than twice the detail of standard definition, which makes for reasonably sharp video playback on a standard TV. However, 1080p goes even further, racking up the pixel dimensions to 1920 x 1080 – that’s five times more detailed than SD.
Over on our YouTube channel, we’ve taken a humorous look at the history of broadcast definitions which might help explain the differences between the two a little better – as well as prepare you for the next generation of high definition – but that’s another story.
Related: What are Super Hi-Vision, Ultra HD, 4K and 8K TV?An HD TV with a resolution of 720p will only be able to display video at this resolution and no higher. So if you’re planning on playing HD games on your PlayStation 4 (which supports 1080p Full HD) or streaming the highest quality movies from Netflix you might want to avoid getting a 720p TV set.
That’s not to say that a PlayStation 4 or Netflix won’t work on a 720p set – you just won’t be able to get the absolute best performance.
Most 720p TV sets you’ll see in shops these days tend to be toward the cheaper end of the price spectrum and will be marketed as being ‘HD Ready’. This is because 720p is the absolute minimum required to meet this standard.
Most 1080p sets you’ll see will be marketed as being ‘Full HD’ or ‘True HD’ as it gives you a richer, more well defined viewing experience.
Can a 720p TV playback 1080p Full HD video?
Related: Quad HD vs qHD vs 4K Ultra HD: What does it all mean?Each TV set will have what’s called its native resolution. This basically means that a 720p set is better at displaying 720p HD broadcasts.
Every broadcast or format your TV receives will be displayed in its native resolution. So if your 720p set receives a 1080p signal from a broadcast, Blu-ray player or games machine, it will downscale it to fit on the screen. Similarly, any standard definition 576p broadcasts will be upscaled.
All HD channels from the BBC, Channel 4, Sky and Virgin Media for example are broadcast in 1080i. A 720p HD TV would then downscale this resolution to fit, while a 1080p TV set would be able to handle it natively. We’ll get on to the difference between 1080i and 1080p in a bit. Promise!
The real plus of having a 1080p TV set have over a 720p one is when it comes to watching Blu-ray movies. Blu-ray is a native 1080p format, so they look their absolute best on 1080p TVs. If you’ve got a growing Blu-ray collection then the choice is clear – go for a 1080p set.
In terms of picture quality on a 24-inch or 26-inch screen the difference is negligible unless you’re sat up close. Only on bigger screens (32-inches and above) can you start to appreciate the benefits of 1080p.
What is 1080i and how is that different to 1080p?
1080i and 1080p broadcasts both display images at the same pixel count as each other – 1920 x 1080. The big difference is in how images are made up on your TV.
The lowercase ‘i’ in 1080i stands for interlaced scan. The lowercase ‘p’ in 1080p stands for progressive scan.
Interlaced scan renders images in vertical lines, breaking down the picture into individual columns and then displaying every other line at a very high rate – at 1/25th of a second. Odd-numbered lines get painted on the screen first, then even numbered lines. While this is incredibly fast and impossible for the human eye to detect, it can create a ghostly flickering effect on live TV broadcasts, particularly live sporting events.
Progressive scan renders images sequentially, all at once. This makes for a much smoother image overall that doesn’t suffer from this ghosting effect. What’s more, flat-panel displays like LCDs, LEDs and plasmas (the most common types of HD TVs) will automatically convert any incoming 1080i signal to 1080p. Good quality TVs (generally the expensive ones from well-known brands) will use clever processing to replace the missing lines, but cheaper TVs won’t look as good.
Interlaced scan was introduced for analogue TV both as a form of data compression (only sending half the signal at any one time) and because old-school CRTs (cathode ray tube) TVs in the past couldn’t physically scan screens fast enough for a progressive picture.
How do 720p and 1080p HD TVs compare to Quad HD and 4K UHD TVs?
The big, new expensive TV tech is of course 4K Ultra HD screens, which cost thousands this time last year but can now be bagged for under £1000. Once again, the difference is mostly a case of screen resolution. Check out our HD vs Quad HD vs 4K UHD comparison to see what these terms all really mean, and what the difference is.
So, should you buy a 4K TV? Well, there are now more ways than ever to enjoy 4K video on your home telly, from the likes of Netflix and Sky TV. That said, the differences between 4K UHD and Full HD are only really noticeable when you put two screens side-by-side, and there’s no guarantee that 4K will survive too long. We’d say only invest if you have cash to burn.
Can an HD TV playback 4K movies and shows?
Yes. Just like a 720p TV playing back 1080p Full HD DVDs and streaming Full HD video, an HD television can also play 4K video without a problem. The video quality is merely downscaled to fit your TV and its pixel count. In fact, because 4K UHD video content has a higher bit rate than HD video, chances are it’ll look great and play with a silky smoothness on your high-def TV.