8K resolution: Hype or benefit?

by Florian Friedrich, November 11th, 2019

TV-Makers and camera manufacturers make us believe 8K is the next big thing. While more and more devices are launching, controversial discussions about the visibility of the resolution are going hand in hand with discussions about what is real 8K. This article covers the most important aspects of 8K as well as delivering some insights about 8K metrics, 8K productions and what the human eye can see. I’ve also put together also a separate article with frequently asked questions about 8K based on questions I received from attendees after my presentations.
If you’re interested in getting some 8K materials for free download, please see my post about NASA 8K footage and sign up to my free 8K newsletter, which will also provide you free access to downloading more 8K clips such as a nice Dinner Clip (see Youtube-preview) making a good example for how smooth and clean 8K can look as well as some Alexa 65 footage with an 8K-Test Pattern at the end of the clip (see Youtube-preview). And if you’re doing HDR and/or 8K post-production or quality analysis, please take a look at our HDRmaster 8K software or HDRmaster Toolset.

4K vs. 8K side-by-side comparisons during demos in 2019.

8K in the living room

Soon after home theater enthusiasts got used to bigger screen diagonals of UHD-TVs, we’re now quadrupling the resolution again with 8K. Does it mean we need even bigger TVs? And are our human eyes even capable of seeing the resolution differences?

Okay, slowly. First of all, say goodbye to the prejudice you need to sit closer to the screen or you need some kind of extraordinary eye-vision in order to see the benefits of 8K. I’m explaining the technical reasons later in this article.
In my presentations this year, I have shown various content demos where people could see the differences between 4K and 8K from much further away than normal viewing distance. Many journalists and technology reviewers have noticed similar things in their articles about 8K TVs – 8K is all about smooth and silent images and losing the pixelated look. We’re now used to smartphones with “Retina” resolution for quite a while and there’s no reason why we should not enjoy the same kind of picture quality using our TVs.

33 Megapixels of 8K versus 8 Megapixels of 4K: a significant difference in pixel count.

Viewing distance and human visual acuity

One of the most obvious questions connected to 8K resolution on a TV is:

How far away should I sit from the TV?

When the question is about a good movie experience, it can be answered quite quickly: Your TV should occupy the same field of view (FOV) that is occupied by a cinema screen in the center of the theater. According to theater standards such as SMPTE EG-18-1994 and THX, this means approximately 45 degrees of FOV.
A FOV of about 60 degrees is considered the closest distance for a good experience. Translating that into a home environment with a 16:9 TV, it means the sweet-spot is in between 2x to 3x the picture height of your TV. So, you may roughly sit 2-3 meters (or 6.5 to 10 ft) away from a 75” TV.

The farthest distance to the screen should result in a FOV of no lesser than 30 degrees. You may use our online calculator for viewing distance and Field Of View to find your best seating position at home or for picking the best screen size.

The next question is essential:

How much resolution can the human eye resolve?

Scientists and eye-doctors are pretty much on the same page regarding this question: A healthy human eye can resolve one arc minute. This means that we’re able to see two objects separated from each other when their distance is only 3 millimeters and we’re 10 meters away. One arc minute equals 1/60 degree. This, by the way, does not mean we can’t see smaller objects such as stars. And some people can see better and differentiate objects as close as half a minute of arc.

Unfortunately, using this value of one arc minute, many people came to incorrect conclusions. They said that any pixel grid does not need to be finer than one arc minute, which would lead us to 2.6K if we look at the TV from 2.5 times the picture height away.

1 Arc minute is 1/60 of a degree.

It’s a common miscalculation to expect one pixel per arc minute, but it does not make it right, and I have multiple examples for you.

Example 1 of why 1 pixel per Arc Minute is not enough: The moon

If we want to find out how many pixels we need in order to recreate a realistic image without any pixelation effect, we need to define an object that all of us can see at the same size. An ideal example of such an object is the moon. The average size of the moon in our FOV is about half a degree or a little more than 30 Arc Minutes. Even a Supermoon is only 14% bigger, so its really a good example. Spending one pixel per arc minute for reproducing the moon, in this case, would mean we’d only have 30×30 pixels for reproducing this well-known object. In my examples, I’m adding a few more pixels and make it 32×32 pixels (just to avoid some unnecessary discussions about the size).  And surely to anyone who is taking the time to think about this, it becomes obvious that this kind of pixel resolution is not enough. The lack of detail and the amount of jagged edges in the 32×32 pixel image of the moon is simply not acceptable.

So how many Pixels per Arc Minute do we need?

My answer is: At least 4 pixels per Arc Minute are needed on a display to avoid any pixelated effect.

Looking at the moon at 4 pixels per Arc Minute or 256×256 pixels, this claim made sense to (almost) all the people attending my presentations and demonstrations in 2019. That being said, this example is just another confirmation for my “4 Pixels per Arc Minute!” claim I constructed after doing some calculations. More of that will follow later after diving into quantization.

Example 2 of why 1 pixel per Arc Minute is not enough: Landolt-C

My next example is something eye-doctors and many who sat in eye-vision tests will appreciate: The Landolt-C. This symbol is a standard symbol for performing eye-vision-tests. Unlike the Snellen E, this symbol opens its gap not only horizontally and vertically, but also diagonally. It’s well defined so that we know the gap of the symbol at normal human visual acuity (visus 1.0 or 20/20 vision) equals one Arc Minute. Now, if it’s not already obvious to you that 1px per Arc Minute is not enough for what you can see, let’s look at this symbol and adjust the pixel resolution.

If we have only one pixel per arc minute, how does this symbol look like?

The answer is: Distorted, very distorted. When you have square pixels and try to show a diagonal structure of the same size as the pixels, it simply does not work. Any eye-vision test with this kind of pixelation would not work, it could not be accepted and we should not accept the display resolution of one pixel per Arc Minute when it comes to smooth images without pixel artifacts.

With 4 pixels per Arc Minute or 8K resolution, the result looks much better and can be suitable for eye-vision tests:

Sampling and quantization

The main reasons why viewers may see pixelated images together with jagged edges and flickering details can be explained easily by looking at mathematic and technical basics.
Spatial artifacts are caused by the rectangular and evenly distributed structure of the pixels as well as the sampling (spatial quantization). You don’t need to have an engineer’s degree in electronics to know the underlying problem; all of us know the importance of proper signal sampling from digital audio. The sampling frequencies for digital audio are at least twice as high as our hearing ability (healthy ears can hear about 20Hz to 20kHz). No one today would accept 20kHz sampling frequency anymore, just because we’re not capable of hearing higher frequencies. Audio CD is sampled at 44.1 kHz, soundtracks of movies are usually 48 or even 96 kHz. Highend-Audio is often sampled at 192 kHz.

Transforming these numbers to video, we’d look at 4K to 16K resolution. What’s not expected in that equation yet is the geometric aspect of the rectangular pixels where diagonal structures or curved edges can leave huge gaps in between the pixels.

As a side note, projectors have some advantages in this aspect, because the rectangular Pixels are getting blurred and such gaps are closing much better than with a TV or Monitor where the red, green and blue subpixels sum up to a square picture element. As much as there is some side effect for the finest details, this results in smoother images with less visible artifacts from pixelation. As soon as an edge is in some movement (especially slight movements close to the horizontal or vertical axis), and without proper sampling, the resulting jaggies and flickering details are highly visible even from much larger distances than the normal viewing distance. I can spot this issue quite often in rolling text or advertisements during sports events. But even for movies on UHD Blu-ray, I have seen various examples.

The problem is technically well known and easy to avoid: According to Shannon-Nyquist, the sampling frequency for a signal should be at least twice as high as the highest frequency you want to cover. If you don’t sample properly, even slight phase changes (or position changes of objects in this case) will lead to massive aliasing effects. In addition to respecting Shannon-Nyquist, we also need to respect the image is two dimensional and we have this diagonal factor on top of it. I wanted to bring all factors together in one formula which I call FGRD, an acronym for Florian’s Geeky Resolution Demand. It may include better human visual acuity as well as higher Nyquist factors. But even moderately calculated, I’m coming to the conclusion of at least 7512 pixels are needed in the horizontal direction in order to make pixel structures disappear. Chroma subsampling is not even respected in this formula, so I think it’s fine to say we need at least 4 Pixels per Arc Minute or 8K in order to get rid of the visible pixel structures.

8K in Motion Pictures

Having worked with 4K+ and 8K contents since 2015, I’m a big fan of acquiring and processing digital 8K motion pictures. Of course, you have to live with the downside of much bigger demands in terms of memory and computational power, but the quality nerd in me is stronger for appreciating the visual benefits.
Those benefits are similar to what 8K achieves on the display side: Significantly reduced spatial artifacts, leading to smoother images and significantly less flickering details or jagged edges.
Most new motion picture cameras provide higher than 4K resolution and we can say for sure that many movies have been shot in 8K resolution now. While RED leads the way in 8K cameras currently, Sony has various cameras like the Sony Venice with more than 4K resolution and even the resolution-conservative motion picture giant ARRI has the ALEXA 65 with 8K resolution.

Current Post-Production and delivery are mostly restricted to 4K, but it’s only a question of time to see more and more 8K productions. Among the many benefits 8K can provide, you’ll see some freedom in digitally stabilizing or zooming into pictures without running into spatial artifacts. You’ll find a lot of additional details and thoughts in our LP/mm vs. Camera Sensor pixels calculator. This calculator will also tell you how much resolution there was in the analog film compared to digital sensors.

If you’re a filmmaker and can’t record in 8K resolution but want to reduce Aliasing effects, Jagged edges or flickering details, you might want to use our FF Pictures Anti-Aliasing Plugin for Resolve, Premiere and After Effects.
It’s the best Anti-Aliasing solution I’m aware of and our customers are very happy with it, having used it in several feature films already.

8K: The bottom line

The benefits of 8K can be very visible. In an ideal world, we’d see 8K contents delivered in 8K to an 8K TV. But the funny part is that even if you have only an 8K TV and deliver lower resolution content to it, you’ll see benefits at normal viewing distances. 8K is visible!


Comments are closed.