3D Ready, DisplayHDR 400, Flicker-Free, 10000:1. Reading the specification sheet of a monitor can quickly get confusing because of cryptical abbreviations and numbers just like these. Here, we set out to clarify some of them and describe in brief how they relate to an average user´s everyday life.
The specifications are listed in no particular order, meaning that ones found higher are not necessarily more important than what´s below. Let´s get into it.
In a nutshell, HDR means complex backlighting system. An HDR-enabled screen has multiple zones of backlighting whereas simple TN, VA and IPS panels have to make do with a single zone. For the sake of simplicity, we can say that one large “bulb” powers a non-HDR-enabled panel while each HDR-enabled panel is powered by multiple “bulbs” of a smaller size that can be turned off individually when needed.
That makes an HDR screen able to display not just a variety of colors but also a variety of brightness levels. Imagine the night sky: Billions of stars glow brightly with darkness all around. An IPS panel would not be able to display that darkness in all of its glory; while the subpixels would do their best to block the light emitted by the backlight, they are not almighty. Now, a mini-LED screen would do a better job, since the efforts of the subpixels would be helped by the panel´s controller dimming the LEDs (“bulbs”) in places where things are supposed to be pitch-black.
OLED screens are HDR-enabled by definition since their red, green and blue subpixels emit light themselves rather than simply blocking or allowing through the light emitted by the backlight. Turning off an OLED subpixel leads to a significant change in both color intensity and brightness.
A quality HDR panel makes watching movies and playing games that bit more enjoyable. Just please stay away from displays that are marked as DisplayHDR 400-compliant. Those do not have multiple backlighting zones and are therefore not really HDR-compatible, as is evident by looking at the table below.
We measure this in candelas per square meter (which is the same as nits) with the help of X-Rite´s i1 Pro 2 monitor calibration device. Peak brightness is how bright the brightest spot gets at 100% brightness when displaying pure white; the lowest brightness is how bright the brightest spot is at 0% brightness when displaying pure white. The former is important for use in daylight and thus should be as high as possible, with 300 nits and more being optimal, while the latter matters when using the device in a dark room and should thus be as low as possible.
While HDR-enabled displays are capable of delivering rather high brightness for short periods of time, they won´t be nearly as bright when displaying day-to-day apps such as a spreadsheet editor.
While brighter screens are easier to work with in most conditions, decent contrast ratio makes colors pop and is thus just as desirable as high brightness is. To calculate the contrast ratio, we set the display´s brightness to 100%, then make it show a completely black image and measure how bright it gets in nits. This value is called black level. Contrast ratio equals screen brightness divided by its black level. Values higher than 1000:1 are considered optimal.
In 2010, Devin Coldewey of TechCrunch was brave enough to proclaim the following:
pretty much every screen in the house is going to be 3D-capable in a year or so
This prediction of his never came true, mostly because the technology never caught up with the expectations of consumers. To recap, the idea behind most 3D-enabled screens and projectors is to serve slightly different images to the person´s left and right eye, creating the impression one is looking at three-dimensional objects rather than at a flat surface. This requires the use of unwieldy active glasses that, in sync with the current frame rate, allow only one eye of the two to see the image at a given time. These glasses make people feel sick quickly, are rarely compatible with “normal” glasses, and they have to be either charged often, or bound to a power outlet. Not exactly a definition of user friendliness, right?
The other approach is to make use of head-tracking and eye-tracking cameras to shift the image on the display slightly whenever the user moves. This, too, has its downsides.
The most important point to consider is that to fully enjoy an expensive 3D-enabled TV or projector, proper 3D content is required such as movies stored on 3D Blu-Ray discs, or well-optimized games.
At present, throwing cash at a 3D-enabled TV, monitor or projector is just not a good investment.
A display´s sRGB coverage and its AdobeRGB/NTSC/P3 coverage are intimately intertwined with its color depth. 6-bit TN and IPS panels are cheap and rather widespread; these are limited to just 262,000 colors, corresponding to two-thirds of the sRGB spectrum and 45% of the NTSC spectrum. The latter is, for the purposes of this little article, nearly identical to AdobeRGB and DCI-P3.
8-bit IPS panels deliver 16 million colors, covering 100% of the sRGB spectrum and three-quarters of AdobeRGB, NTSC and P3. 10-bit AMOLED panels deliver one billion colors to cover the entirety of AdobeRGB, NTSC and P3. Amusingly enough, super-expensive pro-grade monitors go all the way to 14 bits.
Is this important for the average consumer? It is, as colors look rather dull on 6-bit panels with their poor sRGB coverage. Nobody should waste their money on displays incapable of covering the sRGB spectrum.
A PPI value indicates how easy it is to make out what´s on the screen from a normal viewing distance. Values between 80 and 100 are ideal for laptop displays and PC monitors. Go higher than that, and things quickly start becoming too small to see. A 27-inch monitor resolving at 1920 by 1080 has a density of 82 PPI. Use sven.de to calculate the PPI value for your display in just a few seconds.
Apple started pushing for higher PPI values with the release of its iPhone 4 in 2010, and most other companies followed suit. Such screens require OS scaling for their contents to become easy to read; a 17-inch laptop with a 3840 by 2400 screen does not actually display most things at its native resolution as the pixel density of 266 is way too high. Instead, most apps and UI elements get rendered at something like 1536 x 960, and the OS then scales this by 250%. Supposedly, this approach makes fonts sharper and images more natural.
The same applies to phones. If the device has a screen resolution of 1440 by 720, it most likely renders almost everything at 720 by 360 or so and then stretches that to full screen.
The takeaway is, it is not worth it to pay for super-high resolutions and pixel densities. This is especially true for gamers as for every extra few pixels, a price has to be paid in fps.
This is how much time a screen takes to switch from one color to another, and then back. (Some think just the first part is sufficient but we usually take both into account.)
The ThorLabs PDA100A-EC is our tool of choice. For every screen that comes into our immediate vicinity, we measure GtG and BtW response rates, that is, 50% grey to 80% grey to 50% grey and black to white to black. Most IPS screens achieve around 30 milliseconds which is OK but nothing to write home about. For gaming and watching 60 fps videos, 15 ms or lower is required. OLED panels are ahead of the curve as they deliver sub-1 ms response rates.
The smaller the response rate, the smoother everything happening on the screen looks to a human eye. This is more important than most users think it is.
Quite a few panels claiming to deliver high refresh rate, like 120 Hz, have a response rate so slow they won´t even be able to impeccably display 30 fps footage. Adaptive sync technologies such as Nvidia´s G-Sync may bring some relief, but they can only do so much.
This is the ratio between the width and height of the display in pixels, expressed in lowest whole numbers possible. Suppose we have a 3840 x 2400 screen before us. If we start dividing both numbers by 2, we get
- 1920 x 1200 → 960 x 600 → 480 x 300 → 240 x 150 → 120 x 75
There is no way to get a whole number by dividing 75 by two. So let´s divide by 3 now
- 40 x 25
Not again! All right, how about five?
- 8 x 5
Now we´re talking. The screen´s aspect ratio is 8:5. However, most people would go up a notch and call this 16:10 as this makes for an easier comparison with the most popular aspect ratio, 16:9.
It is said that 16:10, 3:2, 4:3 and 5:4 screens are best suited for work while 16:9, 18:9, and 21:9 screens are best for content consumption and gaming.
HDMI and DisplayPort video signals mostly consist of luminance data and color data for every individual pixel that is to be displayed on the screen. Of course, it´s a lot more complicated than that, but let us just keep things simple for now.
Both interfaces have their limitations, the most important one being their bandwidth or how many gigabits they can transfer per second. With many high-resolution, high refresh rate displays, it is easy to run out of that bandwidth.
It is at that time that the question arises of what to save bandwidth on. Resolution, refresh rate, and color fidelity are the three options to choose from. Since most users will be rather unenthusiastic about lowering the refresh rate, let alone the resolution, color fidelity will be reduced instead.
That means the display will be switched from the best option of 4:4:4 to 4:2:2 or, worse still, 4:2:0. A lot of color information will get lost, and image quality will suffer significantly. Here is a short TechPowerUp post on a few occasions of this exact nature.
The graphics driver of your PC or laptop is where you can check which subsampling mode is currently in use and which modes are supported by the monitor.
Sergey Tarasov – Senior Tech Writer – 2271 articles published on Notebookcheck since 2010
I love reading stuff. I also love dealing with different electronic devices, be that a remotely controlled toy or a new MacBook. When I am not at work, you can try searching for me somewhere in the mountains of Altai Republic, Russia.
Sergey Tarasov, 2023-06-17 (Update: 2023-06-17)