Good monitors for video editing (and here I am including really all of cutting, 2D and 3D CGI, VFX, colour grading, compositing and referencing work) - share a lot of the qualities desirable for still image editing use, of course, but there are some key specifications and features to consider when considering the moving image domain.
This information is targeted at those wanting to set up a professional level edit suite for general creation, editing, grading and post production work. (You don't actually have to be professional, just have the desire to work to a professional level!). We focus on video cutting and grading more so than creation in this article, but most of the points relate to both creation of new work and editing work.
Note that until recently, really high end video editing work was typically done on so-called 'broadcast' level monitors - typically with price tags in the order of $30k+ each! But times are rapidly changing, and outfits even at the top of the game are now often using the sorts of monitors we'll be discussing in this article instead - for television, computer animation and even motion picture work. For example we're told that WETA digital are primarily using EIZO screens now.
If you're creating, it's generally best to create and preview at the resolution of the final output - that is if you're creating for 4K, then you probably want to work at UHD/4K resolution.
However for editing/cutting, it's essential to understand that you can and typically do edit a reduced stream version of your video (AKA 'offline editing') - whilst the actual final master may be done on the full stream quality.
For example, you can edit in 1080p on a 60hz monitor whilst still ultimately mastering and outputting (i.e. rendering) a 4K/24p stream. Indeed there are many workflow speed advantages to this approach of course - as your computer has to deal with and display a considerably reduced amount of data. And it means you can fit more on your screen e.g. on a typical 2560 by 1440, 27 inch desktop monitor you can show the video stream at 1080p, plus still have some room for palettes, time strips etc.
Therefore your edit suite doesn't necessarily have to fully reflect the final, mastered output in true, full resolution form, when editing.
This leads directly into the next point....
On the flipside, high end grading work is usually done on the full quality stream (AKA 'online editing') - i.e. 4K is graded on 4K screens, 1080p on screens of at least 1080p. This is not always the case, but at the high end that's the common practise - and of course that means for 4K grading work you need a very powerful machine with excellent data throughput to cope.
As with still image work, if you're faced with a choice between more pixels (i.e. 4K) support and more colour accuracy - you should choose better colour accuracy every time. The same goes for more pixels versus HDR - study after study has shown that human's perception of image quality is much more about colour & contrast that it is about resolution.
Again, what you're previewing and working on often isn't the full quality stream that will be ultimately mastered, but it IS critical that what you see represents the video colour as accurately as possible, and ideally in a way the best reflects the final output and how that will be seen, so that you can make the best quality editing decisions as you work.
Small, and common, problems in your screen that don't affect normal computer use, e.g. inaccurate white points (i.e. screen too warm or cool), or inappropriate black points (contrast of your editing screen not accurately reflecting what your viewers will ultimately see) - can have a drastic effect on the decisions you make while video editing and can lead to very costly mistakes.
Thus colour accuracy is, just as with still image work, the key thing to look for in a monitor for this sort of work.
Almost all decent quality monitors these days can display the full REC.709 colour space, but the world is rapidly moving beyond this traditional standard gamut. The emerging standard of most interest is probably DCI.P3 - the digital cinema colour standard - and modern monitors/projectors can often now display nearly all of this colour space. There is also the REC.2020 standard, which is the much wider standard for UltraHD HDR Premium TV - although there are no screens yet that can display this enormous colour space.
Whilst most TVC work, for example, is still largely focused on Rec.709, increasingly it is desirable to originate and work in a wider space suitable for output in multiple contexts, including cinema. The gamut of work can always be compressed down to Rec.709 if need be, but master streams are ideally recorded, edited, and stored in the wider gamut.
Colour gamut is a fundamental fixed physical property of the monitor, so make sure you buy a monitor that supports the gamut you plan to work with. And of course wide gamut monitors can emulate narrower gamuts very accurately these days, so in general we recommend you look to wide gamut panels. The best monitors can now display almost 100% of the DCI.P3 space.
Once again competing interests have taken a relatively simple conceptual concept - High Dynamic Range support - and turned it into a fractured, difficult to understand quagmire of acronyms.
HDR is, collectively, an expansion of the available range of both contrast and gamut on screens, to improve visual quality and get monitors to display more of what the eye can see, in a natural way.
It is important to differentiate between HDR viewing systems, and HDR editing systems. It's much easier for a screen to hit the viewing standard for HDR than the editing standards. Mostly when monitors talk about HDR support, they're talking about HDR viewing support.
There are competing standards appearing - Dolby Vision, HDR-10 (and HDR-10+), and Hybrid Log-Gamma (HLG) are the frontrunners and it remains to be seen which standard(s) will triumph in the market. Fortunately, it's often the case that support for other standards can be added by firmware updates to machines, rather than having to replace your hardware (although of course this is never guaranteed to happen with any particular screen).
For an editing monitor to be HDR, in a nutshell, it must:
You will also need an up to date, HDR supporting video card (mid to high cards from about 2016 on tend to be HDR ready). Nvidia cards since the 900 series cards are HDR ready, and they currently certify all Pascal-based cards as HDR ready as well. AMD: the 390X and current Polaris lineup are their first HDR-capable cards.
Good video editing monitors really need to support direct hardware calibration - this means the calibration work is done in the monitor's hardware and is applied to ALL images the monitor displays. Software calibration, where the video cards LUTs are used to modify the signal going to the monitor, works reasonably well for still image work, but those LUTs are often bypassed in the display of video, meaning there is often very little benefit to software calibration for video work.
All our monitors have Direct Hardware Calibration support listed in their specs - makes sure you choose a screen where this specification is ticked! And remember, you'll need a calibrator as well as the screen, we generally recommend the i1Display Pro which tends to be the most compatible with third party packages for video calibration - it's really the industry standard calibrator.
You may be aware of the 3D LUT approach to video calibration offered by some CMS and Video software. It's beyond the scope of this article to address this, but in many cases you can upload this 3D LUT to the monitor - but check your software AND monitor model carefully as the monitor may have a 3D LUT for it's own calibration system, but this does not necessarily mean you can upload your own video calibration 3D LUT per se.
Here's an example page of information about this.
Most monitors, when calibrated, lose some of the maximum available contrast. And most (nearly all) colour accurate monitors use IPS panels which inherently have lower contrast than PVA panels. It's the price you pay for the better colour accuracy.
A few monitors are able to keep this contrast over 1000:1 post calibration, as specified for DCI edit suites. This is a relatively new feature/ability in affordable desktop monitors.
A lot of other monitors come close to the figure when calibrated though, so don't necessarily rule out all the others, but if you want a post calibration contrast range of over 1000:1, which is more indicative of how people are likely to see your work once in the cinema or on their home TV screens, then look for screens offering DCI True Blacks support. Again we list this as a spec. on all our monitor listing pages.
In the TV world, 4K is somewhat of a misnomer - most 4K TVs are in fact UHD or 3840 by 2160 (i.e. double HD (1920 by 1080) resolution in both dimensions).
But the standard for cinema work is true 4K with a slightly different aspect ratio - i.e. 4196 by 2160. Again, only a few monitors support this. The extra space is certainly handy, but remember again the stream you're working on is not usually the full stream, so you can still use a UHD (or even 1080p) monitor and ultimately render out to full DCI 4K.
Desktop monitors almost all operate at 60Hz by default (some early 4K setups were limited to 30Hz). Most film work is recorded at 23.97/24p. 24p - really a historical quirk in terms of how it came to be - is still widely held to be a major factor in making modern cinema (and increasingly TV) 'cinematic' - that is, it is this timing that brings a large part of the unique 'film like' look of cinema.
In TV work, and depending on country, other timings such as 50, 60, or and 48, are used.
This means there is almost always a translation going on (e.g. 3:2 pulldown) between the native timing of the recorded video stream and the display of that stream on the monitor. Whilst theoretically not ideal, this rarely presents issues with the editing process...it's much more relevant at the playback end of things where the wrong timing can have a significant and unpleasant visual effect (e.g. TVs with the notorious 'soap opera' effect (seen when they are not displaying 24p work at the right timing and are using 'frame interpolation' to smooth out the motion) - that makes everything look like it was recorded on a cheap handicam.
Some monitors do support 24p input and display. This can be useful if you want to hook up a video device like a bluray player, or see video in its native timing on your desktop, but it's generally inappropriate for desktop computer use in general (animation elements of the UI will appear distinctly slow & jerky, for exampe).
Inputs are pretty easy to deal with. In modern times, you want an HDMI or DisplayPort input that supports the resolution and timing of the monitor and ideally 10 bit input (AKA True or Deep Colour).
Note often with current monitors the HDMI input is limited to 30Hz at 4K, so generally speaking it's better to have a DisplayPort input that supports all the way up to 4K/60Hz.
Most computers (including very recent Macs and PCS with appropriate video cards) can output a 10 bit signal now, so you want a monitor that can accept this higher integrity signal. We have more on this:
Uniformity of the display is generally much less important with video work than still image work, as small amounts of change across the field tends to get lost in the display of moving images - that said, we'd still much prefer a uniformity corrected model to one that is not in general, but it's probably relatively low on the list of concerns for video work so look to the other aspects first.
We're very happy to offer personalised advice if you get in touch. This is often the quickest path to making the right decision for your new monitor.
You might also want to read our material (and we have a lot of it!) on still image editing monitors as most of that information applies here as well. Here's just a few of the main ones to get you started: