The revolution in 4K resolution
4 min read
Video quality is an important part of any visual experience and resolution is one of the first specs people look out for when choosing a new visualization system. In this current blog series, we’ll share all there is to know about the history of resolutions and the 4K revolution.
From SD to Full HD: steadily sharper and wider visuals
Throughout time the technological evolution in content production, chip sizes and signal processing advanced quickly and pushed an increase in resolution standards and video quality.
It started in the early days of cable television and much through the late 20th century, when SD was the prevailing standard. SD, or Standard Definition, covers an area of 720 x 540 pixels often displayed in the 4:3 aspect ratio.
In the late 1990s – early 2000s, with the development of digital video, our typical screens became wider (16:9) and the picture resolution became steadily sharper. That is when HD and Full High-Definition displays hit the shelves. The Full HD systems feature 1920 pixels across the face of the screen and 1080 pixels down the screen vertically in a 16:9 widescreen aspect ratio.
Because it requires less bandwidth, Standard Definition is still used as the base level resolution for broadcast and streaming services. But with audience expectations rising and a growing affordability of High-Definition systems, the relevance of the SD resolution is slowly fading. Full HD can be considered the current standard for commercial use cases.
4K: a dramatic increase in pixels
By the mid-2010s, electronic engineers had devised ways to optimize chip capabilities and dramatically increase the number of pixels in the same 16:9 space, bringing even more brilliance and clarity to visualization. Thus, were born the Ultra High Definition” (UHD) systems with a resolution of 3840 x 2160 pixels, which results in four times the number of pixels and details compared to HD.
When referring to the UHD resolution, the industry rounded up the horizontal pixel count to 4000 pixels and it became popularly known as 4K. However, important side note, it is different from native 4K which denotes an even larger horizontal resolution of 4096 pixels!
Check out: What is the difference between native 4K and 4K UHD?
In the consumer world you see increasing marketing communications with shiny gold labels driving people to buy a 4K (UHD) TV. But the 4K features have especially grown in higher demand in a broad range of professional applications - from simulation to live events, meeting rooms and even medical environments. All seek to have the same elevated level of native 4K quality.
The big 8K buzz
Enabling 4K projection was a major technological accomplishment by the chip manufacturers. Electronics lovers, though, have always yearned for more: more brightness, more clarity, more immersion – all made possible by more pixels. It is the inexorable march of innovation.
With the first 8K resolution displays and projectors rolling out already, 8K technology, which equates to 7680 by 4320 pixels, has turned from a jaw-dropping fantasy to an intriguing and attainable aspiration. However, the main issue lies in the creation, distribution, and application of actual 8K content. In fact, the content industry is still transitioning from (F)HD/WQXGA to 4K.