Instead of sitting in crowded airplanes, waiting in long lines at McCarren Airport in Las Vegas to catch cabs or Lyft rides, and racing around the Las Vegas and Sands Convention Centers for the better part of four days (walking several tens of miles along the way - no kidding!), we simply sat down at our laptop each morning, logged into a portal, and sat through numerous press conferences and toured virtual booths.
To be honest, the experience was less than satisfying. It’s one thing to wear out the soles of your shoes trying to find the hidden gems at the show, pushing aside the PR and marketing blitz and doing some real detective work. But all that effort usually pays off.
Not this time. We could look no further than whatever press releases, still images, and B-roll video were provided to us by exhibitors. And it was near-impossible to restage ad hoc conversations with booth personnel to find out things that weren’t mentioned in the releases - to dig deeper and find the real news.
Back in the day, televisions were a big part of the CES experience. Some manufacturers used as much as 50% of their booth space to show off their latest innovations in display technology. That was understandable at a time when a premium large-screen television commanded four and five figures. Today? These same manufacturers make much more profit from refrigerators and washer/dryer combos than televisions, which are commodities now.
A quick check this morning on a national CE retailer’s Web site showed a 65-inch 4K LCD TV with high dynamic range support offered for less than $500, while a 75-inch 4K LCD model could be had for $1,000 and an 82-inch 4K LCD TV was ticketed at $1,300. Smaller 4K screen sizes like 43” and 55” clearly fall into the “consumables” category - buy one, use it for a year, and toss it without blinking an eye.
There are a number of reasons why 4K TV prices have collapsed from almost $300 per diagonal inch in 2012 (when they were first introduced) to around $7 - $8/inch today, and we don’t have the time or space to go into them. Suffice it to say that the major TV brands are shifting some of their focus to 8K models in an attempt to recapture some of that lost profit margin…but it doesn’t appear consumers are buying in all that much.
One explanation could be price differential. While that 1st-tier 65-inch 4K set has a $500 price tag, an 8K 65-inch model is listed at $2,700 (a $500 price reduction, BTW!); five times as expensive. Granted; that 8K model uses quantum dot technology to render HDR images, while the 4K model offers basic HDR support. But that distinction is lost on many buyers who are prioritizing screen size and price in their purchase decisions.
Another factor could be that most viewers sit no closer than 8-10 feet from their TV screen, and the added pixel resolution can’t be perceived at that distance. To see the physical pixel structure on an 8K TV, you need to be positioned within 18 inches (for 20/20 vision) of the screen. That distance doubles for a 4K set, but again, who sits 3 feet away from a 65-inch television?
Several new models of 8K TVs were indeed unveiled at the virtual CES, but the difference between 2021 and 2020 models is incremental. New ways to get more light through LCD panels to your eyes have been devised, including higher-density “mini” LED backlights and more zones for local dimming to improve HDR performance. On the OLED side, most 2021 enhancements have to do with design esthetic, as there’s a practical limit to luminance levels with this technology. Needless to say; none of these new 8K models will come cheaply.
The real advantage of 8K resolution right now is in acquisition. Movies and TV programs can be filmed in 8K and down-converted to 4K and Full HD for distribution, and in fact will look to have more detail than programs captured in native 4K. But the 4x multiplier in both image pixels and file sizes is a challenge to manage for 8K production: Consider that a single 8K video frame has 33 million pixels compared to the 8.8 million pixels contained in a 4K video frame, and you’ll see the problems in moving those pixels at a 60 Hz frame rate.
The differences between televisions and display monitors are insignificant these days, mostly found in the number and types of interfaces and the design of power supplies, mounts, and bezels. The AV industry started moving to 4K displays a few years ago as production of FHD models was declining. Signal switching and distribution for these products isn’t that much of a challenge, given the preponderance of HDMI 2.0 and DisplayPort 1.2 interfaces and signal management products that support them.
It’s not quite that simple with 8K. Refreshing an 8K/60 display signal at 60 Hz - one with 8-bit RGB color - requires a signal highway that can support at least 72 gigabits per second (Gb/s) sustained rates. That’s four HDMI 2.0 connections running simultaneously! And, although we’re seeing increased support for the faster HDMI 2.1 standard in the CE world, it’s going to have a much slower path to adoption in commercial AV applications…and it’s still not fast enough for the 8K/60 4:4:4 example just cited.
Given that proponents of 8K video have permanently linked high dynamic range to the format, it’s almost pointless to try and calculate an 8K signal variant that we can stuff through an HDMI 2.0 port. (For those playing at home, 8K/24 and 8K/30 with 8-bit 4:2:0 color will make it under the bar at 17.82 Gb/s, and that’s it. No support for HDR, though.) To achieve 8K resolution, it’s a far simpler task to tile and interface four 4K-resolution displays.
As far as moving 8K video through an IT network switch goes; it can be done using a mezzanine-level codec like JPEG XS and has been successfully demonstrated by Japanese TV network NHK. Using 5:1 JPEG-XS compression, an 8K/60 10-bit 4:2:2 video signal cruises through a 10Gig switch @ 9.5 Gb/s. However, we still need to convert it to a display format at the final connection, once again using multiple HDMI or DisplayPort interfaces.
Summing up; it’s going to be a while yet before our industry will consider 8K video and display an ‘everyday’ product, one that warrants much of our energy and money to support. And 8K will remain a more exotic format for some time, due to a combination of financial and technical limitations. For now, 4K is established, affordable, and far less complex to switch and distribute in an AV system.
But check back next year…you never know…
Digging through the archives, we recently came across a program from the 2001 InfoComm Projection Shoot-Out, which was staged in the Sands Convention Center in Las Vegas. And what a trip down memory lane it provided! Twenty years ago, high-definition television broadcasts were just getting off the ground. Flat-screen televisions and monitors were expensive novelties, offering wide XGA (1280/1365x768 resolution) with 8-bit color for prices ranging from $9,000 for 42-inch screens to $18,000 for 50-inch screens; all using plasma technology. Cathode-ray tube (CRT) displays were still common and a lot cheaper than plasma.
Video connections were predominantly analog (HDMI wouldn’t come along until a year later), with composite and S-video leading the way, and component YPbPr and VGA interfaces almost as popular. Video compression was in its early stages (MPEG-2), as was the Internet. One of the more popular signal interfaces at that time (a Kramer product) converted the YPbPr analog HD video format to RGB for use with early models of high-definition monitors that could only accept the latter format.
Signal management hardware required a maximum bandwidth of 70 MHz to handle the most common display formats - 480i, 480p, 720p, 1080i, VGA, SVGA, and XGA. Silicon Graphics’ exotic SXGA format (1280x1024 pixels) pushed that number a bit higher, but getting to 80 MHz bandwidth wasn’t much harder than 70 MHz. Even 100 MHz was easily achievable with analog signal processing. Viewed in retrospective, it seemed like a much simpler time for technology.
Now, imagine you could step into a time-travel machine and emerge two decades later. You wouldn’t recognize the landscape at all: Plasma displays and CRTs are ancient history; LCD displays dominate and LED displays have moved past football stadiums and arenas. Composite, S-video, and analog component video have all gone the way of VCRs, replaced by the HDMI connector. Similarly, the VGA computer interface has yielded to DisplayPort.
Front projectors, expensive items limited mostly to XGA resolution at the turn of the century, have largely become inexpensive commodities sold in office supply stores. They’ve yielded much of their terrain to similarly-commoditized large LCD monitors and televisions. “Projector killer” high dynamic range Ultra HDTVs with 85-inch and 86-inch screen sizes can be purchased today for about the cost of a 3,000-lumens Full HD home theater projector.
A widescreen (16:9) front projector was an exotic item in 2001, offering WXGA resolution and all of 1200 lumens for $9,000. Today? With nine grand in your pocket, you could buy three laser LCD business projector models with Full HD resolution and 5000 lumens output and still walk away with $1500 in change - enough to pick up a 2000-lumens Ultra HD home theater projector, too.
And bandwidth requirements have soared far beyond that quaint 100 MHz speed limit. Now, we have clock rates pushing 600 MHz for 4K/Ultra HD video when it comes to us over the latest HDMI or DisplayPort connections, and 1.2 GHz for high frame rate video and basic 8K. We’ve also gone through several iterations of video compression formats in a frantic race to keep up with higher resolution and faster clock rates as these double semi-trailers loaded with pixels travel through the Internet to our homes, schools, and businesses.
Video signal processing gear has seen a similar fall in retail pricing. At the 2001 Projection Shoot-Out in Las Vegas, video scalers (upconverters) were classified by their output clock rates - 31.5 and 63 kHz, a/k/a line doublers and quadruplers that converted 525i/625i video to progressive scan (nominally 640x480) and 2x resolution (1280x960). These outboard boxes had all kinds of adjustments and tweaks, VGA and BNC output jacks, and ranged in price from $800 to $5,000.
Today, Ultra HD monitors easily up-convert any video format to 4K resolution by using a single chip that is small enough to fit in an HDMI receptacle, while small boxes nimbly jump between low- and high-resolution image and signal formats for just a few hundred dollars.
What’s next? There are a few clear trends. The HDMI interface is here to stay, regardless of how you feel about it. DisplayPort, while a favorite of computer graphics manufacturers, never gained any ground against HDMI in the consumer television and media player world, but it’s not going anywhere anytime soon. If anything, the ability of USB 3.0/4.0 to work simultaneously as a display connection in Alternate Mode has sustained DP, and it’s inevitable that HDMI will also support this format, leading to fewer discrete connectors on laptops.
In the world of displays, we’ve plateaued at 4K (3840x2160) resolution for now; first, because televisions and monitors are so inexpensive, and second, because LCD screens are available in sizes large enough to outright replace projection screens. 4K resolution is more than adequate for meeting rooms, classrooms, and even home theaters. You’d need to sit just 18-20” away from a 4K image to see any pixel structure, and the peak luminance of 300-400 cd/m2 from these screens holds up well in fully-lit rooms.
The next wave in displays is a slow but steady move away from transmissive (LCD) imaging to emissive (OLED and inorganic LEDs). OLEDs are become the preferred display for mobile devices and high-end consumer televisions, and they’re also starting to find unique niches in digital signage, such as curved or warped displays. iLEDs, of course, dominate outdoor signage thanks to their sheer brightness, but we’re seeing the first models of iLED consumer televisions coming to market. They’re big, starting at 100 diagonal images, and expensive (well north of $100k).
What’s interesting about iLED TVs is that they can be constructed from modules, which means they can be built into just about any size desired (also any aspect ratio, but that’s not as attractive a feature for home use). iLEDs don’t suffer from issues with differential color aging like OLEDs, so they can really crank up the brightness - another blow against front projection. Ultimately, OLED TVs will run up against their brighter cousins as prices come down, and become another market casualty. Even small mobile devices will be equipped with “mini” LED displays in the not-too-distant future.
The technological leaps we’ve taken in two decades are nothing short of amazing. But the biggest jump of all awaits - wireless connectivity for everything, from WiFi 6 to 5G. What’s that all about?
Timing is everything, they say. We’ve just finished a CES press briefing with David Glen, president of the HDMI Forum, as part of the “virtual” 2021 Consumer Electronics Show. That kicked off in earnest Monday, January 11 with zero footprint in Las Vegas. (Well, at least we saved a few $$ on airfare, lodging, and food this time.)
As expected, the focus of the briefing was on HDMI 2.1, which has been wending its way slowly to market since it was announced almost four years ago. To recap; HDMI versions up through 2.0 used the same signaling technique (transition-minimized differential signaling, or TMDS) to deliver three channels of video (RGB, YCbCR) and a separate channel with clock/sync pulses. Top speed was 18 gigabits per second (Gb/s).
HDMI 2.1 is a major departure from that architecture as it uses four lanes to deliver digital packets of data (like DisplayPort), embedding the clock/sync signal. And v2.1 promised a substantial speed upgrade from 18 Gb/s to 48 Gb/s (12 Gb/s per lane), achieving greater efficiency by switching from the usual 8bb/10bb signal coding to 16bb/18bb, dropping the signal overhead from 20% to 12%.
The big news this year (apparently) is the growing number of consumer televisions that support HDMI 2.1. One year ago, only Japanese chip manufacturer Socionext could supply chipsets in this format. Now, with much certification work accomplished, other chip fabbers are getting onboard, and TV brands including Samsung, LG, TCL, Hisense, Sony, Panasonic, and Sharp are shipping 4K and 8K TV models with more than one v2.1 input.
The 8K TV market, while still very slow to get going due to high retail prices and lack of consumer enthusiasm, will have to rely on V2.1 interfaces entirely. An 8K video signal with 10-bit 4:2:0 high dynamic range color, refreshed at 60 Hz, has an uncompressed data rate of about 40 Gb/s! And high frame rates are all the rage with gamers these days, so kicking things up to 120 Hz doubles the data rate. For RGB connections, the rate doubles again over 4:2:0 video.
HDMI 2.1, unlike its predecessors, supports Display Stream Compression (DSC). But no one’s really using DSC for consumer applications, aside from small mobile electronics. DSC supports 2:1 video compression with very low latency - a handy tool to have, given the data rates you just saw - and is the foundation of the SDVoE Blue River NT system being hawked for AV-over-IT networks.
Other attributes of HDMI 2.1 include support for all static and dynamic high dynamic range formats (just another way of saying it can handle higher data rates and pass along the required CTA-861.x HDR metadata), an Enhanced Audio Return Channel (eARC) that can handle 1500 Kb/s, sufficient for Dolby Atmos spatial sound; and some additional alphabet soup acronyms:
A quick glance at those data rates for 8K (and120 Hz 4K with RGB 10-bit and 2-bit color) has many of us thinking that optical cables would make a lot more sense for HDMI 2.1 interconnections. Indeed; the HDMI Forum recently certified the first Active Optical Cable (AOC) for long length installations and also
started promotion of HDMI Cable Power so AOCs can be powered directly from a source device’s HDMI Connector. Presumably that means kicking up the DC power handling capability of HDMI along the way, similar to what DisplayPort did 15 years ago.
According to the HDMI Forum, we can expect greater numbers of TVs, AV receivers, game consoles (like the hard-to-find PlayStation 5), graphics cards for gaming, PCs and laptops, and set-top boxes to be introduced this year with one or more v2.1 interfaces.
Accordingly, a new certification program for Ultra High Speed HDMI cables has been launched. Cables that carry this certification must be tested only at HDMI Forum Authorized Test Centers (with every available cable length tested), registered with the HDMI Licensing Authority (HDMI LA) verification and authentication program, and packaged with an Ultra High Speed HDMI Certification label that contains a hologram, QR code, and label ID number.
We mention all this because as our industry well knows, the consumer electronics marketplace “wags the dog” nowadays. To date, there is very little support for HDMI 2.1 in the Pro AV business, but that will change pretty quickly. Consider that the vast majority of large displays (televisions and monitors) sold into our market have Ultra HD native resolution (3840x2160 pixels) and there will be clients buying them who will also need HDR imaging, deep color, and ultimately high frame rates, particularly in tiled displays.
It remains to be seen what impact all of this will have on DisplayPort, which announced at last year’s CES that they were developing standards for version 2.0. HDMI has become so ubiquitous that you’ll find it on DSLR cameras and pro video gear, leaving DP to the world of computer graphics cards and laptops.
What I found interesting is that there’s no support yet for USB 3.0 Type-C Alternate Mode, which DisplayPort embraced early on. Glen couldn’t give us any specific reason why not - such a mode would obviate the need for micro and mini HDMI connections, particularly on mobile electronics. Perhaps we’ll see some movement in that direction this year. Stay tuned…
Well, here we are in the home stretch of 2020. It’s December and one of the most turbulent years in world history is drawing to a close. A worldwide viral pandemic is well into its third wave, hammering national economies, filling hospitals and morgues to the breaking point, and testing everyone’s resolve: Should we travel for the holidays or stay home? How safe is it to get on an airplane or sit in a movie theater? Should students remain in classrooms or revert to virtual learning? How many of us will permanently work from home when this is all over?
“When this is all over” is very much an open-ended statement. While recent announcements of high-efficacy vaccines are encouraging, the timetables for widespread availability of said vaccines have yet to be finalized. Most medical experts and health officials are now saying it will be late spring to early summer before most people can be vaccinated and life can get somewhat back to normal.
We’ll be living in these strange times for a bit longer, unfortunately. Several major expositions and trade shows have already made appropriate adjustments: The annual Consumer Electronics Show, held in early January, will be entirely virtual this year. Integrated Systems Europe bumped its show back to June 1 in Barcelona, and while InfoComm is still scheduled for mid-June in Orlando, the annual NAB show will now take place in October of next year.
The fact is that scheduling any live events in Q1 of 2021 is still a gamble - even major sports leagues like the NHL and NBA are discussing playing in “bubbles” again, as are college basketball teams. Some schools and leagues have thrown in the towel and cancelled winter sports altogether. As this is being written, several states are once again implementing restrictions on indoor dining and the number of persons at indoor gatherings, with some facilities such as gyms being shut down altogether.
In the year 2019 (before Covid-19), our industry was preoccupied with such things as the AV-over-IT transition, 4K and 8K, cloud-based media storage and retrieval, IP-based control systems, HDMI vs. DisplayPort, the emergence of mini-LED displays, arguments of projection vs. direct-view displays, consolidation of both manufacturers and integrators, and generally – how to make a buck selling any hardware these days.
What a difference a year makes. Now, we’re stuck with finding ways to better utilize WiFi and home broadband to communicate – we’re using AV-over-IT, like it or not. We’ve suddenly rediscovered the value of good lighting and audio. We have no choice but to rely on the cloud to store and exchange files (good thing home broadband speeds have steadily increased in the past few years!). Our laptops and mobile devices have taken on greater roles as primary communication portals.
The projection/direct-view argument is moot. People are using whatever displays are on hand to communicate and learn, which in a majority of cases is nothing more than a laptop. While industry consolidation continues during the pandemic, we’re doing a lot more online shopping these days. Some institutions have bitten the bullet, closed down their facilities, and commenced long-overdue facility construction and upgrades, turning lemons into lemonade after a fashion.
The pandemic has forced many of us to throw caution to the wind and try just about anything to effectuate communication. The Society of Motion Picture and Television Engineers (SMPTE), a 104-year-old institution, held its annual technology conference online two weeks ago using the Hubb platform. Attendance was strong (a pleasant surprise) and presentations were recorded with live Q&A afterward. Other trade shows and conferences used this model, and aside from a few hiccups here and there, it worked quite well. While a lot of pre-production was involved to record all the talks, this approach made it possible to stream those talks on-demand after the conference for virtual attendees in different time zones.
We can never lose sight of the fact that, first and foremost, we are experts in audiovisual hardware and software technology tasked to solve a client’s communications needs. The “stuff’ we design and install isn’t the end game; getting the client up and running is. And this pandemic really pulled the rug out from under our collective feet: Who could have guessed that Zoom would become a primary communication platform? That musicians desperately needed a low-latency codec so they could rehearse together virtually? That sales of HDMI-to-USB capture cards would skyrocket? That video streaming would prove a lifeline to both houses of worship and live concert venues?
Doesn’t matter which side of the AV-IT debate you were on originally. It’s here now, and certainly not in the form you expected. Pre-Covid-19, we taught and took classes on low-latency codecs and setting up networks and switches, listened to arguments for and against AV-IT platforms, learned how to set up firewalls and DMZs, and debated the advantages of fiber vs. category cable. Our conversations were all very much focused on wired and wireless local area IT networks (LANs).
And then the coronavirus came along and all of that got tossed out the window. Now, we’re working across wide area networks (WANs) with high-latency codecs, using adaptive bitrate streaming for everything from GoToMeeting and WebEx meetings to watching “The Crown” and “Top Gear” on our Internet-connected smart TVs. Our connection speeds are only as good as our Internet service provider or mobile network. The majority of our connections are over 2.4 and 5 GHz wireless, and upload speeds are often a fraction of download speeds.
In short, we’re not starting with a blank canvas as we usually do to build an AV “solution.” Instead, we’re more like the TV character MacGyver; grabbing whatever tools we can get our hands on to work within predetermined limits, like ‘em or not. And by all accounts, we’re pulling it off, notwithstanding missteps here and there. Necessity is the mother of invention, and there have been plenty of ingenious AV solutions implemented under fire this year.
Yes, 2020 has been a real downer, but things will eventually get back to normal. In the meantime, recall the pre-WWII inspirational poster, “Keep calm and carry on,” that was designed by the English government, but never actually distributed to the public.
Good advice for these crazy times…
A press release recently crossed our desk that detailed how the HDBaseT Alliance is working with a company called SyncPro “…to enable cloud connectivity capabilities for HDBaseT products.” SyncPro developed the Cloud OS software, which according to the company, “…enables IT teams and Managed Service Providers (MSP) to remotely configure, monitor and maintain audio/video, unified collaboration and digital signage products.”
This is an interesting development for the HDBaseT signal distribution format, which you may remember started out as a simple and reliable way to extend HDMI 1.3 video and audio, and IR and RS-232 control signals up to 300 feet, originally over Category 5 cable. HDBaseT can also deliver a maximum of 65 volts DC power @ 1.5 Amps and carry duplex Ethernet signals.
Since then, Valens Semiconductor has made several updates and speed enhancements, such as support for USB 2.0. The latest version, HDBaseT 3.0, employs Cat 6 cable and can transmit a maximum of 16 Gb/s of data with a 2 Gb/s return link. By eliminating the standard 8bb/10bb ANSI symbol coding overhead used by HDMI (and DisplayPort), HDBaseT can transport Ultra HD video using 8-bit 4:4:4 (RGB) color @ 60 Hz.
The HDBaseT format is quite popular in the AV industry, but hasn’t made as much of an impact in other verticals such as broadcast, cable, cinema, and streaming media, where a more conventional IT-based audio and video distribution system is preferred; one that uses TCP/IP protocols and MPEG/JPEG video and audio compression formats. (HDBaseT does not compress video signals.)
What’s interesting about this alliance with SyncPro is how HDBaseT is trying to look more like an AV-over-IT solution without actually being one. Since everything that travels over an IT network is converted to packets with headers, it’s a simple task to add control system functionality. Control packets come in short bursts and don’t need much bandwidth, and they can also hop over WANs for remote monitoring of AV systems.
In contrast, HDBaseT exists in its own closed ecosystem, as it uses a proprietary pulse-amplitude modulation (PAM) technique to convert HDMI 1.4 and 2.0 TMDS signals for transmission. The fact that version 3.0 supports 1 Gb/s Ethernet is incidental, as HDBaseT has always allocated bandwidth for Ethernet packets.
However, HDBaseT cannot coexist in a conventional TCP/IP system with network switches. And there’s really no issue with compressed video over a network structure these days. JPEG XS, a very low-latency “mezzanine’ codec, has been demonstrated by the Japanese national television network NHK, compressing 8K/60 video (33 megapixels per frame!) by 6:1 through a 10 Gb/s network switch with minimal signal degradation, as observed up close on 85-inch 8K LCD monitors.
To be clear, we’re not disparaging HDBaseT here. For many AV installations, it makes perfect sense as a reliable way to transport audio, video, and control over long distances. Category cable is certainly cheap enough and easy to install with crimping tools. Instead of network switches, more conventional full-bandwidth matrix switches are used, just as they would be with HDMI signals. And it is handy to multiplex control and USB signals into the same wiring.
But HDBaseT is still a hybrid approach to network-style signal distribution, trying to straddle two increasingly diverging worlds. Recall the plethora of analog signal extenders so popular in the early part of the 21st century: Remember those InfoComm demos of VGA connections over rolled-up spools of category wire and fiber optic cable?
Valens Semiconductor’s original focus was on the consumer market, and the first demos I saw of HDBaseT were at CES and CEDIA, featuring the “5 Play” slogan (Video, audio, IR, RS-232, and 100 Mb/s Ethernet over one cable). And HDBaseT had a faster, Chinese-backed competitor back then - DiiVA, or Digital Interface for Video & Audio - that largely seems to have faded into history, but used a similar approach to integrate all of these signals into one piece of structured wire with a maximum bandwidth of 18 Gb/s.
But time doesn’t stand still. No one could foresee 4K video coming to market so quickly. A decade ago, the only widespread use of JPEG-based codecs was for digital cinema, as the DCI spec uses JPEG2000 compression. And at that point in time, a “fast” IT network used 1 Gb/s switches, not 10 Gb/s.
Today, the marketplace for networked distribution of AV signals is pure bedlam. HDBaseT competes with Software-Defined Video Over Ethernet (SDVoE) products, which use Display Stream Compression of video into packets for transport through IP networks, and JPEG-based video compression, also for transport over IP networks. The latter solution is now being promoted aggressively by the Alliance for IP Media Solutions (AIMS) to the AV industry as a counter to SDVoE.
And momentum continues to build for TCP/IP-compatible AV distribution architectures. The Society of Motion Picture and Television Engineers (SMPTE) has released its ST 2022 standard, Professional Media Over Managed IP Networks, which has eight separate parts. And now there’s NMOS, the Networked Media Open Specification standard developed by the Advanced Media Workflow Association for transport of audio and video over AV/IT networks. (Are we driving you crazy with all these acronyms?)
Looking down the road, it seems inevitable that the ultimate winner of the AV-over-IT tussle will be a pure TCP/IP solution, given the growing number of vendors and content producers who support the format, the wide range of open standards and specifications, on-going improvements to JPEG-based codecs using artificial intelligence (AI), and ever-lower prices for IT gear. A quick Internet check showed some models of 10 Gb/s switches selling for less than $1,000.
Most importantly, TCP/IP-based signal distribution systems are fully compatible with WiFi formats. That’s not an insignificant advantage, as our industry and others see more and more AV signal distribution moving to 2.4 and 5 GHz (and 60 GHz short range) wireless connectivity via wireless presentation sharing platforms; accessible on all of our mobile devices, desktop and laptop computers, and “smart” televisions and displays.
Believe it or not, there are still systems integrators who prefer to use the time-tested serial digital interface (SDI) for moving uncompressed video and audio assets around a facility, particularly now that 12G SDI connections are available over copper for short distances and optical fiber for long cable runs.
Different strokes for different folks, indeed!
There are many things in life that one can never have enough of, like money, vacation days, good wine, music from your favorite band, and dessert. (Especially dessert!) Add bandwidth to that list. You depend on having “enough” bandwidth every time you send someone a photo from your smartphone, stream Hamilton to your TV, or upload a video to Facebook from your laptop.
Fact is, there’s a never-ending quest to use bandwidth more efficiently, one which the vast majority of us are blissfully unaware of. Groups of very smart people are constantly developing and refining video compression and decompression (codec) algorithms to ensure those selfies, Lin-Manuel Miranda’s music and lyrics, and cute kitten videos get to where they’re supposed to go with minimal signal degradation.
And codec developers can barely keep up. 27 years ago, we saw the first high-efficiency codec emerge to handle low-resolution video on optical disks – MPEG-1, produced for the short-lived compact disc interactive (CD-I) format. Video compressed with MPEG-1 looked okay on small CRT screens, but on larger screens, the quality was pretty bad. Although MPEG-1 and CD-I are fortunately distant memories now, the audio compression format they spawned – MPEG Audio Layer 3 or MP3 for short – lives on to this day.
MPEG-1 was followed by MPEG-2 in 1996 to encode digital video discs (DVDs) and to compress standard-definition and high-definition video for broadcast, cable, and satellite. It’s still in use today. MPEG-2 encoders have come a very long way in 20+ years and can pack two 720p HD programs plus a handful of SD programs into a single 6 MHz TV channel, all with acceptable visual quality.
Like rust, compression experts never sleep. 17 years ago, a newer and more efficient codec made its debut. MPEG-4 H.264 (aka Advanced Video Codec) promised 50% compression efficiency over MPEG-2…and delivered it! By that time, HD video was becoming widely adopted across a multitude of delivery platforms, but available bandwidth was slow to keep up. With the growth of video streaming a few years later, MPEG-4 H.264 became the codec of choice and is supported on everything from tablets and smartphones to laptops, smart TVs, camcorders, and DSLRs.
The only problem was that 4K video became the latest flavor by the turn of this past decade. And to make the job even more difficult, high dynamic range video with its associated wider color gamut was part of the 4K package. But bandwidth hadn’t kept up! Hence, the High-Efficiency Video Codec (HEVC) was rolled out in 2013, again promising 50% more compression efficiency over H.264. HEVC requires quite a bit of computing power to pull off that trick, but it works (and so does its close relative, Google’s VP9 codec). And it’s not cheap to license.
HEVC is used for the UHD Blu-ray optical disc format and for streaming 4K content from just about everywhere except YouTube, which being owned by Google employs the VP9 “open” codec. But the licensing costs spurred more tech types to come up with yet another codec, known as the Alliance for Open Media (AOM) AV-1 codec. This is a royalty-free codec that competes with HEVC but is intended solely for streaming video over Internet connections.
Since the people developing high-resolution video formats always seem to be a few steps ahead of the codec people, more codecs have been proposed in an attempt to catch up. Essential Video Coding (EVC, or MPEG-5) was developed as an alternative video codec for streaming and OTT, but with streaming performance at least equivalent to HEVC. MPEG-5 Part 2, Low Complexity Enhancement Video Coding (LC-EVC), is yet another MPEG standard and is intended to provide enhanced compression efficiency for existing MPEG-based video codecs.
To top it off, the successor to HEVC H.265 is now preparing to take the stage. The Versatile Video Codec, or VVC, is designed for maximum compression efficiency across all compatible devices and platforms with a specific focus on applications like high-dynamic range, high frame rate video, 360-degree video for virtual and augmented reality, and 8K UHD-2 video.
Using spatial-only image metrics for reference, VVC is about 40% more efficient than HEVC for UHD & HD compression. However, a VVC reference encoder has about ten times the complexity of an HEVC reference encoder, while a VVC reference decoder is about twice as complex as an HEVC reference decoder.
We should mention that HEVC, AV-1, EVC, LC-EVC, and VVC are software-intensive codecs, unlike the older MPEG-2 and H.264 AVC codecs that are still in widespread use. All five employ larger coding block sizes and need super-fast CPUs and plenty of memory to analyze and compress video streams. By the way, H.264 AVC is no slouch - it’s had 26 updates since it was first rolled out in 2003.
Okay, so you’re getting a headache from all these abbreviations. (We are, too.) The takeaway here is that methods for enabling the transport of compressed, high bit rate video over everything from broadcast airwaves, Wi-Fi, 5G, and broadband are continuously being refined. You’ll go through life blissfully unaware of which particular codec is being used to stream The Marvelous Mrs. Maisel or let you watch Clemson’s and Alabama’s football teams slug it out on your iPhone. (You will, however, notice any impairments to video quality as a result of excessive compression and complain vigorously to your service provider about them!)
Will there ever be a unified codec? That’s the goal, but no one knows how, when, or even if it will happen. In the ongoing quest for efficiency, codec designers are now implementing artificial intelligence (AI) to perform the lightning-quick analysis of incoming video and decisions on which picture elements to compress, by how much, and for how long.
We already use a simple version of AI to dynamically adjust bit rates of multiple programs in a stream, based on constantly-changing available network bandwidth (dynamic stream shaping and adaptive bit rate encoding are two examples). It stands to reason that a unified codec – one based on advanced AI that can optimize delivery of high-resolution video across any network or platform – should emerge at some point and rid us of the “alphabet soup” of codec formats.
It’s hard to believe, but we’ve lived through six months of the COVID-19 pandemic already. (And that’s six months we’ll never get back.) As disruptive as the pandemic has been, people have still come up with clever workarounds. Perhaps the biggest challenge was how to stage conferences and trade shows, which require people to (a) get on a plane and travel to the conference site, and (b) walk the trade show floor and sit with others in seminars, workshops, and keynotes.
So, 2020 brought us something new: The virtual trade show. We should clarify that the concept isn’t entirely new, as some AV manufacturers have tried it in the past. But this time around, exhibitors and attendees had no choice – the only way we had to see new products and hear about new technologies and processes was to turn on our home computer, register for the event, and log into the event Web site.
While attending a trade show or conference this way has certain advantages (no security screenings, boarding lines, taxi fares, and no overpriced hotel rooms and show floor food), there really isn’t any substitute for seeing products in person, catching up with colleagues, and having one of those many serendipitous ad hoc conversations to trade notes on what everyone else saw.
We’re well through trade show season now, and the jury is still out on how effective the virtual versions of NAB and InfoComm turned out to be. IFA will be limited to 1000 members of the press attending in person (if they’re lucky to get that many!), while IBC, NAB New York, and even CES 2021 have all opted to go virtual. It will be a long time before any of us prints out an exhibitor, press, or attendee badge and hangs it on a lanyard again!
One show in particular suffered perhaps more than others – the Society for Information Display (SID) annual exposition. Originally scheduled for late May, it went virtual in early August, with many sessions still accessible as of early September. But exhibitors didn’t hold back – this is the event where cutting-edge display tech usually takes its first bows and we get a look at the future of displays even though many of the products shown are still in prototype stage.
Based on the many press releases and photos we’ve received, it’s clear that just about all of the innovation is happening in large, self-contained displays. While prominent manufacturers like Samsung, Panasonic, and LG Display are turning their backs on liquid-crystal display (LCD) panel manufacturing – decisions attributable to evaporating profits – they’re sinking more cash into emissive display products including white OLED with color filters (WOLED), quantum dots driven by blue OLEDs (QD-OLED), quantum dots driven by blue nanorods, and micro and mini LED displays.
The fact is, unless you are manufacturing LCD panels with large mother glass sizes in China, you might as well just take your cash, pile it into big hills, and set it on fire. You’ll get the same result, but much faster. Even the Chinese display behemoths like TCL, BOE, TPV, and CSOT are finding good profit margins in LCD panel manufacturing much harder to come by. (And forget projectors – their market share is getting hammered by ever-larger and cheaper LCD displays at one end, and tiled mini LED displays at the other.)
What’s fascinating about this new generation of displays is that they’re all emissive in design. That is, the light they throw off comes directly to your eye, and not through layers of light shutters, color filters, and polarizers found in LCD displays. That translates into better contrast, deeper black levels, and wider viewing angles; all of which used to be characteristic of cathode-ray tube (CRT) televisions and monitors 25 years ago. (Ergo, what was once old is basically new again!)
Even better is that emerging technologies like quantum dots and mini/micro LED aren’t brightness-limited like CRTs were. Even WOLED displays like premium 4K TVs, production monitors, and digital signs can achieve luminance levels of 700 candelas per square meter (cd/m2) with small-area full white signals. Displays equipped with quantum dots push well past 1,000 cd/m2, while mini LED (classified as LED displays with a pixel pitch less than 2.5mm) can easily reach 3,000 cd/m2. By comparison, a full white screen on a CRT grading monitor would measure only 100 cd/m2.
This move to bright, saturated images couldn’t come at a better time, as high dynamic range (HDR) imaging with its associated wide color gamut (WCG) is becoming popular. Combined with high-resolution 4K and 8K displays with ever-larger screens, the popular descriptions of “being there” and “like looking through a window” couldn’t be more apt.
Next-generation displays aren’t just about brighter images with highly-saturated colors. The next frontiers in display tech are flexibility and transparency. Think of folding smartphones and see-through televisions, products that have been in research and development for over a decade and are only just coming to market. How about TV screens that can wrap around poles and stanchions, or be shaped into the petals of a flower? There are also enhancements to touch screens and even touch-less screens that use ultrasonic sensors to operate cursors, open apps and windows, and play, fast forward, pause, and rewind video.
All in all, it’s pretty amazing stuff. At some point, when the pandemic runs its course, you’ll be able to see these cool display products in person, just like the good old days of 2019.
Until then…wish you were here…
Funny, isn’t it? A year ago, colleagues in our industry were alternately embracing, rejecting, or arguing about the upward trend to 4K video. We debated what “4K” actually meant. We got out our calculators to see what flavor of “4K” signal we could pass through our existing HDMI infrastructure. We watched as display manufacturers began replacing Full HD displays with Ultra HD versions (the correct term, as these have a pixel resolution of 3840x2160 and aren’t really true 4K).
And we slapped our heads and groaned as display analysts warned us to get ready for 8K video, along with high dynamic range, wider color gamut, and higher frame rates. We read up on the latest versions of HDMI and DisplayPort and reviewed the press releases from standards organizations that touted the latest versions of high-efficiency codecs, such as the new Versatile Video Codec (VVC.)
What a difference 12 months and a pandemic make. Now, most of us are working from and learning at home, watching video on Zoom, GoToMeeting, WebEx, Teams, and other conferencing platforms. Some of the video looks decent; a lot of it is pretty awful. Much of that can be blamed on “smart” video codecs that use adaptive, variable streaming techniques and dynamic stream shaping to maximize video resolution based on available network speeds during any given time interval. (Audio is easy to deliver – even spatial sound requires just over 1 megabit per second to stream.)
Aside from consumers buying large Ultra HD televisions like there’s no tomorrow – perhaps to stream “Hamilton” in HDR – we’re not hearing much about 4K and 8K TV right now. 2020’s showcase event for 8K, the 2020 Tokyo Olympics, was pushed back a year. Major sports leagues have cobbled together short schedules to compete in mostly empty stadiums, and some have moved their teams to one or two “bubble” locations for round-robin playoffs to stay isolated from COVID-19 outbreaks. (It’s not working, in case you’re wondering.) They’re not concerned presently about producing anything using cutting-edge video.
Guess what? Pixel resolution isn’t our primary concern now. Bandwidth is, and it tends to be fixed, like the water pressure in a large hotel. If one or two people are taking a shower or flushing a toilet at any given time, there’s ample flow. But if every guest did either one at the same time - well, that “ample” water pressure would quickly reduce to a trickle.
Internet bandwidth works the same way, and right now, we have hundreds of millions of home-bound workers, students, online gamers, churches, and Netflix bingers all grabbing as much bandwidth as they can, often at the same time. To compensate, video streaming services and Internet service providers (ISPs) can “throttle” bandwidth if necessary – this was done early on in areas with stay-at-home orders as online users surged, pleasing no one.
Alternately, we can make decisions on our end to help maximize bandwidth. For conferencing, distance learning, online worship, government meetings, and other events that will attract large numbers of remote viewers, the focus should be on effective communication above all else. Consequently, we should select a video format that’s most bandwidth-friendly. And instead of going up in resolution, we might want to go down. (Heads up: We’re about to become a bit contrarian.)
We need to fill a 16:9 screen (or 16:10) for certain. And we want to show fine detail. Do we need a high frame rate? Only if moving objects are being shown, which is rarely the case with an online class or Web conference. The amount of motion in a worship service is also minimal, compared to an auto race or a basketball game.
Turns out, we do have a video format that’s very well-suited to the online world - 1280x720p HD. Yes, it is the lowest version of HD, and compared to Ultra HD with HDR, it looks more like our old standard-definition video systems. Even so, many TV networks use 720p as a baseline for broadcasting everything from scripted entertainment to live sports - and it doesn’t look half bad on large screens.
The minimum viewing distance for 720p is around eight feet for a 42-inch diagonal 720p screen. But you won’t find any of those today, and you’ll be hard-pressed to score a 42-inch 1080p TV, what with manufacturers switching to Ultra HD native resolution. Not to worry; the major TV brands have incorporated some pretty sophisticated picture scaling engines into their sets, which makes your 720p video look better than you might expect on that new 65-inch Ultra HD TV.
If you want to be really thrifty, consider that just about every display we watch today supports multiple frame rates, so we can easily stream 720p HD at 25/30 frames per second for greater efficiency, or 50/60 frames per second if motion is being shown.
With a 60 Hz refresh rate, our pixel clock is 74.25 MHz, and with 4:2:0 8-bit color, the total data rate is about 1.1 gigabit per second, uncompressed. A high-efficiency codec like H.264 can easily mash that down in the range of 3-5 megabits per second with some latency. Even slow residential broadband connections (such as one we tested recently at 8 Mb/s in rural Vermont) can accommodate that data rate.
We can hear the cries and squalling of purists now. “720p is an old HD format!” and “Once you’ve seen 4K video on a big screen, you’ll never turn back!” Well, keep in mind that many of remote workers, students, worshipers, and civic-minded citizens are watching on laptops, tablets and even smartphones. And despite record sales of large screen TVs this year, there are still quite a few smaller TV sets in use today, with “smaller” defined as 55 inches diagonally or less.
Recall what we said earlier about our priorities: Effective communication tops all else, especially pretty pictures. Given the crushing demand on Internet connectivity right now, you should set your output resolution to 1280x720 on streaming cameras wherever possible. 60p is fine, but 30p will use just ½ the bandwidth of 60p. (Ditto 25p and 50p).
When we finally reach the end of the pandemic tunnel – and we WILL reach it; it will just take a while – the debates about 4K and 8K can happily resume. You can set your streaming cameras back to Full HD output and not feel guilty. We can get back to worrying about data rates and juggling combinations of frame rates and color resolution to pass through whatever display interface we’re stuck with.
Perhaps by then, average broadband data rates will support full “4K” Web conferencing. Either that, or we’ll have codecs so efficient that they can pack down 8K video streams to 10 Mb/s or less with just a few frames of latency. (Hey – we can dream, can’t we?)
We’re going to change things up a bit this month and shift our focus away from technology. As of this writing, the coronavirus pandemic is resurgent in several countries; particularly in the United States and Latin and South America. “The times they are a-changing,” goes the chorus in Bob Dylan’s classic song from early 1964, and if there’s a better expression to describe what’s going on now around the world, we can’t think of it.
It’s been four months since much of the world began shutting down in an attempt to slow the spread of COVID-19. The economic hit has been considerable, with the International Monetary Fund now predicting a 4.9% contraction in the global economy for 2020. As businesses try to maintain operations in a variety of quarantine or near-quarantine conditions, they’ve been forced to make changes “on the fly.” It now appears many of those changes will become permanent, even after a vaccine is available and the pandemic eventually winds down. And all of these changes will have an impact on the commercial AV industry.
To start with, let’s look at the increase in the number of remote workers. It’s estimated that 40% of the workforce in the United States will never return to an office, post-pandemic. Instead, they will continue to work from home offices or other remote locations. This, in turn, will reduce the demand for office space in cities and suburbs. And that will have a ripple effect on affiliated retail, hospitality, and service businesses – not to mention tax revenues.
A New York Times story on May 12 quoted executives from JP Morgan Chase, Morgan Stanley, and Barclays Bank – three of the largest commercial tenants in New York City – as saying they will not re-occupy all of the office space they originally leased prior to the COVID-19 pandemic. Instead, many employees will remain working from home permanently, and current leases will not be renewed once they expire.
Companies that were once opposed to the concept of remote workers are now embracing it as this unintentional, worldwide laboratory experiment winds on. Instead of employees gathering to meet in a room, they’re all logging in to virtual meetings via Zoom, GoToMeeting, WebEx, Teams, and other conferencing programs. (Have you tried to buy an external Webcam recently? Good luck, many models are out of stock and on long backorder. So are USB microphones for better audio.)
On the education front, primary and secondary schools in many states are planning to re-open in the fall of 2020, but only in regions where COVID-19 infections have dropped below a specified metric. Not surprisingly, the use of distance learning and group audio and videoconferencing has grown exponentially. And some colleges and universities, anticipating a second wave of infections, have already announced they will revert to the virtual classroom model for the winter semester as a precaution.
Perhaps it’s no surprise then that sales of PCs and laptops spiked upward in the first quarter of 2020: Intel reported a 23% increase in revenue, while AMD saw a 73% bump. Western Digital noted a surge in demand for storage components “due to the shift to working from home and e-learning.” Chromebook sales to students also took off, according to Google. And mobile device sales suffered, with market analyst firm IDC reporting that smartphone shipments had fallen 11.7 percent in Q1, while tablets dropped by 18.2 percent.
The viral pandemic also shuttered a good deal of PC and laptop manufacturing capacity in Asia. According to a June 8 story on The Verge, “…Retail analytics firm Stackline found that in recent weeks, traffic to laptop product Web pages has grown 100 to 130 percent (year over year). Conversion rates (that is, the proportion of visitors to laptop product pages who actually purchase), conversely, have plummeted; they’re normally around 3 percent, but in mid-May they hit an all-time low of 1.5 percent. In other words: people are looking for laptops more, but they’re having trouble finding products in stock to actually purchase.
Another market that will take some time to recover is that of expositions, conferences, and trade shows. Several polls taken this spring have shown conclusively that respondents are not at all interested in attending these events until a safe, effective, and proven COVID-19 vaccine is widely available. And major trade shows have toppled like dominoes this spring and summer, with another wave of cancellations now being announced for the fall.
Touring musical acts, festivals, theme and amusement parks, and sporting events are also struggling to figure out how to re-open and re-schedule without pushing COVID-19 infection rates upward. There is a consequent, direct impact on the transportation, hospitality, and advertising sectors, and in our industry, rental and staging companies.
However, electronic gaming and e-Sports are thriving, as their participants all compete online. Not surprisingly, video streaming of all kinds is also surging; from movies and TV shows to worship services, online courses, and virtual travel. Several movies that were intended for theatrical release went directly to streaming and digital downloads as theater chains closed down. And believe it or not, sales of Blu-ray discs have also ticked up as people dust off their old players and look for ways to entertain their kids.
No one can state precisely what impact these trends will have on the AV industry. We know the increase in remote workers will definitely continue – many companies have been operating this way for years, such as health insurance providers. With ever-faster Internet available to more and more homes and apartments, video conferencing isn’t such a big deal anymore. The unanswered question: How much do face-to-face communications and meetings matter now?
For higher education, the current technological limits of distance learning and virtual classrooms will be sorely tested in coming months. What courses and classes lend themselves to online learning, and which still require a physical presence? Can colleges and universities survive a decline in enrollment caused by the pandemic? Can they justify maintaining large campuses dotted with classrooms, lecture halls, and other facilities when so few are using them?
We know a few things to be true. First, fast Internet connections (and in particular, WiFi 6) are and will continue to be of paramount importance to everyone. Video streaming is here to stay, and for remote workers and distance learning, it’s as essential to life as oxygen.
Second, it appears rumors of the PC’s demise are greatly exaggerated (with apologies to Mark Twain). Mobile devices have their place, but just aren’t practical for day-to-day office work. And third, many of us need better cameras and microphones for conferencing – some cameras create truly awful images and don’t focus very well, while many microphones make you sound like you’re at the bottom of a well.
As AV professionals, we’re tasked with coming up with solutions for our customers. And boy oh boy, do they have some real challenges to solve nowadays! We may find that today’s “hot” product categories don’t apply in the future, and that we’ll have to go back to the drawing board time and again to keep up. (But that’s what we do best, isn’t it?)
Until next time, stay healthy….
A news digest publication we receive each week, appropriately called The Week, features a small section of stories each month under the heading, “Boring, but Important.” That description suits this post quite nicely, as it details the latest specifications for the Universal Serial Bus (USB) – version 4.0.
We tend to take USB for granted, as it is kind of boring. We don’t think much about those receptacles on the sides of our laptops and mobile devices. Most of us use them for charging up smartphones and tablets and connecting thumb drives and other external storage to laptops and desktops, along with wireless keyboard and mouse receivers. And we usually complain that there aren’t enough USB ports and buy USB hubs to connect scanners, printers, and other peripherals.
The original USB specification was released in January of 1996, and the first devices equipped with USB 1.0 connectors made their appearance 24 years ago this month. (Happy birthday!) That’s a lifetime in the world of computer peripherals: USB version 1.0 supported data rates of 1.5 megabits per second (Mb/s) in low-speed mode and 12 Mb/s in full-speed mode. In contrast, the current version (3.2) clips along at 20 GIGABITS per second (Gb/s).
Version 3.1 also did away with seven different connector styles (not to mention an entire cottage industry of between-version adapter plugs and cables) in favor of the Type-C symmetrical 24-pin plug and jack, and added some intelligence to the connection so it could do more than just provide 5 volts of charging voltage and move data back and forth. Now, a “triple play” was possible, connecting power, data, and display.
By all accounts, the Type-C connector is durable and popular. (Ask anyone who has fumbled in the dark with a Micro USB plug to charge up a phone, or broken a Micro plug while trying to push it into a Mini jack.) It takes up a lot less room on laptops, making for thinner and lighter designs as optical disc drives and bulky hard drives are also jettisoned. And it’s fast, no doubt about that – while writing this missive, we just backed up large music and photo files to an external Type-C solid-state drive at an average rate of 150 Mb/s.
So, why should we care all that much about enhancements to USB? The answer is simple – more and more peripherals are using USB as their only connection to the outside world. For example, during the current COVID-19 pandemic, sales of webcams and other streaming devices are through the roof. While more expensive cameras do provide a variety of connector options, lower-price models rely exclusively on USB ports for video and audio. And the same is true for most computer headsets.
The USB 4 specifications, announced while you weren’t paying attention last fall, build on Intel’s Thunderbolt technology to raise data transfer rates to 40 Gb/s over version 3.2’s 20 Gb/s. While very few current models of laptops have USB 3.2 ports on them, you can be sure future models will step on the gas – the USB Implementer’s Forum (USB-IF) estimates that devices with USB 4 ports will start appearing toward the end of this year.
The extra speed isn’t just for data. USB 3.1 introduced the concept of Alternate Mode, meaning one or more serial data lanes can be repurposed to also serve as display connections, all the while continuing to transmit data back and forth. DisplayPort version 1.3 was the first implementation and HDMI 2.1 will also travel nicely over this connection. If you want to connect an Ultra HD computer monitor via Alternate Mode using 10-bit RGB color at 60 Hz, you’ll need to sustain a data rate of around 21 Gb/s. And if you plan on running data back and forth at the same time – well, you get the idea.
What’s different about version 4 is that it uses tunneling over two lanes to move everything. From the USB-IF press release: “Key characteristics of the USB4 solution include two-lane operation using existing USB Type-C cables and up to 40 Gbps operation over 40 Gbps-certified cables, multiple data and display protocols to efficiently share the total available bandwidth over the bus, and backward compatibility with USB 3.2, USB 2.0 and Thunderbolt 3.”
The term “smart interface” certainly applies here. Version 4 is expected to be much more efficient in allocating bandwidth for simultaneous transmission of data and video. If a monitor is using 20 percent of the available bandwidth for video, the remaining 80 percent could be used for data. And a new feature, USB Power Delivery, is an intelligent charging protocol that allows negotiation of faster (or slower, but less battery-draining) charge rates for mobile devices.
Historically, the AV industry has relied on separate connectors for separate functions…HDMI for display, category cable for IT networks, RS-232 for control, unbalanced and balanced connectors for audio, and of course USB for data and peripherals. For long signal runs, we’re tilting back and forth between using category wire for proprietary HDMI/audio/control extenders, or for AV-over-IT networked applications.
Is it time to think instead about moving to a “tunneled” approach for long signal runs, aggregating video, audio, data, and control packets using USB 4 protocols and also supporting connected peripherals such as keyboards, mice, printers, scanners, Webcams, audio interfaces, and PTZ cameras? Logic says yes, but the immediate obstacle to that implementation is the data rates involved: HDBaseT is still stuck at USB 2.0 (max. 480 Mb/s), and AV-over-IT applications are centering around 10 Gb/s network switches.
However, there’s always optical fiber, which serial data interfaces travel over very nicely. Consider that 12G SDI interfaces and cables are already being implemented for 4K and 8K video production and transport. Two such interfaces (which actually run at 11.6 Gb/s) could easily handle USB 3 data rates, while four fibers could carry USB 4 data without breaking a sweat. Multimode fiber would be more than adequate for extenders with its range of nearly three miles.
Fun fact: You could port 8K/60 10-bit 4:2:2 video over a USB 4 connection, with light compression.
Like much of the world, we’re working from a home office now, due to the COVID-19 outbreak. Aside from it being much quieter (we only have two cats for company here, and they’re not particularly noisy), we’re making do with a vintage-2017 laptop, reasonably fast broadband, and the vagaries of online tele/videoconferencing programs like Zoom, Skype, GoToMeeting, and WebEx.
For those readers fortunate to have a dedicated space for a home office, the minimum setup is generally a laptop computer with built-in camera and microphone. (And yes, we know there are folks who still use desktop computers with separate cameras and mics.) It’s a pretty pedestrian arrangement and one that could use some enhancements here and there. While convenient, the audio and video quality from laptops varies widely, and is also impacted by the quality of the broadband connection and the codecs used by the streaming service.
Laptops, by design, don’t have very large speakers built into them. That in turn affects audio quality, which might motivate one to invest in a pair of amplified speakers. Based on observation, these are particularly helpful when participating in online Pilates and yoga classes where the viewer is sitting (by necessity) some distance away from the screen, and by extension, the speakers. We’ve also found these to be very helpful when a family member is engaged in an online art class.
On-board microphones can also use some help. In general, a directional microphone connected to a USB interface is going to sound a lot cleaner during a Zoom session simply because the mic is directional and not prone to picking up background noise from the washing machine, screaming children, or barking dogs. Also, internal microphones often employ some sort of automatic gain control to compensate for their lack of directivity, another enhancement you won’t need with an outboard directional mic.
How about that small screen? Our office happens to have a 46-inch LCD TV in it, across from the desk. For a group meeting, the overall experience can be enhanced by connecting it to the laptop and using it as the primary monitor, along with a separate microphone. You can see those tiny faces in tiny windows more clearly, and your audio will sound a lot better to them. Add in an external USB camera, and you can close the lid of your laptop altogether. (That large display screen is also quite beneficial for online yoga, Pilates, and art instruction we mentioned earlier!)
For those folks who rely on tablets and (horrors!) smartphones to participate in Zoom or Skype meetings, it gets old in a hurry. Best to find a larger screen of some sort, like that new Ultra HDTV you picked up for the Super Bowl, and “cast” your tablet/smartphone screen to it. Put on a headset or a pair of earbuds with microphone for better audio (higher signal-to-noise ratio) and to cut down on ambient room noise in your ears (like that washing machine or the barking dog).
Web conferencing services offer an option to record the meeting. But there’s no reason why you couldn’t just do it yourself locally: All you need is a connection to the HDMI or DisplayPort output of your computer, and an SD-card recording system. If you don’t need the video (and that’s usually the case), use an HDMI de-embedder and simply record to one of many SD-card-based portable audio recorders. They work very well and won’t break the back. We keep one here for recording everything from local bands to worship services.
One complaint we have about modern laptops is that they’re stingy with USB connections. By the time we’ve connected an external camera, external microphone, printer, and some sort of external recorder, we’re long out of ports. So, a USB extender or multi-port USB distribution system is a handy part of the home toolbox. Make sure you have the right USB plug type, as newer laptops are all moving to Type-C connectors. (And some of those also double as external display connectors, using Alternate Mode.)
AND WHILE WE’RE ON THE SUBECT…
There’s an old saying – “The problem isn’t the car; it’s the nut behind the wheel.” We’ve participated in several teleconferences during the pandemic using a variety of software platforms, and we’ve seen both the best and the worst of conferencing practices. Admittedly, this is a new and unfamiliar way for many people to communicate – some are surprised that their laptop cameras and microphones even work correctly!
Talking about cameras…the angle of your laptop screen is kinda important. Tilt it too far back, and other conference participants will see a nice view of your ceiling and maybe a tiny bit of your head. Make sure the screen is tilted forward enough so that you wind up with a nice head-and-shoulders composition. The resulting screen angle might not be the one you normally use, but everyone else will be able to see more of you. (You can also elevate your laptop to compensate.)
Of course, an external USB camera eliminates this problem. We’ve found several models online with affordable prices. Some have tilting bases and sit on a desktop, while others can sit atop computer monitors and are also tilting types. All models have built-in microphones.
Where you position your mobile device, laptop, or external camera also matters. As a rule of thumb, don’t sit with windows or other bright light sources behind you – the camera will “iris down” to compensate the exposure, and you’ll come across as rather shadowy! Try to position your camera so your back is to a neutral, uncluttered background and let as much light fall in your direction. (Bookshelves seem to be popular backdrops these days, especially with authors.)
Some conferencing programs let you create a virtual backdrop, but don’t go crazy with it – no one wants to see pictures of your pets, your boat/sportscar, or your collection of beer cans. Take your phone outside and shoot a picture of a nice, bucolic forest, field of garden and try that. Or pictures of the ocean, or even just the sky. Nothing busy or distracting! You may also have the option of blurring the background, creating an effect known as bokeh.
You can also set up a table or floor lamp (or even a work light) with soft white LED bulbs to boost light levels. The more light your Webcam has to work with, the sharper and cleaner your video will be, even after it is compressed to death. If you have white or off-white ceilings in your room, try bouncing light instead of direct light – you’ll wind up with softer, more diffuse shadows.
If possible, try a USB headset instead of the built-in mic on your device. Headsets block out distracting audio from outside the room and eliminate any possibility of feedback loop echoes from a local speaker and laptop/tablet/phone microphone, unfortunately a regular occurrence in conferences where participants are using their smartphone as a speakerphone. (Those echoes are REALLY annoying to everyone else!) Plus, other people in your house won’t have to listen to the conference.
And when you’re not going to be talking for a while, mute your microphone so that no one has to hear the garbage truck outside, or your dog barking at a mail carrier. If something comes up that will distract you from the meeting, turn off your camera as well – no need for others to see you scurrying to pick up a wayward child and move them to another room.
We didn’t think we needed to remind anyone to dress appropriately, but apparently some conferencing attendees have taken “business casual” to a new level. And yes, we’ve heard plenty of stories about folks who are dressed from the waist up, but are only wearing pajamas, underwear, or going commando from the waist down. (If the camera can’t see it, it’s safe, right?)
Look - It’s a meeting, after all, so show a modicum of respect for other participants. You don’t have to put on formal business wear, but at least avoid the logo T-shirts and tank tops, and pick out a nice collared shirt, blouse, or polo shirt. Surprise everyone, and throw on a blazer. Stick to colors with neutral gray values – no blinding white or deep black colors, which will again throw off the camera. And a little grooming goes a long way. You may not be in the office, but you are in an office of sorts. (What you wear on your feet, however, is up to you…)
When it comes to signal management and distribution, the AV industry primarily relies on HDMI cables, USB cables, category cables, and shielded/unshielded audio cables. In other words, a steady diet of copper gets us from Point A to Point B and beyond. If we’re able to keep the distances between Point A and Point B to a reasonable number, and our bandwidth requirements aren’t excessive, then all is well.
But what happens when our video and audio signals need to traverse a distance longer than a few hundred feet? After all, copper wire does have some degree of resistance, and by extension, attenuation. What’s more, as the frequency of the transported signal (clock rate) increases, the electrons start moving from the center of the cable to the outer part, a phenomenon known as the “skin effect.”
Eventually, the frequency of the signals in use becomes so high that they leave the cable altogether and travel as photons through the air, which is why television stations broadcasting on UHF frequencies use tuned waveguides to couple energy from a transmitter to the antenna. The transmission resembles more of an elaborate plumbing job than anything else!
We could extend AV signals over long distances by converting electrons into photons, i.e. light energy. But we’ll need a suitable transmission medium to carry those pulses of light from Point A to Point B. And that’s where optical fiber cable comes in: It’s able to move those photons from a transmitter to a receiver over very long distances with minimal signal attenuation and degradation.
Let’s say you need to run a l-o-n-g HDMI extension to a remote display, mounted about one thousand feet from the source. HDMI, by itself, won’t get you much more than 25 – 50 feet. HDBaseT extensions are only good to about 300 feet. And a network interface isn’t available for this extension. What to do?
Simple. You’ll need an HDMI-to-optical fiber transmitter/receiver set. The transition-minimized differential signals (TMDS) from the HDMI source become pulses of light, ready to fly through space. If you connect a fiber optic cable using multimode transmission – meaning that the pulses of light reflect multiple times off the core of the fiber as they travel – then you can extend the original signal up to 1.8 miles.
If you elect to use single-mode optical fiber (the pulses of light travel in a relatively straight line through the core of the fiber), then you can extend your source signal all the way out to 20 miles for a reliable connection. The choice is up to you! Keep in mind that multimode optical fiber cable is cheaper than single-mode cable (not to mention Cat6 network cable) and is more than adequate for the above example.
There are a ton of advantages to using fiber optic cable. For one thing, it’s completely isolated from interference, both man-made and natural. It’s also unaffected by ground loops (differential voltages) and magnetic fields. Given its low attenuation per foot, you can just buy a pre-assembled cable with connectors, run the cable, and loop up the excess – no need to trim and re-attach connectors. Or, you can make up your own cables – crimp-on connectors for optical fiber are quite easy to use these days.
Some integrators have already jumped on the fiber wagon. 10-gigabit network switches support both copper wire or optical fiber through small form-factor pluggable (SFP) connections, and it’s likely that faster switches will rely mostly on fiber connections – the signal attenuation over copper wire at higher frequencies is substantial, once the cable run exceeds ten feet.
Fiber optic cable doesn’t take up much room, either. Bundled cables with multiple fibers can be run and laid in overhead cable trays easily enough. (Just don’t put a tight bend in them!) By building out a facility with fiber interconnects to all rooms and spaces, you’ve ensured your facility is future-proofed. If another audio or video signaling format comes into vogue, or your bandwidth demands increase, you simply change out the optical interface – no need to pull new cables.
While the current version of HDMI our industry relies on (v2.0) uses the TMDS format, the next version (v2.1) and all versions of DisplayPort employ a packet-based digital transmission system. That’s an even better match for optical fiber transmission! What’s cool about fiber is that we can multiplex audio, video, control, and metadata all through the same cable, at the same time. We do this with a variety of tricks, including time division (spacing out different packets), code division (coding packets), and wave division multiplexing. With the later process, different wavelengths of light carry different signals.
It should come as no surprise that Kramer supports optical fiber signal distribution. The 675T 4K60 4:4:4 HDMI Transmitter over Ultra-Reach MM/SM Fiber Optic Transmitter and companion 675R Receiver are designed to extend HDMI v2.0 signals over very long cable runs. These useful gadgets couldn’t be easier to set up and operate – all you need to do is provide the fiber optic connection, using a type LC connector at both ends. For shorter runs, multimode cable does the trick, while single-mode fiber will handle the long-haul stuff.
675T and 675R use near-zero latency video chroma sub-sampling conversion technology to
auto−adapt HDMI signals with data rates above 10 Gb/s to a 10G optical link signal data rate. Both units are HDCP 2.2 compliant and support data rates up to 18G (6G per channel), along with LPCM 7.1, Dolby True HD, and DTS-HD audio formats, as specified in HDMI 2.0. Additionally, Kramer’s I-EDIDPro™ Intelligent EDID Processing™ ensures plug and play operation for HDMI source and display systems.
If you’re moving to or have already adopted SDVoE network-based AV signal distribution, there’s a Kramer optical interface product for that, too. KDS-8F is a high-performance, zero latency, 4K@60Hz (4:4:4) transceiver for streaming video and audio via Ethernet over single-mode and multimode optical fiber. And it is ambidextrous: KDS-8F can encode and stream its HDMI or DisplayPort input multiplexed with IR and RS-232 control signals, plus analog audio and USB; all over an IP network. Or, it can receive an SDVoE-encoded signal and decode it for HDMI output, along with control, audio, and USB.
For ruggedized operations, Kramer also offers the CRS-PlugNView-H cable. It’s a high-speed HDMI
active, armored optical cable (AOC) designed for heavy-duty use and abuse expected from rental and road applications. These cables support resolutions up to 4K@60 (4:4:4) 18 Gbps over long distances without an external power supply or additional extenders. You can get ‘em in a variety of lengths from 33 to 328 feet.
Remember – fiber is good for you!
We focus on video- and display-related products a great deal of the time in these monthly ramblings, and for good reason. Video and display signal management represents the “heavy lifting” of the AV industry, involving a myriad of signal formats with lots of pixels, high bit rates, and different color resolutions.
Interfacing and switching analog video and display signals was quite the headache years ago, and one could argue that the transition to digital came just in time. (Imagine interfacing a 4K/60 connection with discrete analog wiring! On second thought, no, don’t imagine it, you’ll just get a massive headache.)
While video usually grabs our attention, audio often seems to just come along for the ride, like your annoying younger brother when you went to the park to play with your friends. You knew he was there, but you largely ignored him and hoped he wouldn’t wander off and get you in trouble with Mom.
Fact is, we have almost as wide a variety of audio signals these days as we do video signals. And audio can come from a varied number of sources with a wide range of quality levels, ranging from professional quality to “what the heck is all that background noise?” Much of it originates from consumer gadgets that were designed for user convenience and not to win any Hollywood awards for ‘best sound mixing.’
Every product that can capture video also has some form of audio recording built-in. That includes smartphones, tablets, laptop computers, camcorders, digital SLR cameras, point-and-shoot cameras, and even those ‘smart’ speakers that are all the rage nowadays. Because the manufacturers of these gadgets don’t know how or where you plan to record audio, most of these products have some sort of automatic gain control (AGC) turned on to make sure it does get recorded.
That works fine in a quiet space, but not so great outside (wind and ambient noise) or in a crowd (background vocal sounds and ambient noise). And the audio output levels vary from one gadget to another, as do the frequency response and microphone characteristics. If you were to string together a bunch of YouTube video clips shot with a wide variety of cameras and phones, you’d clearly hear these differences.
That’s why having some sort of audio digital signal processing (DSP) is really handy these days. DSP can fix a multitude of problems, including audio levels and equalization, and it can be operated using nothing more than a graphical user interface (GUI) via a network connection. DSP also comes in handy when connecting and mixing good old-fashioned analog microphones, particularly in a meeting or conference space. Think of DSP as replacing an analog sound engineer and mixing board, which would take up too much room in a meeting anyway and be distracting.
We can easily implement DSP in a meeting room and also accommodate a wide range of analog and digital input signals. Better yet, we can also offer DSP to huddle spaces, where audio connections and playback are about as ad hoc as it gets. We might be able to hear each other in the space, but anyone participating remotely won’t hear a thing unless microphones are used and volume levels are set to workable ranges. And it would be nice to level all audio sources so that we don’t transition from “gentle breeze” to “airplane taking off” between clips.
Remember – audio can come from just about any connection these days. In addition to analog inputs, we can play digital audio through USB ports and we can also embed it in an HDMI connection. Of course, audio will also be served up over Internet connections, if we’re streaming from a Web site. And all of it will probably need some sort of signal processing so it sounds clear and crisp. We’ll have to support all of these connections to cover the bases.
One advantage of using a digital audio system is that we can also consolidate several other discrete pieces of audio hardware into a single chassis. In addition to signal switching and processing, we can also throw in an amplifier and even some room control functions, again running everything from a network interface to make operation ‘plug and play’ as much as possible. What once required a full rack of mixers, equalizers, amplifiers, and switching gear is now consolidated into a do-everything, single rack unit product.
Kramer knows a little bit about audio, as we’ve been supporting the category for just about four decades. And two of the latest additions to the Kramer audio line reflect this latest thinking in audio hardware and software. AFM-20DSP-AEC is a multi-function audio matrix switcher that comes with 20 bi-directional analog audio ports. Instead of having to work with a pre-determined matrix configuration, you get to decide how many inputs and outputs you need. (And you can easily change your mind later on, because you will. And you know it.)
For digital audio, it includes an HDMI input and output with embedding and de-embedding, plus coaxial S/PDIF input and output jacks. There’s also a 4x4 Dante connection for networked, low latency audio. But Kramer didn’t stop there. AFM-20DSP-AEC also includes a stereo amplifier (2x60-watt @ 8 ohms or 1x120-watt @ 70V/100V). On top of all that, there’s a 32-bit digital-analog converter (DAC) with selectable sampling rates up to 96 kHz, and simultaneous digital signal processing of all inputs and outputs.
The other new product may be a first for the category. Kramer’s DSP-62-AEC was engineered for one of the trickier audio environments – huddle spaces, which are notable for their general lack of hard-wired AV gear. This compact wonder manages to support bi-directional audio through two HDMI inputs and one output, a USB port, a stereo analog audio jack, and up to four analog microphone connections. DSP-62-AEC can route and mix any audio source and send it wherever you want.
If you’re ready to retire racks of discrete audio hardware in favor of a simpler, all-in-one solution, or are scratching your head wondering how you can provide a better audio experience for huddle spaces (which some folks might describe as trying to herd cats), Kramer has you covered. Both AFM-20DSP-AEC and DSP-62-AEC should improve audio quality considerably during meetings. (Sorry, we can’t do anything to fix the quality of presentation content…)
It’s been a couple of years since the first wave of 4K AV products washed ashore, and yet, there is still some confusion about what the term “4K” actually means. It doesn’t help that there has also been (and continues to be) a lot of misinformation offered about this imaging format, ever since the first commercial and consumer displays with 3840x2160 pixel resolution were unveiled in 2012. So let’s clear things up.
To start with, “4K” is kind of a vague catch-all term. A display with true 4K resolution will have 4096 horizontal and 2160 vertical pixels, which is a cinema format. The version of “4K” we’re more familiar with is defined by the Consumer Technology Association as “Ultra HDTV.” Displays and display signals (and many UHD camera sensors) classified as Ultra HD have 3840 horizontal and 2160 vertical pixels. Not quite true 4K, but close enough for our purposes.
Now, here’s where things get tricky. An Ultra HD display signal has four times as many pixels in a single frame as a Full HD video signal; 9.9 million versus 2.48 million. That’s quite a boost in payload, and it creates a speed limit challenge when interfacing, particularly as we increase the frame rate and color bit depth.
There is truth in numbers! A Full HD video frame has a total of 2200x1125 pixels, including blanking. Multiply that by a frame rate of 60Hz, using 10-bit RGB color (or 4:4:4, using a broadcast notation), add in the customary 20% ANSI bit overhead, and you have a payload of 5.346 gigabits per second. (Let’s call it 5.4 Gb/s, to simplify matters).
How did we arrive at that number? Well, 2200 pixels × 1125 pixels × 60 = 148.5 MHz, which is a common pixel clock frequency for Full HD. Next, we multiply 148.5 by 3, because we’re using RGB color. And we then multiply that product by 12 (10-bit color + 2 bits as overhead) to arrive at our final number: 5.4 Gb/s of data. That can easily fit through an HDMI 1.4 connection, which has a maximum rate of 10.2 Gb/s.
Okay, time to exit our Honda Civic and get into a high-performance BMW M3. Our Ultra HD signal has a total of 4400 horizontal and 2250 vertical pixels with blanking. Refreshing that signal 60 times per second gives us a pixel clock of 594 MHz, and using 10-bit RGB color, we now have a sustained data rate of 21.384 Gb/s. Wow! (Not surprisingly, that’s four times as fast as our Full HD signal calculation.)
That’s way too fast for HDMI 1.4. In fact, it’s even too fast for HDMI version 2.0, which can’t transport data any faster than 18 Gb/s. (That’s why a newer and faster version of HDMI – v2.1 – is just now coming to market.) Hmmm…do we really need 10-bit RGB color for everyday applications? Probably not, so let’s dial the bit depth back to 8 bits per color, which should suffice for high-resolution graphics and images.
Jiggering the math that way drops our bit rate down to 17.82 Gb/s, which gets us within the speed limit of HDMI 2.0. We can also crawl under the limbo bar by reducing color resolution to 4:2:2 or 4:2:0. A 10-bit Ultra HD signal with 4:2:2 color has a data rate of 14.26 Gb/s, while a 10-bit 4:2:0 version drops that number to 10.7 Gb/s.
And we can trim the data rate even further by cutting the frame rate in half to 30 Hz. Initially, that’s what many signal management companies did to accommodate Ultra HD signals while retaining the HDMI 1.4 interface. But as our display screens get larger (and they are getting a LOT larger), lower frame rates with wider fields of view can produce noticeable flicker. This phenomenon was first observed by Japanese broadcaster NHK as they began rolling out 8K TV broadcasts…but that’s a story for another time.
So, we want to stick with at least a 60 Hz frame rate for our 85-inch Ultra HD LCD monitor or our 120-inch Ultra HD LED wall. We’ll definitely need signal management products equipped with HDMI 2.0, at the minimum. By doing so, we can accommodate more powerful graphics workstations and laptops, where we can set the bit depth to fit our system bandwidth. We can also stream Ultra HD video from physical media and streaming platforms, where the most common color resolution is 4:2:0. Again, an easy fit for our system.
“Hold on there,” you’re probably thinking. “Where is all this demand for Ultra HD coming from?” Time to wake up and smell the coffee, folks: Monitoring and surveillance systems, such as those used by traffic agencies and process control systems always need more pixels on the screen. Gamers are always looking for more pixels on the screen at faster refresh rates with low latency. So do companies engaged in energy exploration, visualization and virtual reality, 3D modeling, and medical imaging. You know the old saying: You can NEVER have enough pixels.
And as we just read, the AV industry is rapidly switching over to Ultra HD displays as Asian display panel “fabs” phase out zero-profit Full HD panels and ramp up Ultra HD panel production. That means single-monitor and TV sizes as large as 98 inches using LCD and OLED technology, and tiled LCD/OLED display walls – plus LED walls – that have 4K, 8K, and even higher pixel counts. If you are looking to build a new signal distribution system, it is a wise bet against the future to support the higher bandwidths required for Ultra HD, all the way through every connection.
Kramer’s VP-551X 8x2 4K presentation switcher/scaler is well-suited to this purpose, equipped with eight discrete HDMI 2.0 inputs with embedded and discrete audio and native support for both 4:4:4 and 4:2:0 color resolutions. In addition to a single HDMI 2.0 output, it also provides an HDBaseT output for 4K/30 RGB or 4K/60 4:2:0.
More importantly, all HDMI ports are compatible with HDR10, a standard for static metadata required to display high dynamic range video. HDR is becoming an intrinsic part of Ultra HD production and display, allowing the reproduction of a much wider range of luminance values from black to full and specular white. VP-551X also passes the Dolby TrueHD and DTS-HD Master Audio formats, common with physical media and streaming playback.
There are other handy gadgets for your Ultra HD signal management system. Kramer’s new 675T and 675R are plug-and-play fiber optic signal extenders that accept type LC optical plugs and will work with either multimode or single-mode fiber, providing signal extensions up to 20 miles with single-mode operation. The beauty of optical fiber is that it has virtually no speed limit issues, as opposed to copper wire-based signal extenders that run several hundred feet at most.
You also have the option of routing your Ultra HD signals over a 10-gigabit Ethernet connection by using the Kramer KDS-8F SDVoE video streaming transceiver. As an encoder, it encodes and streams HDMI or DisplayPort signals along with infrared control, RS-232 control, analog audio, and bi-directional USB 2.0 over an IP network, using small format pluggable (SFP) optical fiber. KDS-8F can also work as a receiver to decode all of the signal formats just mentioned to an HDMI port with discrete audio, IR, and RS-232 connections.
And that is “what’s up” with 4K these days. The demand for more pixels, refreshed at faster rates with greater color bit depth, isn’t slowing down one bit. (Did you know that 8K cameras are now being used to inspect sewer pipes? It’s true! But that’s a story for another time…)
You might have noticed one AV product category that’s gotten a ton of attention in recent years (if not most of the attention): Presentation sharing/collaboration. Just about everyone and their uncle offers some sort of hardware and/or software that allows anyone to share what’s on their screen with others, whether they’re running Android™, iOS™, or Windows™.
Some of these products are complex and loaded with all kinds of add-on tools. Others are bare-bones designs that employ little more than screen-scraping techniques. This makes choosing a system unnecessarily difficult for customers, who more often than not aren’t even sure of what they expect out of a presentation sharing/collaboration product.
You could make an argument that this category invented itself. In previous times, people dragged laptops into a meeting room, uncoiled and plugged in AC power and display connections, and used tabletop, under table, and remotely-controlled presentation switchers to cycle among the various presentations. If you wanted a copy of whatever was being shown, it was delivered as a photocopied handout or a PDF file after the fact. What a pain in the neck!
But enough engineers were able to see into the not-so-distant future of mobile, personal electronics, i.e., smartphones and tablets. These devices have become so powerful that they have replaced laptops for many functions. And they make extensive use of two things – fast wireless connections and cloud-based content/file storage and delivery. So, why not use them as a new form of presentation platform, particularly with their ability to easily capture high-quality video (although often in the wrong orientation)?
Soon enough, presentation sharing hardware started rolling off the assembly line. Prospective buyers were overwhelmed with these gadgets, some of which were loaded to the gills with advanced functions that would never be used. Other models seemed to be so simplistic in function that they were little more than cheap consumer products. Company and campus IT departments weighed in with concerns about connectivity to their networks and any security threats these new-fangled gadgets might present.
Eventually, customers had to decide just how much functionality they wanted in such products. Younger users, who consider mobile phones as essential to their lives as their internal organs, were primarily sharing photos and videos. In contrast, older users (those still depending on laptops) were more accustomed to loading up things like spreadsheets for meetings. Some attendees wanted paper copies of the presentation, while others simply took photos of the screen of relevant material, using their phones, of course (which is ironic, in a way).
Calls from the IT department got louder over time. As presentation sharing/collaboration products started popping up on networks, strong passwords and two-factor logins became necessary. Multiple installations meant multiple trips through buildings and across campuses to update firmware. Presenters complained about herky-jerky video and issues sharing iOS screens. It was definitely “gaffer tape and paper clips” time!
Today, the presentation sharing/collaboration marketplace has matured enough that manufacturers have a pretty good idea of how people actually use these gadgets. Over time, anecdotal evidence revealed that a majority of customers just wanted a simple, reliable, easy-to-deploy-and-manage wireless screen-sharing solution, particularly in the education vertical. And as it turned out, there was no need to re-invent the wheel: The best approach was to connect and present without needing to install anything on a computer or mobile device – just leverage the native OS and Web browser protocols that are built into each user’s device.
That’s not to say there wasn’t a need for additional features, such as viewing the main screen on your own device (great for lecture halls with long viewing distances), editing documents together in real time, sharing any size file with anyone else in the meeting or class, instant polling, and turning the main display into a digital whiteboard with recordable annotation. Certain groups needed and continue to need all of those bells and whistles.
But for others, the ability to connect their mobile device quickly and easily to a shared screen using a standard wireless connection was the big draw, using iOS mirroring for MacBook™, iPad™, and iPhone™ as well as native mirroring for Chromebook™, Android (Lollipop OS 5.0 or newer), and Windows phones. So was figuring out a way to stream video at native frame rates without the end result turning into visually-annoying, low frame rate flip movies.
The hardware-intensive approach to early wireless presentation systems has now morphed into one that focuses more on software, and rightly so. Indeed, it’s now possible to build your own wireless presentation system simply by installing a software package and using AirPlay for MacOS & iOS, Miracast™ for Windows & Android, and connecting directly through Chrome or Firefox Web browsers.
A bridge from the past to the present was also created for legacy meeting spaces by adding wired HDMI™ inputs to wireless presentation platforms. The hardware and software mixes both wired and wireless connections together in a seamless way, extending the useful life of existing presentation switchers by making them another gateway to the wireless system. (It’s always good to have options!)
Those overworked folks responsible for maintaining IT networks were placated with a versatile software package that allows remote monitoring and configuration of multiple presentation sharing devices on the network – no need for physical visits to each room or space. And by incorporating 1024-bit encryption over each wireless link (and, if necessary, building “DMZs” with firewalls), security was a non-issue.
What we’ve just described is a general outline of Kramer’s VIA wireless presentation product line. For basic plug-and-play connectivity, VIA GO provides 60 Hz video streaming, 1024-bit encryption, and built-in WiFi. VIA Connect PRO can show up to four screens simultaneously and any in-room meeting participants can view the main display, edit documents together in real time, share any size file, and turn the main display into a digital whiteboard. (VIA Connect PLUS adds a wired HDMI input.)
For more advanced users, Kramer’s VIA Campus2 adds e-polling to instantly measure student feedback and can also be used as a secure wireless access point for guests. Six user screens can be shown on one main display and up to 12 screens by using two displays. Remote students can easily join the class and collaborate in real time with embedded 3rd-party video conferencing and office apps including Microsoft Office®, Skype®, GotoMeeting®, Lync®, and WebEx®. (VIA Campus2 PLUS adds a wired HDMI input.)
In its simplest form, VIA can be loaded and run as a software program (VIAware), It delivers the same security offered by all VIA devices and can be installed on any Windows 10 computer. Itcan show up to six user screens on one main display or up to 12 screens on two displays, and remote students can easily join and collaborate in real time with embedded 3rd-party video conferencing and office apps.
Finally, VIA Site Management (VSM) is a software application that enables IT administrators to manage, monitor and control all connected VIA devices. VSM generates alerts on system health and includes reporting and analytics tools for understanding VIA device usage.
It’s taken a few years to get there, but we can finally answer the question posed at the start of this missive: What do presenters really want? Simple, reliable, easy-to-deploy-and-manage wireless screen-sharing solutions, as it turns out. Who knew?
The AV industry has a few “benchmark” years, starting with the introduction of light valve video projection in the 1980s and continuing through the first solid-state video/data projectors in 1993, the first flatscreen displays in the mid-1990s, optical disc media in the late 1990s, high definition TV in the early 2000s, a migration from plasma to LCD later that decade, and widespread adoption of high-speed wireless for connectivity over the past five years.
Right now, we’re laser-focused on moving away from full-bandwidth video signal distribution to compressed Full HD and 4K video switched and routed over IT networks. That in itself is a sea change for integrators, and will more closely align our industry with the world of information technology, likely causing the loss of more than a few jobs along the way, as has happened recently in the broadcast industry.
We’ve also bought into the idea of ditching short-arc projection lamps in favor of a more durable, eco-friendly solution that harnesses laser diodes to color phosphor wheels. And we seem to like the concept of wireless signal connectivity for presentations and collaboration, slowly moving away from wired connection hubs on walls and tabletops.
But the biggest change of all is just starting to emerge from behind the curtain, and that is the increasing dominance of the light-emitting diode (LED) in display technology. And when we say “dominance,” we really mean it – LEDs have the potential to become the first unified display platform since the cathode-ray tube was developed a century ago.
It’s not like we didn’t see it coming. LED videowalls with coarse pixel pitch have been around for more than two decades, but they were limited to installations in large stadiums and arenas, and as outdoor signs in places like Times Square and the Las Vegas Strip. That all changed around the start of the present decade, when individual LEDs became practical to manufacture in ever-smaller sizes.
Those videowalls and scoreboards from 1999 had, on average, a pixel pitch of about 10 millimeters. A contemporary version for indoor installations is likely to have a pixel pitch of 2 millimeters, presenting images with much finer detail when viewed at close range. Given that most of the LED device and tile manufacturing takes place in China, it wasn’t long before tile and wall prices began falling…and customers as diverse as staging companies and retail chains took notice, and bought in.
You probably did, too, at ISE and InfoComm about five years ago. Where did all of these LED all companies come from, all of a sudden? How come I never heard of any of them? Wow, those things are bright! Bet they’re expensive…
To be clear, the position of “display king of the hill” rotates every few years. CRT displays sat on the throne for generations. Then plasma displays took over, only to be deposed by LCD screens. The latter have clung to power for over a decade, but they can see the writing on the wall. The only question is how quickly LCD technology will cede its top-of-the-market position to LEDs.
Based on anecdotal evidence, what we’ve seen at recent trade shows, and forecasts from display analysts, the coronation is going to happen pretty soon. Just as large, economical Full HD LCD monitors and TVs escorted “hang and bang” projectors out of classrooms and meeting spaces, large LED walls are putting a serious dent into the sales of high-brightness projectors, particularly for staging live events. And they’re setting their sights on large LCD monitors and TVs next.
It’s easy to see why. LED walls are built up out of smaller tiles and cubes, just like Lego toys. They literally snap together into lightweight frames, using molded multi-wire plugs to daisy-chain tiles and cubes together, and are easy to fly or stack. LEDs don’t make any noise, aside from small cooling fans when operating in enclosures, and provide a bright, colorful, and high-contrast one-piece display system that can show 4K and even 8K-resolution video.
Still; many end-users and integrators have long regarded LED walls as expensive niche displays, not anything practical enough to install in classrooms, lecture halls, and meeting spaces. Well, that thinking got blown out of the water last June at InfoComm, where we saw the first build-it-yourself LED displays for meetings rooms.
These products have sizes ranging from 120 to 150 diagonal inches (that’s 10 and 12.5 feet, respectively) and offer dot pitches from 1.8 to 2.5 millimeters. They come in modular kits that take two people about three hours to assemble and wire together, and can be hung on a wall or even attached to a roll-around stand – all you need to do is plug in a power cord and connect your video source through an HDMI port, and away you go.
In terms of brightness, LED displays actually have to throttle back on their luminance. Using a pulse-switched operating mode, they could easily hit 2,000 nits, but that would be glaringly uncomfortable in a small room. The actual luminance level is closer to 400 – 600 nits, adjustable to compensate for high ambient light levels.
From a technology perspective, these products are LCD killers. But from a financial perspective, they’re not quite there yet: A 130-inch model has a retail price of about $75,000, which would more than cover the cost of four 65-inch LCD monitors and signal processing gear. Then again, the first 50-inch plasma monitors for commercial use retailed for about $25,000 apiece twenty years ago, so we can expect prices to come down pretty quickly on these products as demand grows.
With large LED walls established as the go-to display for digital signage and image magnification, and fine-pitch walls establishing a beachhead in meeting rooms, the next step is consumer televisions and mobile displays. Late-model Ultra HDTVs with high dynamic range support already use large matrices of mini LEDs as backlights.
It will fall to a new class of LEDs – “micro” devices – to fill in the missing slots or consumer displays…but that’s a story for another time…
The AV industry has arrived at a singularly intriguing point in time. Like the rest of the communications world, we’re getting ready to jump aboard the IT bandwagon and use TCP/IP networks and switches to route and distribute audio and video…leaving the world of uncompressed display signal management behind.
Coincidentally, this paradigm shift is coinciding with another move; this time, from Full HD video to Ultra HD video. But that’s not all: We’ll also have to reckon with high dynamic range content and its associated wider color space. And off in the distance, you can see yet another freight train approaching, with its box cars reading “High Frame Rate Video.”
On the one hand, we’re learning all about codecs, latency, forward error correction, groups of pictures, jumbo frames, and a host of acronyms like DHCP, IGMP, and HLS. On the other, we’re frantically calculating our bandwidth requirements and wondering if we’ll have a fast-enough network to handle all of these high-speed, souped-up pixels.
And that brings up a really good question. Just how fast is “fast enough?” If we’re building an AV-over-IT network, what top speed should we aim for? And how, exactly, can we build some degree of futureproofing into our design so we don’t have to come back a couple of years from now and install new switches and perhaps even new cables?
Let’s start with the basics. Is the network going to be used to switch near-zero or very low latency video? (Audio’s easy, no need to worry there.) if your answer is yes, then you are talking about gigabits per second and more likely tens of gigabits per second data rates. On the other hand, if the AV-IT network is going to be used primarily to stream content in non-real time and some latency isn’t a problem, then we’re talking about tens and hundreds of megabits per second.
Still, you need to design for the highest possible speed requirements, and if at any point you want to manage low-latency video using JPEG-based codecs (or other light compression codecs), you’re back in the tens of gigabits per second neighborhood. Currently in our industry, we have manufacturers advocating for 1 Gb/s switch fabrics and others saying, “No, you need at least a 10 Gb switch.”
From our perspective, designing a 1 Gb/s AV-IT network is essentially tagging it with a “planned obsolescence” sign. A basic 4K (Ultra HD) video signal that’s refreshed 60 times per second and has 8-bit RGB color will generate an uncompressed data rate of 17.82 Gb/s, WAY too fast for a 1 Gb/s switch without significant compression. Yet, that’s a rudimentary form of 4K video.
With HDR enhancements, we’ll need to move to 10-bit sampling. We can cut the color resolution from RGB (4:4:4) to the broadcast standard 4:2:2, at which the new bit rate is 14.26 Gb/s – still too fast for a 10 GB/s switch. We could drop the frame rate in half to 30 Hz, slowing the bit rate down to 7.13 Gb/s and clearing the switch without additional help. But maybe the application really needs that higher refresh rate?
This problem is ameliorated to some extent with the SDVoE Blue River NT codec. It applies light (2:1) compression with super-low latency to let us limbo with ease under the 10 Gb/s bar. All fine and good, but what if we want to run a 10-bit RGB 4K signal with HDR and a 60 Hz frame rate from point A to points B, C, D, E, and F? The raw data rate is now 21.4 Gb/s, and even 2:1 compression won’t get us through the switch.
One possible solution is to compress the broadcast (4:2:2) 4K video format by using a lossless codec like JPEG XS (TiCo) to pack the signal down by as much as 6:1, then come out of the decoder with full-bandwidth display connections. Japanese TV broadcaster NHK showed a demonstration last year at NAB of an 8K signal (7680x4320 pixels) with 10-bit 4:2:2 color and 60 Hz refresh, successfully transiting a 10 Gb/s network switch thanks to 5:1 JPEG XS compression.
So many numbers to think about! But that’s our new world as we continue to push up pixel counts and bit depths. And we haven’t even talked about gamers and virtual reality enthusiasts, who would prefer to see frame rates well above 60 Hz to minimize motion blur. Assume 96, 100, and even 120 Hz for these specialized applications and run the calculations again. (You’ll probably want to jump out the window when you’re finished.)
It’s all too easy to increase bit depth and frame rates for 4K and 8K signals and wind up with data rates that would warrant speeding tickets. But our job as integrators is to provide some sort of future-proofing in any installation…which is why our industry might want to start looking at 40 Gb network switches.
Yes, they exist, although mostly as selected ports on hybrid 10 Gb / 40 Gb switches. And it goes without saying that those 40 Gb ports are using quad small form pluggable (SFP) optical fiber connections. These switches aren’t insanely expensive: We found several models through an Internet search that are priced between $3,000 and $8,500, and although not “pure” 40 Gb switches, they will do the job.
So – is a 10 Gb network switch “fast enough?” Or maybe we need a 40 Gb switch? Believe it or not, 100 Gb switches are in development. Is THAT fast enough for you? Better check your math…
Unless you were wearing a blindfold while walking around CES 2019, you could not miss the numerous displays of 8K televisions. 8K banners hung from anywhere, along with signs for the other “hot” technology at the show – artificial intelligence (AI).
We counted well over a dozen examples of 8K TVs in our perambulations, most of them showing a series of still images with high resolution and high dynamic range. A few demos actually showcased 8K footage, presenting incredible detail in closeups of a honeycomb with attendant bees, or ants crawling over a vegetable garden. (That last clip was so realistic that it freaked out more than a few entomophobes!)
Other more practical demonstrations featured lower-resolution content scaled up to fit the 8K screen. Note that 8K TVs, when they finally arrive in any quantity, will likely start with a screen size of 65 inches and move up from there, hitting a maximum (so far) of 98 diagonal inches. Like it or not, you’ll be sitting pretty close to such a screen, so having all of those pixels plus enhancements like high dynamic range will make for a more pleasing viewing experience.
And you can largely attribute this move up in resolution to the Chinese, more specifically companies like TCL that are building Generation 11 LCD panel fabrication lines. These lines will crank out larger panels that can then be cut into smaller (although still large) sizes and at lower per-panel costs. Given how the competition between Korean and Chinese panel makers has largely decimated any profitability in the Ultra HDTV space (a/k/a 4K), the move up to 8K is almost a necessity.
It shouldn’t be a surprise that the average person is now asking,” Wait a minute! How is it we’re already talking about 8K video and displays? Didn’t we just start moving to 4K resolution? What’s the rush?”
Turns out, the move to 8K has actually been in the works for a long time. More specifically, it started almost 25 years ago in Japan, when broadcaster NHK began researching the next step up from HDTV (which was just getting off the ground outside of Japan!). Their goal was to design and build cameras capable of capturing 8K video at high frame rates, plus the attendant infrastructure to edit, store, and play it back, along with getting it to the home.
NHK’s research and development led to the demonstration of a 4K (4096x2160) camera in 2004 at the NAB show. They followed that by introducing their first 8K camera sensor at NAB 2006, followed by an improved version in 2012. Sharp also showed an 85-inch 8K LCD monitor at CES that year, but people didn’t pay as much attention to that demo as they did the arrival of the first 4K / Ultra HDTV monitors in September of that year at the EFA show.
Back then, depending on the brand name, that 84-inch Ultra HD video monitor – which required four HDMI 1.4 inputs to work – could have set you back as much as $25,000 USD. Coincidentally, this was about seven years after we saw the start of a move away from 720p/768p displays and TVs to Full HD (1920x1080) screen resolution.
A year after those ground-breaking 4K TVs showed up, NHK unveiled a 4-pound 8K Steadicam rig, plus a multi-format video recorder prototype. A 13-inch 8K video monitor for cameras, using OLED technology, also took a bow. And by 2014, NHK was broadcasting selected Olympic events in 8K via satellite via locations around the globe.
In our industry, we were still pushing Full HD and 2K displays and signal management products, looking over our shoulder at a 4K dot in the distance and figuring we had plenty of time. That all changed at ISE in 2018, where Ultra HD and 4K displays were everywhere, not to mention an $80,000 8K broadcast camera from Sharp. Full HD digital signage still make plenty of sense, but the economics of LCD panel manufacturing meant that the fabs in Asia would be pulling back on Full HD and ramping up Ultra HD production.
So here we are in 2019, just embracing the move to Ultra HD. Yet, pundits are already saying, “It’s time to give 8K a look.” At least one Tier 1 display brand has already showcased 8K digital signage at ISE and NAB, and will do so again at InfoComm, likely prompting competitors to show they’re at least players in this new game in Orlando. NAB featured half a dozen 8K video cameras along with recording and storage solutions, and what’s likely the first-ever 8K digital SLR camera to hit the consumer market.
Is this irrational exuberance? Hardly. Clever readers will note that this summer will mark seven years since the first 4K TVs took a bow, seven years after the transition started to Full HD (which itself took place about seven years after the industry began moving away from standard definition displays to 720p/768p HD displays).
Industry forecasts are for about 430,000 8K TVs to ship by the end of December, with over 2 million shipments called for in 2020. Those numbers closely track the roll-out of 4K / Ultra HDTV models from 2012 through 2014. Given that our industry really didn’t embrace 4K until 2017, we figure that you have just a couple of years to get with the 8K program.
And keep in mind that we’re fast approaching a point in time when the pixel density in a display just won’t matter anymore. Because of economics, all large TVs and monitors over 65 inches will have 8K resolution, whether you need it or not. Fortunately, video scalers have gotten quite powerful and can “pull up” your lower-resolution content to fit the screen. And other metrics like HDR, color accuracy, and high frame rate support will be the important ones, not the number of pixels.
Are you ready for 8K?
Ever since the HDMI 2.1 standard was announced at CES in 2017, we’ve all been waiting with baited breath for the chipsets to arrive. v2.1 offered such a speed increase over v2.0 (2013) that it sounded almost like science fiction, leaping from 18 gigabits per second (Gb/s) to an amazing 48 Gb/s, just like that!
And the signaling method would change, too, falling into line with the rest of the world by adopting a digital packet structure, much the same as DisplayPort (which, incidentally, is what much of V2.1 was modeled after). Instead of three lanes for red, green, and blue, plus a separate lane for clock information, v2.1 now employs four separate data lanes, each capable of speeds as fast as 12 Gb/s. With a packetized structure, intermixing and embedding clock packets is a piece of cake.
While that all sounded very impressive over two years ago, the reality still has yet to catch up with the promise. At CES 2019, 8K was a “big” thing, and the HDMI Forum booth had several demonstrations of 8K signaling, including an 8K home theater centered around a Samsung 900-series 85-inch 8K TV. (Ironically, the earlier versions of this TV shipped with the older and slower HDMI 2.0 interface.)
If you dug a bit deeper and asked a few more questions, you would have learned that the testing and certification process for the v2.1 interface is still very much in progress and is not likely to wind up until the fall of this year. What’s more, only one chip manufacturer (Socionext) was cranking out v2.1 TX and RX chips in any quantities as of the end of 2018, with other fabs just getting up to speed.
The hype over HDMI 2.1 reached a bit of absurdity when a prominent television manufacturer declared at their CES press conference that all of their 2019 Ultra HD televisions would have v2.1 input. (No mention as to how many.) Further questioning revealed that, although video signals could enter one of these Ultra HD televisions through a v2.1 physical interface, the signals would be processed as v2.0 after that inside the set.
Why the push for v2.1? Simple. The latest enhancements to TV – high dynamic range and its associated wider color gamut – create a lot more bits per second. And v2.0 looks more and more like a giant speed bump in that context. Presently, you can push a 4K/60 signal through HDMI 2.0 IF you reduce the bit depth to 8 bits per pixel, using the RGB (4:4:4) format. Want to send a 10-bit signal at the same frame rate? Now you have to cut the color resolution to 4:2:2, not easy to do with a computer video card.
While 48 Gb/s may be unattainable in the near future, a data rate around 36 Gb/s could be within reach. That would allow the passage of 4K/60 content with 12-bit RGB color, a truly spectacular set of images streaming at just shy of 30 Gb/s. Or, you could generate a high frame rate (120 Hz) 4K signal for gaming purposes, using 12-bit 4:2:2 color and still get under the wire at 33.3 Gb/s.
The challenge for HDMI has always been higher bit rates over copper. Unlike DisplayPort, there is no provision in the HDMI 2.1 specification for transport over optical fiber, although that shouldn’t be difficult to accomplish given the interface’s packet structure. Above 40 Gb/s, we may have to use optical fiber simply because signal losses over copper wires would be too high to maintain a workable signal-to-noise ratio.
Over in the DisplayPort camp, there hasn’t been a lot of counter-punching going on. Few manufacturers support the DisplayPort v1.3/1.4 standard (v1.4 adds support for HDR metadata, plus color resolutions other than 4:4:4 / RGB) and it’s only the more exotic video cards that would require that kind of speed. Gaming is a good example of a speed-intensive application and that crowd would love to have 12-bit color refreshing at 120 Hz. Or maybe even faster.
Where does that leave our industry? You’ll be hard-pressed to find many signal management products at InfoComm in June that support v2.1 – it took our industry almost four years to really get onboard with v2.0, and we still get press releases from companies boasting how they finally added HDMI 2.0 to their media players and other products. (Well, it’s been five-and-a-half years, you know!)
From our perspective, we don’t expect to see much adoption of v2.1 until a year from now, and even then, things will move slowly. The ProAV industry is more obsessed with the transition from uncompressed high-bandwidth signal distribution to compressed IT-based distribution, centering on 10 Gb/s network switches. (Sorry, you 1 Gb/s fans, that’s just too slow for future-proofing.)
A good “tell” will be how many 4K and 8K TVs (yep, 8K TVs, repeat as often as necessary) will start arriving in the fourth quarter of 2019 with one or more v2.1 input. The more TV manufacturers get with the program, the more likely you’ll see them on commercial monitors and digital signage displays next year in Las Vegas. Another “tell” will be how quickly our industry embraces 8K commercial displays (more to come in our next blog post). Without v2.1, 8K will be nigh impossible.
In the meantime, we’re reminded of that classic song by The Kinks that goes, “I’m so tired, tired of waiting, tired of waiting for you….”
We recently attended the annual Society of Motion Picture & Television Engineers technology conference in Los Angeles. SMPTE has been holding this event for decades and it attracts the best and the brightest to talk about advancements in everything from television and motion picture production to human visual science, advances in audio, video compression and transmission, and lately, promoting the field to women and college graduates.
One invited papers session in particular stood out this year, and its focus was on 8K television. Regular readers of this blog may remember that much of the research and development in this area is being undertaken by NHK, the national Japanese broadcasting network. NHK commenced their work way, way back in 1995, first achieving 4K resolution in camera sensors in 2004 and then introducing their first 8K camera sensor in 2006.
Since then, they’ve designed and built a 4-pound 8K camera system, pushed sensor speeds to as high as 480 Hz, and created a simultaneous downconversion product that ingests 8K video and spits it out at native resolution, 4K resolution, and Full HD resolution. As you can imagine, that requires a ton of processing power!
At this year’s session, one speaker from NHK detailed their work in next-generation camera sensors for 8K that incorporate a unique organic photoconductive film (OPF) layer to boost sensitivity while keeping noise to a minimum. That’s a real challenge when working with small camera sensors – Super 35 sensors for 4K (Ultra HD) production are already jammed full of tiny photosites, a design headache for camera engineers. Now, imagine you’re asked to double the number of pixels in the same size sensor, where the average pixel measures about 3 micrometers!
The second paper described a new 1.25” 8K camera sensor that can record video at frame rates as high as 480 Hz, or eight times as fast as conventional sensors. Using this sensor, fast motion can be captured with minimal blurring and very fine detail. The captured video is down-converted in-camera to 120 Hz for eventual recording and playback. As you might guess, the data flowing from the camera sensor is a gusher: Uncompressed, with 10-bit 4:2:2 color sampling, it approaches 100 gigabits per second (Gb/s), or more than twice as fast as the latest version of HDMI (2.1) can handle.
The final NHK paper talked about setting up the world’s first full-time 4K/8K satellite broadcasting system, which launched in December of 2018. Aside from the technical challenges of bandwidth (both left-hand and right-hand circular polarization of the radio waves was necessary to carry all of the signal data), there was an additional obstacle: Many residents live in older apartment buildings, making the cable infrastructure upgrade process difficult. It was eventually solved by installing parallel lines of plastic optical fiber (POF, or Toslink) alongside existing coaxial cable, telephone, and power lines.
Where is the relevance to our industry? Consider that, ten years ago, Ultra HD and 4K video was largely a lab experiment to many of us. In 2019, we were just getting used to managing Full HD signals over signal distribution and interfacing systems, wrestling with color bit depth and varying frame rates, not to mention the limitations and foibles of HDMI.
Yet, three years later, the first commercial Ultra HD monitors washed up on our shores. A decade later, Ultra HD has become the default resolution for consumer televisions and most commercial AV monitors and displays. Just as we did in 2009, we’re wrestling with the same signal management issues, color bit depths, refresh rates, and a whole new version of HDMI…which isn’t even ready to handle the higher bit rates that 4K video requires for higher frame rates and higher color bit depths.
So, while we fuss, argue, complain, and try to adjust to this latest jump in resolution to 4K, there is a country that is already working at TWICE that video resolution for acquisition, editing, storage, and distribution to the home. There’s no reason to think that we’ll catch up to them eventually – the first 8K televisions already launched to the North American market earlier this year, and we’re seeing early interest in 8K displays for specialized installations like command and control, surveillance, visualization and augmented reality, and (of all things) 3D visualization using autostereo displays.
Skeptics can scoff all they want, but this never-ending push upward and onward in spatial resolution isn’t going to stop. If anything, additional momentum will be provided by enhancements like high dynamic range, wider color gamuts, and high frame rate video. (Did you know that, as display screens get larger and fields of view become wider, any flicker in images created by judder and slower frame rates becomes increasingly noticeable? NHK studied this phenomenon and concluded that a minimum frame rate of 80 Hz was required for 4K and 8K on large displays.)
And as usual, we’ll be expected to interface and transport these signals. The SMPTE SDI standard for UHD (12G SDI) is already inadequate for signal-wire serial digital connections, having a maximum data rate of 11.88 Gb/s. This has resulted in 8K camera manufacturers employing four separate 12G SDI ports and some light compression in-camera to record 8K/60 video with 10-bit 4:2:2 color (uncompressed data rate of 47.7 Gb/s). And it’s also revived interest in the latest SMPTE standard for SDI, 24G (23.76 Gb/s, likely over optical fiber).
This should be interesting to watch, particularly since our industry is still working around the six-year-old TMDS-based HDMI 2.0 interface (18 Gb/s), is still largely allergic to optical fiber, and is promoting an AV/IT video codec that’s barely fast enough to squeeze 4K/60 10-bit 4:4:4 video through a 10-gigabit network switch.
Do you feel the need for speed yet?
FIND WHAT YOU NEED
Want to receive alerts and updates for every new post?