There are many things in life that one can never have enough of, like money, vacation days, good wine, music from your favorite band, and dessert. (Especially dessert!) Add bandwidth to that list. You depend on having “enough” bandwidth every time you send someone a photo from your smartphone, stream Hamilton to your TV, or upload a video to Facebook from your laptop.
Fact is, there’s a never-ending quest to use bandwidth more efficiently, one which the vast majority of us are blissfully unaware of. Groups of very smart people are constantly developing and refining video compression and decompression (codec) algorithms to ensure those selfies, Lin-Manuel Miranda’s music and lyrics, and cute kitten videos get to where they’re supposed to go with minimal signal degradation.
And codec developers can barely keep up. 27 years ago, we saw the first high-efficiency codec emerge to handle low-resolution video on optical disks – MPEG-1, produced for the short-lived compact disc interactive (CD-I) format. Video compressed with MPEG-1 looked okay on small CRT screens, but on larger screens, the quality was pretty bad. Although MPEG-1 and CD-I are fortunately distant memories now, the audio compression format they spawned – MPEG Audio Layer 3 or MP3 for short – lives on to this day.
MPEG-1 was followed by MPEG-2 in 1996 to encode digital video discs (DVDs) and to compress standard-definition and high-definition video for broadcast, cable, and satellite. It’s still in use today. MPEG-2 encoders have come a very long way in 20+ years and can pack two 720p HD programs plus a handful of SD programs into a single 6 MHz TV channel, all with acceptable visual quality.
Like rust, compression experts never sleep. 17 years ago, a newer and more efficient codec made its debut. MPEG-4 H.264 (aka Advanced Video Codec) promised 50% compression efficiency over MPEG-2…and delivered it! By that time, HD video was becoming widely adopted across a multitude of delivery platforms, but available bandwidth was slow to keep up. With the growth of video streaming a few years later, MPEG-4 H.264 became the codec of choice and is supported on everything from tablets and smartphones to laptops, smart TVs, camcorders, and DSLRs.
The only problem was that 4K video became the latest flavor by the turn of this past decade. And to make the job even more difficult, high dynamic range video with its associated wider color gamut was part of the 4K package. But bandwidth hadn’t kept up! Hence, the High-Efficiency Video Codec (HEVC) was rolled out in 2013, again promising 50% more compression efficiency over H.264. HEVC requires quite a bit of computing power to pull off that trick, but it works (and so does its close relative, Google’s VP9 codec). And it’s not cheap to license.
HEVC is used for the UHD Blu-ray optical disc format and for streaming 4K content from just about everywhere except YouTube, which being owned by Google employs the VP9 “open” codec. But the licensing costs spurred more tech types to come up with yet another codec, known as the Alliance for Open Media (AOM) AV-1 codec. This is a royalty-free codec that competes with HEVC but is intended solely for streaming video over Internet connections.
Since the people developing high-resolution video formats always seem to be a few steps ahead of the codec people, more codecs have been proposed in an attempt to catch up. Essential Video Coding (EVC, or MPEG-5) was developed as an alternative video codec for streaming and OTT, but with streaming performance at least equivalent to HEVC. MPEG-5 Part 2, Low Complexity Enhancement Video Coding (LC-EVC), is yet another MPEG standard and is intended to provide enhanced compression efficiency for existing MPEG-based video codecs.
To top it off, the successor to HEVC H.265 is now preparing to take the stage. The Versatile Video Codec, or VVC, is designed for maximum compression efficiency across all compatible devices and platforms with a specific focus on applications like high-dynamic range, high frame rate video, 360-degree video for virtual and augmented reality, and 8K UHD-2 video.
Using spatial-only image metrics for reference, VVC is about 40% more efficient than HEVC for UHD & HD compression. However, a VVC reference encoder has about ten times the complexity of an HEVC reference encoder, while a VVC reference decoder is about twice as complex as an HEVC reference decoder.
We should mention that HEVC, AV-1, EVC, LC-EVC, and VVC are software-intensive codecs, unlike the older MPEG-2 and H.264 AVC codecs that are still in widespread use. All five employ larger coding block sizes and need super-fast CPUs and plenty of memory to analyze and compress video streams. By the way, H.264 AVC is no slouch - it’s had 26 updates since it was first rolled out in 2003.
Okay, so you’re getting a headache from all these abbreviations. (We are, too.) The takeaway here is that methods for enabling the transport of compressed, high bit rate video over everything from broadcast airwaves, Wi-Fi, 5G, and broadband are continuously being refined. You’ll go through life blissfully unaware of which particular codec is being used to stream The Marvelous Mrs. Maisel or let you watch Clemson’s and Alabama’s football teams slug it out on your iPhone. (You will, however, notice any impairments to video quality as a result of excessive compression and complain vigorously to your service provider about them!)
Will there ever be a unified codec? That’s the goal, but no one knows how, when, or even if it will happen. In the ongoing quest for efficiency, codec designers are now implementing artificial intelligence (AI) to perform the lightning-quick analysis of incoming video and decisions on which picture elements to compress, by how much, and for how long.
We already use a simple version of AI to dynamically adjust bit rates of multiple programs in a stream, based on constantly-changing available network bandwidth (dynamic stream shaping and adaptive bit rate encoding are two examples). It stands to reason that a unified codec – one based on advanced AI that can optimize delivery of high-resolution video across any network or platform – should emerge at some point and rid us of the “alphabet soup” of codec formats.
It’s hard to believe, but we’ve lived through six months of the COVID-19 pandemic already. (And that’s six months we’ll never get back.) As disruptive as the pandemic has been, people have still come up with clever workarounds. Perhaps the biggest challenge was how to stage conferences and trade shows, which require people to (a) get on a plane and travel to the conference site, and (b) walk the trade show floor and sit with others in seminars, workshops, and keynotes.
So, 2020 brought us something new: The virtual trade show. We should clarify that the concept isn’t entirely new, as some AV manufacturers have tried it in the past. But this time around, exhibitors and attendees had no choice – the only way we had to see new products and hear about new technologies and processes was to turn on our home computer, register for the event, and log into the event Web site.
While attending a trade show or conference this way has certain advantages (no security screenings, boarding lines, taxi fares, and no overpriced hotel rooms and show floor food), there really isn’t any substitute for seeing products in person, catching up with colleagues, and having one of those many serendipitous ad hoc conversations to trade notes on what everyone else saw.
We’re well through trade show season now, and the jury is still out on how effective the virtual versions of NAB and InfoComm turned out to be. IFA will be limited to 1000 members of the press attending in person (if they’re lucky to get that many!), while IBC, NAB New York, and even CES 2021 have all opted to go virtual. It will be a long time before any of us prints out an exhibitor, press, or attendee badge and hangs it on a lanyard again!
One show in particular suffered perhaps more than others – the Society for Information Display (SID) annual exposition. Originally scheduled for late May, it went virtual in early August, with many sessions still accessible as of early September. But exhibitors didn’t hold back – this is the event where cutting-edge display tech usually takes its first bows and we get a look at the future of displays even though many of the products shown are still in prototype stage.
Based on the many press releases and photos we’ve received, it’s clear that just about all of the innovation is happening in large, self-contained displays. While prominent manufacturers like Samsung, Panasonic, and LG Display are turning their backs on liquid-crystal display (LCD) panel manufacturing – decisions attributable to evaporating profits – they’re sinking more cash into emissive display products including white OLED with color filters (WOLED), quantum dots driven by blue OLEDs (QD-OLED), quantum dots driven by blue nanorods, and micro and mini LED displays.
The fact is, unless you are manufacturing LCD panels with large mother glass sizes in China, you might as well just take your cash, pile it into big hills, and set it on fire. You’ll get the same result, but much faster. Even the Chinese display behemoths like TCL, BOE, TPV, and CSOT are finding good profit margins in LCD panel manufacturing much harder to come by. (And forget projectors – their market share is getting hammered by ever-larger and cheaper LCD displays at one end, and tiled mini LED displays at the other.)
What’s fascinating about this new generation of displays is that they’re all emissive in design. That is, the light they throw off comes directly to your eye, and not through layers of light shutters, color filters, and polarizers found in LCD displays. That translates into better contrast, deeper black levels, and wider viewing angles; all of which used to be characteristic of cathode-ray tube (CRT) televisions and monitors 25 years ago. (Ergo, what was once old is basically new again!)
Even better is that emerging technologies like quantum dots and mini/micro LED aren’t brightness-limited like CRTs were. Even WOLED displays like premium 4K TVs, production monitors, and digital signs can achieve luminance levels of 700 candelas per square meter (cd/m2) with small-area full white signals. Displays equipped with quantum dots push well past 1,000 cd/m2, while mini LED (classified as LED displays with a pixel pitch less than 2.5mm) can easily reach 3,000 cd/m2. By comparison, a full white screen on a CRT grading monitor would measure only 100 cd/m2.
This move to bright, saturated images couldn’t come at a better time, as high dynamic range (HDR) imaging with its associated wide color gamut (WCG) is becoming popular. Combined with high-resolution 4K and 8K displays with ever-larger screens, the popular descriptions of “being there” and “like looking through a window” couldn’t be more apt.
Next-generation displays aren’t just about brighter images with highly-saturated colors. The next frontiers in display tech are flexibility and transparency. Think of folding smartphones and see-through televisions, products that have been in research and development for over a decade and are only just coming to market. How about TV screens that can wrap around poles and stanchions, or be shaped into the petals of a flower? There are also enhancements to touch screens and even touch-less screens that use ultrasonic sensors to operate cursors, open apps and windows, and play, fast forward, pause, and rewind video.
All in all, it’s pretty amazing stuff. At some point, when the pandemic runs its course, you’ll be able to see these cool display products in person, just like the good old days of 2019.
Until then…wish you were here…
Funny, isn’t it? A year ago, colleagues in our industry were alternately embracing, rejecting, or arguing about the upward trend to 4K video. We debated what “4K” actually meant. We got out our calculators to see what flavor of “4K” signal we could pass through our existing HDMI infrastructure. We watched as display manufacturers began replacing Full HD displays with Ultra HD versions (the correct term, as these have a pixel resolution of 3840x2160 and aren’t really true 4K).
And we slapped our heads and groaned as display analysts warned us to get ready for 8K video, along with high dynamic range, wider color gamut, and higher frame rates. We read up on the latest versions of HDMI and DisplayPort and reviewed the press releases from standards organizations that touted the latest versions of high-efficiency codecs, such as the new Versatile Video Codec (VVC.)
What a difference 12 months and a pandemic make. Now, most of us are working from and learning at home, watching video on Zoom, GoToMeeting, WebEx, Teams, and other conferencing platforms. Some of the video looks decent; a lot of it is pretty awful. Much of that can be blamed on “smart” video codecs that use adaptive, variable streaming techniques and dynamic stream shaping to maximize video resolution based on available network speeds during any given time interval. (Audio is easy to deliver – even spatial sound requires just over 1 megabit per second to stream.)
Aside from consumers buying large Ultra HD televisions like there’s no tomorrow – perhaps to stream “Hamilton” in HDR – we’re not hearing much about 4K and 8K TV right now. 2020’s showcase event for 8K, the 2020 Tokyo Olympics, was pushed back a year. Major sports leagues have cobbled together short schedules to compete in mostly empty stadiums, and some have moved their teams to one or two “bubble” locations for round-robin playoffs to stay isolated from COVID-19 outbreaks. (It’s not working, in case you’re wondering.) They’re not concerned presently about producing anything using cutting-edge video.
Guess what? Pixel resolution isn’t our primary concern now. Bandwidth is, and it tends to be fixed, like the water pressure in a large hotel. If one or two people are taking a shower or flushing a toilet at any given time, there’s ample flow. But if every guest did either one at the same time - well, that “ample” water pressure would quickly reduce to a trickle.
Internet bandwidth works the same way, and right now, we have hundreds of millions of home-bound workers, students, online gamers, churches, and Netflix bingers all grabbing as much bandwidth as they can, often at the same time. To compensate, video streaming services and Internet service providers (ISPs) can “throttle” bandwidth if necessary – this was done early on in areas with stay-at-home orders as online users surged, pleasing no one.
Alternately, we can make decisions on our end to help maximize bandwidth. For conferencing, distance learning, online worship, government meetings, and other events that will attract large numbers of remote viewers, the focus should be on effective communication above all else. Consequently, we should select a video format that’s most bandwidth-friendly. And instead of going up in resolution, we might want to go down. (Heads up: We’re about to become a bit contrarian.)
We need to fill a 16:9 screen (or 16:10) for certain. And we want to show fine detail. Do we need a high frame rate? Only if moving objects are being shown, which is rarely the case with an online class or Web conference. The amount of motion in a worship service is also minimal, compared to an auto race or a basketball game.
Turns out, we do have a video format that’s very well-suited to the online world - 1280x720p HD. Yes, it is the lowest version of HD, and compared to Ultra HD with HDR, it looks more like our old standard-definition video systems. Even so, many TV networks use 720p as a baseline for broadcasting everything from scripted entertainment to live sports - and it doesn’t look half bad on large screens.
The minimum viewing distance for 720p is around eight feet for a 42-inch diagonal 720p screen. But you won’t find any of those today, and you’ll be hard-pressed to score a 42-inch 1080p TV, what with manufacturers switching to Ultra HD native resolution. Not to worry; the major TV brands have incorporated some pretty sophisticated picture scaling engines into their sets, which makes your 720p video look better than you might expect on that new 65-inch Ultra HD TV.
If you want to be really thrifty, consider that just about every display we watch today supports multiple frame rates, so we can easily stream 720p HD at 25/30 frames per second for greater efficiency, or 50/60 frames per second if motion is being shown.
With a 60 Hz refresh rate, our pixel clock is 74.25 MHz, and with 4:2:0 8-bit color, the total data rate is about 1.1 gigabit per second, uncompressed. A high-efficiency codec like H.264 can easily mash that down in the range of 3-5 megabits per second with some latency. Even slow residential broadband connections (such as one we tested recently at 8 Mb/s in rural Vermont) can accommodate that data rate.
We can hear the cries and squalling of purists now. “720p is an old HD format!” and “Once you’ve seen 4K video on a big screen, you’ll never turn back!” Well, keep in mind that many of remote workers, students, worshipers, and civic-minded citizens are watching on laptops, tablets and even smartphones. And despite record sales of large screen TVs this year, there are still quite a few smaller TV sets in use today, with “smaller” defined as 55 inches diagonally or less.
Recall what we said earlier about our priorities: Effective communication tops all else, especially pretty pictures. Given the crushing demand on Internet connectivity right now, you should set your output resolution to 1280x720 on streaming cameras wherever possible. 60p is fine, but 30p will use just ½ the bandwidth of 60p. (Ditto 25p and 50p).
When we finally reach the end of the pandemic tunnel – and we WILL reach it; it will just take a while – the debates about 4K and 8K can happily resume. You can set your streaming cameras back to Full HD output and not feel guilty. We can get back to worrying about data rates and juggling combinations of frame rates and color resolution to pass through whatever display interface we’re stuck with.
Perhaps by then, average broadband data rates will support full “4K” Web conferencing. Either that, or we’ll have codecs so efficient that they can pack down 8K video streams to 10 Mb/s or less with just a few frames of latency. (Hey – we can dream, can’t we?)
We’re going to change things up a bit this month and shift our focus away from technology. As of this writing, the coronavirus pandemic is resurgent in several countries; particularly in the United States and Latin and South America. “The times they are a-changing,” goes the chorus in Bob Dylan’s classic song from early 1964, and if there’s a better expression to describe what’s going on now around the world, we can’t think of it.
It’s been four months since much of the world began shutting down in an attempt to slow the spread of COVID-19. The economic hit has been considerable, with the International Monetary Fund now predicting a 4.9% contraction in the global economy for 2020. As businesses try to maintain operations in a variety of quarantine or near-quarantine conditions, they’ve been forced to make changes “on the fly.” It now appears many of those changes will become permanent, even after a vaccine is available and the pandemic eventually winds down. And all of these changes will have an impact on the commercial AV industry.
To start with, let’s look at the increase in the number of remote workers. It’s estimated that 40% of the workforce in the United States will never return to an office, post-pandemic. Instead, they will continue to work from home offices or other remote locations. This, in turn, will reduce the demand for office space in cities and suburbs. And that will have a ripple effect on affiliated retail, hospitality, and service businesses – not to mention tax revenues.
A New York Times story on May 12 quoted executives from JP Morgan Chase, Morgan Stanley, and Barclays Bank – three of the largest commercial tenants in New York City – as saying they will not re-occupy all of the office space they originally leased prior to the COVID-19 pandemic. Instead, many employees will remain working from home permanently, and current leases will not be renewed once they expire.
Companies that were once opposed to the concept of remote workers are now embracing it as this unintentional, worldwide laboratory experiment winds on. Instead of employees gathering to meet in a room, they’re all logging in to virtual meetings via Zoom, GoToMeeting, WebEx, Teams, and other conferencing programs. (Have you tried to buy an external Webcam recently? Good luck, many models are out of stock and on long backorder. So are USB microphones for better audio.)
On the education front, primary and secondary schools in many states are planning to re-open in the fall of 2020, but only in regions where COVID-19 infections have dropped below a specified metric. Not surprisingly, the use of distance learning and group audio and videoconferencing has grown exponentially. And some colleges and universities, anticipating a second wave of infections, have already announced they will revert to the virtual classroom model for the winter semester as a precaution.
Perhaps it’s no surprise then that sales of PCs and laptops spiked upward in the first quarter of 2020: Intel reported a 23% increase in revenue, while AMD saw a 73% bump. Western Digital noted a surge in demand for storage components “due to the shift to working from home and e-learning.” Chromebook sales to students also took off, according to Google. And mobile device sales suffered, with market analyst firm IDC reporting that smartphone shipments had fallen 11.7 percent in Q1, while tablets dropped by 18.2 percent.
The viral pandemic also shuttered a good deal of PC and laptop manufacturing capacity in Asia. According to a June 8 story on The Verge, “…Retail analytics firm Stackline found that in recent weeks, traffic to laptop product Web pages has grown 100 to 130 percent (year over year). Conversion rates (that is, the proportion of visitors to laptop product pages who actually purchase), conversely, have plummeted; they’re normally around 3 percent, but in mid-May they hit an all-time low of 1.5 percent. In other words: people are looking for laptops more, but they’re having trouble finding products in stock to actually purchase.
Another market that will take some time to recover is that of expositions, conferences, and trade shows. Several polls taken this spring have shown conclusively that respondents are not at all interested in attending these events until a safe, effective, and proven COVID-19 vaccine is widely available. And major trade shows have toppled like dominoes this spring and summer, with another wave of cancellations now being announced for the fall.
Touring musical acts, festivals, theme and amusement parks, and sporting events are also struggling to figure out how to re-open and re-schedule without pushing COVID-19 infection rates upward. There is a consequent, direct impact on the transportation, hospitality, and advertising sectors, and in our industry, rental and staging companies.
However, electronic gaming and e-Sports are thriving, as their participants all compete online. Not surprisingly, video streaming of all kinds is also surging; from movies and TV shows to worship services, online courses, and virtual travel. Several movies that were intended for theatrical release went directly to streaming and digital downloads as theater chains closed down. And believe it or not, sales of Blu-ray discs have also ticked up as people dust off their old players and look for ways to entertain their kids.
No one can state precisely what impact these trends will have on the AV industry. We know the increase in remote workers will definitely continue – many companies have been operating this way for years, such as health insurance providers. With ever-faster Internet available to more and more homes and apartments, video conferencing isn’t such a big deal anymore. The unanswered question: How much do face-to-face communications and meetings matter now?
For higher education, the current technological limits of distance learning and virtual classrooms will be sorely tested in coming months. What courses and classes lend themselves to online learning, and which still require a physical presence? Can colleges and universities survive a decline in enrollment caused by the pandemic? Can they justify maintaining large campuses dotted with classrooms, lecture halls, and other facilities when so few are using them?
We know a few things to be true. First, fast Internet connections (and in particular, WiFi 6) are and will continue to be of paramount importance to everyone. Video streaming is here to stay, and for remote workers and distance learning, it’s as essential to life as oxygen.
Second, it appears rumors of the PC’s demise are greatly exaggerated (with apologies to Mark Twain). Mobile devices have their place, but just aren’t practical for day-to-day office work. And third, many of us need better cameras and microphones for conferencing – some cameras create truly awful images and don’t focus very well, while many microphones make you sound like you’re at the bottom of a well.
As AV professionals, we’re tasked with coming up with solutions for our customers. And boy oh boy, do they have some real challenges to solve nowadays! We may find that today’s “hot” product categories don’t apply in the future, and that we’ll have to go back to the drawing board time and again to keep up. (But that’s what we do best, isn’t it?)
Until next time, stay healthy….
A news digest publication we receive each week, appropriately called The Week, features a small section of stories each month under the heading, “Boring, but Important.” That description suits this post quite nicely, as it details the latest specifications for the Universal Serial Bus (USB) – version 4.0.
We tend to take USB for granted, as it is kind of boring. We don’t think much about those receptacles on the sides of our laptops and mobile devices. Most of us use them for charging up smartphones and tablets and connecting thumb drives and other external storage to laptops and desktops, along with wireless keyboard and mouse receivers. And we usually complain that there aren’t enough USB ports and buy USB hubs to connect scanners, printers, and other peripherals.
The original USB specification was released in January of 1996, and the first devices equipped with USB 1.0 connectors made their appearance 24 years ago this month. (Happy birthday!) That’s a lifetime in the world of computer peripherals: USB version 1.0 supported data rates of 1.5 megabits per second (Mb/s) in low-speed mode and 12 Mb/s in full-speed mode. In contrast, the current version (3.2) clips along at 20 GIGABITS per second (Gb/s).
Version 3.1 also did away with seven different connector styles (not to mention an entire cottage industry of between-version adapter plugs and cables) in favor of the Type-C symmetrical 24-pin plug and jack, and added some intelligence to the connection so it could do more than just provide 5 volts of charging voltage and move data back and forth. Now, a “triple play” was possible, connecting power, data, and display.
By all accounts, the Type-C connector is durable and popular. (Ask anyone who has fumbled in the dark with a Micro USB plug to charge up a phone, or broken a Micro plug while trying to push it into a Mini jack.) It takes up a lot less room on laptops, making for thinner and lighter designs as optical disc drives and bulky hard drives are also jettisoned. And it’s fast, no doubt about that – while writing this missive, we just backed up large music and photo files to an external Type-C solid-state drive at an average rate of 150 Mb/s.
So, why should we care all that much about enhancements to USB? The answer is simple – more and more peripherals are using USB as their only connection to the outside world. For example, during the current COVID-19 pandemic, sales of webcams and other streaming devices are through the roof. While more expensive cameras do provide a variety of connector options, lower-price models rely exclusively on USB ports for video and audio. And the same is true for most computer headsets.
The USB 4 specifications, announced while you weren’t paying attention last fall, build on Intel’s Thunderbolt technology to raise data transfer rates to 40 Gb/s over version 3.2’s 20 Gb/s. While very few current models of laptops have USB 3.2 ports on them, you can be sure future models will step on the gas – the USB Implementer’s Forum (USB-IF) estimates that devices with USB 4 ports will start appearing toward the end of this year.
The extra speed isn’t just for data. USB 3.1 introduced the concept of Alternate Mode, meaning one or more serial data lanes can be repurposed to also serve as display connections, all the while continuing to transmit data back and forth. DisplayPort version 1.3 was the first implementation and HDMI 2.1 will also travel nicely over this connection. If you want to connect an Ultra HD computer monitor via Alternate Mode using 10-bit RGB color at 60 Hz, you’ll need to sustain a data rate of around 21 Gb/s. And if you plan on running data back and forth at the same time – well, you get the idea.
What’s different about version 4 is that it uses tunneling over two lanes to move everything. From the USB-IF press release: “Key characteristics of the USB4 solution include two-lane operation using existing USB Type-C cables and up to 40 Gbps operation over 40 Gbps-certified cables, multiple data and display protocols to efficiently share the total available bandwidth over the bus, and backward compatibility with USB 3.2, USB 2.0 and Thunderbolt 3.”
The term “smart interface” certainly applies here. Version 4 is expected to be much more efficient in allocating bandwidth for simultaneous transmission of data and video. If a monitor is using 20 percent of the available bandwidth for video, the remaining 80 percent could be used for data. And a new feature, USB Power Delivery, is an intelligent charging protocol that allows negotiation of faster (or slower, but less battery-draining) charge rates for mobile devices.
Historically, the AV industry has relied on separate connectors for separate functions…HDMI for display, category cable for IT networks, RS-232 for control, unbalanced and balanced connectors for audio, and of course USB for data and peripherals. For long signal runs, we’re tilting back and forth between using category wire for proprietary HDMI/audio/control extenders, or for AV-over-IT networked applications.
Is it time to think instead about moving to a “tunneled” approach for long signal runs, aggregating video, audio, data, and control packets using USB 4 protocols and also supporting connected peripherals such as keyboards, mice, printers, scanners, Webcams, audio interfaces, and PTZ cameras? Logic says yes, but the immediate obstacle to that implementation is the data rates involved: HDBaseT is still stuck at USB 2.0 (max. 480 Mb/s), and AV-over-IT applications are centering around 10 Gb/s network switches.
However, there’s always optical fiber, which serial data interfaces travel over very nicely. Consider that 12G SDI interfaces and cables are already being implemented for 4K and 8K video production and transport. Two such interfaces (which actually run at 11.6 Gb/s) could easily handle USB 3 data rates, while four fibers could carry USB 4 data without breaking a sweat. Multimode fiber would be more than adequate for extenders with its range of nearly three miles.
Fun fact: You could port 8K/60 10-bit 4:2:2 video over a USB 4 connection, with light compression.
Like much of the world, we’re working from a home office now, due to the COVID-19 outbreak. Aside from it being much quieter (we only have two cats for company here, and they’re not particularly noisy), we’re making do with a vintage-2017 laptop, reasonably fast broadband, and the vagaries of online tele/videoconferencing programs like Zoom, Skype, GoToMeeting, and WebEx.
For those readers fortunate to have a dedicated space for a home office, the minimum setup is generally a laptop computer with built-in camera and microphone. (And yes, we know there are folks who still use desktop computers with separate cameras and mics.) It’s a pretty pedestrian arrangement and one that could use some enhancements here and there. While convenient, the audio and video quality from laptops varies widely, and is also impacted by the quality of the broadband connection and the codecs used by the streaming service.
Laptops, by design, don’t have very large speakers built into them. That in turn affects audio quality, which might motivate one to invest in a pair of amplified speakers. Based on observation, these are particularly helpful when participating in online Pilates and yoga classes where the viewer is sitting (by necessity) some distance away from the screen, and by extension, the speakers. We’ve also found these to be very helpful when a family member is engaged in an online art class.
On-board microphones can also use some help. In general, a directional microphone connected to a USB interface is going to sound a lot cleaner during a Zoom session simply because the mic is directional and not prone to picking up background noise from the washing machine, screaming children, or barking dogs. Also, internal microphones often employ some sort of automatic gain control to compensate for their lack of directivity, another enhancement you won’t need with an outboard directional mic.
How about that small screen? Our office happens to have a 46-inch LCD TV in it, across from the desk. For a group meeting, the overall experience can be enhanced by connecting it to the laptop and using it as the primary monitor, along with a separate microphone. You can see those tiny faces in tiny windows more clearly, and your audio will sound a lot better to them. Add in an external USB camera, and you can close the lid of your laptop altogether. (That large display screen is also quite beneficial for online yoga, Pilates, and art instruction we mentioned earlier!)
For those folks who rely on tablets and (horrors!) smartphones to participate in Zoom or Skype meetings, it gets old in a hurry. Best to find a larger screen of some sort, like that new Ultra HDTV you picked up for the Super Bowl, and “cast” your tablet/smartphone screen to it. Put on a headset or a pair of earbuds with microphone for better audio (higher signal-to-noise ratio) and to cut down on ambient room noise in your ears (like that washing machine or the barking dog).
Web conferencing services offer an option to record the meeting. But there’s no reason why you couldn’t just do it yourself locally: All you need is a connection to the HDMI or DisplayPort output of your computer, and an SD-card recording system. If you don’t need the video (and that’s usually the case), use an HDMI de-embedder and simply record to one of many SD-card-based portable audio recorders. They work very well and won’t break the back. We keep one here for recording everything from local bands to worship services.
One complaint we have about modern laptops is that they’re stingy with USB connections. By the time we’ve connected an external camera, external microphone, printer, and some sort of external recorder, we’re long out of ports. So, a USB extender or multi-port USB distribution system is a handy part of the home toolbox. Make sure you have the right USB plug type, as newer laptops are all moving to Type-C connectors. (And some of those also double as external display connectors, using Alternate Mode.)
AND WHILE WE’RE ON THE SUBECT…
There’s an old saying – “The problem isn’t the car; it’s the nut behind the wheel.” We’ve participated in several teleconferences during the pandemic using a variety of software platforms, and we’ve seen both the best and the worst of conferencing practices. Admittedly, this is a new and unfamiliar way for many people to communicate – some are surprised that their laptop cameras and microphones even work correctly!
Talking about cameras…the angle of your laptop screen is kinda important. Tilt it too far back, and other conference participants will see a nice view of your ceiling and maybe a tiny bit of your head. Make sure the screen is tilted forward enough so that you wind up with a nice head-and-shoulders composition. The resulting screen angle might not be the one you normally use, but everyone else will be able to see more of you. (You can also elevate your laptop to compensate.)
Of course, an external USB camera eliminates this problem. We’ve found several models online with affordable prices. Some have tilting bases and sit on a desktop, while others can sit atop computer monitors and are also tilting types. All models have built-in microphones.
Where you position your mobile device, laptop, or external camera also matters. As a rule of thumb, don’t sit with windows or other bright light sources behind you – the camera will “iris down” to compensate the exposure, and you’ll come across as rather shadowy! Try to position your camera so your back is to a neutral, uncluttered background and let as much light fall in your direction. (Bookshelves seem to be popular backdrops these days, especially with authors.)
Some conferencing programs let you create a virtual backdrop, but don’t go crazy with it – no one wants to see pictures of your pets, your boat/sportscar, or your collection of beer cans. Take your phone outside and shoot a picture of a nice, bucolic forest, field of garden and try that. Or pictures of the ocean, or even just the sky. Nothing busy or distracting! You may also have the option of blurring the background, creating an effect known as bokeh.
You can also set up a table or floor lamp (or even a work light) with soft white LED bulbs to boost light levels. The more light your Webcam has to work with, the sharper and cleaner your video will be, even after it is compressed to death. If you have white or off-white ceilings in your room, try bouncing light instead of direct light – you’ll wind up with softer, more diffuse shadows.
If possible, try a USB headset instead of the built-in mic on your device. Headsets block out distracting audio from outside the room and eliminate any possibility of feedback loop echoes from a local speaker and laptop/tablet/phone microphone, unfortunately a regular occurrence in conferences where participants are using their smartphone as a speakerphone. (Those echoes are REALLY annoying to everyone else!) Plus, other people in your house won’t have to listen to the conference.
And when you’re not going to be talking for a while, mute your microphone so that no one has to hear the garbage truck outside, or your dog barking at a mail carrier. If something comes up that will distract you from the meeting, turn off your camera as well – no need for others to see you scurrying to pick up a wayward child and move them to another room.
We didn’t think we needed to remind anyone to dress appropriately, but apparently some conferencing attendees have taken “business casual” to a new level. And yes, we’ve heard plenty of stories about folks who are dressed from the waist up, but are only wearing pajamas, underwear, or going commando from the waist down. (If the camera can’t see it, it’s safe, right?)
Look - It’s a meeting, after all, so show a modicum of respect for other participants. You don’t have to put on formal business wear, but at least avoid the logo T-shirts and tank tops, and pick out a nice collared shirt, blouse, or polo shirt. Surprise everyone, and throw on a blazer. Stick to colors with neutral gray values – no blinding white or deep black colors, which will again throw off the camera. And a little grooming goes a long way. You may not be in the office, but you are in an office of sorts. (What you wear on your feet, however, is up to you…)
When it comes to signal management and distribution, the AV industry primarily relies on HDMI cables, USB cables, category cables, and shielded/unshielded audio cables. In other words, a steady diet of copper gets us from Point A to Point B and beyond. If we’re able to keep the distances between Point A and Point B to a reasonable number, and our bandwidth requirements aren’t excessive, then all is well.
But what happens when our video and audio signals need to traverse a distance longer than a few hundred feet? After all, copper wire does have some degree of resistance, and by extension, attenuation. What’s more, as the frequency of the transported signal (clock rate) increases, the electrons start moving from the center of the cable to the outer part, a phenomenon known as the “skin effect.”
Eventually, the frequency of the signals in use becomes so high that they leave the cable altogether and travel as photons through the air, which is why television stations broadcasting on UHF frequencies use tuned waveguides to couple energy from a transmitter to the antenna. The transmission resembles more of an elaborate plumbing job than anything else!
We could extend AV signals over long distances by converting electrons into photons, i.e. light energy. But we’ll need a suitable transmission medium to carry those pulses of light from Point A to Point B. And that’s where optical fiber cable comes in: It’s able to move those photons from a transmitter to a receiver over very long distances with minimal signal attenuation and degradation.
Let’s say you need to run a l-o-n-g HDMI extension to a remote display, mounted about one thousand feet from the source. HDMI, by itself, won’t get you much more than 25 – 50 feet. HDBaseT extensions are only good to about 300 feet. And a network interface isn’t available for this extension. What to do?
Simple. You’ll need an HDMI-to-optical fiber transmitter/receiver set. The transition-minimized differential signals (TMDS) from the HDMI source become pulses of light, ready to fly through space. If you connect a fiber optic cable using multimode transmission – meaning that the pulses of light reflect multiple times off the core of the fiber as they travel – then you can extend the original signal up to 1.8 miles.
If you elect to use single-mode optical fiber (the pulses of light travel in a relatively straight line through the core of the fiber), then you can extend your source signal all the way out to 20 miles for a reliable connection. The choice is up to you! Keep in mind that multimode optical fiber cable is cheaper than single-mode cable (not to mention Cat6 network cable) and is more than adequate for the above example.
There are a ton of advantages to using fiber optic cable. For one thing, it’s completely isolated from interference, both man-made and natural. It’s also unaffected by ground loops (differential voltages) and magnetic fields. Given its low attenuation per foot, you can just buy a pre-assembled cable with connectors, run the cable, and loop up the excess – no need to trim and re-attach connectors. Or, you can make up your own cables – crimp-on connectors for optical fiber are quite easy to use these days.
Some integrators have already jumped on the fiber wagon. 10-gigabit network switches support both copper wire or optical fiber through small form-factor pluggable (SFP) connections, and it’s likely that faster switches will rely mostly on fiber connections – the signal attenuation over copper wire at higher frequencies is substantial, once the cable run exceeds ten feet.
Fiber optic cable doesn’t take up much room, either. Bundled cables with multiple fibers can be run and laid in overhead cable trays easily enough. (Just don’t put a tight bend in them!) By building out a facility with fiber interconnects to all rooms and spaces, you’ve ensured your facility is future-proofed. If another audio or video signaling format comes into vogue, or your bandwidth demands increase, you simply change out the optical interface – no need to pull new cables.
While the current version of HDMI our industry relies on (v2.0) uses the TMDS format, the next version (v2.1) and all versions of DisplayPort employ a packet-based digital transmission system. That’s an even better match for optical fiber transmission! What’s cool about fiber is that we can multiplex audio, video, control, and metadata all through the same cable, at the same time. We do this with a variety of tricks, including time division (spacing out different packets), code division (coding packets), and wave division multiplexing. With the later process, different wavelengths of light carry different signals.
It should come as no surprise that Kramer supports optical fiber signal distribution. The 675T 4K60 4:4:4 HDMI Transmitter over Ultra-Reach MM/SM Fiber Optic Transmitter and companion 675R Receiver are designed to extend HDMI v2.0 signals over very long cable runs. These useful gadgets couldn’t be easier to set up and operate – all you need to do is provide the fiber optic connection, using a type LC connector at both ends. For shorter runs, multimode cable does the trick, while single-mode fiber will handle the long-haul stuff.
675T and 675R use near-zero latency video chroma sub-sampling conversion technology to
auto−adapt HDMI signals with data rates above 10 Gb/s to a 10G optical link signal data rate. Both units are HDCP 2.2 compliant and support data rates up to 18G (6G per channel), along with LPCM 7.1, Dolby True HD, and DTS-HD audio formats, as specified in HDMI 2.0. Additionally, Kramer’s I-EDIDPro™ Intelligent EDID Processing™ ensures plug and play operation for HDMI source and display systems.
If you’re moving to or have already adopted SDVoE network-based AV signal distribution, there’s a Kramer optical interface product for that, too. KDS-8F is a high-performance, zero latency, 4K@60Hz (4:4:4) transceiver for streaming video and audio via Ethernet over single-mode and multimode optical fiber. And it is ambidextrous: KDS-8F can encode and stream its HDMI or DisplayPort input multiplexed with IR and RS-232 control signals, plus analog audio and USB; all over an IP network. Or, it can receive an SDVoE-encoded signal and decode it for HDMI output, along with control, audio, and USB.
For ruggedized operations, Kramer also offers the CRS-PlugNView-H cable. It’s a high-speed HDMI
active, armored optical cable (AOC) designed for heavy-duty use and abuse expected from rental and road applications. These cables support resolutions up to 4K@60 (4:4:4) 18 Gbps over long distances without an external power supply or additional extenders. You can get ‘em in a variety of lengths from 33 to 328 feet.
Remember – fiber is good for you!
We focus on video- and display-related products a great deal of the time in these monthly ramblings, and for good reason. Video and display signal management represents the “heavy lifting” of the AV industry, involving a myriad of signal formats with lots of pixels, high bit rates, and different color resolutions.
Interfacing and switching analog video and display signals was quite the headache years ago, and one could argue that the transition to digital came just in time. (Imagine interfacing a 4K/60 connection with discrete analog wiring! On second thought, no, don’t imagine it, you’ll just get a massive headache.)
While video usually grabs our attention, audio often seems to just come along for the ride, like your annoying younger brother when you went to the park to play with your friends. You knew he was there, but you largely ignored him and hoped he wouldn’t wander off and get you in trouble with Mom.
Fact is, we have almost as wide a variety of audio signals these days as we do video signals. And audio can come from a varied number of sources with a wide range of quality levels, ranging from professional quality to “what the heck is all that background noise?” Much of it originates from consumer gadgets that were designed for user convenience and not to win any Hollywood awards for ‘best sound mixing.’
Every product that can capture video also has some form of audio recording built-in. That includes smartphones, tablets, laptop computers, camcorders, digital SLR cameras, point-and-shoot cameras, and even those ‘smart’ speakers that are all the rage nowadays. Because the manufacturers of these gadgets don’t know how or where you plan to record audio, most of these products have some sort of automatic gain control (AGC) turned on to make sure it does get recorded.
That works fine in a quiet space, but not so great outside (wind and ambient noise) or in a crowd (background vocal sounds and ambient noise). And the audio output levels vary from one gadget to another, as do the frequency response and microphone characteristics. If you were to string together a bunch of YouTube video clips shot with a wide variety of cameras and phones, you’d clearly hear these differences.
That’s why having some sort of audio digital signal processing (DSP) is really handy these days. DSP can fix a multitude of problems, including audio levels and equalization, and it can be operated using nothing more than a graphical user interface (GUI) via a network connection. DSP also comes in handy when connecting and mixing good old-fashioned analog microphones, particularly in a meeting or conference space. Think of DSP as replacing an analog sound engineer and mixing board, which would take up too much room in a meeting anyway and be distracting.
We can easily implement DSP in a meeting room and also accommodate a wide range of analog and digital input signals. Better yet, we can also offer DSP to huddle spaces, where audio connections and playback are about as ad hoc as it gets. We might be able to hear each other in the space, but anyone participating remotely won’t hear a thing unless microphones are used and volume levels are set to workable ranges. And it would be nice to level all audio sources so that we don’t transition from “gentle breeze” to “airplane taking off” between clips.
Remember – audio can come from just about any connection these days. In addition to analog inputs, we can play digital audio through USB ports and we can also embed it in an HDMI connection. Of course, audio will also be served up over Internet connections, if we’re streaming from a Web site. And all of it will probably need some sort of signal processing so it sounds clear and crisp. We’ll have to support all of these connections to cover the bases.
One advantage of using a digital audio system is that we can also consolidate several other discrete pieces of audio hardware into a single chassis. In addition to signal switching and processing, we can also throw in an amplifier and even some room control functions, again running everything from a network interface to make operation ‘plug and play’ as much as possible. What once required a full rack of mixers, equalizers, amplifiers, and switching gear is now consolidated into a do-everything, single rack unit product.
Kramer knows a little bit about audio, as we’ve been supporting the category for just about four decades. And two of the latest additions to the Kramer audio line reflect this latest thinking in audio hardware and software. AFM-20DSP-AEC is a multi-function audio matrix switcher that comes with 20 bi-directional analog audio ports. Instead of having to work with a pre-determined matrix configuration, you get to decide how many inputs and outputs you need. (And you can easily change your mind later on, because you will. And you know it.)
For digital audio, it includes an HDMI input and output with embedding and de-embedding, plus coaxial S/PDIF input and output jacks. There’s also a 4x4 Dante connection for networked, low latency audio. But Kramer didn’t stop there. AFM-20DSP-AEC also includes a stereo amplifier (2x60-watt @ 8 ohms or 1x120-watt @ 70V/100V). On top of all that, there’s a 32-bit digital-analog converter (DAC) with selectable sampling rates up to 96 kHz, and simultaneous digital signal processing of all inputs and outputs.
The other new product may be a first for the category. Kramer’s DSP-62-AEC was engineered for one of the trickier audio environments – huddle spaces, which are notable for their general lack of hard-wired AV gear. This compact wonder manages to support bi-directional audio through two HDMI inputs and one output, a USB port, a stereo analog audio jack, and up to four analog microphone connections. DSP-62-AEC can route and mix any audio source and send it wherever you want.
If you’re ready to retire racks of discrete audio hardware in favor of a simpler, all-in-one solution, or are scratching your head wondering how you can provide a better audio experience for huddle spaces (which some folks might describe as trying to herd cats), Kramer has you covered. Both AFM-20DSP-AEC and DSP-62-AEC should improve audio quality considerably during meetings. (Sorry, we can’t do anything to fix the quality of presentation content…)
It’s been a couple of years since the first wave of 4K AV products washed ashore, and yet, there is still some confusion about what the term “4K” actually means. It doesn’t help that there has also been (and continues to be) a lot of misinformation offered about this imaging format, ever since the first commercial and consumer displays with 3840x2160 pixel resolution were unveiled in 2012. So let’s clear things up.
To start with, “4K” is kind of a vague catch-all term. A display with true 4K resolution will have 4096 horizontal and 2160 vertical pixels, which is a cinema format. The version of “4K” we’re more familiar with is defined by the Consumer Technology Association as “Ultra HDTV.” Displays and display signals (and many UHD camera sensors) classified as Ultra HD have 3840 horizontal and 2160 vertical pixels. Not quite true 4K, but close enough for our purposes.
Now, here’s where things get tricky. An Ultra HD display signal has four times as many pixels in a single frame as a Full HD video signal; 9.9 million versus 2.48 million. That’s quite a boost in payload, and it creates a speed limit challenge when interfacing, particularly as we increase the frame rate and color bit depth.
There is truth in numbers! A Full HD video frame has a total of 2200x1125 pixels, including blanking. Multiply that by a frame rate of 60Hz, using 10-bit RGB color (or 4:4:4, using a broadcast notation), add in the customary 20% ANSI bit overhead, and you have a payload of 5.346 gigabits per second. (Let’s call it 5.4 Gb/s, to simplify matters).
How did we arrive at that number? Well, 2200 pixels × 1125 pixels × 60 = 148.5 MHz, which is a common pixel clock frequency for Full HD. Next, we multiply 148.5 by 3, because we’re using RGB color. And we then multiply that product by 12 (10-bit color + 2 bits as overhead) to arrive at our final number: 5.4 Gb/s of data. That can easily fit through an HDMI 1.4 connection, which has a maximum rate of 10.2 Gb/s.
Okay, time to exit our Honda Civic and get into a high-performance BMW M3. Our Ultra HD signal has a total of 4400 horizontal and 2250 vertical pixels with blanking. Refreshing that signal 60 times per second gives us a pixel clock of 594 MHz, and using 10-bit RGB color, we now have a sustained data rate of 21.384 Gb/s. Wow! (Not surprisingly, that’s four times as fast as our Full HD signal calculation.)
That’s way too fast for HDMI 1.4. In fact, it’s even too fast for HDMI version 2.0, which can’t transport data any faster than 18 Gb/s. (That’s why a newer and faster version of HDMI – v2.1 – is just now coming to market.) Hmmm…do we really need 10-bit RGB color for everyday applications? Probably not, so let’s dial the bit depth back to 8 bits per color, which should suffice for high-resolution graphics and images.
Jiggering the math that way drops our bit rate down to 17.82 Gb/s, which gets us within the speed limit of HDMI 2.0. We can also crawl under the limbo bar by reducing color resolution to 4:2:2 or 4:2:0. A 10-bit Ultra HD signal with 4:2:2 color has a data rate of 14.26 Gb/s, while a 10-bit 4:2:0 version drops that number to 10.7 Gb/s.
And we can trim the data rate even further by cutting the frame rate in half to 30 Hz. Initially, that’s what many signal management companies did to accommodate Ultra HD signals while retaining the HDMI 1.4 interface. But as our display screens get larger (and they are getting a LOT larger), lower frame rates with wider fields of view can produce noticeable flicker. This phenomenon was first observed by Japanese broadcaster NHK as they began rolling out 8K TV broadcasts…but that’s a story for another time.
So, we want to stick with at least a 60 Hz frame rate for our 85-inch Ultra HD LCD monitor or our 120-inch Ultra HD LED wall. We’ll definitely need signal management products equipped with HDMI 2.0, at the minimum. By doing so, we can accommodate more powerful graphics workstations and laptops, where we can set the bit depth to fit our system bandwidth. We can also stream Ultra HD video from physical media and streaming platforms, where the most common color resolution is 4:2:0. Again, an easy fit for our system.
“Hold on there,” you’re probably thinking. “Where is all this demand for Ultra HD coming from?” Time to wake up and smell the coffee, folks: Monitoring and surveillance systems, such as those used by traffic agencies and process control systems always need more pixels on the screen. Gamers are always looking for more pixels on the screen at faster refresh rates with low latency. So do companies engaged in energy exploration, visualization and virtual reality, 3D modeling, and medical imaging. You know the old saying: You can NEVER have enough pixels.
And as we just read, the AV industry is rapidly switching over to Ultra HD displays as Asian display panel “fabs” phase out zero-profit Full HD panels and ramp up Ultra HD panel production. That means single-monitor and TV sizes as large as 98 inches using LCD and OLED technology, and tiled LCD/OLED display walls – plus LED walls – that have 4K, 8K, and even higher pixel counts. If you are looking to build a new signal distribution system, it is a wise bet against the future to support the higher bandwidths required for Ultra HD, all the way through every connection.
Kramer’s VP-551X 8x2 4K presentation switcher/scaler is well-suited to this purpose, equipped with eight discrete HDMI 2.0 inputs with embedded and discrete audio and native support for both 4:4:4 and 4:2:0 color resolutions. In addition to a single HDMI 2.0 output, it also provides an HDBaseT output for 4K/30 RGB or 4K/60 4:2:0.
More importantly, all HDMI ports are compatible with HDR10, a standard for static metadata required to display high dynamic range video. HDR is becoming an intrinsic part of Ultra HD production and display, allowing the reproduction of a much wider range of luminance values from black to full and specular white. VP-551X also passes the Dolby TrueHD and DTS-HD Master Audio formats, common with physical media and streaming playback.
There are other handy gadgets for your Ultra HD signal management system. Kramer’s new 675T and 675R are plug-and-play fiber optic signal extenders that accept type LC optical plugs and will work with either multimode or single-mode fiber, providing signal extensions up to 20 miles with single-mode operation. The beauty of optical fiber is that it has virtually no speed limit issues, as opposed to copper wire-based signal extenders that run several hundred feet at most.
You also have the option of routing your Ultra HD signals over a 10-gigabit Ethernet connection by using the Kramer KDS-8F SDVoE video streaming transceiver. As an encoder, it encodes and streams HDMI or DisplayPort signals along with infrared control, RS-232 control, analog audio, and bi-directional USB 2.0 over an IP network, using small format pluggable (SFP) optical fiber. KDS-8F can also work as a receiver to decode all of the signal formats just mentioned to an HDMI port with discrete audio, IR, and RS-232 connections.
And that is “what’s up” with 4K these days. The demand for more pixels, refreshed at faster rates with greater color bit depth, isn’t slowing down one bit. (Did you know that 8K cameras are now being used to inspect sewer pipes? It’s true! But that’s a story for another time…)
You might have noticed one AV product category that’s gotten a ton of attention in recent years (if not most of the attention): Presentation sharing/collaboration. Just about everyone and their uncle offers some sort of hardware and/or software that allows anyone to share what’s on their screen with others, whether they’re running Android™, iOS™, or Windows™.
Some of these products are complex and loaded with all kinds of add-on tools. Others are bare-bones designs that employ little more than screen-scraping techniques. This makes choosing a system unnecessarily difficult for customers, who more often than not aren’t even sure of what they expect out of a presentation sharing/collaboration product.
You could make an argument that this category invented itself. In previous times, people dragged laptops into a meeting room, uncoiled and plugged in AC power and display connections, and used tabletop, under table, and remotely-controlled presentation switchers to cycle among the various presentations. If you wanted a copy of whatever was being shown, it was delivered as a photocopied handout or a PDF file after the fact. What a pain in the neck!
But enough engineers were able to see into the not-so-distant future of mobile, personal electronics, i.e., smartphones and tablets. These devices have become so powerful that they have replaced laptops for many functions. And they make extensive use of two things – fast wireless connections and cloud-based content/file storage and delivery. So, why not use them as a new form of presentation platform, particularly with their ability to easily capture high-quality video (although often in the wrong orientation)?
Soon enough, presentation sharing hardware started rolling off the assembly line. Prospective buyers were overwhelmed with these gadgets, some of which were loaded to the gills with advanced functions that would never be used. Other models seemed to be so simplistic in function that they were little more than cheap consumer products. Company and campus IT departments weighed in with concerns about connectivity to their networks and any security threats these new-fangled gadgets might present.
Eventually, customers had to decide just how much functionality they wanted in such products. Younger users, who consider mobile phones as essential to their lives as their internal organs, were primarily sharing photos and videos. In contrast, older users (those still depending on laptops) were more accustomed to loading up things like spreadsheets for meetings. Some attendees wanted paper copies of the presentation, while others simply took photos of the screen of relevant material, using their phones, of course (which is ironic, in a way).
Calls from the IT department got louder over time. As presentation sharing/collaboration products started popping up on networks, strong passwords and two-factor logins became necessary. Multiple installations meant multiple trips through buildings and across campuses to update firmware. Presenters complained about herky-jerky video and issues sharing iOS screens. It was definitely “gaffer tape and paper clips” time!
Today, the presentation sharing/collaboration marketplace has matured enough that manufacturers have a pretty good idea of how people actually use these gadgets. Over time, anecdotal evidence revealed that a majority of customers just wanted a simple, reliable, easy-to-deploy-and-manage wireless screen-sharing solution, particularly in the education vertical. And as it turned out, there was no need to re-invent the wheel: The best approach was to connect and present without needing to install anything on a computer or mobile device – just leverage the native OS and Web browser protocols that are built into each user’s device.
That’s not to say there wasn’t a need for additional features, such as viewing the main screen on your own device (great for lecture halls with long viewing distances), editing documents together in real time, sharing any size file with anyone else in the meeting or class, instant polling, and turning the main display into a digital whiteboard with recordable annotation. Certain groups needed and continue to need all of those bells and whistles.
But for others, the ability to connect their mobile device quickly and easily to a shared screen using a standard wireless connection was the big draw, using iOS mirroring for MacBook™, iPad™, and iPhone™ as well as native mirroring for Chromebook™, Android (Lollipop OS 5.0 or newer), and Windows phones. So was figuring out a way to stream video at native frame rates without the end result turning into visually-annoying, low frame rate flip movies.
The hardware-intensive approach to early wireless presentation systems has now morphed into one that focuses more on software, and rightly so. Indeed, it’s now possible to build your own wireless presentation system simply by installing a software package and using AirPlay for MacOS & iOS, Miracast™ for Windows & Android, and connecting directly through Chrome or Firefox Web browsers.
A bridge from the past to the present was also created for legacy meeting spaces by adding wired HDMI™ inputs to wireless presentation platforms. The hardware and software mixes both wired and wireless connections together in a seamless way, extending the useful life of existing presentation switchers by making them another gateway to the wireless system. (It’s always good to have options!)
Those overworked folks responsible for maintaining IT networks were placated with a versatile software package that allows remote monitoring and configuration of multiple presentation sharing devices on the network – no need for physical visits to each room or space. And by incorporating 1024-bit encryption over each wireless link (and, if necessary, building “DMZs” with firewalls), security was a non-issue.
What we’ve just described is a general outline of Kramer’s VIA wireless presentation product line. For basic plug-and-play connectivity, VIA GO provides 60 Hz video streaming, 1024-bit encryption, and built-in WiFi. VIA Connect PRO can show up to four screens simultaneously and any in-room meeting participants can view the main display, edit documents together in real time, share any size file, and turn the main display into a digital whiteboard. (VIA Connect PLUS adds a wired HDMI input.)
For more advanced users, Kramer’s VIA Campus2 adds e-polling to instantly measure student feedback and can also be used as a secure wireless access point for guests. Six user screens can be shown on one main display and up to 12 screens by using two displays. Remote students can easily join the class and collaborate in real time with embedded 3rd-party video conferencing and office apps including Microsoft Office®, Skype®, GotoMeeting®, Lync®, and WebEx®. (VIA Campus2 PLUS adds a wired HDMI input.)
In its simplest form, VIA can be loaded and run as a software program (VIAware), It delivers the same security offered by all VIA devices and can be installed on any Windows 10 computer. Itcan show up to six user screens on one main display or up to 12 screens on two displays, and remote students can easily join and collaborate in real time with embedded 3rd-party video conferencing and office apps.
Finally, VIA Site Management (VSM) is a software application that enables IT administrators to manage, monitor and control all connected VIA devices. VSM generates alerts on system health and includes reporting and analytics tools for understanding VIA device usage.
It’s taken a few years to get there, but we can finally answer the question posed at the start of this missive: What do presenters really want? Simple, reliable, easy-to-deploy-and-manage wireless screen-sharing solutions, as it turns out. Who knew?
The AV industry has a few “benchmark” years, starting with the introduction of light valve video projection in the 1980s and continuing through the first solid-state video/data projectors in 1993, the first flatscreen displays in the mid-1990s, optical disc media in the late 1990s, high definition TV in the early 2000s, a migration from plasma to LCD later that decade, and widespread adoption of high-speed wireless for connectivity over the past five years.
Right now, we’re laser-focused on moving away from full-bandwidth video signal distribution to compressed Full HD and 4K video switched and routed over IT networks. That in itself is a sea change for integrators, and will more closely align our industry with the world of information technology, likely causing the loss of more than a few jobs along the way, as has happened recently in the broadcast industry.
We’ve also bought into the idea of ditching short-arc projection lamps in favor of a more durable, eco-friendly solution that harnesses laser diodes to color phosphor wheels. And we seem to like the concept of wireless signal connectivity for presentations and collaboration, slowly moving away from wired connection hubs on walls and tabletops.
But the biggest change of all is just starting to emerge from behind the curtain, and that is the increasing dominance of the light-emitting diode (LED) in display technology. And when we say “dominance,” we really mean it – LEDs have the potential to become the first unified display platform since the cathode-ray tube was developed a century ago.
It’s not like we didn’t see it coming. LED videowalls with coarse pixel pitch have been around for more than two decades, but they were limited to installations in large stadiums and arenas, and as outdoor signs in places like Times Square and the Las Vegas Strip. That all changed around the start of the present decade, when individual LEDs became practical to manufacture in ever-smaller sizes.
Those videowalls and scoreboards from 1999 had, on average, a pixel pitch of about 10 millimeters. A contemporary version for indoor installations is likely to have a pixel pitch of 2 millimeters, presenting images with much finer detail when viewed at close range. Given that most of the LED device and tile manufacturing takes place in China, it wasn’t long before tile and wall prices began falling…and customers as diverse as staging companies and retail chains took notice, and bought in.
You probably did, too, at ISE and InfoComm about five years ago. Where did all of these LED all companies come from, all of a sudden? How come I never heard of any of them? Wow, those things are bright! Bet they’re expensive…
To be clear, the position of “display king of the hill” rotates every few years. CRT displays sat on the throne for generations. Then plasma displays took over, only to be deposed by LCD screens. The latter have clung to power for over a decade, but they can see the writing on the wall. The only question is how quickly LCD technology will cede its top-of-the-market position to LEDs.
Based on anecdotal evidence, what we’ve seen at recent trade shows, and forecasts from display analysts, the coronation is going to happen pretty soon. Just as large, economical Full HD LCD monitors and TVs escorted “hang and bang” projectors out of classrooms and meeting spaces, large LED walls are putting a serious dent into the sales of high-brightness projectors, particularly for staging live events. And they’re setting their sights on large LCD monitors and TVs next.
It’s easy to see why. LED walls are built up out of smaller tiles and cubes, just like Lego toys. They literally snap together into lightweight frames, using molded multi-wire plugs to daisy-chain tiles and cubes together, and are easy to fly or stack. LEDs don’t make any noise, aside from small cooling fans when operating in enclosures, and provide a bright, colorful, and high-contrast one-piece display system that can show 4K and even 8K-resolution video.
Still; many end-users and integrators have long regarded LED walls as expensive niche displays, not anything practical enough to install in classrooms, lecture halls, and meeting spaces. Well, that thinking got blown out of the water last June at InfoComm, where we saw the first build-it-yourself LED displays for meetings rooms.
These products have sizes ranging from 120 to 150 diagonal inches (that’s 10 and 12.5 feet, respectively) and offer dot pitches from 1.8 to 2.5 millimeters. They come in modular kits that take two people about three hours to assemble and wire together, and can be hung on a wall or even attached to a roll-around stand – all you need to do is plug in a power cord and connect your video source through an HDMI port, and away you go.
In terms of brightness, LED displays actually have to throttle back on their luminance. Using a pulse-switched operating mode, they could easily hit 2,000 nits, but that would be glaringly uncomfortable in a small room. The actual luminance level is closer to 400 – 600 nits, adjustable to compensate for high ambient light levels.
From a technology perspective, these products are LCD killers. But from a financial perspective, they’re not quite there yet: A 130-inch model has a retail price of about $75,000, which would more than cover the cost of four 65-inch LCD monitors and signal processing gear. Then again, the first 50-inch plasma monitors for commercial use retailed for about $25,000 apiece twenty years ago, so we can expect prices to come down pretty quickly on these products as demand grows.
With large LED walls established as the go-to display for digital signage and image magnification, and fine-pitch walls establishing a beachhead in meeting rooms, the next step is consumer televisions and mobile displays. Late-model Ultra HDTVs with high dynamic range support already use large matrices of mini LEDs as backlights.
It will fall to a new class of LEDs – “micro” devices – to fill in the missing slots or consumer displays…but that’s a story for another time…
The AV industry has arrived at a singularly intriguing point in time. Like the rest of the communications world, we’re getting ready to jump aboard the IT bandwagon and use TCP/IP networks and switches to route and distribute audio and video…leaving the world of uncompressed display signal management behind.
Coincidentally, this paradigm shift is coinciding with another move; this time, from Full HD video to Ultra HD video. But that’s not all: We’ll also have to reckon with high dynamic range content and its associated wider color space. And off in the distance, you can see yet another freight train approaching, with its box cars reading “High Frame Rate Video.”
On the one hand, we’re learning all about codecs, latency, forward error correction, groups of pictures, jumbo frames, and a host of acronyms like DHCP, IGMP, and HLS. On the other, we’re frantically calculating our bandwidth requirements and wondering if we’ll have a fast-enough network to handle all of these high-speed, souped-up pixels.
And that brings up a really good question. Just how fast is “fast enough?” If we’re building an AV-over-IT network, what top speed should we aim for? And how, exactly, can we build some degree of futureproofing into our design so we don’t have to come back a couple of years from now and install new switches and perhaps even new cables?
Let’s start with the basics. Is the network going to be used to switch near-zero or very low latency video? (Audio’s easy, no need to worry there.) if your answer is yes, then you are talking about gigabits per second and more likely tens of gigabits per second data rates. On the other hand, if the AV-IT network is going to be used primarily to stream content in non-real time and some latency isn’t a problem, then we’re talking about tens and hundreds of megabits per second.
Still, you need to design for the highest possible speed requirements, and if at any point you want to manage low-latency video using JPEG-based codecs (or other light compression codecs), you’re back in the tens of gigabits per second neighborhood. Currently in our industry, we have manufacturers advocating for 1 Gb/s switch fabrics and others saying, “No, you need at least a 10 Gb switch.”
From our perspective, designing a 1 Gb/s AV-IT network is essentially tagging it with a “planned obsolescence” sign. A basic 4K (Ultra HD) video signal that’s refreshed 60 times per second and has 8-bit RGB color will generate an uncompressed data rate of 17.82 Gb/s, WAY too fast for a 1 Gb/s switch without significant compression. Yet, that’s a rudimentary form of 4K video.
With HDR enhancements, we’ll need to move to 10-bit sampling. We can cut the color resolution from RGB (4:4:4) to the broadcast standard 4:2:2, at which the new bit rate is 14.26 Gb/s – still too fast for a 10 GB/s switch. We could drop the frame rate in half to 30 Hz, slowing the bit rate down to 7.13 Gb/s and clearing the switch without additional help. But maybe the application really needs that higher refresh rate?
This problem is ameliorated to some extent with the SDVoE Blue River NT codec. It applies light (2:1) compression with super-low latency to let us limbo with ease under the 10 Gb/s bar. All fine and good, but what if we want to run a 10-bit RGB 4K signal with HDR and a 60 Hz frame rate from point A to points B, C, D, E, and F? The raw data rate is now 21.4 Gb/s, and even 2:1 compression won’t get us through the switch.
One possible solution is to compress the broadcast (4:2:2) 4K video format by using a lossless codec like JPEG XS (TiCo) to pack the signal down by as much as 6:1, then come out of the decoder with full-bandwidth display connections. Japanese TV broadcaster NHK showed a demonstration last year at NAB of an 8K signal (7680x4320 pixels) with 10-bit 4:2:2 color and 60 Hz refresh, successfully transiting a 10 Gb/s network switch thanks to 5:1 JPEG XS compression.
So many numbers to think about! But that’s our new world as we continue to push up pixel counts and bit depths. And we haven’t even talked about gamers and virtual reality enthusiasts, who would prefer to see frame rates well above 60 Hz to minimize motion blur. Assume 96, 100, and even 120 Hz for these specialized applications and run the calculations again. (You’ll probably want to jump out the window when you’re finished.)
It’s all too easy to increase bit depth and frame rates for 4K and 8K signals and wind up with data rates that would warrant speeding tickets. But our job as integrators is to provide some sort of future-proofing in any installation…which is why our industry might want to start looking at 40 Gb network switches.
Yes, they exist, although mostly as selected ports on hybrid 10 Gb / 40 Gb switches. And it goes without saying that those 40 Gb ports are using quad small form pluggable (SFP) optical fiber connections. These switches aren’t insanely expensive: We found several models through an Internet search that are priced between $3,000 and $8,500, and although not “pure” 40 Gb switches, they will do the job.
So – is a 10 Gb network switch “fast enough?” Or maybe we need a 40 Gb switch? Believe it or not, 100 Gb switches are in development. Is THAT fast enough for you? Better check your math…
Unless you were wearing a blindfold while walking around CES 2019, you could not miss the numerous displays of 8K televisions. 8K banners hung from anywhere, along with signs for the other “hot” technology at the show – artificial intelligence (AI).
We counted well over a dozen examples of 8K TVs in our perambulations, most of them showing a series of still images with high resolution and high dynamic range. A few demos actually showcased 8K footage, presenting incredible detail in closeups of a honeycomb with attendant bees, or ants crawling over a vegetable garden. (That last clip was so realistic that it freaked out more than a few entomophobes!)
Other more practical demonstrations featured lower-resolution content scaled up to fit the 8K screen. Note that 8K TVs, when they finally arrive in any quantity, will likely start with a screen size of 65 inches and move up from there, hitting a maximum (so far) of 98 diagonal inches. Like it or not, you’ll be sitting pretty close to such a screen, so having all of those pixels plus enhancements like high dynamic range will make for a more pleasing viewing experience.
And you can largely attribute this move up in resolution to the Chinese, more specifically companies like TCL that are building Generation 11 LCD panel fabrication lines. These lines will crank out larger panels that can then be cut into smaller (although still large) sizes and at lower per-panel costs. Given how the competition between Korean and Chinese panel makers has largely decimated any profitability in the Ultra HDTV space (a/k/a 4K), the move up to 8K is almost a necessity.
It shouldn’t be a surprise that the average person is now asking,” Wait a minute! How is it we’re already talking about 8K video and displays? Didn’t we just start moving to 4K resolution? What’s the rush?”
Turns out, the move to 8K has actually been in the works for a long time. More specifically, it started almost 25 years ago in Japan, when broadcaster NHK began researching the next step up from HDTV (which was just getting off the ground outside of Japan!). Their goal was to design and build cameras capable of capturing 8K video at high frame rates, plus the attendant infrastructure to edit, store, and play it back, along with getting it to the home.
NHK’s research and development led to the demonstration of a 4K (4096x2160) camera in 2004 at the NAB show. They followed that by introducing their first 8K camera sensor at NAB 2006, followed by an improved version in 2012. Sharp also showed an 85-inch 8K LCD monitor at CES that year, but people didn’t pay as much attention to that demo as they did the arrival of the first 4K / Ultra HDTV monitors in September of that year at the EFA show.
Back then, depending on the brand name, that 84-inch Ultra HD video monitor – which required four HDMI 1.4 inputs to work – could have set you back as much as $25,000 USD. Coincidentally, this was about seven years after we saw the start of a move away from 720p/768p displays and TVs to Full HD (1920x1080) screen resolution.
A year after those ground-breaking 4K TVs showed up, NHK unveiled a 4-pound 8K Steadicam rig, plus a multi-format video recorder prototype. A 13-inch 8K video monitor for cameras, using OLED technology, also took a bow. And by 2014, NHK was broadcasting selected Olympic events in 8K via satellite via locations around the globe.
In our industry, we were still pushing Full HD and 2K displays and signal management products, looking over our shoulder at a 4K dot in the distance and figuring we had plenty of time. That all changed at ISE in 2018, where Ultra HD and 4K displays were everywhere, not to mention an $80,000 8K broadcast camera from Sharp. Full HD digital signage still make plenty of sense, but the economics of LCD panel manufacturing meant that the fabs in Asia would be pulling back on Full HD and ramping up Ultra HD production.
So here we are in 2019, just embracing the move to Ultra HD. Yet, pundits are already saying, “It’s time to give 8K a look.” At least one Tier 1 display brand has already showcased 8K digital signage at ISE and NAB, and will do so again at InfoComm, likely prompting competitors to show they’re at least players in this new game in Orlando. NAB featured half a dozen 8K video cameras along with recording and storage solutions, and what’s likely the first-ever 8K digital SLR camera to hit the consumer market.
Is this irrational exuberance? Hardly. Clever readers will note that this summer will mark seven years since the first 4K TVs took a bow, seven years after the transition started to Full HD (which itself took place about seven years after the industry began moving away from standard definition displays to 720p/768p HD displays).
Industry forecasts are for about 430,000 8K TVs to ship by the end of December, with over 2 million shipments called for in 2020. Those numbers closely track the roll-out of 4K / Ultra HDTV models from 2012 through 2014. Given that our industry really didn’t embrace 4K until 2017, we figure that you have just a couple of years to get with the 8K program.
And keep in mind that we’re fast approaching a point in time when the pixel density in a display just won’t matter anymore. Because of economics, all large TVs and monitors over 65 inches will have 8K resolution, whether you need it or not. Fortunately, video scalers have gotten quite powerful and can “pull up” your lower-resolution content to fit the screen. And other metrics like HDR, color accuracy, and high frame rate support will be the important ones, not the number of pixels.
Are you ready for 8K?
Ever since the HDMI 2.1 standard was announced at CES in 2017, we’ve all been waiting with baited breath for the chipsets to arrive. v2.1 offered such a speed increase over v2.0 (2013) that it sounded almost like science fiction, leaping from 18 gigabits per second (Gb/s) to an amazing 48 Gb/s, just like that!
And the signaling method would change, too, falling into line with the rest of the world by adopting a digital packet structure, much the same as DisplayPort (which, incidentally, is what much of V2.1 was modeled after). Instead of three lanes for red, green, and blue, plus a separate lane for clock information, v2.1 now employs four separate data lanes, each capable of speeds as fast as 12 Gb/s. With a packetized structure, intermixing and embedding clock packets is a piece of cake.
While that all sounded very impressive over two years ago, the reality still has yet to catch up with the promise. At CES 2019, 8K was a “big” thing, and the HDMI Forum booth had several demonstrations of 8K signaling, including an 8K home theater centered around a Samsung 900-series 85-inch 8K TV. (Ironically, the earlier versions of this TV shipped with the older and slower HDMI 2.0 interface.)
If you dug a bit deeper and asked a few more questions, you would have learned that the testing and certification process for the v2.1 interface is still very much in progress and is not likely to wind up until the fall of this year. What’s more, only one chip manufacturer (Socionext) was cranking out v2.1 TX and RX chips in any quantities as of the end of 2018, with other fabs just getting up to speed.
The hype over HDMI 2.1 reached a bit of absurdity when a prominent television manufacturer declared at their CES press conference that all of their 2019 Ultra HD televisions would have v2.1 input. (No mention as to how many.) Further questioning revealed that, although video signals could enter one of these Ultra HD televisions through a v2.1 physical interface, the signals would be processed as v2.0 after that inside the set.
Why the push for v2.1? Simple. The latest enhancements to TV – high dynamic range and its associated wider color gamut – create a lot more bits per second. And v2.0 looks more and more like a giant speed bump in that context. Presently, you can push a 4K/60 signal through HDMI 2.0 IF you reduce the bit depth to 8 bits per pixel, using the RGB (4:4:4) format. Want to send a 10-bit signal at the same frame rate? Now you have to cut the color resolution to 4:2:2, not easy to do with a computer video card.
While 48 Gb/s may be unattainable in the near future, a data rate around 36 Gb/s could be within reach. That would allow the passage of 4K/60 content with 12-bit RGB color, a truly spectacular set of images streaming at just shy of 30 Gb/s. Or, you could generate a high frame rate (120 Hz) 4K signal for gaming purposes, using 12-bit 4:2:2 color and still get under the wire at 33.3 Gb/s.
The challenge for HDMI has always been higher bit rates over copper. Unlike DisplayPort, there is no provision in the HDMI 2.1 specification for transport over optical fiber, although that shouldn’t be difficult to accomplish given the interface’s packet structure. Above 40 Gb/s, we may have to use optical fiber simply because signal losses over copper wires would be too high to maintain a workable signal-to-noise ratio.
Over in the DisplayPort camp, there hasn’t been a lot of counter-punching going on. Few manufacturers support the DisplayPort v1.3/1.4 standard (v1.4 adds support for HDR metadata, plus color resolutions other than 4:4:4 / RGB) and it’s only the more exotic video cards that would require that kind of speed. Gaming is a good example of a speed-intensive application and that crowd would love to have 12-bit color refreshing at 120 Hz. Or maybe even faster.
Where does that leave our industry? You’ll be hard-pressed to find many signal management products at InfoComm in June that support v2.1 – it took our industry almost four years to really get onboard with v2.0, and we still get press releases from companies boasting how they finally added HDMI 2.0 to their media players and other products. (Well, it’s been five-and-a-half years, you know!)
From our perspective, we don’t expect to see much adoption of v2.1 until a year from now, and even then, things will move slowly. The ProAV industry is more obsessed with the transition from uncompressed high-bandwidth signal distribution to compressed IT-based distribution, centering on 10 Gb/s network switches. (Sorry, you 1 Gb/s fans, that’s just too slow for future-proofing.)
A good “tell” will be how many 4K and 8K TVs (yep, 8K TVs, repeat as often as necessary) will start arriving in the fourth quarter of 2019 with one or more v2.1 input. The more TV manufacturers get with the program, the more likely you’ll see them on commercial monitors and digital signage displays next year in Las Vegas. Another “tell” will be how quickly our industry embraces 8K commercial displays (more to come in our next blog post). Without v2.1, 8K will be nigh impossible.
In the meantime, we’re reminded of that classic song by The Kinks that goes, “I’m so tired, tired of waiting, tired of waiting for you….”
We recently attended the annual Society of Motion Picture & Television Engineers technology conference in Los Angeles. SMPTE has been holding this event for decades and it attracts the best and the brightest to talk about advancements in everything from television and motion picture production to human visual science, advances in audio, video compression and transmission, and lately, promoting the field to women and college graduates.
One invited papers session in particular stood out this year, and its focus was on 8K television. Regular readers of this blog may remember that much of the research and development in this area is being undertaken by NHK, the national Japanese broadcasting network. NHK commenced their work way, way back in 1995, first achieving 4K resolution in camera sensors in 2004 and then introducing their first 8K camera sensor in 2006.
Since then, they’ve designed and built a 4-pound 8K camera system, pushed sensor speeds to as high as 480 Hz, and created a simultaneous downconversion product that ingests 8K video and spits it out at native resolution, 4K resolution, and Full HD resolution. As you can imagine, that requires a ton of processing power!
At this year’s session, one speaker from NHK detailed their work in next-generation camera sensors for 8K that incorporate a unique organic photoconductive film (OPF) layer to boost sensitivity while keeping noise to a minimum. That’s a real challenge when working with small camera sensors – Super 35 sensors for 4K (Ultra HD) production are already jammed full of tiny photosites, a design headache for camera engineers. Now, imagine you’re asked to double the number of pixels in the same size sensor, where the average pixel measures about 3 micrometers!
The second paper described a new 1.25” 8K camera sensor that can record video at frame rates as high as 480 Hz, or eight times as fast as conventional sensors. Using this sensor, fast motion can be captured with minimal blurring and very fine detail. The captured video is down-converted in-camera to 120 Hz for eventual recording and playback. As you might guess, the data flowing from the camera sensor is a gusher: Uncompressed, with 10-bit 4:2:2 color sampling, it approaches 100 gigabits per second (Gb/s), or more than twice as fast as the latest version of HDMI (2.1) can handle.
The final NHK paper talked about setting up the world’s first full-time 4K/8K satellite broadcasting system, which launched in December of 2018. Aside from the technical challenges of bandwidth (both left-hand and right-hand circular polarization of the radio waves was necessary to carry all of the signal data), there was an additional obstacle: Many residents live in older apartment buildings, making the cable infrastructure upgrade process difficult. It was eventually solved by installing parallel lines of plastic optical fiber (POF, or Toslink) alongside existing coaxial cable, telephone, and power lines.
Where is the relevance to our industry? Consider that, ten years ago, Ultra HD and 4K video was largely a lab experiment to many of us. In 2019, we were just getting used to managing Full HD signals over signal distribution and interfacing systems, wrestling with color bit depth and varying frame rates, not to mention the limitations and foibles of HDMI.
Yet, three years later, the first commercial Ultra HD monitors washed up on our shores. A decade later, Ultra HD has become the default resolution for consumer televisions and most commercial AV monitors and displays. Just as we did in 2009, we’re wrestling with the same signal management issues, color bit depths, refresh rates, and a whole new version of HDMI…which isn’t even ready to handle the higher bit rates that 4K video requires for higher frame rates and higher color bit depths.
So, while we fuss, argue, complain, and try to adjust to this latest jump in resolution to 4K, there is a country that is already working at TWICE that video resolution for acquisition, editing, storage, and distribution to the home. There’s no reason to think that we’ll catch up to them eventually – the first 8K televisions already launched to the North American market earlier this year, and we’re seeing early interest in 8K displays for specialized installations like command and control, surveillance, visualization and augmented reality, and (of all things) 3D visualization using autostereo displays.
Skeptics can scoff all they want, but this never-ending push upward and onward in spatial resolution isn’t going to stop. If anything, additional momentum will be provided by enhancements like high dynamic range, wider color gamuts, and high frame rate video. (Did you know that, as display screens get larger and fields of view become wider, any flicker in images created by judder and slower frame rates becomes increasingly noticeable? NHK studied this phenomenon and concluded that a minimum frame rate of 80 Hz was required for 4K and 8K on large displays.)
And as usual, we’ll be expected to interface and transport these signals. The SMPTE SDI standard for UHD (12G SDI) is already inadequate for signal-wire serial digital connections, having a maximum data rate of 11.88 Gb/s. This has resulted in 8K camera manufacturers employing four separate 12G SDI ports and some light compression in-camera to record 8K/60 video with 10-bit 4:2:2 color (uncompressed data rate of 47.7 Gb/s). And it’s also revived interest in the latest SMPTE standard for SDI, 24G (23.76 Gb/s, likely over optical fiber).
This should be interesting to watch, particularly since our industry is still working around the six-year-old TMDS-based HDMI 2.0 interface (18 Gb/s), is still largely allergic to optical fiber, and is promoting an AV/IT video codec that’s barely fast enough to squeeze 4K/60 10-bit 4:4:4 video through a 10-gigabit network switch.
Do you feel the need for speed yet?
Back in September, we discussed the escalation of “Ks,” as in how many thousands of pixels the display industry is trying to stuff into next−generation LCD, OLED, and inorganic LED panels. We mentioned that the first 8K displays are now coming to market, even as our industry is still trying to come to grips with the care, feeding, and handling of 4K / Ultra HD video signals.
Things are moving more quickly than anticipated. The HDMI Forum recently held a press conference in New York City to talk about HDMI 2.1 and where it’s headed. This newer, faster version of HDMI was first introduced at CES in 2017 and is quite the departure from previous versions.
Instead of using transition−minimized differential signaling (TMDS), which was the foundation of digital display interfaces going back to DVI in 1999, version 2.1 has adopted a packet format very similar to that of DisplayPort. By doing so, HDMI 2.1 can now expand signal carriage to four lanes of data with an embedded clock, compared to the older 3 lanes with separate clock used in all HDMI versions through 2.0.
There are other advantages. Because the signal is 100% digital now, it can be compressed using Display Stream Compression (DSC), which will come in really handy with the massive signals needed to handle high frame rate video and 8K. Another advantage is that the clock rates and data are free to zoom far beyond the 18 Gb/s limit of version 2.0.
Indeed; HDMI 2.1 now has a maximum data rate of 48 Gb/s (or 12 Gb/s per lane). That number is mind−boggling: We’re only starting to see network switches with that much speed come to market. But if you run the numbers, you WILL need that kind of speed for advanced high−resolution imaging.
Consider a 4K signal with high dynamic range and a 120 Hz frame rate. The base clock rate for such a signal, using standard CTA blanking, would be 4400 pixels (x) 2250 pixels (x) 120, or 1188 MHz (1.188 GHz. Add in 10−bit color (the minimum for HDR) with 4:4:4 (RGB) color resolution, and the grand total (after shopper coupons) is 1188 (x) 12 (x) 3 = 42.77 Gb/s. Going to lower color resolution lowers the tab a little: With 4:2:2 color, the data rate is 28.51 Gb/s and with 4:2:0 color, it drops to 21.39 Gb/s.
That’s still pretty fast – too fast for HDMI 2.0. And if we start talking about 8K imaging, things get even crazier. An 8K video stream (again, using standard CTA blanking) with just 10−bit RGB color at 60 Hz refresh will leave you in a cloud of dust:
8800 (x) 4500 (x) 60 (x) 12 (x) 3 = 85.536 Gb/s.
Zoom−zoom! We’d have to drop to 4:2:0 color resolution just to get that signal through an HDMI 2.1 connection. Even 4:2:2 color would be too fast at about 57 Gb/s. The current version of DisplayPort would also vanish in the rear−view mirror, as it is capped at 32.4 Gb/s. (We expect to hear about a new version of DP at CES next month, presumably one that’s a LOT faster.)
This is presumably where DSC would enter the picture. It is capable of 2:1 compression with extremely low latency, and that would get our example 8K/60 signal down to earth and to a point where it could travel over HDMI 2.1 (but not DP). The only catch is, DSC requires quite a bit of computation to work correctly and is considered “CPU−hungry,” which of course adds cost to its implementation.
What’s curious about HDMI 2.1 to us is the continued lack of a native optical transport specification. Any signal running in the 40 Gb/s range should probably travel over optical fiber. Certainly, if it’s going to travel through a 40 Gb/s network switch, that transport will be as pulses of light and not electrons dancing on the outer edge of copper conductors.
We inquired at the NYC press event if any HDMI Forum members were actually making v2.1 transmitter and receiver chipsets yet. So far, only one company in Japan (Socionext) is doing that, but you would be hard−pressed to find any commercial or consumer products that support V2.1 at present. (We’ll certainly have our eyes open at CES for one!)
As mentioned in September, it’s expected that over 5 million 8K TVs will be shipped worldwide by the end of 2020 – just two years from now. Hand−in−hand will be a small but growing number of 8K monitors for commercial use (yes, there are customers waiting for such products, believe it or not) and the vast majority of those will come from super−sized LCD panel “Fabs” in China that are currently under construction or just firing up.
We’ve frequently used this expression in the past: “What good is a Ferrari if you live on a dirt road−” Well, that’s pretty much the situation we’re looking at with the next generation of displays. Higher resolution, high dynamic range, wider color gamuts, and high frame rates will all add up to super−sized packages of display data that dwarf what we switch and distribute today.
New codecs like JPEG XS / TiCo will help to squeeze things through network switches, but we’ll still have a choke point at the physical display interface. And we don’t have any real solutions to the problem just yet: Do we use compression− Double up on interface connections− Skip the traditional HDMI / DP interface altogether, and use a decoder inside the display to decompress the signal−
We’ve just returned from the annual Society of Motion Picture & Television Engineers (SMPTE) technology conference in Los Angeles. This is one of the pre−eminent motion imaging and media delivery conferences in the world, attracting papers from the best and the brightest working across a diversity of disciplines. Image capture, signal distribution, storage, displays, video compression, virtual and augmented reality, streaming – you name it, there was a session about it.
One of the more intriguing sessions covered artificial intelligence (AI) and machine learning (ML), particularly as those apply to post−production and media workflows. AI and ML are both hot−button topics right now, and more pervasive than you might think. EDID is a very rudimentary form of AI that must be programmed, but it allows displays and video sources to automatically make the best connection in terms of image resolution, frame rates, and color modes.
Internet of Things (IoT) products for the home both incorporate AI and ML, based on predictions. Every time you use an IoT device in conjunction with other devices, or perform the same set of operations when you use that device, it can “learn” the patterns and save them as a “macro.” With enough on−board intelligence, the device can ask you if you’d like to repeat previous instructions and then execute those instructions automatically.
A good example would be leaving the house, turning down the thermostat, and switching on selected lights along with an alarm. All of these actions can be saved and repeated automatically, and the group macro given a name (“Out for The Evening”). You just need to tell your voice recognition system to execute that command.
In our world, the individual commands that turn on lights in a room and activate selected pieces of AV gear are already programmed into macros, accessed from a touch screen. With facial and voice recognition, you wouldn’t even need the touchscreen – the system would recognize you automatically, determine if you are authorized to use anything in the room, and ask your preferences. (You’ll know you’re in trouble if your IoT system says, “I’m sorry Dave, I can’t allow you to do that.”)
In the SMPTE world, AI and ML can be used for more sophisticated functions. Let’s say you have a great deal of footage from a film shoot that’s been digitized. AI can search that footage automatically and sort it, based on parameters you choose. With facial recognition, it can group all takes featuring a given actor, a certain cityscape background, or daytime vs. nighttime shots. It’s conceivable that AI & ML could even look for continuity errors by rapidly scanning takes. (Did you know NASCAR has digitized over 500,000 hours of video and film from 1933 to the present in their library, searched and accessed by AI−)
There are parallels to other industries. In the legal world, document searches that were once performed by legions of low−paid clerks are now executed by AI robots, programmed to look for specific key words. Demonstrations have been made of advertising and marketing copy written entirely by AI, based on keywords and macros previously programmed. There have even been attempts to have robots write fiction!
Another popular session topic – one which took up an entire day – was high dynamic range (HDR). According to a session chair, HDR “is a hot mess right now” as there are multiple competing standards, no consistency in coding metadata for HDR program content, and a lot of unanswered questions about delivering HDR content to viewers and measuring the quality of their experience.
For many attendees, there were plenty of basic questions about HDR – how does anyone define it exactly− How often is it used in current movies and television programs− Are there metrics that can be used to define the quality of the HDR experience− What are the “killer apps” for HDR− How does HDR affect emotional and perceptual responses in viewers−
For the AV industry, both AI and HDR will be hot−button topics in 2019. With each passing year, more of the signal distribution, coding, and storage infrastructure we build and use will become automated. The day is coming when we’ll stop obsessing over display resolution and media formats and will instead search for content by name in the cloud to play back on whatever display we have on hand.
AI will create and store multiple resolutions of the desired content and stream files to us at the highest possible resolution and frame rate that our network connection can reliably support. (That’s already happening with advanced video encoders and decoders that “talk” to the network, determine the safe maximum allowable bit rate, and change it on the fly as network conditions change.)
Storage was yet another popular topic, as was blockchain. We’re not quite yet familiar with the ins and outs of blockchain (as are many of you, no doubt!), but suffice it to say that the world is moving away from scheduled media distribution to individual, on−demand content consumption from cloud servers through a myriad of distribution channels. And many of those will rely heavily on wireless connectivity, increasingly through 5G wireless networks.
The SMPTE conference wouldn’t be complete without a look into the future. Our industry is still trying to get up to speed on 4K, yet 8K video is already on our doorstep. Movie theaters are looking into LED screens to replace the decades−old projector/screen model. We can now wrap a viewer in dozens of channels of “reach out and touch it” three−dimensional sound. (Did you know the National Hot Rod Association (NHRA) is working with Dolby to add multi−channel spatial sound to its telecasts−) And while virtual reality (VR) is still struggling to get off the ground, its counterpart augmented reality (AR) is moving ahead by leaps and bounds.
How much of this will affect the AV industry− All of it, sooner or later…
As a company primarily focused on signal management (switching, mixing, interfacing, and format conversion), Kramer doesn’t pick sides when it comes to signal sources and “sinks” (a/k/a displays). We’re more concerned with getting the signals there intact over a variety of connections, which today could mean anything from full−bandwidth HDMI cables to AV−over−IT and fast WiFi.
But we can’t help but observe overarching trends in the AV industry. And one that clearly stands out is a shift away from front projection to direct−view displays, a category that includes everything from flat panel LCD and OLED technology to emissive LED walls, particularly those that use fine−pitch LED arrays.
If you like to attend concerts by popular singers and groups (and who doesn’t−), you may have noticed the extensive use of image magnification (IMAG) in the form of towers of LED cubes, or arrayed as wide walls behind the band (or even both!) It’s hard to miss these stacks, particularly if the concert is outdoors and the sun hasn’t set yet.
Sharp−eyed viewers might also notice that just about every touring act – whether it be U2, Paul McCartney, Blake Shelton, Keith Urban, or a Broadway musical – now uses LED walls as set pieces and IMAG displays. And why not− They’re super bright, scalable, and from a staging standpoint, comparatively easy to assemble and disassemble. At least, more so than flying projectors and screens, the “old school” way it used to be done.
Fact is, LED walls (which primarily use components made in China and are being priced very competitively) have substantially eaten in to the market share of high−brightness projectors. And it’s easy to see why, as there are no lamps to change (or filters) and lenses to fit. You need a bigger image− You simply build a bigger LED wall. No worries about high ambient light levels, not when you’ve got upward of 3,000 nits of brightness to start with.
To achieve that level of brightness with a projector, you’d have to start with well over 30,000 lumens. And that doesn’t even consider the size of the projected image. For those playing at home, let’s assume we want to light up a 10’ x 18’ screen area with the equivalent of 3,000 nits, or 877 foot−Lamberts.
10 x 18 = 180 square feet (x) 877 = 157,860 lumens
Yikes! That’s a LOT of lumens. Even five stacked 30,000 lumens projectors would come up short.
Perhaps a more dramatic example can be found closer to home. As noted in previous commentaries, the free−fall in LCD panel pricing has resulted in bargain−basement deals on televisions, specifically those with Ultra HD (4K) resolution. But you may not have noticed just how low those prices have fallen recently.
It is now possible to purchase 75−inch Ultra HDTVs for less than $1,500, with some Chinese brands falling perilously close to $1,000. This, in turn, has led panel manufacturers to “go big” and bring out even larger panels in the 80” range. Consequently, anyone can buy 82−inch and 85−inch Ultra HD televisions for less than $4,000 – and these sets also support high dynamic range and its associated, wider color space.
It wasn’t that many years ago that a Full HD home theater projector with about 2,000 lumens light output was priced at around $7,000. Add in a screen, brackets, and associated wiring, and you’d be well on your way to $10,000! Given that many home theater installations used screens in the 80−inch to 90−inch range, it’s almost a no−brainer to opt for the self−contained LCD television and do away with the screen, bracket, and a lot of extra wiring.
As you can see, fine−pitch LEDs and super−sized LCD televisions and monitors are nibbling away at projector market share from both ends. And this trend is only going to continue as panel prices and LED pitches continue to drop. So, what market does that leave for projectors−
The answer is any image that requires three−dimensional mapping, like curved walls, spheres, or unusual shapes (trapezoidal, multiple planes). LCD panels can be formed in many ways, but it’s not easy to make them into curved shapes. Their more inexpensive cousins (OLEDs) can be printed onto flexible substrates and warped into all kinds of unusual shapes – even cylinders – but have nowhere near the horsepower of inorganic LED walls. Projectors make more sense here, logistically and financially.
Where projectors fell a bit behind but are catching up is in resolution. The move to 4K started first and foremost with projectors almost 15 years ago, but attention shifted to large LCD displays around 2012. Since then, LCDs and now OLEDs have dominated the discussion, and compared to an 85−inch Ultra HDTV with HDR, a ‘true’ 4K home theater projector is quite an expensive beast. Projectors that use lower−resolution chips and image shifting have now come to the AV marketplace to try and keep pace with the move to 4K.
From our standpoint, all of these trends point toward two things: Faster clock rates and a ton of pixel data moving from point A to point B. Doesn’t matter whether you opt for an LED videowall, a super−large LCD display, or a 4K projector! The increased refresh rates, expanded color bit depth for HDR, and some new tricks like high frame rates that will result will put greater demands on your signal switching and distribution systems.
Got your back. Our engineers already have their calculators out…
In baseball, the letter ‘K’ is shorthand for strikeout – getting a batter to swing at or take a third strike. It’s not unusual to see fans of a particular batter holding up signs with large letter ‘Ks” on them to signify how many strikeouts a pitcher compiles during a game. By the way, the record for a nine−inning game is 20 strikeouts, an almost−impossible feat accomplished by just two major league pitchers. Who were they− (Answer at the end of this article)
In the world of electronics, “K” stands for the more conventional value of one thousand, being derived from the “K” in “Kilo,” which according to the dictionary is “…a Greek combining form meaning “thousand,” introduced from French in the nomenclature of the metric system” and “…French, representing Greek chī́lioi or a thousand.”
The display industry has become fixated on “Ks” lately. Until the late 1990s, we didn’t have any displays capable of “kilo” pixels of resolution: Just 20 years ago, the first plasma display monitors came to market with 1,280 horizontal imaging pixels, making them the first “kilo” displays (at least, in one axis.) After the turn of the 21st century, we started to see displays panels with almost 2,000 horizontal pixels (1920, to be exact) and for the first time, more than 1,000 vertical pixels (1080 and 1200, respectively).
Wow, that was a lot of pixels – 2,073,000 to be precise. And most of us figured that would be good for some time to come – who would need more resolution than that−
Turns out, everyone. Aside from some unusual high−resolution displays from Apple that had 2550 horizontal pixels, we were stuck at 1920x1080 and 1920x1200 for a few years. That is, until 2012, when a new crop of so−called “4K” displays made an appearance at the IFA consumer electronics show. (To be fair, Sony had been selling high−brightness SXRD projectors for cinema applications since the mid−2000s and these models had 4096 horizontal pixels.)
Six years later, televisions and monitors with 4K resolution (mostly 3840 horizontal and 2160 vertical pixels) have become commodities. What happened− For starters, Chinese display manufacturers made big bets on 4K LCD panels for TVs, figuring that Full HD (1920x1080) would provide diminishing returns over time. They constructed new fabrication lines to crank tens and hundreds of thousands of panels each month.
Korean manufacturers didn’t sit on their hands, ramping up production of 4K LCD and organic light−emitting diode (OLED) panels for televisions and commercial applications. “4K” became a buzzword for the latest and greatest in flatscreen displays. Screen sizes increased as prices continued to drop from $238 per diagonal inch for those original, limited function 84−inch 4K LCD monitors to an amazing $9 per diagonal inch for a 55−inch 4K television today.
While that’s a bargain price for consumers, there’s little or no profit for panel and display manufacturers when a 60−inch 4K TV can be had for less than $1,000. So, the Chinese took the lead again, deciding maybe it was time to jump to the “next” K – 8K, or more specifically, 7680 horizontal and 4320 vertical pixels. (For those keeping score at home, that represents about 33 million total pixels, or 16 times the resolution of an old−fashioned Full HDTV.)
What – has the world gone crazy− Aside from some Ultra HD Blu−ray discs and Netflix/Amazon streaming, there’s very little 4K content to watch today. And you want me to buy an 8K TV next time around−
Remember – it’s not about content, it’s about profitability. And it’s also about televisions and monitors getting larger and larger. 8K resolution on a 42−inch display that sits ten feet from the nearest viewer makes no sense at all. But 8K resolution on an 80−inch display that sits just a few feet away does make sense. Think of how coarse outdoor LED signs appeared in the early 2000s. Now, wander through InfoComm and notice the 20−foot and 30−foot fine−pitch LED displays that are popping up everywhere: With a fine dot pitch (say, 1.2mm and down), they’re approaching 8K resolution.
Market research done by a few companies predicts that by 2020, over 5 million 8K TVs will be produced and sold worldwide by 2020 – less than a year and a half from now. Granted, many of those sales will take place in China, but you will see 8K televisions on store shelves by Christmas of this year and certainly no later than the 2019 Super Bowl.
Think we’re crazy− At the recent IFA Show, both LG and Samsung announced they were bringing 80 inch−class 8K televisions to retail this fall. LG’s entry, which has no model number or pricing information yet, is an 88−inch OLED TV. (For those keeping score at home, that’s a little more than seven diagonal feet.) Samsung’s answer is the Q900FN QLED 8K TV, an 85−inch LCD display that uses quantum dot backlighting to produce high dynamic range images and is supposed to arrive on these shores in October. There’s no way to predict retail prices for either product, but it’s a safe bet they’ll be more than $9 per diagonal inch.
If you still can’t get your head around the fact that 8K TV is right around the corner, this will blow you away; Innolux, a Taiwanese display manufacturer, showed a 16K 100−inch monitor at a trade show in China in late August. Yep, you read that right – 16K, or more specifically, 15,360 horizontal x 8,640 vertical pixels. If you can’t see the pixel structure on an 8K display unless you are just 1.5 feet away, you’ll never spot it on this display without a jeweler’s loupe.
Crazy, right− So, what does this mean for signal management and interfacing products− With Full HD, we have to move 2,073,000 pixels every second. For 4K (Ultra HD), the payload jumps to 9.9 million pixels per second, and for 8K, we’re looking at 39.6 million pixels per second (including the blanking interval). Our 594 MHz pixel clock for 4K now accelerates to about 2.4 GHz and a data rate of about 22 gigabits per second (Gb/s) now rockets to about 90 Gb/s for a 10−bit RGB 8K signal. Got bandwidth− We sure hope so….
It’s generally accepted that most of the product innovation in the AV industry originated in the world of consumer electronics. Blu−ray discs, tablets, big and inexpensive Ultra HD displays, faster WiFi, and streaming video all got off the starting line ten years ago. Even more recent trends like Internet of Things control systems and the migration from projected images to fine pitch LED screens are largely driven by consumer behavior.
Look at all of the wireless collaboration systems that must continually update their operating systems to handle content from Apple hardware as each version of iOS comes to market. (It’s a wonder Apple hasn’t come out with its own wireless collaboration platform.) And incremental improvements to WiFi are also being driven by a growing demand to stream video to the home. Speaking of video, more institutions of higher learning are posting instructional videos to YouTube channels, and of course those channels must be accessible to students.
The migration of AV signals to IT infrastructures has followed a similar trend in the CE world. Cable TV companies are already implementing delivery of video, audio, and data over fast wireless networks in the home, eliminating the need for a traditional coaxial cable connection. That’s largely because many new homes are not being built with category wire in their walls. Instead, builders assume homeowners will just rely on fast wireless to make all their connections.
It’s not a stretch to imagine a meeting room in the not−too−distant future that relies on wireless connectivity for every device – even videoconferencing. We’re already at that point with home offices, where teleconferences use GoToMeeting, WebEx, Skype, Zoom, and other software−based codecs through 802.1ac routers. It works! So why not add in wireless voice control systems−
Well, things don’t always work out as planned. A recent survey conducted by the Web site The Information revealed that, while about 50 million Amazon Alexa voice recognition systems have been purchased, only 100,000 or so of their users have actually ordered anything from Amazon using Alexa. It appears that Alexa users are primarily ordering up music from streaming sites and not for ordering groceries, toilet paper, or pet food. (Or headphones, bicycles, smart televisions, clothing, batteries, etc.)
It’s clear that the Alexa platform is quite popular, but still a novelty for many. Anecdotal evidence reveals that there is still a learning curve to be tackled for people to fully embrace voice control systems, whether they come from Amazon, Google (Assistant), Samsung (Bixby), or other companies. But with an increasing number of gadgets coming to market equipped with network interfaces, there will be solid growth in IoT control systems applications. Right now, things like doorbell cameras linked to televisions and lights that dim and change color seem to be the popular applications.
One thing that may give pause is the perception that some large corporation is collecting data on you and your viewing/buying/consumption habits as you issue voice commands. Well, they probably are. But that concern could present a market opportunity for a somewhat dumber voice control system that can handle all of the control “stuff” and leave the shopping to others. Granted, such a device would come at a cost premium: Amazon eats a lot of the cost of Alexa boxes because they want you to buy things with it and figure they’ll make their profit at the back end, especially from Prime members.
What’s intriguing is the fact that most appliances for the home that were shown at CES back in January incorporate the Google Assistant platform. (“OK, Google!”) Google is not in the retail business, but they are very much in the “big data” business. Their voice control product is obviously aimed at supporting IoT control in the home, but it can and will also gather data every time you issue a command, even if it’s just to stream music from the Google Play store.
Samsung’s Bixby VC platform exists for yet a different reason, and that is to convince you to buy as many Samsung products for your home as possible. Televisions, washers and dryers, refrigerators, tablets, laptops, smartphones – any and all of these can easily integrate into a Bixby−controlled universe of appliances. Samsung has gone so far as to state that EVERY product they make will be “connected” by 2020.
So, what does all of this mean to our market− First off, the more end−users become comfortable with voice recognition and control, the more they’ll request it be part of an AV installation. If you can walk into your house and tell Bixby to turn on the lights and television (and probably the oven to warm up last night’s pizza), you will logically assume that you can walk into any room and turn things on with your dulcet tones. That would include classrooms, meeting rooms, and even huddle spaces.
Second, the smart companies will be busy developing drivers for these VC systems so they can indeed be used to control the AV gear in a given room. (Even if it’s just adjusting lights and drapes at first.) Once a potential customer learns that a high−profile installation has adopted VC, they’ll want it for their facility. There’s nothing like keeping up with the Joneses to motivate customers, especially in high−profile installations.
Third, the adoption of VC systems will drive IT administrators crazy. The security issues alone could represent a Pandora’s Box (or Pandora’s voice control) to admins: How can you be sure the voice recognition system isn’t responding to a recording− Do you require two−factor identification, such as facial recognition, a thumbprint, or even a spoken password before executing a command or series of commands−
Fourth, someone will develop a generic voice control system not tied to selling groceries, collecting data, or checking to see if the spin cycle is over. This VC system will be targeted specifically at the commercial AV market (and possibly residential customers) and come with an appropriate interface to translate commands into addressable IP packets to operate just about anything with an Internet hook−up. There may even be multiple, incompatible VC products offered for sale.
But make no mistake, voice control is coming. You have our word on that. (And “no, Alexa, I did not just order five cases of Double Stuff Oreos!”)
If you’ve been paying even the slightest attention to trends in the AV industry, you know that fast wireless connectivity has become a very important part of any installation. In particular, WiFi is integral to all of the wireless collaboration and presentation−sharing devices that are overrunning classrooms and meeting rooms.
What you may not be aware of is how hard WiFi protocol developers are working to try and stay ahead of the tidal wave of consumer and commercial products that are completely and utterly reliant on wireless connections. Those readers who’ve been around long enough to remember the crude attempts in the late 1990s to stream static images and presentations to projectors should be suitably amazed that anyone can now watch 1080p/60 video on a mobile device – without dropped packets and buffering.
But is that benchmark good enough− Nope, and particularly with 4K video now getting a foothold. The current “fast” protocol is IEEE 802.11ac, otherwise known as channel−bonding WiFi. This protocol combines two or more 20 MHz wireless channels to boost bandwidth and connection speeds, enabling the “holy grail” of reliable 1080p/60 streaming.
Well, that used to be the holy grail. Now, customers want to stream multiple 1080p/60 videos without buffering while simultaneously passing the usual TCP/IP traffic. So, the hard−working wireless tecchies have come up with an even faster standard – 802.11ax.
What’s in an “x−” Supposedly, improved performance, extended coverage and longer battery life. 802.11ax can deliver a single video stream at 3.5Gb/s, and with new multiplexing technology, can deliver four simultaneous streams to a single endpoint with a theoretical bandwidth of 14Gbps.
801.22ax does this by using a higher level of modulation – quadrature amplitude modulation (QAM), to be precise. Your cable TV company sends you digital TV programs using 256−QAM (or 256 levels of symbols). 802.11ax goes even further by employing 1024−QAM (more symbols of data) and combines it with more antennas (multiple in, multiple out or MIMO).
A newer version of MIMO, Multiple User (MU−MIMO) can provide up to eight simultaneous streams of video from one wireless access point. Unlike 802.11ac, version “x” operates in both the 2.4 GHz and 5 GHz wireless bands to combine and format channels. And a technology called Orthogonal Frequency Division Multiple Access (OFDMA) allows each MU−MIMO stream to be split in four additional streams, boosting the effective bandwidth per user by a factor of four.
The difference between earlier versions of the 802.11 protocol and version “x” is like the difference between a Toyota Corolla and a Ferrari. Whereas older versions of wireless lacked the ability to dynamically shape streams and antenna beaming paths, “x” basically modifies every single parameter of a wireless connection to optimize streams of packets. (Did we mention it’s also smart enough to detect on−going activity and wait until the channel is clear to begin transmissions−)
If you’ve figured out that you’ll need all−new WiFi routers and access points to use any of these features, you win the giant stuffed teddy bear! Your in−room wireless networks will have to be upgraded to 802.11ax at some point, but the good news is that new WiFi gear is pretty inexpensive. What’s more, 802.11ax is backward−compatible with systems like 802.11ac and 802.11n.
Hey – what a minute. Haven’t we written about 802.11ad before and called it the best thing since waffle irons− True, we did. And for sheer speed, you can’t beat “d” as it supports streaming rates of over 3 Gb/s per channel, with six 2 GHz channels to work with. No congestion here – it’s the equivalent of having an 8−lane highway accessible during rush hour.
But there’s a catch. (There’s ALWAYS a catch.) 802.11ad operates in the 60 GHz radio band; way, WAY above the frequencies used for 802.11ac and 802.11ax. Radio waves at this frequency are quite small – a full wave measures just under 2 inches, meaning that antennas or this band can be formed right on a semiconductor layer.
And the catch is that 60 GHz radio waves have a very limited range at maximum power levels (FCC rules allow about 1 watt), which means if you move more than about 30 feet or 10 meters from a 60 GHz WiFi access point, you’ll probably lose the signal. 60 GHz signals also won’t pass through solid objects of any kind, which does provide a backward form of security when you think about it.
In contrast, 2.4 GHz wireless signals can travel through many solid surfaces that are conductive. And 5 GHz radio waves can penetrate wood floors, glass, sheet rock, curtains, and plastics for quite a distance. With boosters, you can increase the range of 2.4 and 5 GHz signals by several tens of feet. (A similar trick will work at 60 GHz with steerable antennas.)
Practically speaking, both wireless modes can co−exist nicely. “d” is perfectly suited for short−range, high−bandwidth links in rooms and “x” can pick up the slack over longer distances in larger spaces. Indeed, we’ve seen tri−band modems come to market in recent years that incorporate the older “c” version with “d,” although they have been slow to gain adoption.
As our industry finds more ways to get rid of wires, especially when connecting devices configured for Internet of Things (IoT) control, demands for more efficient and faster WiFi will only increase. Versions “x” and “d” should satisfy those demands for a few years…we hope…
If you managed to make it out to this year’s running of InfoComm, you might have summarized your trip to colleagues with these talking points:
(a) LED displays, and
Indeed; it was impossible to escape these two trends. LED walls and cubes were everywhere in the Las Vegas Convention Center, in many cases promoted by a phalanx of Chinese brands you’ve likely never heard of. But make no mistake about it – LEDs are the future of displays, whether they are used for massive outdoor signage or compact indoor arrays.
With the development of micro LED technology, we’re going to see an expansion of LEDs into televisions, monitors, and even that smart watch on your wrist. (Yes, Apple is working on micro LEDs for personal electronics.)
Projector manufacturers are understandably nervous about the inroads LEDs are making into large venues. Indeed; this author recently saw Paul Simon’s “farewell tour” performance at the Wells Fargo Center in Philadelphia, and the backdrop was an enormous widescreen LED wall that provided crystal−clear image magnification (very handy when concertgoers around you are up and dancing, blocking your view of the stage).
As for the other talking point – well, it was impossible to avoid in conversations at InfoComm. Between manufacturers hawking their “ideal” solutions for compressing and streaming audio and video and all of the seminars in classrooms and booths, you’d think that AV−over−IT is a done deal.
The truth is a little different. Not all installations are looking to route signals through a 10 Gb/s Cisco switch. In fact, a brand−spanking−new studio built for ESPN in lower Manhattan, overlooking the East River and the Brooklyn Bridge, relies on almost 500 circuits of 3G SDI video through an enormous router. Any network−centric signal distribution within this space is mostly for IT traffic.
That’s not to say that installers are poo−pooing AV−over−IT and the new SMPTE 2110 standards for network distribution of deterministic video. It’s still early in the game and sometimes tried−and−tested signal distribution methods like SDI are perfectly acceptable, especially in the case of this particular facility with its 1080p/60 backbone.
Even so, the writing on the all couldn’t be more distinct with respect to LEDs and network distribution of AV. But there were other concerns at the show that didn’t receive nearly as much media attention.
At the IMCCA Emerging Trends session on Tuesday, several presentations focused on interfacing humans and technology. With “OK Google” and Alexa all the rage, discussions focused on how fast these consumer interfaces would migrate to AV control systems. An important point was made about the need for two−factor authentication – simple voice control might not be adequately secure for say, a boardroom in a large financial institution.
What would the second factor be− Facial recognition− (This was a popular suggestion.) Fingerprints− Retinal scans− A numeric code that could be spoken or entered on a keypad− The name of your favorite pet− Given that hackers in England recently gained access to a casino’s customer database via an Internet−connected thermometer in a fish tank, two−factor authentication for AV control systems doesn’t seem like a bad idea.
Another topic of discussion was 8K video. With a majority of display manufacturers showing 4K LCD (and in some cases OLED) monitors in Vegas, the logical question was: Could resolutions be pushed higher− Of course, the answer is a resounding “yes!”
Display analysts predict there will be over 5 million 8K televisions shipped by 2022 and we’re bound to see commercial monitors adapted from those products. But 8K doesn’t have to be achieved in a single, stand−alone display: With the advent of smaller 4K monitors (some as small as 43 inches), it is a simple matter to tile a 2x2 array to achieve 7680x4320 pixels. And there doesn’t appear to be a shortage of customers for such a display, especially in the command and control and process control verticals.
The other conversations of interest revolved around the need for faster wireless. We now have 802.1ac channel bonding, with 802.11ax on the horizon. For in−room super−speed WiFi, 802.11ad provides six channels at 60 GHz, each 2 GHz wide or 100x the bandwidth of individual channels at 2.4 and 5 GHz.
But wise voices counsel to pay attention to 5G mobile networks, which promise download speeds of 1 Gb/s. While not appropriate for in−room AV connectivity, 5G delivery of streaming video assets to classrooms and meetings is inevitable. Some purveyors of wireless connectivity services like AT&T and Verizon insist that 5G could eventually make WiFi obsolete. (That’s a bit of a stretch, but this author understands the motivation for making such a claim.)
The point of this missive− Simply that our industry is headed for some mind−boggling changes in the next decade. Networked AV, LEDs, 8K video and displays, multi−factor authentication for control systems, and super−fast wireless connections are all in the wings.
And if you were observant at InfoComm, you know it’s coming…and quickly.
Readers may remember Butch Cassidy and The Sundance Kid, a classic western from 1969 that featured cinema icons Paul Newman and Robert Redford in the title roles as a pair of happy−go−lucky (and often violent) train and bank robbers. They were the leaders of the infamous “Wild Bunch,” also known as the Hole−In−The−Wall Gang, so named for the remote hideaway in Wyoming that they used to evade authorities after pulling off a heist.
The story goes that Cassidy and The Kid eventually decamped to South America to get away from relentless manhunts, but ultimately met their maker during a bloody shoot−out in Bolivia. In the movie, Butch and The Kid are constantly riding across the countryside from Argentina to Chile and Bolivia, pursued by a small but determined band of soldiers. “Who ARE those guys−” was The Kid’s constant refrain, as he looked over his shoulder with fear.
In the AV industry, there are plenty of “those guys” manufacturing hardware and writing software code. For the display industry, “those guys” are Chinese LCD panel fabricators, who are slowly subsuming the flat panel display business once dominated by Japan and later by Korea. In consumer electronics, “those guys” are companies like Hisense and TCL in televisions, Huawei and ZTE in smartphones, and Lenovo in laptops.
You can find corollaries in control systems, video encoders, network switches, and cable. But there’s one sector of the industry where “those guys” haven’t been able to catch up with the leaders – and that’s the display interface.
For the past 16 years, the High Definition Multimedia Interface (HDMI) has ruled the roost for display connections, pushing aside VGA at first and then DVI on everything from televisions and Blu−ray players to laptop computers and camcorders. It’s evolved numerous times from a basic plug−and−play interface for televisions and AV receivers to a high−speed transport system for 4K and ultimately 8K video. Ironically, HDMI is often the input and output connection for video encoders and decoders that, in theory, could displace it from the market altogether.
So, who are “those guys” in this sector− Why, the folks at the Video Electronics Standards Association (VESA), who developed and periodically update DisplayPort. First launched in 2006, DisplayPort was intended to replace the old analog VGA connector with a newer, 100%−digital version that could handle many times the bandwidth of an XGA (1024x768) or UXGA (1600x1200) video signal.
Other forward−looking features included direct display drivers (no need for a video card), support for optical fiber, multiplexing with USB and other data bus formats, and even a wireless specification (it never really caught on). Like HDMI, DP had its “mini” and “micro” versions (Mini DP and Mobility DP).
In recent years, VESA stayed current by upping the speed limit from 21.6 to 32 gigabits per second (Gb/s), supporting the USB 3.0 Alternate Mode, adding some cool bells and whistles like simultaneous multi−display output, adopting the first compression system for display signals (Display Stream Compression), recognizing high dynamic range metadata formats, and even accepting color formats other than RGB.
Best of all, there continue to be no royalties associated with DP use, unlike HDMI. The specification is available to anyone who’s interested, unlike HDMI. And DP was ready to support deep color and high frame rate 4K video as recently as 2013, unlike HDMI.
However…unlike HDMI, DisplayPort has had limited success penetrating the consumer electronics display interfacing market. While some laptop manufacturers have adopted the interface, along with commercial AV monitors and video cards for high−performance PCs, HDMI is still the undisputed king of the hill when it comes to plugging any sort of media device into a display.
Even long−time supporters of DP have switched allegiances. Apple, known for using Mini DisplayPort on its MacBook laptops, is now adding HDMI connections. Lenovo, another DP stalwart, is doing the same thing on its newer ThinkPad laptops. Clearly, the HDMI Forum isn’t worried at all about “those guys.”
That’s not to say “those guys” are giving up the chase. Earlier this year at CES, VESA had several stands in their booth demonstrating a new set of standards for high dynamic range and wide color gamuts on computer monitors – specifically, those using LCD technology. DisplayHDR calls out specific numbers that must be achieved to qualify for DisplayHDR 400, DisplayHDR 600, and DisplayHDR 1000 certification.
Those numbers fall into the categories of 10% full white, full screen white “flash,” and full screen white “sustained” operation, minimum black level, minimum color gamut, minimum color bit depth, and black−to−white transition time. With interest in HDR video growing, the DisplayHDR specifications are an attempt to get around vague descriptions of things like color range (“70% of NTSC!”) and contrast ratios that don’t specify how the measurements were taken.
And this is actually a good thing. In the CE world, the UHD Alliance has a vague set of minimum requirements for a TV to qualify as high dynamic range. Compared to the more stringent DisplayHDR requirements, the UHD Alliance specs are equivalent to asking if you can walk and chew gum at the same time. Whereas HDMI version 2.0 (currently the fastest available) can transport an Ultra HD signal with 8−bit RGB color safely at 60 Hz, that’s setting the bar kinda low in our opinion.
In contrast, DisplayPort 1.3 and 1.4 (adds HDR metadata and support for 4:2:0 and 4:2:2 color) aren’t even breathing hard with a 12−bit RGB Ultra HD video stream refreshed at 60 Hz. And that means a computer display certified to meet one of the DisplayHDR standards can actually accept a robust HDR signal. (Note that VESA isn’t choosing sides here – DisplayHDR−certified screens can also use HDMI connections, but signal options are limited by HDMI 2.0’s top speed of 18 Gb/s.)
With HDMI 2.1 looming on the horizon – a new version of the interface that liberally borrows from DisplayPort architecture – it doesn’t appear that “those guys” are going to catch up any time soon. But you never know: Even Butch and Sundance had their day of reckoning…
You can learn more about DisplayHDR here. Check it out!
We just returned from our annual visit to the NAB Show in Las Vegas with a lot to think about – not the least of which was, where do we go from here−
From “here,” we mean conventional AV signal management and distribution, using industry standard formats like HD SDI, DVI, and HDMI. “There” isn’t as clearly defined, but we’re pretty sure it will ultimately involve TCP and IP, fast switches, and optical and twisted−pair cable.
Even so, there was no shortage of vendors trying to convince booth visitors that AV−over−IT is the way to go, and right now! Some NAB exhibitors have staked their entire business model on it, with flashy exhibits featuring powerful codecs, cloud media storage and retrieval, high dynamic range (HDR) imaging, and production workflows (editing, color correction, and visual effects) all interconnected via an IT infrastructure.
And, of course, there is now a SMPTE standard for transporting professional media over managed AV networks (note the word “managed”), and that’s ST 2110. The pertinent documents that define the standards are (to date) SMPTE ST 2110−10/−20/−30 for addressing system concerns and uncompressed video and audio streams, and SMPTE ST 2110−21 for specifying traffic shaping and delivery timing of uncompressed video.
Others at NAB weren’t so sure about this rush to IT and extolled the virtues of next−generation SDI (6G, 12G, and even 24G). Their argument is that deterministic video doesn’t always travel well with the non−real−time traffic you find on networks. And the “pro” SDI crowd may have an argument, based on all of the 12G connectivity demos we saw. 3G video, to be more specific, runs at about 2.97 Gb/s, so a 12G connection would be good for 11.88 Gb/s – fast enough to transport an uncompressed 4K/60 video signal with 8−bit 4:2:2 color or 10−bit 4:2:0 color.
The challenge to date has been to manufacture suitable cables for the transport of 12G SDI signals that will meet signal−to−noise (S/N) specifications. Moving to optical fiber is one way around the problem, but it appears most of the demos we saw relied on coaxial connections. To drive a 4K display, some manufacturers rely on quad 3G inputs. Ditto on getting 4K footage out of a camera, although to be fair, there were some single−wire 4K camera connections exhibited.
In an earlier blog post, we talked about the quantum leap to 8K video and displays. Well, we were quite surprised – perhaps pleasantly – to see Sharp exhibiting at NAB, showing an entire acquisition, editing, production, storage, and display system for 8K video. (Yes, that Sharp, the same guys that make those huge LCD displays. And copiers. And kitchen appliances. Now owned by Hon Hai precision industries.)
Sharp’s 8K broadcast camera, more accurately the 8C−B60A, uses a single Super 35mm sensor with effective resolution of 7680x4320 pixels arrayed in a Bayer format. That’s 16 times the resolution of a Full HD camera, which means data rates that are 16x that of 3G SDI. In case you are math challenged, we’re talking in the range of 48 Gb/s of data for a 4320p/60 video signal with 8−bit 4:2:2 color, which requires four 12G connections.
Like they say at the dragstrip, “Now THAT’S fast!” It’s so fast in fact that you can’t even use an expensive 40 Gb/s network switch to port this signal over an IT network. Indeed; light compression must come into play to get that number down to a manageable level. The TICO (Tiny Codec) is a good candidate – 4:1 TiCo compression would pack our 8K signal back down to 12 Gb/s. JPEG2000 4:1 would do the same thing, and both are low−latency codecs (and are similar to each other in how they work). For that matter, 4:1 compression would drop a 4K/60 signal down to 3G levels, making it a heckuva lot easier to switch.
The Blue River NT technology sweeping through the AV industry uses a codec that’s adapted from VESA’s Display Stream Compression (DSC) and is even gentler, packing things down a maximum of 2:1 to get a 4K/60 10−bit 4:4:4 video stream through a 10 Gb/s switch. We haven’t seen it used in conjunction with an 8K video source yet, but be advised that DSC can actually work up to 3:1 compression levels and still remain visually lossless with very low latency.
In the NHK booth, you could watch a demonstration of 8K/60 video traveling through a 10 Gb/s switch using so−called mezzanine compression based on the TiCo system. In this case, NHK was using 5:1 TiCo compression to slow down a 40 Gb/s 8K/60 video stream to 8 Gb/s. Even our 48 Gb/s example from earlier would make it under the bar at 9.6 Gb/s.
So, what does this all mean− First off, SDI isn’t quite dead yet, to paraphrase Monty Python. It may not be suitable for long distance transmission of 4K video, but it’s still workable for short runs from cameras to switchers and other studio gear, and longer runs using optical connections. (The SNR hurdle still has to be cleared for long coaxial cable runs.)
Second, it’s becoming clear that some degree of light compression is going to be a way of life with 4K and 8K production, especially when you factor in the additional bits for HDR and high−frame rate video (of which there was plenty on display in Vegas). You think 48 Gb/s is fast− Try moving 8K/120 video around: Both NHK and NTT were showing exactly that, with corresponding data rates of about 96 Gb/s. Definitely “funny car” territory.
Third, we’re still a long way from resolving the SDI vs. IP argument. Indeed; some recent studio projects we’re aware of have been built using both SDI and IP architectures to move data and video on separate paths. While offerings like cloud storage will require the network hookup, point−to−point 1080p video can still travel happily over SDI connections. (It should also be pointed out that, for all the ballyhoo about 4K, very little in the way of 4K video production is being undertaken at present.)
NAB 2018 reflected all of this thinking, and then some. It’s almost as if everyone else at the show was waiting for the other guy to take the first step. SDI, or IP− Or Both− Come on, doggone it, make up your mind…..
Yes, you read that right: 8K displays are coming. For that matter, 8K broadcasting has already been underway in Japan since 2012, and several companies are developing 8K video cameras to be shown at next month’s NAB show in Las Vegas.
“Hold on a minute!” you’re probably thinking. “I don’t even own a 4K TV yet. And now they’re already on the endangered species list−”
Well, not exactly. But two recent press releases show just how crazy the world of display technology has become.
The first release came from Insight Media in February and stated that, “The 2020 Tokyo Olympics will be a major driver in the development of 8K infrastructure with Japanese broadcaster NHK leading efforts to produce and broadcast Olympic programming to homes…cameras from Hitachi, Astrodesign, Ikegami, Sharp and Sony address the many challenges in capturing 8K video…the display industry plans for massive expansion of Gen 10.5 capacity, which will enable efficient production of 65" and 75" display panels for both LCD and OLED TV…. sales of 8K Flat Panel TVs are expected to increase from 0.1 million in 2018 to 5.8 million in 2022, with China leading the way representing more than 60% of the total market during this period.”
You read that right. Almost 6 million 8K LCD and OLED TVs are expected to be sold four years now and over 3 million of those sales will be in China.
But there’s more. Analyst firm IHS Markit issued their own forecasts for 8K TV earlier this month, predicting that, “While ultra−high definition (UHD) panels are estimated to account for more than 98 percent of the 60−inch and larger display market in 2017, most TV panel suppliers are planning to mass produce 8K displays in 2018. The 7680 x 4320−pixel resolution display is expected to make up about 1 percent of the 60−inch and larger display market this year and 9 percent in 2020.”
According to HIS Markit, companies with skin in the 8K game include Innolux, which will supply 65−inch LCD panels to Sharp for use in consumer televisions and in commercial AV displays. Meanwhile, Sharp – which had previously shown an 85−inch 8K TV prototype – will ramp up production of a new 70−inch 8K LCD display (LV−70X500E) in their Sakai Gen 10 LCD plant. This display was shown in Sharp’s booth at ISE, along with their new 8K video camera.
Sony and Samsung are also expected to launch 8K LCD TVs this year. Both companies showed prototypes at CES with Samsung’s offering measuring about 85 inches. Sony’s prototype also measured 85 inches but included micro light−emitting diodes (LEDs) in the backlight to achieve what Sony described as “full high dynamic range,” achieving peak (specular) brightness of 10,000 nits. (That’ll give you a pretty good sunburn!)
Other players in 8K include LG Display, who already announced an 88−inch 8K OLED TV prior to CES, and Chinese fabricators BOE, AUO, and China Electronics Corporation (CEC). What’s even more interesting is that some of these 8K LCD and OLED panels will be equipped with indium gallium zinc oxide (IGZO) switching transistors.
No, IGZO isn’t a cure for aging. But what it does is provide much higher pixel density in a given screen size with lower power consumption. More importantly, it will allow these 8K TVs to refresh their pictures as fast as 120 Hz – double the normal refresh rate we use today. And that will be important as High Frame Rate (HFR) video production ramps up.
Predictably, prices for TVs and monitors using panels with 4K resolution are collapsing. In the AV channel, 4K (Ultra HD) displays are only beginning to show up in product lines, but manufacturers are well aware of pricing trends with Ultra HD vs. Full HD (1920x1080p). With some consumer models now selling for as little as $8 per diagonal inch, the move from Full HD to 4K / Ultra HD will pick up lots of steam.
And with 8K displays now becoming a ‘premium’ product, 4K / Ultra HD will be the ‘everyday’ or mainstream display offering in screen sizes as small as 40 inches and as large as – well, you name it. We’ve already seen 84−inch, 88−inch, and 98−inch commercial displays, and prototypes as large as 120 inches – yes, 10’ of diagonal screen, wrap your head around that – have been exhibited at CES and other shows.
We saw quite a few demonstrations of 4K commercial displays at ISE and expect to see a whole lot more at InfoComm in June, along with the inevitable price wars. And there will be the usual “my encoder handles 4K better than yours with less latency” battles, shoot−outs, and arguments. But that could ultimately turn out to be the appetizer in this full−course meal.
For companies manufacturing signal distribution and switching equipment, 4K / Ultra HD already presents us with a full plate. 8K would be too much to bite off at present! Consider that an 8K/60 video signal using 12−bit RGB color requires a data rate approaching 100 gigabits per second (Gb/s), as compared to a 12−bit, 60 Hz Full HD signal’s rate of about 6 Gb/s, and you can see we will have some pretty steep hills to climb to manage 8K.
Distributing 8K over a network will be equally challenging and will require switching speeds somewhere north of 40 Gb/s even for a basic form of 8K video, which (we assume) will also incorporate high dynamic range and wide color gamuts. 40 Gb/s switches do exist but are pricey and would require 8K signals to be compressed by at least 25% to be manageable. And they’d certainly use optical fiber for all their connections.
To summarize, 4K / Ultra HD isn’t on the endangered species just yet. (You can still buy Full HD monitors and TVs, if that’s any comfort.) And the pace of change in display technology is so rapid nowadays that you can’t be blamed if you feel like Rip Van Winkle sometimes!
But whether it makes sense or not – or whether we’re ready or not – it’s “full speed ahead” for 8K displays as we head into the third decade of the 21st century…
You could be forgiven if you wondered where all of the televisions disappeared to at this year’s CES. Ten years ago, the walls of booths occupied by the likes of Sharp, Sony, Panasonic, Samsung, and LG were stuffed full of LCD and plasma televisions. This was the flagship product for all of these companies and a big part of their sales.
But this year− A very different look. With the continued emphasis on “connected everything,” TVs moved to the background as “connected solutions” for home and office grabbed center stage. And there’s a good reason why: Display panels are inexpensive to manufacture now and the TVs they wind up in have dropped dramatically in price.
A quick check at online pre−Super Bowl TV sales showed that you can pick up a first−tier 55−inch 4K (Ultra HD) TV with “smart” functionality for about $500, spending about $100 less for a 2nd−tier brand. Want high dynamic range− Add around $300 − $400 to the price. And we’re talking about Ultra HDTVs here, not Full HD sets that can be had in the same screen size for as little as $399.
You can attribute this collapse in TV prices to large−scale manufacturing in China, where both raw material and labor costs are much lower than in older industrial companies. Robotics (another big thing at CES) also play a part: The most up−to−date display panel fabrication lines in Asia may sit in multistory buildings, but they only require about 15 to 20 people to monitor and control everything.
Lower production costs and increasing use of robotics have made it possible to jump to 8K (7680x4320) display resolution. Indeed; many pundits are predicting that 8K displays will replace 4K in a very short time period. But that’s a fanciful prediction at best, given that there is no commercially−produced 8K video and movie content and the storage required for such content would amount to 16 times that needed for plain old Full HD (1920x1080).
Still, large flat screen displays continue to push projectors out of the market. More AV installations are now using large LCD screens, some with 4K resolution. At the high end, light−emitting diode (LED) displays are now preferred for large indoor and outdoor electronic signs. They’re intensely bright, pushing out 3,000, 4,000, and 5,000 nits over wide viewing angles and creating images that hold up well even under full daylight.
But now there’s a wild card, and that’s the micro LED display. Most commercial LED displays have a dot (pixel) pitch of 4−6 mm for outdoor use. In recent years, fine pitch LED displays have dropped down below 2 mm with some videowalls touting 1.8, 1.6, 1.2, and even .9mm pitches. (For some perspective, a 50−inch plasma monitor from 1999 had about a 1.2mm dot pitch and 1366x768 resolution.)
The micro LED takes that a step further with dot pitches much smaller than 1 mm. Take that same 50−inch TV and stuff it full of 4K pixels (3840x2160), and you’ll see that a dot pitch of about .3mm is required for each pixel. (8K resolution would drop that in half again to .15mm.) It’s easy nowadays to form LCD and OLED pixels that small, but micro LEDs are a bit trickier.
Nevertheless, Samsung showed a 146−inch diagonal micro LED display with 8K resolution. Because the display uses LEDs exclusively, it’s very bright (over 2,000 nits for specular highlights) and has a wide viewing angle. This concept display was also able to show high dynamic range (HDR) video and a much wider color gamut than we usually see. Since LEDs can pulse on and off at very high speeds, this type of display is perfect for the next big revolution in imaging – high frame rate video.
We’ve seen micro LED technology before. Sony exhibited a hand−wired TV full of micro LEDs about 6−7 years ago at CES, and conservative estimates were that it probably cost in excess of $100,000 to make. Samsung’s model surely came in a lot lower than that, and for one big reason: It’s modular. The final product is actually made up of several smaller LED tiles, which is quite a revolutionary approach to building a TV.
Here’s what we find interesting: It may actually catch on. Tiling is a familiar concept to those in the AV and staging markets who routinely put together large displays for temporary or permanent installations. The thinking at CES is, “why not do this with televisions−” In essence, you could decide just how big a display you’d want in your home or office and then order up the correct number of tiles. Stack them together, connect all of the driver cables, and away you go.
Okay, maybe it won’t be that simple. But building TVs out of super−thin tiles could represent a significant manufacturing revolution, just as flat screen displays kicked tube TVs out of the market 15 years ago. What we don’t know is the final pixel resolution of those tiled TVs and how we’ll interface signals to them. Newer and faster versions of HDMI and DisplayPort may be the answer. Or perhaps it will require an entirely different method of signal transport.
A few weeks ago was the annual Consumer Electronics Show in Las Vegas. For the first time, the majority of exhibits in Las Vegas emphasized applications over hardware, or to put it another way, “it’s not what you have, it’s what you do with it.”
And that shouldn’t be a surprise at all. Prices of commercial and consumer gear have been steadily declining over the last decade to the point where much of that gear is now considered consumable and disposable. Buy it, use it, and replace it over ever−shorter product life cycles.
Attendees wandering through the LVCC couldn’t help but pick up on the “connected” vibe: Connecting and controlling everything with voice commands is all the buzz nowadays. So are faster WiFi and 5G cellular, along with smart, connected appliances and smart, connected cars. To make things even more interesting, Amazon and Google voice recognition systems were found on everything from televisions to cars.
Speech recognition and control has come a long way since we first saw it implemented at the turn of this decade by companies like Conexant, and it works. And it’s cheap. And you can use it to control just about everything in your home that’s tied to a network, so it’s not unreasonable to assume voice recognition could also be used to operate everything in an AV installation.
This isn’t fantasy. Every TV manufacturer had at least one model at CES that supported Amazon or Google Assistant. (Some models support both systems.) You can link your TV to your refrigerator, washer, dryer, and other appliances in your home and control just about anything or get status updates. Or you can just ask your assistant general questions, and depending on the question, the system can anticipate what you’re about to do and activate or deactivate devices.
LG has this feature in their 2018 TVs (ThinQ with Cloi), while Samsung claims that every product they make will be interconnected by 2020 and voice controlled using their Bixby system. While the Chinese brands are not quite up that level, they did show sample rooms with interconnected devices that all respond to voice prompts.
In addition, Samsung’s purchase of Harman in 2016 gives them entry to the multi−billion−dollar car audio market. And by extension, they can support voice recognition and control in cars, linking them back to homes and offices. On the TV side, both TiVo and Comcast have had voice control and search features for some time, using adaptive intelligence to hunt down and locate programs.
Examples were shown of voice commands through an LG OLED TV to (a) adjust room lighting, (b) adjust room temperature and humidity, (c) check where the washer and /or dryer cycles stand, (d) check to see what’s in the refrigerator and suggest a recipe for the food that was found, and (e) ultimately order takeout food from a restaurant.
Another key part of this voice−centered control system is machine learning. As implemented in televisions, these systems can anticipate which programs you’re likely to watch. As part of a wider control system, they can remember what room temperatures you prefer, when you cycle room lights, and what combinations of lighting/temperature/humidity you like when you retire for the night. Needless to say, the system can also activate alarms and outside lighting.
Samsung also showed an advanced in−door wide−angle camera to see who’s ringing the doorbell. Not exactly a new concept, but it can be linked to your TV or a display in your kitchen appliances. (Yes, that’s becoming a thing now, with a large LCD screen that serves as a hub for everything from your daily schedule to recalling recipes from cloud storage.)
Another example of machine learning was discussed at the Panasonic press conference. Their big thing is “smart cities” (also a themed area in the Westgate Convention Center) wherein everything is connected – your home, your car, the highways, you name it. Panasonic talked about getting into your car and driving to Starbucks (either with you driving or an autonomous system) and the car will automatically call ahead and order your favorite beverage.
If it’s this easy to implement in the consumer space, why aren’t we doing more of it in the professional AV world− Kramer Control is already using the icon−oriented drag−and−drop method of building a control system, using cloud−based drivers. All of the control systems for home use that were shown at CES work the same way. The big question is, which voice recognition system will be paired up with this next generation of control systems−
A big concern that comes to mind is security. It sounds like a great idea to command the function of every piece of hardware in a building, but if all of that gear is interconnected through Ethernet or WiFi, then it’s open to hacking from the outside world. Google’s Nest thermostat was hacked a couple of years ago, so it’s reasonable to assume anything from a TV to a projector, lighting control system, or HVAVC could be at risk.
Samsung announced at CES that every product they make will be connected by 2020, largely using 5G cellular networks. No doubt companies like LG, Sony, Panasonic, and Chinese brands will follow suit. After all, it’s what consumers want – right− (At least, that’s what we heard all week long in Las Vegas…)
FIND WHAT YOU NEED
Want to receive alerts and updates for every new post?