Blog

Kramer Blog

Welcome to our new Blog! The place to share our thoughts on the next big ideas

  • What’s Up With 4K?

    It’s been a couple of years since the first wave of 4K AV products washed ashore, and yet, there is still some confusion about what the term “4K” actually means. It doesn’t help that there has also been (and continues to be) a lot of misinformation offered about this imaging format, ever since the first commercial and consumer displays with 3840x2160 pixel resolution were unveiled in 2012. So let’s clear things up.

     

    To start with, “4K” is kind of a vague catch-all term. A display with true 4K resolution will have 4096 horizontal and 2160 vertical pixels, which is a cinema format. The version of “4K” we’re more familiar with is defined by the Consumer Technology Association as “Ultra HDTV.” Displays and display signals (and many UHD camera sensors) classified as Ultra HD have 3840 horizontal and 2160 vertical pixels. Not quite true 4K, but close enough for our purposes.

     

    Now, here’s where things get tricky. An Ultra HD display signal has four times as many pixels in a single frame as a Full HD video signal; 9.9 million versus 2.48 million. That’s quite a boost in payload, and it creates a speed limit challenge when interfacing, particularly as we increase the frame rate and color bit depth.

     

    There is truth in numbers! A Full HD video frame has a total of 2200x1125 pixels, including blanking. Multiply that by a frame rate of 60Hz, using 10-bit RGB color (or 4:4:4, using a broadcast notation), add in the customary 20% ANSI bit overhead, and you have a payload of 5.346 gigabits per second. (Let’s call it 5.4 Gb/s, to simplify matters).

     

    How did we arrive at that number? Well, 2200 pixels × 1125 pixels × 60 = 148.5 MHz, which is a common pixel clock frequency for Full HD. Next, we multiply 148.5 by 3, because we’re using RGB color. And we then multiply that product by 12 (10-bit color + 2 bits as overhead) to arrive at our final number: 5.4 Gb/s of data. That can easily fit through an HDMI 1.4 connection, which has a maximum rate of 10.2 Gb/s.

     

    Okay, time to exit our Honda Civic and get into a high-performance BMW M3. Our Ultra HD signal has a total of 4400 horizontal and 2250 vertical pixels with blanking. Refreshing that signal 60 times per second gives us a pixel clock of 594 MHz, and using 10-bit RGB color, we now have a sustained data rate of 21.384 Gb/s. Wow! (Not surprisingly, that’s four times as fast as our Full HD signal calculation.)

     

    That’s way too fast for HDMI 1.4. In fact, it’s even too fast for HDMI version 2.0, which can’t transport data any faster than 18 Gb/s. (That’s why a newer and faster version of HDMI – v2.1 – is just now coming to market.) Hmmm…do we really need 10-bit RGB color for everyday applications? Probably not, so let’s dial the bit depth back to 8 bits per color, which should suffice for high-resolution graphics and images.

     

    Jiggering the math that way drops our bit rate down to 17.82 Gb/s, which gets us within the speed limit of HDMI 2.0. We can also crawl under the limbo bar by reducing color resolution to 4:2:2 or 4:2:0. A 10-bit Ultra HD signal with 4:2:2 color has a data rate of 14.26 Gb/s, while a 10-bit 4:2:0 version drops that number to 10.7 Gb/s.

     

    And we can trim the data rate even further by cutting the frame rate in half to 30 Hz. Initially, that’s what many signal management companies did to accommodate Ultra HD signals while retaining the HDMI 1.4 interface. But as our display screens get larger (and they are getting a LOT larger), lower frame rates with wider fields of view can produce noticeable flicker. This phenomenon was first observed by Japanese broadcaster NHK as they began rolling out 8K TV broadcasts…but that’s a story for another time.

     

    So, we want to stick with at least a 60 Hz frame rate for our 85-inch Ultra HD LCD monitor or our 120-inch Ultra HD LED wall. We’ll definitely need signal management products equipped with HDMI 2.0, at the minimum. By doing so, we can accommodate more powerful graphics workstations and laptops, where we can set the bit depth to fit our system bandwidth. We can also stream Ultra HD video from physical media and streaming platforms, where the most common color resolution is 4:2:0. Again, an easy fit for our system.

     

    “Hold on there,” you’re probably thinking. “Where is all this demand for Ultra HD coming from?” Time to wake up and smell the coffee, folks: Monitoring and surveillance systems, such as those used by traffic agencies and process control systems always need more pixels on the screen. Gamers are always looking for more pixels on the screen at faster refresh rates with low latency. So do companies engaged in energy exploration, visualization and virtual reality, 3D modeling, and medical imaging. You know the old saying: You can NEVER have enough pixels.

     

    And as we just read, the AV industry is rapidly switching over to Ultra HD displays as Asian display panel “fabs” phase out zero-profit Full HD panels and ramp up Ultra HD panel production. That means single-monitor and TV sizes as large as 98 inches using LCD and OLED technology, and tiled LCD/OLED display walls – plus LED walls – that have 4K, 8K, and even higher pixel counts. If you are looking to build a new signal distribution system, it is a wise bet against the future to support the higher bandwidths required for Ultra HD, all the way through every connection.

     

    Kramer’s VP-551X 8x2 4K presentation switcher/scaler is well-suited to this purpose, equipped with eight discrete HDMI 2.0 inputs with embedded and discrete audio and native support for both 4:4:4 and 4:2:0 color resolutions. In addition to a single HDMI 2.0 output, it also provides an HDBaseT output for 4K/30 RGB or 4K/60 4:2:0.

     

    More importantly, all HDMI ports are compatible with HDR10, a standard for static metadata required to display high dynamic range video. HDR is becoming an intrinsic part of Ultra HD production and display, allowing the reproduction of a much wider range of luminance values from black to full and specular white. VP-551X also passes the Dolby TrueHD and DTS-HD Master Audio formats, common with physical media and streaming playback.

     

    There are other handy gadgets for your Ultra HD signal management system. Kramer’s new 675T and 675R are plug-and-play fiber optic signal extenders that accept type LC optical plugs and will work with either multimode or single-mode fiber, providing signal extensions up to 20 miles with single-mode operation. The beauty of optical fiber is that it has virtually no speed limit issues, as opposed to copper wire-based signal extenders that run several hundred feet at most.

     

    You also have the option of routing your Ultra HD signals over a 10-gigabit Ethernet connection by using the Kramer KDS-8F SDVoE video streaming transceiver. As an encoder, it encodes and streams HDMI or DisplayPort signals along with infrared control, RS-232 control, analog audio, and bi-directional USB 2.0 over an IP network, using small format pluggable (SFP) optical fiber. KDS-8F can also work as a receiver to decode all of the signal formats just mentioned to an HDMI port with discrete audio, IR, and RS-232 connections.

     

    And that is “what’s up” with 4K these days. The demand for more pixels, refreshed at faster rates with greater color bit depth, isn’t slowing down one bit. (Did you know that 8K cameras are now being used to inspect sewer pipes? It’s true! But that’s a story for another time…)

     

  • About That ‘Sharing’ Thing: What Do Presenters Really Want?

    You might have noticed one AV product category that’s gotten a ton of attention in recent years (if not most of the attention): Presentation sharing/collaboration. Just about everyone and their uncle offers some sort of hardware and/or software that allows anyone to share what’s on their screen with others, whether they’re running Android™, iOS™, or Windows™.

     

    Some of these products are complex and loaded with all kinds of add-on tools. Others are bare-bones designs that employ little more than screen-scraping techniques. This makes choosing a system unnecessarily difficult for customers, who more often than not aren’t even sure of what they expect out of a presentation sharing/collaboration product.

     

    You could make an argument that this category invented itself. In previous times, people dragged laptops into a meeting room, uncoiled and plugged in AC power and display connections, and used tabletop, under table, and remotely-controlled presentation switchers to cycle among the various presentations. If you wanted a copy of whatever was being shown, it was delivered as a photocopied handout or a PDF file after the fact. What a pain in the neck!

     

    But enough engineers were able to see into the not-so-distant future of mobile, personal electronics, i.e., smartphones and tablets. These devices have become so powerful that they have replaced laptops for many functions. And they make extensive use of two things – fast wireless connections and cloud-based content/file storage and delivery. So, why not use them as a new form of presentation platform, particularly with their ability to easily capture high-quality video (although often in the wrong orientation)?

     

    Soon enough, presentation sharing hardware started rolling off the assembly line. Prospective buyers were overwhelmed with these gadgets, some of which were loaded to the gills with advanced functions that would never be used. Other models seemed to be so simplistic in function that they were little more than cheap consumer products. Company and campus IT departments weighed in with concerns about connectivity to their networks and any security threats these new-fangled gadgets might present.

     

    Eventually, customers had to decide just how much functionality they wanted in such products. Younger users, who consider mobile phones as essential to their lives as their internal organs, were primarily sharing photos and videos. In contrast, older users (those still depending on laptops) were more accustomed to loading up things like spreadsheets for meetings. Some attendees wanted paper copies of the presentation, while others simply took photos of the screen of relevant material, using their phones, of course (which is ironic, in a way).

     

    Calls from the IT department got louder over time. As presentation sharing/collaboration products started popping up on networks, strong passwords and two-factor logins became necessary. Multiple installations meant multiple trips through buildings and across campuses to update firmware. Presenters complained about herky-jerky video and issues sharing iOS screens. It was definitely “gaffer tape and paper clips” time!

     

    Today, the presentation sharing/collaboration marketplace has matured enough that manufacturers have a pretty good idea of how people actually use these gadgets. Over time, anecdotal evidence revealed that a majority of customers just wanted a simple, reliable, easy-to-deploy-and-manage wireless screen-sharing solution, particularly in the education vertical. And as it turned out, there was no need to re-invent the wheel: The best approach was to connect and present without needing to install anything on a computer or mobile device – just leverage the native OS and Web browser protocols that are built into each user’s device.

     

    That’s not to say there wasn’t a need for additional features, such as viewing the main screen on your own device (great for lecture halls with long viewing distances), editing documents together in real time, sharing any size file with anyone else in the meeting or class, instant polling, and turning the main display into a digital whiteboard with recordable annotation. Certain groups needed and continue to need all of those bells and whistles.

     

    But for others, the ability to connect their mobile device quickly and easily to a shared screen using a standard wireless connection was the big draw, using iOS mirroring for MacBook™, iPad™, and iPhone™ as well as native mirroring for Chromebook™, Android (Lollipop OS 5.0 or newer), and Windows phones. So was figuring out a way to stream video at native frame rates without the end result turning into visually-annoying, low frame rate flip movies.

     

    The hardware-intensive approach to early wireless presentation systems has now morphed into one that focuses more on software, and rightly so. Indeed, it’s now possible to build your own wireless presentation system simply by installing a software package and using AirPlay for MacOS & iOS, Miracast™ for Windows & Android, and connecting directly through Chrome or Firefox Web browsers.

     

    A bridge from the past to the present was also created for legacy meeting spaces by adding wired HDMI™ inputs to wireless presentation platforms. The hardware and software mixes both wired and wireless connections together in a seamless way, extending the useful life of existing presentation switchers by making them another gateway to the wireless system. (It’s always good to have options!)

     

    Those overworked folks responsible for maintaining IT networks were placated with a versatile software package that allows remote monitoring and configuration of multiple presentation sharing devices on the network – no need for physical visits to each room or space. And by incorporating 1024-bit encryption over each wireless link (and, if necessary, building “DMZs” with firewalls), security was a non-issue.

     

    What we’ve just described is a general outline of Kramer’s VIA wireless presentation product line. For basic plug-and-play connectivity, VIA GO provides 60 Hz video streaming, 1024-bit encryption, and built-in WiFi. VIA Connect PRO can show up to four screens simultaneously and any in-room meeting participants can view the main display, edit documents together in real time, share any size file, and turn the main display into a digital whiteboard. (VIA Connect PLUS adds a wired HDMI input.)

     

    For more advanced users, Kramer’s VIA Campus2 adds e-polling to instantly measure student feedback and can also be used as a secure wireless access point for guests. Six user screens can be shown on one main display and up to 12 screens by using two displays. Remote students can easily join the class and collaborate in real time with embedded 3rd-party video conferencing and office apps including Microsoft Office®, Skype®, GotoMeeting®, Lync®, and WebEx®. (VIA Campus2 PLUS adds a wired HDMI input.)

     

    In its simplest form, VIA can be loaded and run as a software program (VIAware), It delivers the same security offered by all VIA devices and can be installed on any Windows 10 computer. Itcan show up to six user screens on one main display or up to 12 screens on two displays, and remote students can easily join and collaborate in real time with embedded 3rd-party video conferencing and office apps.

     

    Finally, VIA Site Management (VSM) is a software application that enables IT administrators to manage, monitor and control all connected VIA devices. VSM generates alerts on system health and includes reporting and analytics tools for understanding VIA device usage.

     

    It’s taken a few years to get there, but we can finally answer the question posed at the start of this missive: What do presenters really want? Simple, reliable, easy-to-deploy-and-manage wireless screen-sharing solutions, as it turns out. Who knew?

  • As Relentless as a Tidal Wave

    The AV industry has a few “benchmark” years, starting with the introduction of light valve video projection in the 1980s and continuing through the first solid-state video/data projectors in 1993, the first flatscreen displays in the mid-1990s, optical disc media in the late 1990s, high definition TV in the early 2000s, a migration from plasma to LCD later that decade, and widespread adoption of high-speed wireless for connectivity over the past five years.

     

    Right now, we’re laser-focused on moving away from full-bandwidth video signal distribution to compressed Full HD and 4K video switched and routed over IT networks. That in itself is a sea change for integrators, and will more closely align our industry with the world of information technology, likely causing the loss of more than a few jobs along the way, as has happened recently in the broadcast industry.

     

    We’ve also bought into the idea of ditching short-arc projection lamps in favor of a more durable, eco-friendly solution that harnesses laser diodes to color phosphor wheels. And we seem to like the concept of wireless signal connectivity for presentations and collaboration, slowly moving away from wired connection hubs on walls and tabletops.

     

    But the biggest change of all is just starting to emerge from behind the curtain, and that is the increasing dominance of the light-emitting diode (LED) in display technology. And when we say “dominance,” we really mean it – LEDs have the potential to become the first unified display platform since the cathode-ray tube was developed a century ago.

     

    It’s not like we didn’t see it coming. LED videowalls with coarse pixel pitch have been around for more than two decades, but they were limited to installations in large stadiums and arenas, and as outdoor signs in places like Times Square and the Las Vegas Strip. That all changed around the start of the present decade, when individual LEDs became practical to manufacture in ever-smaller sizes.

     

    Those videowalls and scoreboards from 1999 had, on average, a pixel pitch of about 10 millimeters. A contemporary version for indoor installations is likely to have a pixel pitch of 2 millimeters, presenting images with much finer detail when viewed at close range. Given that most of the LED device and tile manufacturing takes place in China, it wasn’t long before tile and wall prices began falling…and customers as diverse as staging companies and retail chains took notice, and bought in.

     

    You probably did, too, at ISE and InfoComm about five years ago. Where did all of these LED all companies come from, all of a sudden? How come I never heard of any of them? Wow, those things are bright! Bet they’re expensive…

     

    To be clear, the position of “display king of the hill” rotates every few years. CRT displays sat on the throne for generations. Then plasma displays took over, only to be deposed by LCD screens. The latter have clung to power for over a decade, but they can see the writing on the wall. The only question is how quickly LCD technology will cede its top-of-the-market position to LEDs.

     

    Based on anecdotal evidence, what we’ve seen at recent trade shows, and forecasts from display analysts, the coronation is going to happen pretty soon. Just as large, economical Full HD LCD monitors and TVs escorted “hang and bang” projectors out of classrooms and meeting spaces, large LED walls are putting a serious dent into the sales of high-brightness projectors, particularly for staging live events. And they’re setting their sights on large LCD monitors and TVs next.

     

    It’s easy to see why. LED walls are built up out of smaller tiles and cubes, just like Lego toys. They literally snap together into lightweight frames, using molded multi-wire plugs to daisy-chain tiles and cubes together, and are easy to fly or stack. LEDs don’t make any noise, aside from small cooling fans when operating in enclosures, and provide a bright, colorful, and high-contrast one-piece display system that can show 4K and even 8K-resolution video.

     

    Still; many end-users and integrators have long regarded LED walls as expensive niche displays, not anything practical enough to install in classrooms, lecture halls, and meeting spaces. Well, that thinking got blown out of the water last June at InfoComm, where we saw the first build-it-yourself LED displays for meetings rooms.

     

    These products have sizes ranging from 120 to 150 diagonal inches (that’s 10 and 12.5 feet, respectively) and offer dot pitches from 1.8 to 2.5 millimeters. They come in modular kits that take two people about three hours to assemble and wire together, and can be hung on a wall or even attached to a roll-around stand – all you need to do is plug in a power cord and connect your video source through an HDMI port, and away you go.

     

    In terms of brightness, LED displays actually have to throttle back on their luminance. Using a pulse-switched operating mode, they could easily hit 2,000 nits, but that would be glaringly uncomfortable in a small room. The actual luminance level is closer to 400 – 600 nits, adjustable to compensate for high ambient light levels.

     

    From a technology perspective, these products are LCD killers. But from a financial perspective, they’re not quite there yet: A 130-inch model has a retail price of about $75,000, which would more than cover the cost of four 65-inch LCD monitors and signal processing gear. Then again, the first 50-inch plasma monitors for commercial use retailed for about $25,000 apiece twenty years ago, so we can expect prices to come down pretty quickly on these products as demand grows.

     

    With large LED walls established as the go-to display for digital signage and image magnification, and fine-pitch walls establishing a beachhead in meeting rooms, the next step is consumer televisions and mobile displays. Late-model Ultra HDTVs with high dynamic range support already use large matrices of mini LEDs as backlights.

     

    It will fall to a new class of LEDs – “micro” devices – to fill in the missing slots or consumer displays…but that’s a story for another time…

     

  • How Fast is ‘Fast Enough’?

    The AV industry has arrived at a singularly intriguing point in time. Like the rest of the communications world, we’re getting ready to jump aboard the IT bandwagon and use TCP/IP networks and switches to route and distribute audio and video…leaving the world of uncompressed display signal management behind.

     

    Coincidentally, this paradigm shift is coinciding with another move; this time, from Full HD video to Ultra HD video. But that’s not all: We’ll also have to reckon with high dynamic range content and its associated wider color space. And off in the distance, you can see yet another freight train approaching, with its box cars reading “High Frame Rate Video.”

     

    On the one hand, we’re learning all about codecs, latency, forward error correction, groups of pictures, jumbo frames, and a host of acronyms like DHCP, IGMP, and HLS. On the other, we’re frantically calculating our bandwidth requirements and wondering if we’ll have a fast-enough network to handle all of these high-speed, souped-up pixels.

     

    And that brings up a really good question. Just how fast is “fast enough?” If we’re building an AV-over-IT network, what top speed should we aim for? And how, exactly, can we build some degree of futureproofing into our design so we don’t have to come back a couple of years from now and install new switches and perhaps even new cables?

     

    Let’s start with the basics. Is the network going to be used to switch near-zero or very low latency video? (Audio’s easy, no need to worry there.) if your answer is yes, then you are talking about gigabits per second and more likely tens of gigabits per second data rates. On the other hand, if the AV-IT network is going to be used primarily to stream content in non-real time and some latency isn’t a problem, then we’re talking about tens and hundreds of megabits per second.

     

    Still, you need to design for the highest possible speed requirements, and if at any point you want to manage low-latency video using JPEG-based codecs (or other light compression codecs), you’re back in the tens of gigabits per second neighborhood. Currently in our industry, we have manufacturers advocating for 1 Gb/s switch fabrics and others saying, “No, you need at least a 10 Gb switch.”

     

    From our perspective, designing a 1 Gb/s AV-IT network is essentially tagging it with a “planned obsolescence” sign. A basic 4K (Ultra HD) video signal that’s refreshed 60 times per second and has 8-bit RGB color will generate an uncompressed data rate of 17.82 Gb/s, WAY too fast for a 1 Gb/s switch without significant compression. Yet, that’s a rudimentary form of 4K video.

     

    With HDR enhancements, we’ll need to move to 10-bit sampling. We can cut the color resolution from RGB (4:4:4) to the broadcast standard 4:2:2, at which the new bit rate is 14.26 Gb/s – still too fast for a 10 GB/s switch. We could drop the frame rate in half to 30 Hz, slowing the bit rate down to 7.13 Gb/s and clearing the switch without additional help. But maybe the application really needs that higher refresh rate?

     

    This problem is ameliorated to some extent with the SDVoE Blue River NT codec. It applies light (2:1) compression with super-low latency to let us limbo with ease under the 10 Gb/s bar. All fine and good, but what if we want to run a 10-bit RGB 4K signal with HDR and a 60 Hz frame rate from point A to points B, C, D, E, and F? The raw data rate is now 21.4 Gb/s, and even 2:1 compression won’t get us through the switch.

     

    One possible solution is to compress the broadcast (4:2:2) 4K video format by using a lossless codec like JPEG XS (TiCo) to pack the signal down by as much as 6:1, then come out of the decoder with full-bandwidth display connections. Japanese TV broadcaster NHK showed a demonstration last year at NAB of an 8K signal (7680x4320 pixels) with 10-bit 4:2:2 color and 60 Hz refresh, successfully transiting a 10 Gb/s network switch thanks to 5:1 JPEG XS compression.

     

    So many numbers to think about! But that’s our new world as we continue to push up pixel counts and bit depths. And we haven’t even talked about gamers and virtual reality enthusiasts, who would prefer to see frame rates well above 60 Hz to minimize motion blur. Assume 96, 100, and even 120 Hz for these specialized applications and run the calculations again. (You’ll probably want to jump out the window when you’re finished.)

     

    It’s all too easy to increase bit depth and frame rates for 4K and 8K signals and wind up with data rates that would warrant speeding tickets. But our job as integrators is to provide some sort of future-proofing in any installation…which is why our industry might want to start looking at 40 Gb network switches.

     

    Yes, they exist, although mostly as selected ports on hybrid 10 Gb / 40 Gb switches. And it goes without saying that those 40 Gb ports are using quad small form pluggable (SFP) optical fiber connections. These switches aren’t insanely expensive: We found several models through an Internet search that are priced between $3,000 and $8,500, and although not “pure” 40 Gb switches, they will do the job.

     

    So – is a 10 Gb network switch “fast enough?” Or maybe we need a 40 Gb switch? Believe it or not, 100 Gb switches are in development. Is THAT fast enough for you? Better check your math…

  • 8K And the Seven-Year Itch

    Unless you were wearing a blindfold while walking around CES 2019, you could not miss the numerous displays of 8K televisions. 8K banners hung from anywhere, along with signs for the other “hot” technology at the show – artificial intelligence (AI).

     

    We counted well over a dozen examples of 8K TVs in our perambulations, most of them showing a series of still images with high resolution and high dynamic range. A few demos actually showcased 8K footage, presenting incredible detail in closeups of a honeycomb with attendant bees, or ants crawling over a vegetable garden. (That last clip was so realistic that it freaked out more than a few entomophobes!)

     

    Other more practical demonstrations featured lower-resolution content scaled up to fit the 8K screen. Note that 8K TVs, when they finally arrive in any quantity, will likely start with a screen size of 65 inches and move up from there, hitting a maximum (so far) of 98 diagonal inches. Like it or not, you’ll be sitting pretty close to such a screen, so having all of those pixels plus enhancements like high dynamic range will make for a more pleasing viewing experience.

     

    And you can largely attribute this move up in resolution to the Chinese, more specifically companies like TCL that are building Generation 11 LCD panel fabrication lines. These lines will crank out larger panels that can then be cut into smaller (although still large) sizes and at lower per-panel costs. Given how the competition between Korean and Chinese panel makers has largely decimated any profitability in the Ultra HDTV space (a/k/a 4K), the move up to 8K is almost a necessity.

     

    It shouldn’t be a surprise that the average person is now asking,” Wait a minute! How is it we’re already talking about 8K video and displays? Didn’t we just start moving to 4K resolution? What’s the rush?”

     

    Turns out, the move to 8K has actually been in the works for a long time. More specifically, it started almost 25 years ago in Japan, when broadcaster NHK began researching the next step up from HDTV (which was just getting off the ground outside of Japan!). Their goal was to design and build cameras capable of capturing 8K video at high frame rates, plus the attendant infrastructure to edit, store, and play it back, along with getting it to the home.

     

    NHK’s research and development led to the demonstration of a 4K (4096x2160) camera in 2004 at the NAB show. They followed that by introducing their first 8K camera sensor at NAB 2006, followed by an improved version in 2012. Sharp also showed an 85-inch 8K LCD monitor at CES that year, but people didn’t pay as much attention to that demo as they did the arrival of the first 4K / Ultra HDTV monitors in September of that year at the EFA show.

     

    Back then, depending on the brand name, that 84-inch Ultra HD video monitor – which required four HDMI 1.4 inputs to work – could have set you back as much as $25,000 USD. Coincidentally, this was about seven years after we saw the start of a move away from 720p/768p displays and TVs to Full HD (1920x1080) screen resolution.

     

    A year after those ground-breaking 4K TVs showed up, NHK unveiled a 4-pound 8K Steadicam rig, plus a multi-format video recorder prototype. A 13-inch 8K video monitor for cameras, using OLED technology, also took a bow. And by 2014, NHK was broadcasting selected Olympic events in 8K via satellite via locations around the globe.

     

    In our industry, we were still pushing Full HD and 2K displays and signal management products, looking over our shoulder at a 4K dot in the distance and figuring we had plenty of time. That all changed at ISE in 2018, where Ultra HD and 4K displays were everywhere, not to mention an $80,000 8K broadcast camera from Sharp. Full HD digital signage still make plenty of sense, but the economics of LCD panel manufacturing meant that the fabs in Asia would be pulling back on Full HD and ramping up Ultra HD production.

     

    So here we are in 2019, just embracing the move to Ultra HD. Yet, pundits are already saying, “It’s time to give 8K a look.” At least one Tier 1 display brand has already showcased 8K digital signage at ISE and NAB, and will do so again at InfoComm, likely prompting competitors to show they’re at least players in this new game in Orlando. NAB featured half a dozen 8K video cameras along with recording and storage solutions, and what’s likely the first-ever 8K digital SLR camera to hit the consumer market.

     

    Is this irrational exuberance? Hardly. Clever readers will note that this summer will mark seven years since the first 4K TVs took a bow, seven years after the transition started to Full HD (which itself took place about seven years after the industry began moving away from standard definition displays to 720p/768p HD displays).

     

    Industry forecasts are for about 430,000 8K TVs to ship by the end of December, with over 2 million shipments called for in 2020. Those numbers closely track the roll-out of 4K / Ultra HDTV models from 2012 through 2014. Given that our industry really didn’t embrace 4K until 2017, we figure that you have just a couple of years to get with the 8K program.

     

    And keep in mind that we’re fast approaching a point in time when the pixel density in a display just won’t matter anymore. Because of economics, all large TVs and monitors over 65 inches will have 8K resolution, whether you need it or not. Fortunately, video scalers have gotten quite powerful and can “pull up” your lower-resolution content to fit the screen. And other metrics like HDR, color accuracy, and high frame rate support will be the important ones, not the number of pixels.

     

    Are you ready for 8K?

  • HDMI 2.1: Tired of Waiting for You…

    Ever since the HDMI 2.1 standard was announced at CES in 2017, we’ve all been waiting with baited breath for the chipsets to arrive. v2.1 offered such a speed increase over v2.0 (2013) that it sounded almost like science fiction, leaping from 18 gigabits per second (Gb/s) to an amazing 48 Gb/s, just like that!

     

    And the signaling method would change, too, falling into line with the rest of the world by adopting a digital packet structure, much the same as DisplayPort (which, incidentally, is what much of V2.1 was modeled after). Instead of three lanes for red, green, and blue, plus a separate lane for clock information, v2.1 now employs four separate data lanes, each capable of speeds as fast as 12 Gb/s. With a packetized structure, intermixing and embedding clock packets is a piece of cake.

     

    While that all sounded very impressive over two years ago, the reality still has yet to catch up with the promise. At CES 2019, 8K was a “big” thing, and the HDMI Forum booth had several demonstrations of 8K signaling, including an 8K home theater centered around a Samsung 900-series 85-inch 8K TV. (Ironically, the earlier versions of this TV shipped with the older and slower HDMI 2.0 interface.)

     

    If you dug a bit deeper and asked a few more questions, you would have learned that the testing and certification process for the v2.1 interface is still very much in progress and is not likely to wind up until the fall of this year. What’s more, only one chip manufacturer (Socionext) was cranking out v2.1 TX and RX chips in any quantities as of the end of 2018, with other fabs just getting up to speed.

     

    The hype over HDMI 2.1 reached a bit of absurdity when a prominent television manufacturer declared at their CES press conference that all of their 2019 Ultra HD televisions would have v2.1 input. (No mention as to how many.) Further questioning revealed that, although video signals could enter one of these Ultra HD televisions through a v2.1 physical interface, the signals would be processed as v2.0 after that inside the set.

     

    Why the push for v2.1? Simple. The latest enhancements to TV – high dynamic range and its associated wider color gamut – create a lot more bits per second. And v2.0 looks more and more like a giant speed bump in that context. Presently, you can push a 4K/60 signal through HDMI 2.0 IF you reduce the bit depth to 8 bits per pixel, using the RGB (4:4:4) format. Want to send a 10-bit signal at the same frame rate? Now you have to cut the color resolution to 4:2:2, not easy to do with a computer video card.

     

    While 48 Gb/s may be unattainable in the near future, a data rate around 36 Gb/s could be within reach. That would allow the passage of 4K/60 content with 12-bit RGB color, a truly spectacular set of images streaming at just shy of 30 Gb/s. Or, you could generate a high frame rate (120 Hz) 4K signal for gaming purposes, using 12-bit 4:2:2 color and still get under the wire at 33.3 Gb/s.

     

    The challenge for HDMI has always been higher bit rates over copper. Unlike DisplayPort, there is no provision in the HDMI 2.1 specification for transport over optical fiber, although that shouldn’t be difficult to accomplish given the interface’s packet structure. Above 40 Gb/s, we may have to use optical fiber simply because signal losses over copper wires would be too high to maintain a workable signal-to-noise ratio.

     

    Over in the DisplayPort camp, there hasn’t been a lot of counter-punching going on. Few manufacturers support the DisplayPort v1.3/1.4 standard (v1.4 adds support for HDR metadata, plus color resolutions other than 4:4:4 / RGB) and it’s only the more exotic video cards that would require that kind of speed. Gaming is a good example of a speed-intensive application and that crowd would love to have 12-bit color refreshing at 120 Hz. Or maybe even faster.

     

    Where does that leave our industry? You’ll be hard-pressed to find many signal management products at InfoComm in June that support v2.1 – it took our industry almost four years to really get onboard with v2.0, and we still get press releases from companies boasting how they finally added HDMI 2.0 to their media players and other products. (Well, it’s been five-and-a-half years, you know!)

     

    From our perspective, we don’t expect to see much adoption of v2.1 until a year from now, and even then, things will move slowly. The ProAV industry is more obsessed with the transition from uncompressed high-bandwidth signal distribution to compressed IT-based distribution, centering on 10 Gb/s network switches. (Sorry, you 1 Gb/s fans, that’s just too slow for future-proofing.)

     

    A good “tell” will be how many 4K and 8K TVs (yep, 8K TVs, repeat as often as necessary) will start arriving in the fourth quarter of 2019 with one or more v2.1 input. The more TV manufacturers get with the program, the more likely you’ll see them on commercial monitors and digital signage displays next year in Las Vegas. Another “tell” will be how quickly our industry embraces 8K commercial displays (more to come in our next blog post). Without v2.1, 8K will be nigh impossible.

     

    In the meantime, we’re reminded of that classic song by The Kinks that goes, “I’m so tired, tired of waiting, tired of waiting for you….”

     

     

  • Making AV systems and services completely IT-friendly – Easy, Secure, and Managed

    In recent years, Kramer has been shifting its strategy and investing in IT-friendly Pro AV solutions Integrating hardware, software, and cloud systems like Kramer Control, Kramer Network Enterprise AV Management Platform, VIA wireless presentation, KronoMeet Room Booking & Scheduling, Maestro integrated automation, and more. Software-driven AV functionality on a single device is implemented throughout different systems, creating Kramer Open AV Platforms. This scalable and smart approach to AV architecture enables multiple functionalities like integrated video conferencing coupled with powerful room control and automation, digital signage, and much more.

    The industry is at its final stages of the convergence shift. IT has indeed taken over AV, and this requires a new approach to AV in general. It requires new kinds of solutions and different ways of working. We are committed to positioning Kramer as a provider of IT-friendly AV solutions, investing heavily in meeting the challenges and motivations of the IT department. We have integrated our IT-friendly philosophy into many of our products and solutions. The Kramer AV over IT approach is aimed at making AV systems and services completely IT-friendly – Easy, Secure, and Managed.

    We believe that user-experience is what stands at the core of any easy and simple accessible technology – and AV is no exception. Whether it is videoconferencing or wireless collaboration, AV must be fail-safe and user-proof, and always simple to navigate. For the end user, our focus on easy operation results in a fully transparent AV experience.  

    Zero-touch automation with Kramer Maestro is our way of creating meeting space AV systems that do not require user intervention. The user can enter a room, connect his or her device into the BYOD interface – wired or wireless – and Maestro instantly activates a set of pre-configured actions to prepare the environment for collaboration.  

    We also simplify the process by equipping our VIA line with fully native support for AirPlay, Miracast, and Chromecast, enabling Windows 10, Apple devices, and Android devices to connect wirelessly and present content quickly and natively. This is what we call ‘True BYOD,’ which means users don’t need to install any extra piece of software to get connected and present wirelessly.

    “Easy” in the age of AV over IT is important to IT department and the AV installer. For the IT department, easy solutions are designed to be fail-safe so they will not generate service calls. To meet the technology and management needs of huddle spaces and small-to-medium size rooms built for ad-hoc meetings, we deliver solutions consisting of as few hardware components as possible. This strategy reduces points of failure and overall costs.

    For AV installers, “easy” also means selecting products that are simple to integrate into a larger system. Kramer’s cloud-driven control does not require programming/coding, and it offers a drag-and-drop process, making AV configuration easy and setup significantly quicker. Installers can configure a space, remotely access and update the AV without having to leave their office or make local adjustments.

    We want to meet the expectations of IT managers that they can manage AV in the same way that they manage IT. IT managers expect the ability to view and remotely access every component in the AV installation, just as they would in IT deployments. Managing AV remotely is something that, for an IT department, is natural. 

    IT departments also expect management tools that provide an immediate overview of the organization’s AV, understanding – at a quick glance – the status of all connected devices. With the Kramer Network Enterprise AV Management Platform, we are enabling IT departments to commission, deploy, and manage AV just like an IT domain, with the power to remotely manage, access, and make remote firmware upgrades and notifications via a rich dashboard accessed from any type of device.

    Recent research has found that more than 95% of IT professionals would rate security as the primary factor when choosing any type of technology. This is why Kramer considers security a strategic goal – as essential as product functionality – and the company applies IT industry security standards across its own AV portfolio.

     

    Kramer’s products comply with top industry security standards such as, IEEE 802.1x, 2048-bit SSL encryption, LDAP identity management, Common Criteria PPS 3.0 and more, and products are penetration tested against OWASP 2019 criteria. Pursuing better quality standards, Kramer is ISO 27001 certified and GDPR compliant.

     

    Kramer Open AV Platforms™ will show ISE attendees how the company is delivering on the promise of smart AV. Open AV Platforms is multiple software-driven AV functionality that runs on a single device. The license-activated nature of Open AV Platforms lets the user or customer determine what he or she needs, when they need it, rather than paying for unnecessary functionality. At any given point, a client can add an available feature or expand AV functionality by simply activating it as software. This software-driven method makes AV more scalable and more cost-effective; users can add features on an existing device and reduce the need to replace devices when extra functionality is needed.

     

    At Kramer, we understand that it is not enough to provide the right technology or solution. We have made a strategic decision to make customer education a key enabler of our business. With the launch of Kramer Academy, we are positioning ourselves as the bridge between two worlds – educating IT departments about AV and how to handle AV effectively, and enhancing the knowledge of traditional AV professionals to better understand how to operate in an IT-enabled world.

    To summarise we are making AV systems and services completely IT-friendly by making them easy to integrate and install, failsafe, Secure from cyber attack, and Manageable in a way that an IT professional would expect.

     
  • Onward and Upward

    We recently attended the annual Society of Motion Picture & Television Engineers technology conference in Los Angeles. SMPTE has been holding this event for decades and it attracts the best and the brightest to talk about advancements in everything from television and motion picture production to human visual science, advances in audio, video compression and transmission, and lately, promoting the field to women and college graduates.

     

    One invited papers session in particular stood out this year, and its focus was on 8K television. Regular readers of this blog may remember that much of the research and development in this area is being undertaken by NHK, the national Japanese broadcasting network. NHK commenced their work way, way back in 1995, first achieving 4K resolution in camera sensors in 2004 and then introducing their first 8K camera sensor in 2006.

     

    Since then, they’ve designed and built a 4-pound 8K camera system, pushed sensor speeds to as high as 480 Hz, and created a simultaneous downconversion product that ingests 8K video and spits it out at native resolution, 4K resolution, and Full HD resolution. As you can imagine, that requires a ton of processing power!

     

    At this year’s session, one speaker from NHK detailed their work in next-generation camera sensors for 8K that incorporate a unique organic photoconductive film (OPF) layer to boost sensitivity while keeping noise to a minimum. That’s a real challenge when working with small camera sensors – Super 35 sensors for 4K (Ultra HD) production are already jammed full of tiny photosites, a design headache for camera engineers. Now, imagine you’re asked to double the number of pixels in the same size sensor, where the average pixel measures about 3 micrometers!

     

    The second paper described a new 1.25” 8K camera sensor that can record video at frame rates as high as 480 Hz, or eight times as fast as conventional sensors. Using this sensor, fast motion can be captured with minimal blurring and very fine detail. The captured video is down-converted in-camera to 120 Hz for eventual recording and playback. As you might guess, the data flowing from the camera sensor is a gusher: Uncompressed, with 10-bit 4:2:2 color sampling, it approaches 100 gigabits per second (Gb/s), or more than twice as fast as the latest version of HDMI (2.1) can handle.

     

    The final NHK paper talked about setting up the world’s first full-time 4K/8K satellite broadcasting system, which launched in December of 2018. Aside from the technical challenges of bandwidth (both left-hand and right-hand circular polarization of the radio waves was necessary to carry all of the signal data), there was an additional obstacle: Many residents live in older apartment buildings, making the cable infrastructure upgrade process difficult. It was eventually solved by installing parallel lines of plastic optical fiber (POF, or Toslink) alongside existing coaxial cable, telephone, and power lines.

     

    Where is the relevance to our industry? Consider that, ten years ago, Ultra HD and 4K video was largely a lab experiment to many of us. In 2019, we were just getting used to managing Full HD signals over signal distribution and interfacing systems, wrestling with color bit depth and varying frame rates, not to mention the limitations and foibles of HDMI.

    Yet, three years later, the first commercial Ultra HD monitors washed up on our shores. A decade later, Ultra HD has become the default resolution for consumer televisions and most commercial AV monitors and displays. Just as we did in 2009, we’re wrestling with the same signal management issues, color bit depths, refresh rates, and a whole new version of HDMI…which isn’t even ready to handle the higher bit rates that 4K video requires for higher frame rates and higher color bit depths.

     

    So, while we fuss, argue, complain, and try to adjust to this latest jump in resolution to 4K, there is a country that is already working at TWICE that video resolution for acquisition, editing, storage, and distribution to the home. There’s no reason to think that we’ll catch up to them eventually – the first 8K televisions already launched to the North American market earlier this year, and we’re seeing early interest in 8K displays for specialized installations like command and control, surveillance, visualization and augmented reality, and (of all things) 3D visualization using autostereo displays.

     

    Skeptics can scoff all they want, but this never-ending push upward and onward in spatial resolution isn’t going to stop. If anything, additional momentum will be provided by enhancements like high dynamic range, wider color gamuts, and high frame rate video. (Did you know that, as display screens get larger and fields of view become wider, any flicker in images created by judder and slower frame rates becomes increasingly noticeable? NHK studied this phenomenon and concluded that a minimum frame rate of 80 Hz was required for 4K and 8K on large displays.)

     

    And as usual, we’ll be expected to interface and transport these signals. The SMPTE SDI standard for UHD (12G SDI) is already inadequate for signal-wire serial digital connections, having a maximum data rate of 11.88 Gb/s. This has resulted in 8K camera manufacturers employing four separate 12G SDI ports and some light compression in-camera to record 8K/60 video with 10-bit 4:2:2 color (uncompressed data rate of 47.7 Gb/s). And it’s also revived interest in the latest SMPTE standard for SDI, 24G (23.76 Gb/s, likely over optical fiber).

     

    This should be interesting to watch, particularly since our industry is still working around the six-year-old TMDS-based HDMI 2.0 interface (18 Gb/s), is still largely allergic to optical fiber, and is promoting an AV/IT video codec that’s barely fast enough to squeeze 4K/60 10-bit 4:4:4 video through a 10-gigabit network switch.

     

    Do you feel the need for speed yet?

     

     

  • Newer! Faster! Brighter! (And Barely Fast Enough)

    Back in September, we discussed the escalation of “Ks,” as in how many thousands of pixels the display industry is trying to stuff into next−generation LCD, OLED, and inorganic LED panels. We mentioned that the first 8K displays are now coming to market, even as our industry is still trying to come to grips with the care, feeding, and handling of 4K / Ultra HD video signals.

    Things are moving more quickly than anticipated. The HDMI Forum recently held a press conference in New York City to talk about HDMI 2.1 and where it’s headed. This newer, faster version of HDMI was first introduced at CES in 2017 and is quite the departure from previous versions.

    Instead of using transition−minimized differential signaling (TMDS), which was the foundation of digital display interfaces going back to DVI in 1999, version 2.1 has adopted a packet format very similar to that of DisplayPort. By doing so, HDMI 2.1 can now expand signal carriage to four lanes of data with an embedded clock, compared to the older 3 lanes with separate clock used in all HDMI versions through 2.0.

    There are other advantages. Because the signal is 100% digital now, it can be compressed using Display Stream Compression (DSC), which will come in really handy with the massive signals needed to handle high frame rate video and 8K. Another advantage is that the clock rates and data are free to zoom far beyond the 18 Gb/s limit of version 2.0.

    Indeed; HDMI 2.1 now has a maximum data rate of 48 Gb/s (or 12 Gb/s per lane). That number is mind−boggling: We’re only starting to see network switches with that much speed come to market. But if you run the numbers, you WILL need that kind of speed for advanced high−resolution imaging.

    Consider a 4K signal with high dynamic range and a 120 Hz frame rate. The base clock rate for such a signal, using standard CTA blanking, would be 4400 pixels (x) 2250 pixels (x) 120, or 1188 MHz (1.188 GHz. Add in 10−bit color (the minimum for HDR) with 4:4:4 (RGB) color resolution, and the grand total (after shopper coupons) is 1188 (x) 12 (x) 3 = 42.77 Gb/s. Going to lower color resolution lowers the tab a little: With 4:2:2 color, the data rate is 28.51 Gb/s and with 4:2:0 color, it drops to 21.39 Gb/s.

    That’s still pretty fast – too fast for HDMI 2.0. And if we start talking about 8K imaging, things get even crazier. An 8K video stream (again, using standard CTA blanking) with just 10−bit RGB color at 60 Hz refresh will leave you in a cloud of dust:

    8800 (x) 4500 (x) 60 (x) 12 (x) 3 = 85.536 Gb/s.

    Zoom−zoom! We’d have to drop to 4:2:0 color resolution just to get that signal through an HDMI 2.1 connection. Even 4:2:2 color would be too fast at about 57 Gb/s. The current version of DisplayPort would also vanish in the rear−view mirror, as it is capped at 32.4 Gb/s. (We expect to hear about a new version of DP at CES next month, presumably one that’s a LOT faster.)

    This is presumably where DSC would enter the picture. It is capable of 2:1 compression with extremely low latency, and that would get our example 8K/60 signal down to earth and to a point where it could travel over HDMI 2.1 (but not DP). The only catch is, DSC requires quite a bit of computation to work correctly and is considered “CPU−hungry,” which of course adds cost to its implementation.

    What’s curious about HDMI 2.1 to us is the continued lack of a native optical transport specification. Any signal running in the 40 Gb/s range should probably travel over optical fiber. Certainly, if it’s going to travel through a 40 Gb/s network switch, that transport will be as pulses of light and not electrons dancing on the outer edge of copper conductors.

    We inquired at the NYC press event if any HDMI Forum members were actually making v2.1 transmitter and receiver chipsets yet. So far, only one company in Japan (Socionext) is doing that, but you would be hard−pressed to find any commercial or consumer products that support V2.1 at present. (We’ll certainly have our eyes open at CES for one!)

    As mentioned in September, it’s expected that over 5 million 8K TVs will be shipped worldwide by the end of 2020 – just two years from now. Hand−in−hand will be a small but growing number of 8K monitors for commercial use (yes, there are customers waiting for such products, believe it or not) and the vast majority of those will come from super−sized LCD panel “Fabs” in China that are currently under construction or just firing up.

    We’ve frequently used this expression in the past: “What good is a Ferrari if you live on a dirt road−” Well, that’s pretty much the situation we’re looking at with the next generation of displays. Higher resolution, high dynamic range, wider color gamuts, and high frame rates will all add up to super−sized packages of display data that dwarf what we switch and distribute today.

    New codecs like JPEG XS / TiCo will help to squeeze things through network switches, but we’ll still have a choke point at the physical display interface. And we don’t have any real solutions to the problem just yet: Do we use compression− Double up on interface connections− Skip the traditional HDMI / DP interface altogether, and use a decoder inside the display to decompress the signal−

    Stay tuned…

  • All the Buzz in L.A.

    We’ve just returned from the annual Society of Motion Picture & Television Engineers (SMPTE) technology conference in Los Angeles. This is one of the pre−eminent motion imaging and media delivery conferences in the world, attracting papers from the best and the brightest working across a diversity of disciplines. Image capture, signal distribution, storage, displays, video compression, virtual and augmented reality, streaming – you name it, there was a session about it.

    One of the more intriguing sessions covered artificial intelligence (AI) and machine learning (ML), particularly as those apply to post−production and media workflows. AI and ML are both hot−button topics right now, and more pervasive than you might think. EDID is a very rudimentary form of AI that must be programmed, but it allows displays and video sources to automatically make the best connection in terms of image resolution, frame rates, and color modes.

    Internet of Things (IoT) products for the home both incorporate AI and ML, based on predictions. Every time you use an IoT device in conjunction with other devices, or perform the same set of operations when you use that device, it can “learn” the patterns and save them as a “macro.” With enough on−board intelligence, the device can ask you if you’d like to repeat previous instructions and then execute those instructions automatically.

    A good example would be leaving the house, turning down the thermostat, and switching on selected lights along with an alarm. All of these actions can be saved and repeated automatically, and the group macro given a name (“Out for The Evening”). You just need to tell your voice recognition system to execute that command.

    In our world, the individual commands that turn on lights in a room and activate selected pieces of AV gear are already programmed into macros, accessed from a touch screen. With facial and voice recognition, you wouldn’t even need the touchscreen – the system would recognize you automatically, determine if you are authorized to use anything in the room, and ask your preferences. (You’ll know you’re in trouble if your IoT system says, “I’m sorry Dave, I can’t allow you to do that.”)

    In the SMPTE world, AI and ML can be used for more sophisticated functions. Let’s say you have a great deal of footage from a film shoot that’s been digitized. AI can search that footage automatically and sort it, based on parameters you choose. With facial recognition, it can group all takes featuring a given actor, a certain cityscape background, or daytime vs. nighttime shots. It’s conceivable that AI & ML could even look for continuity errors by rapidly scanning takes. (Did you know NASCAR has digitized over 500,000 hours of video and film from 1933 to the present in their library, searched and accessed by AI−)

    There are parallels to other industries. In the legal world, document searches that were once performed by legions of low−paid clerks are now executed by AI robots, programmed to look for specific key words. Demonstrations have been made of advertising and marketing copy written entirely by AI, based on keywords and macros previously programmed. There have even been attempts to have robots write fiction!

    Another popular session topic – one which took up an entire day – was high dynamic range (HDR). According to a session chair, HDR “is a hot mess right now” as there are multiple competing standards, no consistency in coding metadata for HDR program content, and a lot of unanswered questions about delivering HDR content to viewers and measuring the quality of their experience.

    For many attendees, there were plenty of basic questions about HDR – how does anyone define it exactly− How often is it used in current movies and television programs− Are there metrics that can be used to define the quality of the HDR experience− What are the “killer apps” for HDR− How does HDR affect emotional and perceptual responses in viewers−

    For the AV industry, both AI and HDR will be hot−button topics in 2019. With each passing year, more of the signal distribution, coding, and storage infrastructure we build and use will become automated. The day is coming when we’ll stop obsessing over display resolution and media formats and will instead search for content by name in the cloud to play back on whatever display we have on hand.

    AI will create and store multiple resolutions of the desired content and stream files to us at the highest possible resolution and frame rate that our network connection can reliably support. (That’s already happening with advanced video encoders and decoders that “talk” to the network, determine the safe maximum allowable bit rate, and change it on the fly as network conditions change.)

    Storage was yet another popular topic, as was blockchain. We’re not quite yet familiar with the ins and outs of blockchain (as are many of you, no doubt!), but suffice it to say that the world is moving away from scheduled media distribution to individual, on−demand content consumption from cloud servers through a myriad of distribution channels. And many of those will rely heavily on wireless connectivity, increasingly through 5G wireless networks.

    The SMPTE conference wouldn’t be complete without a look into the future. Our industry is still trying to get up to speed on 4K, yet 8K video is already on our doorstep. Movie theaters are looking into LED screens to replace the decades−old projector/screen model. We can now wrap a viewer in dozens of channels of “reach out and touch it” three−dimensional sound. (Did you know the National Hot Rod Association (NHRA) is working with Dolby to add multi−channel spatial sound to its telecasts−) And while virtual reality (VR) is still struggling to get off the ground, its counterpart augmented reality (AR) is moving ahead by leaps and bounds.

    How much of this will affect the AV industry− All of it, sooner or later…

  • Is There a Future for Projectors

    As a company primarily focused on signal management (switching, mixing, interfacing, and format conversion), Kramer doesn’t pick sides when it comes to signal sources and “sinks” (a/k/a displays). We’re more concerned with getting the signals there intact over a variety of connections, which today could mean anything from full−bandwidth HDMI cables to AV−over−IT and fast WiFi.

    But we can’t help but observe overarching trends in the AV industry. And one that clearly stands out is a shift away from front projection to direct−view displays, a category that includes everything from flat panel LCD and OLED technology to emissive LED walls, particularly those that use fine−pitch LED arrays.

    If you like to attend concerts by popular singers and groups (and who doesn’t−), you may have noticed the extensive use of image magnification (IMAG) in the form of towers of LED cubes, or arrayed as wide walls behind the band (or even both!) It’s hard to miss these stacks, particularly if the concert is outdoors and the sun hasn’t set yet.

    Sharp−eyed viewers might also notice that just about every touring act – whether it be U2, Paul McCartney, Blake Shelton, Keith Urban, or a Broadway musical – now uses LED walls as set pieces and IMAG displays. And why not− They’re super bright, scalable, and from a staging standpoint, comparatively easy to assemble and disassemble. At least, more so than flying projectors and screens, the “old school” way it used to be done.

    Fact is, LED walls (which primarily use components made in China and are being priced very competitively) have substantially eaten in to the market share of high−brightness projectors. And it’s easy to see why, as there are no lamps to change (or filters) and lenses to fit. You need a bigger image− You simply build a bigger LED wall. No worries about high ambient light levels, not when you’ve got upward of 3,000 nits of brightness to start with.

    To achieve that level of brightness with a projector, you’d have to start with well over 30,000 lumens. And that doesn’t even consider the size of the projected image. For those playing at home, let’s assume we want to light up a 10’ x 18’ screen area with the equivalent of 3,000 nits, or 877 foot−Lamberts.

    10 x 18 = 180 square feet (x) 877 = 157,860 lumens

    Yikes! That’s a LOT of lumens. Even five stacked 30,000 lumens projectors would come up short.

    Perhaps a more dramatic example can be found closer to home. As noted in previous commentaries, the free−fall in LCD panel pricing has resulted in bargain−basement deals on televisions, specifically those with Ultra HD (4K) resolution. But you may not have noticed just how low those prices have fallen recently.

    It is now possible to purchase 75−inch Ultra HDTVs for less than $1,500, with some Chinese brands falling perilously close to $1,000. This, in turn, has led panel manufacturers to “go big” and bring out even larger panels in the 80” range. Consequently, anyone can buy 82−inch and 85−inch Ultra HD televisions for less than $4,000 – and these sets also support high dynamic range and its associated, wider color space.

    It wasn’t that many years ago that a Full HD home theater projector with about 2,000 lumens light output was priced at around $7,000. Add in a screen, brackets, and associated wiring, and you’d be well on your way to $10,000! Given that many home theater installations used screens in the 80−inch to 90−inch range, it’s almost a no−brainer to opt for the self−contained LCD television and do away with the screen, bracket, and a lot of extra wiring.

    As you can see, fine−pitch LEDs and super−sized LCD televisions and monitors are nibbling away at projector market share from both ends. And this trend is only going to continue as panel prices and LED pitches continue to drop. So, what market does that leave for projectors−

    The answer is any image that requires three−dimensional mapping, like curved walls, spheres, or unusual shapes (trapezoidal, multiple planes). LCD panels can be formed in many ways, but it’s not easy to make them into curved shapes. Their more inexpensive cousins (OLEDs) can be printed onto flexible substrates and warped into all kinds of unusual shapes – even cylinders – but have nowhere near the horsepower of inorganic LED walls. Projectors make more sense here, logistically and financially.

    Where projectors fell a bit behind but are catching up is in resolution. The move to 4K started first and foremost with projectors almost 15 years ago, but attention shifted to large LCD displays around 2012. Since then, LCDs and now OLEDs have dominated the discussion, and compared to an 85−inch Ultra HDTV with HDR, a ‘true’ 4K home theater projector is quite an expensive beast. Projectors that use lower−resolution chips and image shifting have now come to the AV marketplace to try and keep pace with the move to 4K.

    From our standpoint, all of these trends point toward two things: Faster clock rates and a ton of pixel data moving from point A to point B. Doesn’t matter whether you opt for an LED videowall, a super−large LCD display, or a 4K projector! The increased refresh rates, expanded color bit depth for HDR, and some new tricks like high frame rates that will result will put greater demands on your signal switching and distribution systems.

    Got your back. Our engineers already have their calculators out…

     

  • How Many K's Do You Need

    In baseball, the letter ‘K’ is shorthand for strikeout – getting a batter to swing at or take a third strike. It’s not unusual to see fans of a particular batter holding up signs with large letter ‘Ks” on them to signify how many strikeouts a pitcher compiles during a game. By the way, the record for a nine−inning game is 20 strikeouts, an almost−impossible feat accomplished by just two major league pitchers. Who were they− (Answer at the end of this article)

    In the world of electronics, “K” stands for the more conventional value of one thousand, being derived from the “K” in “Kilo,” which according to the dictionary is “…a Greek combining form meaning “thousand,” introduced from French in the nomenclature of the metric system” and “…French, representing Greek chī́lioi or a thousand.”

    The display industry has become fixated on “Ks” lately. Until the late 1990s, we didn’t have any displays capable of “kilo” pixels of resolution: Just 20 years ago, the first plasma display monitors came to market with 1,280 horizontal imaging pixels, making them the first “kilo” displays (at least, in one axis.) After the turn of the 21st century, we started to see displays panels with almost 2,000 horizontal pixels (1920, to be exact) and for the first time, more than 1,000 vertical pixels (1080 and 1200, respectively).

    Wow, that was a lot of pixels – 2,073,000 to be precise. And most of us figured that would be good for some time to come – who would need more resolution than that−

    Turns out, everyone. Aside from some unusual high−resolution displays from Apple that had 2550 horizontal pixels, we were stuck at 1920x1080 and 1920x1200 for a few years. That is, until 2012, when a new crop of so−called “4K” displays made an appearance at the IFA consumer electronics show. (To be fair, Sony had been selling high−brightness SXRD projectors for cinema applications since the mid−2000s and these models had 4096 horizontal pixels.)

    Six years later, televisions and monitors with 4K resolution (mostly 3840 horizontal and 2160 vertical pixels) have become commodities. What happened− For starters, Chinese display manufacturers made big bets on 4K LCD panels for TVs, figuring that Full HD (1920x1080) would provide diminishing returns over time. They constructed new fabrication lines to crank tens and hundreds of thousands of panels each month.

    Korean manufacturers didn’t sit on their hands, ramping up production of 4K LCD and organic light−emitting diode (OLED) panels for televisions and commercial applications. “4K” became a buzzword for the latest and greatest in flatscreen displays. Screen sizes increased as prices continued to drop from $238 per diagonal inch for those original, limited function 84−inch 4K LCD monitors to an amazing $9 per diagonal inch for a 55−inch 4K television today.

    While that’s a bargain price for consumers, there’s little or no profit for panel and display manufacturers when a 60−inch 4K TV can be had for less than $1,000. So, the Chinese took the lead again, deciding maybe it was time to jump to the “next” K – 8K, or more specifically, 7680 horizontal and 4320 vertical pixels. (For those keeping score at home, that represents about 33 million total pixels, or 16 times the resolution of an old−fashioned Full HDTV.)

    What – has the world gone crazy− Aside from some Ultra HD Blu−ray discs and Netflix/Amazon streaming, there’s very little 4K content to watch today. And you want me to buy an 8K TV next time around−

    Remember – it’s not about content, it’s about profitability. And it’s also about televisions and monitors getting larger and larger. 8K resolution on a 42−inch display that sits ten feet from the nearest viewer makes no sense at all. But 8K resolution on an 80−inch display that sits just a few feet away does make sense. Think of how coarse outdoor LED signs appeared in the early 2000s. Now, wander through InfoComm and notice the 20−foot and 30−foot fine−pitch LED displays that are popping up everywhere: With a fine dot pitch (say, 1.2mm and down), they’re approaching 8K resolution.

    Market research done by a few companies predicts that by 2020, over 5 million 8K TVs will be produced and sold worldwide by 2020 – less than a year and a half from now. Granted, many of those sales will take place in China, but you will see 8K televisions on store shelves by Christmas of this year and certainly no later than the 2019 Super Bowl.

    Think we’re crazy− At the recent IFA Show, both LG and Samsung announced they were bringing 80 inch−class 8K televisions to retail this fall. LG’s entry, which has no model number or pricing information yet, is an 88−inch OLED TV. (For those keeping score at home, that’s a little more than seven diagonal feet.) Samsung’s answer is the Q900FN QLED 8K TV, an 85−inch LCD display that uses quantum dot backlighting to produce high dynamic range images and is supposed to arrive on these shores in October. There’s no way to predict retail prices for either product, but it’s a safe bet they’ll be more than $9 per diagonal inch.

    If you still can’t get your head around the fact that 8K TV is right around the corner, this will blow you away; Innolux, a Taiwanese display manufacturer, showed a 16K 100−inch monitor at a trade show in China in late August. Yep, you read that right – 16K, or more specifically, 15,360 horizontal x 8,640 vertical pixels. If you can’t see the pixel structure on an 8K display unless you are just 1.5 feet away, you’ll never spot it on this display without a jeweler’s loupe.

    Crazy, right− So, what does this mean for signal management and interfacing products− With Full HD, we have to move 2,073,000 pixels every second. For 4K (Ultra HD), the payload jumps to 9.9 million pixels per second, and for 8K, we’re looking at 39.6 million pixels per second (including the blanking interval). Our 594 MHz pixel clock for 4K now accelerates to about 2.4 GHz and a data rate of about 22 gigabits per second (Gb/s) now rockets to about 90 Gb/s for a 10−bit RGB 8K signal. Got bandwidth− We sure hope so….

  • On Voice Recognition Hardware for Control (and the Underlying Motives)

    It’s generally accepted that most of the product innovation in the AV industry originated in the world of consumer electronics. Blu−ray discs, tablets, big and inexpensive Ultra HD displays, faster WiFi, and streaming video all got off the starting line ten years ago. Even more recent trends like Internet of Things control systems and the migration from projected images to fine pitch LED screens are largely driven by consumer behavior.

    Look at all of the wireless collaboration systems that must continually update their operating systems to handle content from Apple hardware as each version of iOS comes to market. (It’s a wonder Apple hasn’t come out with its own wireless collaboration platform.) And incremental improvements to WiFi are also being driven by a growing demand to stream video to the home. Speaking of video, more institutions of higher learning are posting instructional videos to YouTube channels, and of course those channels must be accessible to students.

    The migration of AV signals to IT infrastructures has followed a similar trend in the CE world. Cable TV companies are already implementing delivery of video, audio, and data over fast wireless networks in the home, eliminating the need for a traditional coaxial cable connection. That’s largely because many new homes are not being built with category wire in their walls. Instead, builders assume homeowners will just rely on fast wireless to make all their connections.

    It’s not a stretch to imagine a meeting room in the not−too−distant future that relies on wireless connectivity for every device – even videoconferencing. We’re already at that point with home offices, where teleconferences use GoToMeeting, WebEx, Skype, Zoom, and other software−based codecs through 802.1ac routers. It works! So why not add in wireless voice control systems−

    Well, things don’t always work out as planned. A recent survey conducted by the Web site The Information revealed that, while about 50 million Amazon Alexa voice recognition systems have been purchased, only 100,000 or so of their users have actually ordered anything from Amazon using Alexa. It appears that Alexa users are primarily ordering up music from streaming sites and not for ordering groceries, toilet paper, or pet food. (Or headphones, bicycles, smart televisions, clothing, batteries, etc.)

    It’s clear that the Alexa platform is quite popular, but still a novelty for many. Anecdotal evidence reveals that there is still a learning curve to be tackled for people to fully embrace voice control systems, whether they come from Amazon, Google (Assistant), Samsung (Bixby), or other companies. But with an increasing number of gadgets coming to market equipped with network interfaces, there will be solid growth in IoT control systems applications. Right now, things like doorbell cameras linked to televisions and lights that dim and change color seem to be the popular applications.

    One thing that may give pause is the perception that some large corporation is collecting data on you and your viewing/buying/consumption habits as you issue voice commands. Well, they probably are. But that concern could present a market opportunity for a somewhat dumber voice control system that can handle all of the control “stuff” and leave the shopping to others. Granted, such a device would come at a cost premium: Amazon eats a lot of the cost of Alexa boxes because they want you to buy things with it and figure they’ll make their profit at the back end, especially from Prime members.

    What’s intriguing is the fact that most appliances for the home that were shown at CES back in January incorporate the Google Assistant platform. (“OK, Google!”) Google is not in the retail business, but they are very much in the “big data” business. Their voice control product is obviously aimed at supporting IoT control in the home, but it can and will also gather data every time you issue a command, even if it’s just to stream music from the Google Play store.

    Samsung’s Bixby VC platform exists for yet a different reason, and that is to convince you to buy as many Samsung products for your home as possible. Televisions, washers and dryers, refrigerators, tablets, laptops, smartphones – any and all of these can easily integrate into a Bixby−controlled universe of appliances. Samsung has gone so far as to state that EVERY product they make will be “connected” by 2020.

    So, what does all of this mean to our market− First off, the more end−users become comfortable with voice recognition and control, the more they’ll request it be part of an AV installation. If you can walk into your house and tell Bixby to turn on the lights and television (and probably the oven to warm up last night’s pizza), you will logically assume that you can walk into any room and turn things on with your dulcet tones. That would include classrooms, meeting rooms, and even huddle spaces.

    Second, the smart companies will be busy developing drivers for these VC systems so they can indeed be used to control the AV gear in a given room. (Even if it’s just adjusting lights and drapes at first.) Once a potential customer learns that a high−profile installation has adopted VC, they’ll want it for their facility. There’s nothing like keeping up with the Joneses to motivate customers, especially in high−profile installations.

    Third, the adoption of VC systems will drive IT administrators crazy. The security issues alone could represent a Pandora’s Box (or Pandora’s voice control) to admins: How can you be sure the voice recognition system isn’t responding to a recording− Do you require two−factor identification, such as facial recognition, a thumbprint, or even a spoken password before executing a command or series of commands−

    Fourth, someone will develop a generic voice control system not tied to selling groceries, collecting data, or checking to see if the spin cycle is over. This VC system will be targeted specifically at the commercial AV market (and possibly residential customers) and come with an appropriate interface to translate commands into addressable IP packets to operate just about anything with an Internet hook−up. There may even be multiple, incompatible VC products offered for sale.

    But make no mistake, voice control is coming. You have our word on that. (And “no, Alexa, I did not just order five cases of Double Stuff Oreos!”)

  • Speed is Everything

    If you’ve been paying even the slightest attention to trends in the AV industry, you know that fast wireless connectivity has become a very important part of any installation. In particular, WiFi is integral to all of the wireless collaboration and presentation−sharing devices that are overrunning classrooms and meeting rooms.


    What you may not be aware of is how hard WiFi protocol developers are working to try and stay ahead of the tidal wave of consumer and commercial products that are completely and utterly reliant on wireless connections. Those readers who’ve been around long enough to remember the crude attempts in the late 1990s to stream static images and presentations to projectors should be suitably amazed that anyone can now watch 1080p/60 video on a mobile device – without dropped packets and buffering.


    But is that benchmark good enough− Nope, and particularly with 4K video now getting a foothold. The current “fast” protocol is IEEE 802.11ac, otherwise known as channel−bonding WiFi. This protocol combines two or more 20 MHz wireless channels to boost bandwidth and connection speeds, enabling the “holy grail” of reliable 1080p/60 streaming.


    Well, that used to be the holy grail. Now, customers want to stream multiple 1080p/60 videos without buffering while simultaneously passing the usual TCP/IP traffic. So, the hard−working wireless tecchies have come up with an even faster standard – 802.11ax.


    What’s in an “x−” Supposedly, improved performance, extended coverage and longer battery life.  802.11ax can deliver a single video stream at 3.5Gb/s, and with new multiplexing technology, can deliver four simultaneous streams to a single endpoint with a theoretical bandwidth of 14Gbps.


    801.22ax does this by using a higher level of modulation – quadrature amplitude modulation (QAM), to be precise. Your cable TV company sends you digital TV programs using 256−QAM (or 256 levels of symbols). 802.11ax goes even further by employing 1024−QAM (more symbols of data) and combines it with more antennas (multiple in, multiple out or MIMO).


    A newer version of MIMO, Multiple User (MU−MIMO) can provide up to eight simultaneous streams of video from one wireless access point. Unlike 802.11ac, version “x” operates in both the 2.4 GHz and 5 GHz wireless bands to combine and format channels. And a technology called Orthogonal Frequency Division Multiple Access (OFDMA) allows each MU−MIMO stream to be split in four additional streams, boosting the effective bandwidth per user by a factor of four.


    The difference between earlier versions of the 802.11 protocol and version “x” is like the difference between a Toyota Corolla and a Ferrari. Whereas older versions of wireless lacked the ability to dynamically shape streams and antenna beaming paths, “x” basically modifies every single parameter of a wireless connection to optimize streams of packets. (Did we mention it’s also smart enough to detect on−going activity and wait until the channel is clear to begin transmissions−)


    If you’ve figured out that you’ll need all−new WiFi routers and access points to use any of these features, you win the giant stuffed teddy bear! Your in−room wireless networks will have to be upgraded to 802.11ax at some point, but the good news is that new WiFi gear is pretty inexpensive. What’s more, 802.11ax is backward−compatible with systems like 802.11ac and 802.11n.


    Hey – what a minute. Haven’t we written about 802.11ad before and called it the best thing since waffle irons− True, we did. And for sheer speed, you can’t beat “d” as it supports streaming rates of over 3 Gb/s per channel, with six 2 GHz channels to work with. No congestion here – it’s the equivalent of having an 8−lane highway accessible during rush hour.


    But there’s a catch. (There’s ALWAYS a catch.) 802.11ad operates in the 60 GHz radio band; way, WAY above the frequencies used for 802.11ac and 802.11ax. Radio waves at this frequency are quite small – a full wave measures just under 2 inches, meaning that antennas or this band can be formed right on a semiconductor layer.


    And the catch is that 60 GHz radio waves have a very limited range at maximum power levels (FCC rules allow about 1 watt), which means if you move more than about 30 feet or 10 meters from a 60 GHz WiFi access point, you’ll probably lose the signal. 60 GHz signals also won’t pass through solid objects of any kind, which does provide a backward form of security when you think about it.


    In contrast, 2.4 GHz wireless signals can travel through many solid surfaces that are conductive. And 5 GHz radio waves can penetrate wood floors, glass, sheet rock, curtains, and plastics for quite a distance. With boosters, you can increase the range of 2.4 and 5 GHz signals by several tens of feet. (A similar trick will work at 60 GHz with steerable antennas.)


    Practically speaking, both wireless modes can co−exist nicely. “d” is perfectly suited for short−range, high−bandwidth links in rooms and “x” can pick up the slack over longer distances in larger spaces. Indeed, we’ve seen tri−band modems come to market in recent years that incorporate the older “c” version with “d,” although they have been slow to gain adoption.


    As our industry finds more ways to get rid of wires, especially when connecting devices configured for Internet of Things (IoT) control, demands for more efficient and faster WiFi will only increase. Versions “x” and “d” should satisfy those demands for a few years…we hope…

     

  • INFOCOMM 2018: What Did It All Mean

     If you managed to make it out to this year’s running of InfoComm, you might have summarized your trip to colleagues with these talking points:


    (a) LED displays, and

    (b) AV−over−IT.

     

    Indeed; it was impossible to escape these two trends. LED walls and cubes were everywhere in the Las Vegas Convention Center, in many cases promoted by a phalanx of Chinese brands you’ve likely never heard of. But make no mistake about it – LEDs are the future of displays, whether they are used for massive outdoor signage or compact indoor arrays.


    With the development of micro LED technology, we’re going to see an expansion of LEDs into televisions, monitors, and even that smart watch on your wrist. (Yes, Apple is working on micro LEDs for personal electronics.)


    Projector manufacturers are understandably nervous about the inroads LEDs are making into large venues. Indeed; this author recently saw Paul Simon’s “farewell tour” performance at the Wells Fargo Center in Philadelphia, and the backdrop was an enormous widescreen LED wall that provided crystal−clear image magnification (very handy when concertgoers around you are up and dancing, blocking your view of the stage).


    As for the other talking point – well, it was impossible to avoid in conversations at InfoComm. Between manufacturers hawking their “ideal” solutions for compressing and streaming audio and video and all of the seminars in classrooms and booths, you’d think that AV−over−IT is a done deal.


    The truth is a little different. Not all installations are looking to route signals through a 10 Gb/s Cisco switch. In fact, a brand−spanking−new studio built for ESPN in lower Manhattan, overlooking the East River and the Brooklyn Bridge, relies on almost 500 circuits of 3G SDI video through an enormous router. Any network−centric signal distribution within this space is mostly for IT traffic.

    That’s not to say that installers are poo−pooing AV−over−IT and the new SMPTE 2110 standards for network distribution of deterministic video. It’s still early in the game and sometimes tried−and−tested signal distribution methods like SDI are perfectly acceptable, especially in the case of this particular facility with its 1080p/60 backbone.


    Even so, the writing on the all couldn’t be more distinct with respect to LEDs and network distribution of AV. But there were other concerns at the show that didn’t receive nearly as much media attention.


    At the IMCCA Emerging Trends session on Tuesday, several presentations focused on interfacing humans and technology. With “OK Google” and Alexa all the rage, discussions focused on how fast these consumer interfaces would migrate to AV control systems. An important point was made about the need for two−factor authentication – simple voice control might not be adequately secure for say, a boardroom in a large financial institution.


    What would the second factor be− Facial recognition− (This was a popular suggestion.) Fingerprints− Retinal scans− A numeric code that could be spoken or entered on a keypad− The name of your favorite pet− Given that hackers in England recently gained access to a casino’s customer database via an Internet−connected thermometer in a fish tank, two−factor authentication for AV control systems doesn’t seem like a bad idea.


    Another topic of discussion was 8K video. With a majority of display manufacturers showing 4K LCD (and in some cases OLED) monitors in Vegas, the logical question was: Could resolutions be pushed higher− Of course, the answer is a resounding “yes!”


    Display analysts predict there will be over 5 million 8K televisions shipped by 2022 and we’re bound to see commercial monitors adapted from those products. But 8K doesn’t have to be achieved in a single, stand−alone display: With the advent of smaller 4K monitors (some as small as 43 inches), it is a simple matter to tile a 2x2 array to achieve 7680x4320 pixels. And there doesn’t appear to be a shortage of customers for such a display, especially in the command and control and process control verticals.


    The other conversations of interest revolved around the need for faster wireless. We now have 802.1ac channel bonding, with 802.11ax on the horizon. For in−room super−speed WiFi, 802.11ad provides six channels at 60 GHz, each 2 GHz wide or 100x the bandwidth of individual channels at 2.4 and 5 GHz.


    But wise voices counsel to pay attention to 5G mobile networks, which promise download speeds of 1 Gb/s. While not appropriate for in−room AV connectivity, 5G delivery of streaming video assets to classrooms and meetings is inevitable. Some purveyors of wireless connectivity services like AT&T and Verizon insist that 5G could eventually make WiFi obsolete. (That’s a bit of a stretch, but this author understands the motivation for making such a claim.)


    The point of this missive− Simply that our industry is headed for some mind−boggling changes in the next decade. Networked AV, LEDs, 8K video and displays, multi−factor authentication for control systems, and super−fast wireless connections are all in the wings.


    And if you were observant at InfoComm, you know it’s coming…and quickly.

  • “Who Are Those Guys”

    Readers may remember Butch Cassidy and The Sundance Kid, a classic western from 1969 that featured cinema icons Paul Newman and Robert Redford in the title roles as a pair of happy−go−lucky (and often violent) train and bank robbers. They were the leaders of the infamous “Wild Bunch,” also known as the Hole−In−The−Wall Gang, so named for the remote hideaway in Wyoming that they used to evade authorities after pulling off a heist.

    The story goes that Cassidy and The Kid eventually decamped to South America to get away from relentless manhunts, but ultimately met their maker during a bloody shoot−out in Bolivia. In the movie, Butch and The Kid are constantly riding across the countryside from Argentina to Chile and Bolivia, pursued by a small but determined band of soldiers. “Who ARE those guys−” was The Kid’s constant refrain, as he looked over his shoulder with fear.

    In the AV industry, there are plenty of “those guys” manufacturing hardware and writing software code. For the display industry, “those guys” are Chinese LCD panel fabricators, who are slowly subsuming the flat panel display business once dominated by Japan and later by Korea. In consumer electronics, “those guys” are companies like Hisense and TCL in televisions, Huawei and ZTE in smartphones, and Lenovo in laptops.

    You can find corollaries in control systems, video encoders, network switches, and cable. But there’s one sector of the industry where “those guys” haven’t been able to catch up with the leaders – and that’s the display interface.

    For the past 16 years, the High Definition Multimedia Interface (HDMI) has ruled the roost for display connections, pushing aside VGA at first and then DVI on everything from televisions and Blu−ray players to laptop computers and camcorders. It’s evolved numerous times from a basic plug−and−play interface for televisions and AV receivers to a high−speed transport system for 4K and ultimately 8K video. Ironically, HDMI is often the input and output connection for video encoders and decoders that, in theory, could displace it from the market altogether.

    So, who are “those guys” in this sector− Why, the folks at the Video Electronics Standards Association (VESA), who developed and periodically update DisplayPort. First launched in 2006, DisplayPort was intended to replace the old analog VGA connector with a newer, 100%−digital version that could handle many times the bandwidth of an XGA (1024x768) or UXGA (1600x1200) video signal.

    Other forward−looking features included direct display drivers (no need for a video card), support for optical fiber, multiplexing with USB and other data bus formats, and even a wireless specification (it never really caught on). Like HDMI, DP had its “mini” and “micro” versions (Mini DP and Mobility DP).

    In recent years, VESA stayed current by upping the speed limit from 21.6 to 32 gigabits per second (Gb/s), supporting the USB 3.0 Alternate Mode, adding some cool bells and whistles like simultaneous multi−display output, adopting the first compression system for display signals (Display Stream Compression), recognizing high dynamic range metadata formats, and even accepting color formats other than RGB.

    Best of all, there continue to be no royalties associated with DP use, unlike HDMI. The specification is available to anyone who’s interested, unlike HDMI. And DP was ready to support deep color and high frame rate 4K video as recently as 2013, unlike HDMI.

    However…unlike HDMI, DisplayPort has had limited success penetrating the consumer electronics display interfacing market. While some laptop manufacturers have adopted the interface, along with commercial AV monitors and video cards for high−performance PCs, HDMI is still the undisputed king of the hill when it comes to plugging any sort of media device into a display.

    Even long−time supporters of DP have switched allegiances. Apple, known for using Mini DisplayPort on its MacBook laptops, is now adding HDMI connections. Lenovo, another DP stalwart, is doing the same thing on its newer ThinkPad laptops. Clearly, the HDMI Forum isn’t worried at all about “those guys.”

    That’s not to say “those guys” are giving up the chase. Earlier this year at CES, VESA had several stands in their booth demonstrating a new set of standards for high dynamic range and wide color gamuts on computer monitors – specifically, those using LCD technology. DisplayHDR calls out specific numbers that must be achieved to qualify for DisplayHDR 400, DisplayHDR 600, and DisplayHDR 1000 certification.

    Those numbers fall into the categories of 10% full white, full screen white “flash,” and full screen white “sustained” operation, minimum black level, minimum color gamut, minimum color bit depth, and black−to−white transition time. With interest in HDR video growing, the DisplayHDR specifications are an attempt to get around vague descriptions of things like color range (“70% of NTSC!”) and contrast ratios that don’t specify how the measurements were taken.

    And this is actually a good thing. In the CE world, the UHD Alliance has a vague set of minimum requirements for a TV to qualify as high dynamic range. Compared to the more stringent DisplayHDR requirements, the UHD Alliance specs are equivalent to asking if you can walk and chew gum at the same time. Whereas HDMI version 2.0 (currently the fastest available) can transport an Ultra HD signal with 8−bit RGB color safely at 60 Hz, that’s setting the bar kinda low in our opinion.

    In contrast, DisplayPort 1.3 and 1.4 (adds HDR metadata and support for 4:2:0 and 4:2:2 color) aren’t even breathing hard with a 12−bit RGB Ultra HD video stream refreshed at 60 Hz. And that means a computer display certified to meet one of the DisplayHDR standards can actually accept a robust HDR signal. (Note that VESA isn’t choosing sides here – DisplayHDR−certified screens can also use HDMI connections, but signal options are limited by HDMI 2.0’s top speed of 18 Gb/s.)

    With HDMI 2.1 looming on the horizon – a new version of the interface that liberally borrows from DisplayPort architecture – it doesn’t appear that “those guys” are going to catch up any time soon. But you never know: Even Butch and Sundance had their day of reckoning…

    You can learn more about DisplayHDR here. Check it out!

  • NAB 2018: It's All Up in the Air

    We just returned from our annual visit to the NAB Show in Las Vegas with a lot to think about – not the least of which was, where do we go from here−

     

    From “here,” we mean conventional AV signal management and distribution, using industry standard formats like HD SDI, DVI, and HDMI. “There” isn’t as clearly defined, but we’re pretty sure it will ultimately involve TCP and IP, fast switches, and optical and twisted−pair cable.

     

    Even so, there was no shortage of vendors trying to convince booth visitors that AV−over−IT is the way to go, and right now! Some NAB exhibitors have staked their entire business model on it, with flashy exhibits featuring powerful codecs, cloud media storage and retrieval, high dynamic range (HDR) imaging, and production workflows (editing, color correction, and visual effects) all interconnected via an IT infrastructure.

     

    And, of course, there is now a SMPTE standard for transporting professional media over managed AV networks (note the word “managed”), and that’s ST 2110. The pertinent documents that define the standards are (to date) SMPTE ST 2110−10/−20/−30 for addressing system concerns and uncompressed video and audio streams, and SMPTE ST 2110−21 for specifying traffic shaping and delivery timing of uncompressed video.

     

    Others at NAB weren’t so sure about this rush to IT and extolled the virtues of next−generation SDI (6G, 12G, and even 24G). Their argument is that deterministic video doesn’t always travel well with the non−real−time traffic you find on networks. And the “pro” SDI crowd may have an argument, based on all of the 12G connectivity demos we saw. 3G video, to be more specific, runs at about 2.97 Gb/s, so a 12G connection would be good for 11.88 Gb/s – fast enough to transport an uncompressed 4K/60 video signal with 8−bit 4:2:2 color or 10−bit 4:2:0 color.

     

    The challenge to date has been to manufacture suitable cables for the transport of 12G SDI signals that will meet signal−to−noise (S/N) specifications. Moving to optical fiber is one way around the problem, but it appears most of the demos we saw relied on coaxial connections. To drive a 4K display, some manufacturers rely on quad 3G inputs. Ditto on getting 4K footage out of a camera, although to be fair, there were some single−wire 4K camera connections exhibited.

     

    In an earlier blog post, we talked about the quantum leap to 8K video and displays. Well, we were quite surprised – perhaps pleasantly – to see Sharp exhibiting at NAB, showing an entire acquisition, editing, production, storage, and display system for 8K video. (Yes, that Sharp, the same guys that make those huge LCD displays. And copiers. And kitchen appliances. Now owned by Hon Hai precision industries.)

     

    Sharp’s 8K broadcast camera, more accurately the 8C−B60A, uses a single Super 35mm sensor with effective resolution of 7680x4320 pixels arrayed in a Bayer format. That’s 16 times the resolution of a Full HD camera, which means data rates that are 16x that of 3G SDI. In case you are math challenged, we’re talking in the range of 48 Gb/s of data for a 4320p/60 video signal with 8−bit 4:2:2 color, which requires four 12G connections.

     

    Like they say at the dragstrip, “Now THAT’S fast!” It’s so fast in fact that you can’t even use an expensive 40 Gb/s network switch to port this signal over an IT network.  Indeed; light compression must come into play to get that number down to a manageable level. The TICO (Tiny Codec) is a good candidate – 4:1 TiCo compression would pack our 8K signal back down to 12 Gb/s. JPEG2000 4:1 would do the same thing, and both are low−latency codecs (and are similar to each other in how they work). For that matter, 4:1 compression would drop a 4K/60 signal down to 3G levels, making it a heckuva lot easier to switch.

     

    The Blue River NT technology sweeping through the AV industry uses a codec that’s adapted from VESA’s Display Stream Compression (DSC) and is even gentler, packing things down a maximum of 2:1 to get a 4K/60 10−bit 4:4:4 video stream through a 10 Gb/s switch. We haven’t seen it used in conjunction with an 8K video source yet, but be advised that DSC can actually work up to 3:1 compression levels and still remain visually lossless with very low latency.

     

    In the NHK booth, you could watch a demonstration of 8K/60 video traveling through a 10 Gb/s switch using so−called mezzanine compression based on the TiCo system. In this case, NHK was using 5:1 TiCo compression to slow down a 40 Gb/s 8K/60 video stream to 8 Gb/s. Even our 48 Gb/s example from earlier would make it under the bar at 9.6 Gb/s.

     

    So, what does this all mean− First off, SDI isn’t quite dead yet, to paraphrase Monty Python. It may not be suitable for long distance transmission of 4K video, but it’s still workable for short runs from cameras to switchers and other studio gear, and longer runs using optical connections. (The SNR hurdle still has to be cleared for long coaxial cable runs.)

     

    Second, it’s becoming clear that some degree of light compression is going to be a way of life with 4K and 8K production, especially when you factor in the additional bits for HDR and high−frame rate video (of which there was plenty on display in Vegas). You think 48 Gb/s is fast− Try moving 8K/120 video around: Both NHK and NTT were showing exactly that, with corresponding data rates of about 96 Gb/s. Definitely “funny car” territory.

     

    Third, we’re still a long way from resolving the SDI vs. IP argument. Indeed; some recent studio projects we’re aware of have been built using both SDI and IP architectures to move data and video on separate paths. While offerings like cloud storage will require the network hookup, point−to−point 1080p video can still travel happily over SDI connections. (It should also be pointed out that, for all the ballyhoo about 4K, very little in the way of 4K video production is being undertaken at present.)

     

    NAB 2018 reflected all of this thinking, and then some. It’s almost as if everyone else at the show was waiting for the other guy to take the first step. SDI, or IP− Or Both− Come on, doggone it, make up your mind….. 

  • Tired Of 4K TV Yet− Here Comes 8K TV….

    Yes, you read that right: 8K displays are coming. For that matter, 8K broadcasting has already been underway in Japan since 2012, and several companies are developing 8K video cameras to be shown at next month’s NAB show in Las Vegas.

    “Hold on a minute!” you’re probably thinking. “I don’t even own a 4K TV yet. And now they’re already on the endangered species list−”

    Well, not exactly. But two recent press releases show just how crazy the world of display technology has become.

    The first release came from Insight Media in February and stated that, “The 2020 Tokyo Olympics will be a major driver in the development of 8K infrastructure with Japanese broadcaster NHK leading efforts to produce and broadcast Olympic programming to homes…cameras from Hitachi, Astrodesign, Ikegami, Sharp and Sony address the many challenges in capturing 8K video…the display industry plans for massive expansion of Gen 10.5 capacity, which will enable efficient production of 65" and 75" display panels for both LCD and OLED TV…. sales of 8K Flat Panel TVs are expected to increase from 0.1 million in 2018 to 5.8 million in 2022, with China leading the way representing more than 60% of the total market during this period.”

    You read that right. Almost 6 million 8K LCD and OLED TVs are expected to be sold four years now and over 3 million of those sales will be in China.

    But there’s more. Analyst firm IHS Markit issued their own forecasts for 8K TV earlier this month, predicting that, “While ultra−high definition (UHD) panels are estimated to account for more than 98 percent of the 60−inch and larger display market in 2017, most TV panel suppliers are planning to mass produce 8K displays in 2018. The 7680 x 4320−pixel resolution display is expected to make up about 1 percent of the 60−inch and larger display market this year and 9 percent in 2020.”

    According to HIS Markit, companies with skin in the 8K game include Innolux, which will supply 65−inch LCD panels to Sharp for use in consumer televisions and in commercial AV displays. Meanwhile, Sharp – which had previously shown an 85−inch 8K TV prototype – will ramp up production of a new 70−inch 8K LCD display (LV−70X500E) in their Sakai Gen 10 LCD plant. This display was shown in Sharp’s booth at ISE, along with their new 8K video camera.

    Sony and Samsung are also expected to launch 8K LCD TVs this year. Both companies showed prototypes at CES with Samsung’s offering measuring about 85 inches. Sony’s prototype also measured 85 inches but included micro light−emitting diodes (LEDs) in the backlight to achieve what Sony described as “full high dynamic range,” achieving peak (specular) brightness of 10,000 nits. (That’ll give you a pretty good sunburn!)

    Other players in 8K include LG Display, who already announced an 88−inch 8K OLED TV prior to CES, and Chinese fabricators BOE, AUO, and China Electronics Corporation (CEC). What’s even more interesting is that some of these 8K LCD and OLED panels will be equipped with indium gallium zinc oxide (IGZO) switching transistors.

    No, IGZO isn’t a cure for aging. But what it does is provide much higher pixel density in a given screen size with lower power consumption. More importantly, it will allow these 8K TVs to refresh their pictures as fast as 120 Hz – double the normal refresh rate we use today. And that will be important as High Frame Rate (HFR) video production ramps up.

    Predictably, prices for TVs and monitors using panels with 4K resolution are collapsing. In the AV channel, 4K (Ultra HD) displays are only beginning to show up in product lines, but manufacturers are well aware of pricing trends with Ultra HD vs. Full HD (1920x1080p). With some consumer models now selling for as little as $8 per diagonal inch, the move from Full HD to 4K / Ultra HD will pick up lots of steam.

    And with 8K displays now becoming a ‘premium’ product, 4K / Ultra HD will be the ‘everyday’ or mainstream display offering in screen sizes as small as 40 inches and as large as – well, you name it. We’ve already seen 84−inch, 88−inch, and 98−inch commercial displays, and prototypes as large as 120 inches – yes, 10’ of diagonal screen, wrap your head around that – have been exhibited at CES and other shows.

    We saw quite a few demonstrations of 4K commercial displays at ISE and expect to see a whole lot more at InfoComm in June, along with the inevitable price wars. And there will be the usual “my encoder handles 4K better than yours with less latency” battles, shoot−outs, and arguments. But that could ultimately turn out to be the appetizer in this full−course meal.

    For companies manufacturing signal distribution and switching equipment, 4K / Ultra HD already presents us with a full plate. 8K would be too much to bite off at present! Consider that an 8K/60 video signal using 12−bit RGB color requires a data rate approaching 100 gigabits per second (Gb/s), as compared to a 12−bit, 60 Hz Full HD signal’s rate of about 6 Gb/s, and you can see we will have some pretty steep hills to climb to manage 8K.

    Distributing 8K over a network will be equally challenging and will require switching speeds somewhere north of 40 Gb/s even for a basic form of 8K video, which (we assume) will also incorporate high dynamic range and wide color gamuts. 40 Gb/s switches do exist but are pricey and would require 8K signals to be compressed by at least 25% to be manageable. And they’d certainly use optical fiber for all their connections.

    To summarize, 4K / Ultra HD isn’t on the endangered species just yet. (You can still buy Full HD monitors and TVs, if that’s any comfort.) And the pace of change in display technology is so rapid nowadays that you can’t be blamed if you feel like Rip Van Winkle sometimes!

    But whether it makes sense or not – or whether we’re ready or not – it’s “full speed ahead” for 8K displays as we head into the third decade of the 21st century…

  • CES 2018: Get Ready For The Next Wave Of Displays

    You could be forgiven if you wondered where all of the televisions disappeared to at this year’s CES. Ten years ago, the walls of booths occupied by the likes of Sharp, Sony, Panasonic, Samsung, and LG were stuffed full of LCD and plasma televisions. This was the flagship product for all of these companies and a big part of their sales.

    But this year− A very different look. With the continued emphasis on “connected everything,” TVs moved to the background as “connected solutions” for home and office grabbed center stage. And there’s a good reason why: Display panels are inexpensive to manufacture now and the TVs they wind up in have dropped dramatically in price.

    A quick check at online pre−Super Bowl TV sales showed that you can pick up a first−tier 55−inch 4K (Ultra HD) TV with “smart” functionality for about $500, spending about $100 less for a 2nd−tier brand. Want high dynamic range− Add around $300 − $400 to the price. And we’re talking about Ultra HDTVs here, not Full HD sets that can be had in the same screen size for as little as $399.

    You can attribute this collapse in TV prices to large−scale manufacturing in China, where both raw material and labor costs are much lower than in older industrial companies. Robotics (another big thing at CES) also play a part: The most up−to−date display panel fabrication lines in Asia may sit in multistory buildings, but they only require about 15 to 20 people to monitor and control everything.

    Lower production costs and increasing use of robotics have made it possible to jump to 8K (7680x4320) display resolution. Indeed; many pundits are predicting that 8K displays will replace 4K in a very short time period. But that’s a fanciful prediction at best, given that there is no commercially−produced 8K video and movie content and the storage required for such content would amount to 16 times that needed for plain old Full HD (1920x1080).

    Still, large flat screen displays continue to push projectors out of the market. More AV installations are now using large LCD screens, some with 4K resolution. At the high end, light−emitting diode (LED) displays are now preferred for large indoor and outdoor electronic signs. They’re intensely bright, pushing out 3,000, 4,000, and 5,000 nits over wide viewing angles and creating images that hold up well even under full daylight.

    But now there’s a wild card, and that’s the micro LED display. Most commercial LED displays have a dot (pixel) pitch of 4−6 mm for outdoor use. In recent years, fine pitch LED displays have dropped down below 2 mm with some videowalls touting 1.8, 1.6, 1.2, and even .9mm pitches. (For some perspective, a 50−inch plasma monitor from 1999 had about a 1.2mm dot pitch and 1366x768 resolution.)

    The micro LED takes that a step further with dot pitches much smaller than 1 mm. Take that same 50−inch TV and stuff it full of 4K pixels (3840x2160), and you’ll see that a dot pitch of about .3mm is required for each pixel. (8K resolution would drop that in half again to .15mm.) It’s easy nowadays to form LCD and OLED pixels that small, but micro LEDs are a bit trickier.

    Nevertheless, Samsung showed a 146−inch diagonal micro LED display with 8K resolution. Because the display uses LEDs exclusively, it’s very bright (over 2,000 nits for specular highlights) and has a wide viewing angle. This concept display was also able to show high dynamic range (HDR) video and a much wider color gamut than we usually see. Since LEDs can pulse on and off at very high speeds, this type of display is perfect for the next big revolution in imaging – high frame rate video.

    We’ve seen micro LED technology before. Sony exhibited a hand−wired TV full of micro LEDs about 6−7 years ago at CES, and conservative estimates were that it probably cost in excess of $100,000 to make. Samsung’s model surely came in a lot lower than that, and for one big reason: It’s modular. The final product is actually made up of several smaller LED tiles, which is quite a revolutionary approach to building a TV.

    Here’s what we find interesting: It may actually catch on. Tiling is a familiar concept to those in the AV and staging markets who routinely put together large displays for temporary or permanent installations. The thinking at CES is, “why not do this with televisions−” In essence, you could decide just how big a display you’d want in your home or office and then order up the correct number of tiles. Stack them together, connect all of the driver cables, and away you go.

    Okay, maybe it won’t be that simple. But building TVs out of super−thin tiles could represent a significant manufacturing revolution, just as flat screen displays kicked tube TVs out of the market 15 years ago. What we don’t know is the final pixel resolution of those tiled TVs and how we’ll interface signals to them. Newer and faster versions of HDMI and DisplayPort may be the answer. Or perhaps it will require an entirely different method of signal transport.

    Stay tuned!

  • CES 2018: Tying It All Together

    A few weeks ago was the annual Consumer Electronics Show in Las Vegas. For the first time, the majority of exhibits in Las Vegas emphasized applications over hardware, or to put it another way, “it’s not what you have, it’s what you do with it.”

    And that shouldn’t be a surprise at all. Prices of commercial and consumer gear have been steadily declining over the last decade to the point where much of that gear is now considered consumable and disposable. Buy it, use it, and replace it over ever−shorter product life cycles.

    Attendees wandering through the LVCC couldn’t help but pick up on the “connected” vibe: Connecting and controlling everything with voice commands is all the buzz nowadays. So are faster WiFi and 5G cellular, along with smart, connected appliances and smart, connected cars. To make things even more interesting, Amazon and Google voice recognition systems were found on everything from televisions to cars.

    Speech recognition and control has come a long way since we first saw it implemented at the turn of this decade by companies like Conexant, and it works. And it’s cheap. And you can use it to control just about everything in your home that’s tied to a network, so it’s not unreasonable to assume voice recognition could also be used to operate everything in an AV installation.

    This isn’t fantasy. Every TV manufacturer had at least one model at CES that supported Amazon or Google Assistant. (Some models support both systems.) You can link your TV to your refrigerator, washer, dryer, and other appliances in your home and control just about anything or get status updates. Or you can just ask your assistant general questions, and depending on the question, the system can anticipate what you’re about to do and activate or deactivate devices.

    LG has this feature in their 2018 TVs (ThinQ with Cloi), while Samsung claims that every product they make will be interconnected by 2020 and voice controlled using their Bixby system. While the Chinese brands are not quite up that level, they did show sample rooms with interconnected devices that all respond to voice prompts.

    In addition, Samsung’s purchase of Harman in 2016 gives them entry to the multi−billion−dollar car audio market. And by extension, they can support voice recognition and control in cars, linking them back to homes and offices. On the TV side, both TiVo and Comcast have had voice control and search features for some time, using adaptive intelligence to hunt down and locate programs.

    Examples were shown of voice commands through an LG OLED TV to (a) adjust room lighting, (b) adjust room temperature and humidity, (c) check where the washer and /or dryer cycles stand, (d) check to see what’s in the refrigerator and suggest a recipe for the food that was found, and (e) ultimately order takeout food from a restaurant.

    Another key part of this voice−centered control system is machine learning. As implemented in televisions, these systems can anticipate which programs you’re likely to watch. As part of a wider control system, they can remember what room temperatures you prefer, when you cycle room lights, and what combinations of lighting/temperature/humidity you like when you retire for the night. Needless to say, the system can also activate alarms and outside lighting.

    Samsung also showed an advanced in−door wide−angle camera to see who’s ringing the doorbell. Not exactly a new concept, but it can be linked to your TV or a display in your kitchen appliances. (Yes, that’s becoming a thing now, with a large LCD screen that serves as a hub for everything from your daily schedule to recalling recipes from cloud storage.)

    Another example of machine learning was discussed at the Panasonic press conference. Their big thing is “smart cities” (also a themed area in the Westgate Convention Center) wherein everything is connected – your home, your car, the highways, you name it. Panasonic talked about getting into your car and driving to Starbucks (either with you driving or an autonomous system) and the car will automatically call ahead and order your favorite beverage.

    If it’s this easy to implement in the consumer space, why aren’t we doing more of it in the professional AV world− Kramer Control is already using the icon−oriented drag−and−drop method of building a control system, using cloud−based drivers. All of the control systems for home use that were shown at CES work the same way. The big question is, which voice recognition system will be paired up with this next generation of control systems−

    A big concern that comes to mind is security. It sounds like a great idea to command the function of every piece of hardware in a building, but if all of that gear is interconnected through Ethernet or WiFi, then it’s open to hacking from the outside world. Google’s Nest thermostat was hacked a couple of years ago, so it’s reasonable to assume anything from a TV to a projector, lighting control system, or HVAVC could be at risk.

    Samsung announced at CES that every product they make will be connected by 2020, largely using 5G cellular networks. No doubt companies like LG, Sony, Panasonic, and Chinese brands will follow suit. After all, it’s what consumers want – right− (At least, that’s what we heard all week long in Las Vegas…)

FIND WHAT YOU NEED

GET UPDATES

Want to receive alerts and updates for every new post?