Apical

Dynamic range has been a problem ever since any attempt to record a piece of reality was done. What we see and hear live is still beyond what most capturing methods are capable of. I remember the years of analog compact cassette tape and my old analog tape recorder being able to achieve signal to noise ratio of 53 decibels. When I adjusted the recording level to reach maximum with the loudest parts of a song, the silent passages were disappearing in random noise. There was a circuitry called DNS, or dynamic noise reduction system, essentially compressing the input signal to fit in the 50-something decibels range. Then came CDs with digital audio delivering dynamic range of 90 decibels. While great on paper specs and fantastic to listen in an isolated quiet room, 90 decibels is very difficult to replay when riding a train in a headset or watching a movie at home. Many recent audio systems offer dynamic range compression, so the silent parts are amplified and the louder parts are attenuated.

Same thing applies to images. Before digital photography we had analog negative film (producing prints) and chrome film (producing slides). Chrome had much wider dynamic range, and was preferred by professional photographers. The shift to digital photography improved almost all aspects of image recording, compared to analog film. All but the dynamic range. Digital cameras, chasing each other in the stupid megapixel race, fall short in capturing dynamic range. Shadows are uniformly black, while highlights are just white. This is the reason I am sooo happy with the FujiFilm S5 Pro I posted here a number of times about.

But even if we are able to capture a high dynamic range, the problem hits us when playing back the captured material. Most LCDs simply cannot play back the entire range. Take a home cinema projector as an example. It projects picture on a white screen. Yes, the screen is white. And the white screen, when not directly lit by the projector, is considered the black color of the projected content. While the same white screen, while lit, represents the white color. So the actual dynamic range is between non-lit white and lit-white... The screen itself presents a challenge to replay the captured reality...

Similar problems apply to mobile devices with LCD screens. These screens are backlit, a number of LEDs sitting behind the LCD screen generates light that either passes through (white) or is blocked (black). Fine in a dark room. But in sunshine? Black is no longer black and we lose visibility and contrast. One approach here is to increase the output of the LED light source. Brighter display is more visible in sunlight. But as you anticipate, consumes much more energy and one of today's scarcities is battery life.

Another approach is to intelligently alter the dynamic range of displayed content. Apical is an "algorithm" company specializing in such clever image transformations. They have the IRIDIX engine responsible just for that - optimizing the picture to be viewable in conditions limited by ambient light and display properties. I am not going to dive into the details, there is more on their web site. What is important in the end, is we get better picture, actually much better, without the need to stretch the limits of our eyes and the device power consumption. Apical's iridix can be found in many well known photography equipment. Being a sophisticated algorithm itself, it can be relatively easily applied to still images. The problem is much more complicated when we have a video stream to be adjusted by iridix. The algorithm has to be applied to every frame and looking at the power balance, what we gain on lowering backlight, we lose on keeping the application processor fully utilized (and this requires a lot of processing power).

But since recently we have a solution to this very problem. The problem of playing high dynamic range video on portable LCD screen in extreme lighting conditions. Apical algorithms can handle that. And we have a company called Quicklogic, mentioned here a number of times. Quicklogic has signed an exclusive agreement with Apical to implement the image enhancing algorithms on the low power gate arrays they provide. The resulting solution is called VEE (Visual Enhancement Engine) and occupies a tiny chip sitting between the graphic controller and the display driver. The VEE chip tweaks the video stream real time, frame by frame, and optimizes it to match the current lighting environment, making the output much more visible. Actually the difference between VEE and non-VEE screens is dramatic. The more difficult the conditions are (outdoor viewing, sunshine), the more of a difference VEE gives. And at the same time we have lower overall power consumption, due to reduced backlight. A rare win - win combination. Potentially a winner for Quicklogic. So next time you shop for a cell phone or a smart phone or a personal media player, check if it has an Apical logo... It will certainly have a VEE chip inside.

Comments