Publication: Research - peer-review › Journal article – Annual report year: 2012
Local backlight dimming in Liquid Crystal Displays (LCD) is a technique for reducing power consumption and simultaneously increasing contrast ratio to provide a High Dynamic Range (HDR) image reproduction. Several backlight dimming algorithms exist with focus on reducing power consumption, while other algorithms aim at enhancing contrast, with power savings as a side effect. In our earlier work, we have modeled backlight dimming as a linear programming problem, where the target is to minimize the cost function measuring the distance between ideal and actual output. In this paper, we propose a version of the abovementioned algorithm, speeding up execution by decreasing the number of input variables. This is done by using a subset of the input pixels, selected among the ones experiencing leakage or clipping distortions. The optimization problem is then solved on this subset. Sample reduction can also be beneficial in conjunction with other approaches, such as an algorithm based on gradient descent, also presented here. All the proposals have been compared against other known approaches on simulated edge- and direct-lit displays, and the results show that the optimal distortion level can be reached using a subset of pixels, with significantly reduced computational load compared to the optimal algorithm with the full image.
|Journal||Proceedings of SPIE, the International Society for Optical Engineering|
|Conference||SPIE Photonics Europe : Optics, Photonics, and Digital Technologies for Multimedia Applications II|
|Period||16-04-12 → 19-05-12|
|Citations||Web of Science® Times Cited: 0|
- Local backlight dimming, Liquid crystal display, Light emitting diode backlight, Linear programming, Optimization, High dynamic range display, Gradient descent
Loading map data...