A research team led by Professor Su Seok Choi and Ph.D. student Inpyo Hong from the Department of Electrical Engineering at Pohang University of Science and Technology (POSTECH) has successfully developed the world’s first Pixel-Based Local Sound OLED Technology, fundamentally transforming the way displays and audio are integrated. This groundbreaking technology enables each pixel in an OLED panel to function as an independent sound emitter, turning the display itself into a multichannel speaker array—eliminating the need for external audio devices. The team has successfully demonstrated this technology on a 13-inch OLED panel (suitable for laptops and tablets), and the results have been published in the international journal Advanced Science.
As display technology continues to advance, the industry is no longer focused solely on higher resolution, color accuracy, and dynamic range, but is increasingly emphasizing multisensory integration—particularly audio-visual synchronization, which studies show accounts for up to 90% of users’ sense of immersion. However, traditional systems require bulky external speakers or multiple built-in speaker units, which increase device size and pose integration challenges in confined spaces, such as inside vehicles.
The innovative solution developed by the POSTECH team involves embedding ultra-thin piezoelectric actuators into the OLED panel, replacing the traditional large vibration units. These pixel-aligned piezoelectric elements convert electrical signals into precise sound vibrations and seamlessly integrate with the slim structure of OLED panels. This approach effectively solves the industry’s long-standing issue of sound crosstalk, using a unique design that eliminates interference between adjacent audio sources. As a result, each pixel can emit sound while maintaining spatial separation, achieving unprecedented precision in sound localization.
▲ Compared to the current issue of sound crosstalk (Figure a), the new technology shows significant improvement (Figure b); Figure c shows the team’s test using a 13-inch OLED panel.
This technology holds broad application potential. For instance, it enables zoned audio in vehicles, where drivers can hear navigation instructions while passengers enjoy music; VR/AR devices can dynamically adjust sound direction based on head movements; medical displays can emit directional alerts; and smart home interfaces can deliver spatially separated notification sounds. Most importantly, all of this is achieved without compromising the slim and lightweight characteristics of OLED panels.
Professor Su Seok Choi stated, “Displays are evolving from purely visual output devices into fully sensory interfaces that combine sight and sound. This technology will become a core element of next-generation electronics, enabling thinner devices while delivering immersive, high-fidelity audio.”
Funded by South Korea’s Ministry of Trade, Industry and Energy, this research successfully integrates ultra-thin piezoelectric sound emission technology with high-resolution OLED displays. It opens new design possibilities for smartphones, laptops, in-vehicle displays, and VR devices, fundamentally transforming the nature of multisensory human-machine interaction and marking a new milestone in synchronized audiovisual experiences.