We tend to think of sound as omnidirectional, and that’s because it normally is. If you drop a glass on the floor, the sound waves from it shattering are emitted in all directions. But that doesn’t mean it has to be — sound is, after all, just waves vibrating through the air. Your stereo, for example, sounds much different if you stand behind the speakers than if you’re standing in front of them. That can even be narrowed down and focused much further to create a “sound projector,” and researchers from the University of Sussex in England have developed the first sound projector that can track individuals.
The concept of sound projection has been around for a long time, and essentially relies on emitting a narrow beam of sound waves at a target. Anyone outside of that beam is unable to hear the sound. That can be further reduced to a small pocket of volume by aiming two beams at a single point. Neither beam actually carries the complete sound “picture,” but where the sound waves modify each other where they intersect to create a small space where the audio can be heard clearly. It’s also possible to direct those sound waves in a way similar to optical techniques that have been used with light for a long time.
What makes this research interesting is how the sound projection is being directed. An inexpensive commercial webcam is used for facial tracking in order to target a specific individual. An Arduino-controlled acoustic telescope is then aimed at that person. That telescope not only points the sound waves at the person’s head, but also focuses them just like you would focus the lens on a camera. That makes it possible to create a small pocket around a person’s head with audio that only they can hear, which opens up all kinds of interesting possibilities for entertainment and even personal alerts.