When researchers at NTT Corporation rotated an ultrasound focal point at just five cycles per second, something unexpected happened: the force people felt on their skin multiplied twentyfold. This discovery, announced in May 2025, cracked open a problem that had stalled mid-air haptics for years—how to make invisible touch feel real.
The Weakness Problem
Ultrasound haptic systems work by aiming phased arrays of transducers at a point in mid-air, creating acoustic radiation force that pushes against skin. The physics are elegant: adjust the phase of each 40 kHz transmitter so the waves arrive simultaneously at a focal point, and you can create a tactile sensation without any device touching the user. The catch? Conventional focal points generate only about 0.01 newtons of force—roughly the pressure of a feather brushing your palm.
That weakness limited the technology to novelty demonstrations. You could feel something, but it didn't convince your nervous system that anything substantial was happening. The sensation registered as interesting but insubstantial, like touching a hologram that barely pushed back.
Motion Changes Everything
The NTT breakthrough, developed with The University of Tokyo, showed that motion—not just vibration—fundamentally alters how we perceive ultrasound haptics. Rotating the focal point at 5 Hz doesn't just add movement to the sensation; it amplifies the perceived force by a factor of twenty. The actual acoustic energy hasn't changed. What changed is how our mechanoreceptors interpret a moving stimulus versus a static one.
This isn't about vibrating the focal point at 200 Hz to stimulate skin at sensitive frequencies, a technique already common in the field. The 5 Hz rotation operates at a completely different perceptual level, one that taps into how our touch system evolved to detect objects moving across skin—something far more salient to survival than static pressure.
The team didn't stop at force amplification. They built a "haptic synthesizer" that combines multiple frequency vibrations to create distinct textures in mid-air: smooth, rough, slimy. The work earned nominations for both Best Paper and Best Demonstration at Eurohaptics 2024, the field's premier conference.
Beyond the Focal Point
While NTT solved the force problem, other researchers have tackled spatial precision. Mid-air ultrasound works within about 30 centimeters of the transducer array before acoustic pressure attenuates too much for reliable tactile sensing. Within that range, recent work has shown the focal area can be adjusted by 25-30% across various positions using the Energy Difference Method, which optimizes how the array concentrates acoustic energy.
This matters because early systems treated the focal point as essentially fixed in size—you could move it around, but not reshape it. Being able to adjust the focal area means rendering sensations that match the contours of virtual objects more accurately. Touch a virtual sphere, and the system can distribute the sensation across a curved area rather than poking a single point into your palm.
Spatiotemporal modulation takes this further by moving focal points over time, tracing shapes or textures across skin. Combined with amplitude modulation at 200 Hz, the technology can simulate surprisingly complex tactile experiences—all from transducer arrays that look like flat speaker panels.
Where Physical Interfaces Disappear
Automotive designers are among the most eager adopters. Modern cars have replaced physical buttons with touchscreens, sacrificing tactile feedback for flexibility and aesthetics. Mid-air ultrasound haptics could restore that feedback without the clutter of mechanical controls. Reach toward a virtual climate control dial floating above your dashboard, and you'd feel detents and resistance as if turning a real knob—except there's nothing there.
The technology integrates naturally with augmented reality systems. Current AR experiences are primarily visual, occasionally adding spatial audio. Touch has remained stubbornly absent because wearable haptic devices—gloves, wristbands, vests—add friction that undermines the promise of seamless digital overlay. Ultrasound requires no wearables. The transducers sit in the environment or device, not on your body.
This opens applications beyond interface design. Remote social touch—feeling a handshake or hug from someone in another location—becomes technically feasible. Virtual 3D modeling gains another dimension when designers can feel the curves and edges they're sculpting in space. Head-mounted displays could incorporate ultrasound arrays to add touch to VR without the heat and weight of vibration motors.
The Biocompatible Advantage
Ultrasound brings technical benefits that competing mid-air haptic approaches struggle to match. It's biocompatible—we've used diagnostic ultrasound in medicine for decades without harm. It penetrates materials efficiently, meaning arrays can sit behind acoustically transparent displays (research shows surfaces with 0.5mm holes and 25% open space barely affect performance). It's inaudible at 40 kHz, unlike some competing technologies that produce annoying whines. And the components are relatively cheap; 40 kHz transducers are commodity parts.
The field still faces constraints. The 30-centimeter effective range limits applications to relatively close interaction. Multiple users in the same space can interfere with each other's haptic experiences. And while the NTT discovery amplifies perceived force dramatically, the absolute force levels still can't simulate lifting heavy objects or strong impacts.
When Holograms Push Back
The trajectory from weak focal points to texture-synthesizing systems with amplified force perception happened in just over a year. That pace suggests we're past the fundamental research phase and into engineering refinement. The question isn't whether mid-air ultrasound haptics will work outside labs—commercial systems like Ultraleap's UHEV1 already exist—but how quickly they'll become common enough that touching nothing feels normal.
We've spent decades training ourselves to accept that digital interactions offer no tactile feedback, that screens are inert glass requiring only visual attention. Ultrasound haptics propose reversing that training, making the invisible tangible again. The technology finally works well enough to be convincing. Whether that's enough to change how we build interfaces depends less on the physics and more on whether designers can imagine what to do when holograms learn to push back.