+6281216825037 [email protected]

Engineers at Stanford University created an airborne method of imaging objects lying deep underwater — blending light and sound to penetrate the stubborn barrier of interference at the surface of water, according to a recent study published in the journal IEEE Access.

In essence, we may be nearing a time when flying drones map the entire ocean floor, in high resolution.


Engineers penetrate the surface, underwater barrier with sound, light blend

The researchers think their new hybrid optical-acoustic system might eventually play a role in conducting drone-based biological marine surveys from the air, in addition to executing large-scale aerial searches of sunken ships and planes, and mapping the deep black depths of the ocean at similar opacities and speed as we do with above-water landscapes.

“Airborne and spaceborne radar and laser-based, or LIDAR, systems have been able to map Earth’s landscapes for decades. Radar signals are even able to penetrate cloud coverage and canopy coverage,” said Amin Arbabian, study leader and an associate professor of electrical engineering in Stanford’s School of Engineering, in a Stanford blog post. “However, seawater is much too absorptive for imaging into water,” he added.

“Our goal is to develop a more robust system which can image even through murky water,” added Arbabian.

Energy loss at surface, underwater barrier

Oceans account for roughly 70% of the Earth’s surface, but only a small portion of their depths are cataloged with high-resolution imaging and mapping, reports TechXplore.

The major barrier to high-resolution imaging is physical. For example, sound waves can’t slide from air to water or vice versa without losing the majority of its energy — more than 99.9%. This happens because of a reflective effect from a different medium.

Systems using soundwaves moving from water to air into water again will lose this energy twice — losing 99.9999% of its energy.

Conventional seafloor imaging method too slow, costly

Likewise, electromagnetic radiation — which includes microwaves, radar signals, and conventional light — also loses tremendous energy upon passing from one physical medium into another. But most of the energy is absorbed in the water, according to Aidan Fitzpatrick, first author of the study and Stanford graduate student in electrical engineering.

Notably, this air-to-water absorption also accounts for why sunlight doesn’t penetrate to the ultimate depths of the ocean. It’s also why the cellular signals of smartphones — which is a form of electromagnetic radiation — can’t send or receive calls underwater.

This means we can’t map oceans from the air and from space in the same way we can map land. As of writing, most underwater mapping efforts were completed via attaching sonar systems to ships assigned to trawl specific regions of interest. But this method is expensive and slow, and lacks the efficiency needed to map large areas.

The Stanford ‘S’ was detected submerged underwater and reconstructed in 3D with ultrasound waves. Source: Aidan Fitzpatrick / Stanford University

Solving an invisible jigsaw puzzle, best of both worlds

This is why the Photoacoustic Airborne Sonar System (PASS) was designed to overcome. It blends light and sound to break through the interference typically experienced at air-water boundaries. The idea came from a different project — one relying on microwaves to perform “non-contact” imaging and crisp characterization of underground plant roots.

Some instruments used in PASS were created to work with this surface-breaking paradox in mind, via collaborative help from electrical engineering Professor Butrus Khuri-Yakub of Stanford.

In short, PASS uses the strengths of light and sound to overcome their relative weaknesses: “If we can use light in the air, where light travels well, and sound in the water, where sound travels well, we can get the best of both worlds,” said Fitzpatrick in the Stanford blog post.

PASS Stanford S
The system blends light and sound to compensate for energy loss. Source: Aidan Fitzpatrick / Stanford University

New system compensates for loss in signal magnitude

To accomplish this, the system initially fires a laser from the air — which is absorbed in the surface of water. Once this happens, it generates ultrasound waves, which propagate down into the depths of the water column until they reflect off of some underwater object — before gliding effortlessly back to the sunny surface.

Sound waves still lose most of their energy when they hit the air-water barrier, but in generating the sound waves underwater with lasers, the researchers cut out the secondary energy loss mentioned above.

“We have developed a system that is sensitive enough to compensate for a loss of this magnitude and still allow for signal detection and imaging,” said Arbabian.

We could build high-resolution map of entire ocean floor

As of writing, the PASS has only seen tests in the lab with a container roughly the size of a large fish tank. “Current experiments use static water but we are currently working toward dealing with water waves,” said Fitzpatrick. “This is a challenging but we think feasible problem.”

The engineers at Stanford envision the technology eventually operating on a helicopter or drone, Fitzpatrick said. “We expect the system to be able to fly at tens of meters above the water,” Fitzpatrick added. But when it completes successful tests above real-world oceans, it will only be a matter of time until the abyssal depths of every ocean are available to see with high-resolution images.

Source Article