Research

Universal Adaptive Beamformers

This project exploits universal algorithms, order statistics and Random Matrix Theory to address the challenges for adaptive beamformers for large arrays operating in dynamic environments. We extend our recent research applying universal algorithms from information theory to important array processing challenges. We will developing approximate implementations of optimal adaptive beamformers (ABFs) that sacrifice a  very small amount of performance to achieve a substantial reduction in computational requirements. These tradeoffs between performance and computation requirements will become critical in future passive sonar systems containing very large numbers of sensors. We plan to test our new algorithms on real data from our microphone array and also existing ocean acoustic data sets from prior ONR-funded ocean acoustic experiments.  

[Funded by the Office of Naval Research Code 321US]

Collaboration with Stony Brook University and George Mason University

 

Learning from Hearing: Neurobehavioral, physiological, and computational processes of auditory object learning

To make sense of and respond appropriately to the cacophony reaching their ears, animals must determine what sound comes from each physical source in an environment. This process of “auditory object formation” is mathematically under-determined. Solving this problem requires exploitation of prior information and knowledge of natural sound structure. Learning is key to this ability; in a mixture of sounds, you are better not only at recognizing your own name, but at understanding a familiar voice vs. that of a stranger. The importance of learning in auditory scene analysis is well established for humans, yet little is known about the cognitive processes enabling marine mammals to learn to parse complex auditory scenes.   In this project, we will explore behavior in free-swimming, echolocating dolphins as they learn to represent and discriminate objects with varying acoustic backscatter.   We will also conduct listening experiments with dolphins to asses their ability to learn patterns in sequences of sounds in sequences of echos and in sequences of signatures whistle syllables.   These experiments will include EEG and behavioral measures.   Both the free-swimming and listening experiments will be use to train computational models for dolphin decision making and predict the behavior and performance of the dolphins in further experiments.

[Funded by the Office of Naval Research MURI Program]

Collaboration with Carnegie Mellon University, University of Michigan, New College of Florida, University of Washington, and Woods Hole Oceanographic Institution

 

Listening for Rain: Detection and Estimation of Rainfall from Underwater Acoustic Signals

Obtaining detailed rainfall data is one of the largest challenges for computer models of ocean dynamics.   Satellite remote sensing of rainfall over the open ocean lacks sufficient accuracy.   This project develops critical capabilities for estimating rainfall using inexpensive acoustic sensors implemented in low-cost drifters.   The pioneering acoustic rainfall algorithms of Nystuen and collaborators relied on binary decision trees based on a few narrowband frequencies.   We exploit dimension reduction techniques and modern machine learning algorithms to extract information in the broadband acoustic spectrum above 1 kHz to detect and estimate rainfall on short time scales.   The project includes 1-2 month deployments of acoustic recording systems synchronized with nearby meteorological observations.    We plan to develop the algorithms required for a longer term goal of estimating rainfall over the world oceans from a network of such drifter systems. 

[Funded by the Office of Naval Research/UMass Dartmouth Marine and UnderSea Technology (MUST) Research Program]

Collaboration with Scripps Institute of Oceanography and Applied Ocean Sciences

 

MURI: Active Sensing in Echolocation Marine Mammals and Humans

The behavior of echolocating animals gathering information differs from manmade sonar systems. The animals tightly couple sensing and action in an active sensing feedback loop. Manmade sonar systems methodically sweep across the seabed when searching for an object, like a farmer tilling a field. Dolphins race across the space exploring objects of interest in detail, and rapidly converging on the object of interest. Bats landing on a small object alternately aim their echolocation beam to either side of the object as they approach, but rarely ensonify it directly. Mathematical models find that this behavior paradoxically increases the information about the object’s location even while it reduces the echo energy received. A shared theme of these behavior appears to be modifying behavior to gain information. The UMass Dartmouth component of this MURI project will develop two models for search behaviors maximizing information about the environment. The first model focuses on movement searching for an object of interest in a possibly cluttered space. In collaboration with the University of Washington Applied Physics Lab, we will develop active sensing searches that maximize the expected gain of mutual information about the object’s unknown location. This search strategy extends prior passive sensing “infotaxis” models for insects in information poor environments where detection of signals of interest is rare and possibly easily missed. The second model focuses on strategies for directing a sonar beam around a detected object to maximize the Fisher Information about the object’s location in azimuth and elevation. Aiming the beam’s main response axis slightly askew of a target theoretically improves the precision of the estimated location. We will attempt to demonstrate this phenomena with broadband signals received by a manmade sensor with a frequency dependent beam pattern. For binaural hearing with frequency-dependent directional sensors, we will explore how much information about location is contributed by interaural time difference of the received signal, and how much information is contributed by the directional filtering of the waveform by the sensors.

[Funded by the Office of Naval Research MURI Program]

Collaboration with Carnegie Mellon University, University of Michigan, University of Washington, the University of St Andrews, and Woods Hole Oceanographic Institution

 

Prior Research Projects

 

Information Entropy for Humpback Whale Songs and Leopard Seal Calling Bouts

This is a webinar I gave for the UNH Acoustics Center in September 2017