banner image
No data available.
Please log in to see this content.
You have no subscription access to this content.
No metrics data to plot.
The attempt to load metrics for this article has failed.
The attempt to plot a graph for these metrics has failed.
An echolocation visualization and interface system for dolphin research
Rent this article for
View: Figures


Image of FIG. 1.
FIG. 1.

The basic design of the EchoLocation Visualization and Interface System, ELVIS. The dolphin transmits a train of sonar pulses, focused in a narrow beam aimed at the hydrophone matrix. The relative sound pressure levels in the sonar beam are converted into light intensity variations on the PC screen. This dynamic image is projected back on the semitransparent hydrophone screen, offering an immediate visual feedback to the dolphin on its sonar output. The relative sound pressure levels can also be coded into, e.g., color variations only visible on the PC screen to the human observer.

Image of FIG. 2.
FIG. 2.

Block diagram displaying the signal path through the system. The five blocks, labeled hydrophones, amplifiers, filters, peak detectors and A/D converters, represent 16 independent channels.

Image of FIG. 3.
FIG. 3.

The peak-hold detector output as it monitors the maximum amplitude of the input signal (dolphin click) from a hydrophone. After software-controlled storage to a hard drive, the detector is reset by a control signal from an output on the acquisition board.

Image of FIG. 4.
FIG. 4.

The ELVIS, as seen from the dolphin’s point of view, while being used as an acoustic “touch screen.” The dolphin can “click” on the two white symbols by aiming its sonar beam axis at them. The symbols correspond to active buttons on a normal touch screen. The active area as well as the trig level can be set by the operator to adjust accordingly to the training level of the dolphin.

Image of FIG. 5.
FIG. 5.

These video frames span over a total time of . The picture shows the beam axis trace on the screen and a dolphin echolocating as it swims up to the right side of the screen. The video recordings were made with camera 2. The black line was added to indicate the beam axis orientation. The small black arrows illustrate the motion path of the beam axis across the screen.

Image of FIG. 6.
FIG. 6.

These six frames show the beam axis tracing program while one dolphin is exploring the system for the first time. The frames are captured from a sequence filmed with video camera 2. The time between each frame is . All spots drawn by the system were numbered according to the order of which they appeared on the screen. The text and numbers in the pictures were added after the filming.

Image of FIG. 7.
FIG. 7.

A screen shot from running the replay program interface with a short sequence of a sonar beam axis trace of a dolphin searching for an object partially buried in coral sand over the hydrophone matrix. The object’s position within the “Search pattern” area is indicated by the painted gray circle. The light intensity of the spots indicates the sound pressure levels of the clicks. It spans from dark red for weak clicks to white for strong clicks in a linear scale. The fade-out time of the trace can be set during replay. The intensity of each click hitting the same location was summed in this particular visualization mode. This figure has been converted into grayscale, thus representing dark red as dark gray. Consecutive clicks are numbered from 1 to 43 and were recorded during .

Image of FIG. 8.
FIG. 8.

Time synchronized video frames of a dolphin operating the acoustic touch screen, where A shows the dolphin from above, and B shows the acoustic touch screen from the dry side of the pool, with the trace of the dolphin’s sonar beam axis as it hits the square symbol.


Article metrics loading...


Full text loading...

This is a required field
Please enter a valid email address
752b84549af89a08dbdd7fdb8b9568b5 journal.articlezxybnytfddd
Scitation: An echolocation visualization and interface system for dolphin research