Omnicam deployed at Birch Aquarium

On Wednesday last week we deployed our new omnidirectional imaging system named OmniCam in the Birch Aquarium kelp tank. This deployment allowed us to test and set buoyancy in a salt water environment, check the operation of the instrument, and acquire video data on all six cameras of the environment in the kelp tank.

The OmniCam is a high resolution, high-speed imaging system designed to capture the ambient light field in the open ocean. It uses six scientific grade HD-CCD cameras, each supported by its own pico-ITX computer and solid state drive. This allows each camera to record data in parallel allowing for simultaneous acquisition of uncompressed HD video at up to 20 fps. Fisheye lenses are used on all six cameras to cover as wide a field of view as possible. The entire system is housed in a 44cm diameter glass sphere.

The system is now packaged and ready to take out to sea this week where it will be deployed as part of a multi-university program to study squid camouflage. Towards this effort, the OmniCam will record the light field in the squid’s natural habit. These data will then be played back to squid in a controlled laboratory setting.

 

“Enhanced extended range underwater imaging via structured illumination” published in Optics Express

Dr. Jaffe’s work investigating the use of structured illumination for extended range underwater imaging systems (Enhanced extended range underwater imaging via structured illumination) appears in Vol. 18, Issue 12 of Optics Express this month.

iPhone goes to out sea: Testing the iPhone GPS receiver for locating and tracking autonomous underwater vehicles

One of the key components of the Autonomous Underwater Explorers (AUEs) currently under development in our lab is a GPS receiver for locating, tracking, and recovering the AUEs after a deployment. Because the AUEs will be floating right at the surface of the water, there are problems acquiring satellites and getting reliable GPS fixes using conventional receivers.

On a recent cruise, we had the opportunity to test the GPS receiver of the iPhone 3G. The test consisted of putting together a small plastic housing for the iPhone that was weighted to ensure the iPhone was at the surface. Below is a photograph of the components of the housing:

IMG_2729

The housing consisted of a set of large washers for ballast, some padding to fill the empty space in a 1 L plastic jar, electrical tape to fix a line to the jar, and the iPhone. The iPhone was then powered on, and a GPS application was started to acquire and log GPS fixes. The parts were then assembled:

 

IMG_2731 IMG_2736

We then deployed the iPhone off the stern of the ship, payed out about 200 feet of line, and then towed the package behind the ship for a few minutes. After recovering the package, downloading the GPS data and plotting using GPSVisualizer, we obtained a nice GPS track:

Trackcropped

Follow this link to explore the GPS track intertactively on EveryTrail.com

Note that this was recorded by the iPhone while it was partially underwater, bobbing up and down as it was towed behind a ship!

Many GPS receivers fail to work at all when used right at the surface of the ocean. Yet the iPhone not only worked, it worked well! There are several factors to this performance. Perhaps one of the most significant is that the iPhone uses the cellular network to acquire GPS almanacs and therefore can rapidly initialize and start computing GPS coordinates.  Of course without the cellular network, this performance would be reduced, but none the less, the iPhone does provide a solution for acquiring GPS fixes at the surface (perhaps even slightly under water) of the ocean!

 

AUEs in the news

Today the Autonomous Underwater Explorer project was featured in an article in the Scripps News and also a press release from the National Science Foundation. From the press release:

In an effort to plug gaps in knowledge about key ocean processes, the National Science Foundation (NSF)’s division of ocean sciences has awarded nearly $1 million to scientists at the Scripps Institution of Oceanography in La Jolla, Calif. The Scripps marine scientists will develop a new breed of ocean-probing instruments. Jules Jaffe and Peter Franks will spearhead an effort to design and deploy autonomous underwater explorers, or AUEs. AUEs will trace the fine details of oceanographic processes vital to tiny marine inhabitants.

 

Links:

MAZOOPS first deployment at sea

We returned yesterday from the first deployment of the new MAZOOPS zooplankton sonar at sea. We were able to deploy the system several times per day, and collected nearly 100 GB of acoustic and image data from two different stations off the coast of San Diego.  Deployments of the MAZOOPS were combined with MOCNESS tows and the two modalities generally showed good agreement.

In total, the system was deployed eleven times during the four day cruise. Typically, the system was deployed running in autonomous mode during the day down to 250 meters, and running in tethered mode with real-time data during the night down to 100 meters.

During the daytime, typical profiles showed that most of the acoustic scattering was down around 250 meters or deeper. These data were supported by images from the low-light-level camera on the MAZOOPS. Image data showed that copepods and euphaussiids were likely the most abundant animals, and these data were confirmed by the MOCNESS tows.

On two occasions, while idling at 30 meters depth, the system encountered very high concentrations of euphausiids which were seen in acoustic data as a dramatic increase in the number of echoes and also the amplitude of these echoes. Image data showed a proportional increase in the number of animals imaged. These small scale features are unlikely to be seen in MOCNESS data (due to its averaging of animal concentrations over much larger scales than the MAZOOPS.

One of the key features of MAZOOPS is the combination of acoustic and optical imaging sensors that image the same field of view, making the system truly multimodal. By combining these data together more information about the type of animal and its orientation are available to aid in understanding acoustic echoes. The multiview nature of the acoustic system can then be used in concert with optical imagery to obtain improved estimates of shape and/or taxa specific abundance.

A big thanks to Jules, Fernando, Florian, Prasanna, Justin, Christian, Gabriel, Paul, the crew of the R/V Sproul, and all of the other team members who made this cruise possible.

Below are a few images from the cruise:

Paul Roberts receives first place in Student poster competition at Acoustics ’08

July 11, 2008 – After returning from the Acoustics 08 conference in Paris, we heard the great news that Paul Roberts won first prize in the student poster contest for his poster entitled “Application of multiple-angle acoustic scatter to remote fish classification.” The poster investigates how certain key parameters in a multiple-angle system (such as array aperture and signal bandwidth) effect the ability of the system to classify fish of different size, shape, and species.