Dr. Jaffe’s work investigating the use of structured illumination for extended range underwater imaging systems (Enhanced extended range underwater imaging via structured illumination) appears in Vol. 18, Issue 12 of Optics Express this month.
One of the key components of the Autonomous Underwater Explorers (AUEs) currently under development in our lab is a GPS receiver for locating, tracking, and recovering the AUEs after a deployment. Because the AUEs will be floating right at the surface of the water, there are problems acquiring satellites and getting reliable GPS fixes using conventional receivers.
On a recent cruise, we had the opportunity to test the GPS receiver of the iPhone 3G. The test consisted of putting together a small plastic housing for the iPhone that was weighted to ensure the iPhone was at the surface. Below is a photograph of the components of the housing:
The housing consisted of a set of large washers for ballast, some padding to fill the empty space in a 1 L plastic jar, electrical tape to fix a line to the jar, and the iPhone. The iPhone was then powered on, and a GPS application was started to acquire and log GPS fixes. The parts were then assembled:
We then deployed the iPhone off the stern of the ship, payed out about 200 feet of line, and then towed the package behind the ship for a few minutes. After recovering the package, downloading the GPS data and plotting using GPSVisualizer, we obtained a nice GPS track:
Note that this was recorded by the iPhone while it was partially underwater, bobbing up and down as it was towed behind a ship!
Many GPS receivers fail to work at all when used right at the surface of the ocean. Yet the iPhone not only worked, it worked well! There are several factors to this performance. Perhaps one of the most significant is that the iPhone uses the cellular network to acquire GPS almanacs and therefore can rapidly initialize and start computing GPS coordinates. Of course without the cellular network, this performance would be reduced, but none the less, the iPhone does provide a solution for acquiring GPS fixes at the surface (perhaps even slightly under water) of the ocean!
Follow the link below to access the story
In an effort to plug gaps in knowledge about key ocean processes, the National Science Foundation (NSF)’s division of ocean sciences has awarded nearly $1 million to scientists at the Scripps Institution of Oceanography in La Jolla, Calif. The Scripps marine scientists will develop a new breed of ocean-probing instruments. Jules Jaffe and Peter Franks will spearhead an effort to design and deploy autonomous underwater explorers, or AUEs. AUEs will trace the fine details of oceanographic processes vital to tiny marine inhabitants.
- http://scrippsnews.ucsd.edu/Releases/?releaseID=1031 (Scripps News)
- http://www.nsf.gov/news/news_summ.jsp?cntn_id=115887&org=NSF&from=news (NSF Press Release)
- http://www.jacobsschool.ucsd.edu/news/news_releases/release.sfe?id=901 (Jacobs School Press Release)
We returned yesterday from the first deployment of the new MAZOOPS zooplankton sonar at sea. We were able to deploy the system several times per day, and collected nearly 100 GB of acoustic and image data from two different stations off the coast of San Diego. Deployments of the MAZOOPS were combined with MOCNESS tows and the two modalities generally showed good agreement.
In total, the system was deployed eleven times during the four day cruise. Typically, the system was deployed running in autonomous mode during the day down to 250 meters, and running in tethered mode with real-time data during the night down to 100 meters.
During the daytime, typical profiles showed that most of the acoustic scattering was down around 250 meters or deeper. These data were supported by images from the low-light-level camera on the MAZOOPS. Image data showed that copepods and euphaussiids were likely the most abundant animals, and these data were confirmed by the MOCNESS tows.
On two occasions, while idling at 30 meters depth, the system encountered very high concentrations of euphausiids which were seen in acoustic data as a dramatic increase in the number of echoes and also the amplitude of these echoes. Image data showed a proportional increase in the number of animals imaged. These small scale features are unlikely to be seen in MOCNESS data (due to its averaging of animal concentrations over much larger scales than the MAZOOPS.
One of the key features of MAZOOPS is the combination of acoustic and optical imaging sensors that image the same field of view, making the system truly multimodal. By combining these data together more information about the type of animal and its orientation are available to aid in understanding acoustic echoes. The multiview nature of the acoustic system can then be used in concert with optical imagery to obtain improved estimates of shape and/or taxa specific abundance.
A big thanks to Jules, Fernando, Florian, Prasanna, Justin, Christian, Gabriel, Paul, the crew of the R/V Sproul, and all of the other team members who made this cruise possible.
Below are a few images from the cruise:
At 12:25 Dr. Jules Jaffe was interviewed as one of the “Nifty Fifty” in the San Diego Science Festival.
July 11, 2008 – After returning from the Acoustics 08 conference in Paris, we heard the great news that Paul Roberts won first prize in the student poster contest for his poster entitled “Application of multiple-angle acoustic scatter to remote fish classification.” The poster investigates how certain key parameters in a multiple-angle system (such as array aperture and signal bandwidth) effect the ability of the system to classify fish of different size, shape, and species.
December 4, 2007 – The FAD Sonar just returned from a month long deployment on board of the R/P Flip. It was used as a supplemental instrument to measure fish and plankton aggregating around Flip during the FLIP07 SCORE experiment conducted by John Hildebrand and Elizabeth Henderson. We’d like to thank Liz for an outstanding job of running the system throughout the 30 day deployment and recording 30 Gigabytes of data that we can’t wait to analyze!