The US Naval Research Laboratory (NRL) has conducted research to develop predictive intermediate trophic level (ITL) ecosystem models, reports Cassandra Eichner of NRL.
The research into biological ocean swarms was initiated after the US Navy found that the presence of ITL organisms, such as tiny crustaceans and jellyfish, interrupt its operations.
These animals weaken the ability of the service in making tactical decisions as their presence near or around underwater acoustic equipment affects its output.
Readings from the equipment could be false and unreliable, which in turn hinders planning and charting a navigation course.
To gain a better understanding of the organisms, NRL oceanographer Brad Penta led a campaign to collect information about the dynamics of ITL ecosystems near ocean fronts and biologically active areas off the coast of Delaware.
The 14-day research into the presence of marine swarms showed that a dense swarm around underwater acoustic equipment has the capacity to reflect and reverberate sound, prompting false readings.
US Naval Research Laboratory oceanographer Brad Penta said: “Many of these organisms emit light, called bioluminescence. They do not light up all the time, usually, it’s when they are stimulated or disturbed.”
Penta added: “If you had enough of them (veligers), they could interfere with sonar or an optical instrument. Their presence may change the depth at which navy assets are deployed.”
The research process saw the use of multiple shipboard and tools, including In-Situ Ichthyoplankton Imaging System (ISIIS). The ISIIS was attached with other tools for measuring salinity, temperature, oxygen, chlorophyll-a, and light attenuation.
The study also involved the use of an aircraft fitted with imagers and remote sensing aboard to survey the ocean. It was installed with multiple light detection and ranging (LIDAR) cameras.
To facilitate the process, a new NRL tool was developed, multi-wavelength LIDAR for the environment (MUWLE).
An oceanographer in NRL’s Remote Sensing division Deric Gray said: “Blue worked better in deep water. Green worked well in algae rich areas, and yellow worked well in turbid bays with a lot of mud.”
“We saw aerosol layers that showed up more significantly than we thought they would. The LIDAR also saw thin, broken clouds underneath the aircraft that we couldn’t otherwise see.”
All the information gathered will be processed and translated through a mathematical model called deep neural networks (DNN).
A convolutional neural network (CNN) to identify organisms in the ISIIS images is currently under development.
NRL computer scientist Christopher Wood said: “CNNs are geared toward image analysis. A human being couldn’t process these images in a lifetime. The image reels are massive and some of the organisms are very small.”
Florida Atlantic University Harbor Branch Oceanographic Institute, University of South Alabama Dauphin Island Sea Lab, University of Southern Mississippi, Florida International University, and the University of Delaware participated in the study as research collaborators.