We’re krillin’ it with brand new deep-sea discoveries! Join us as we shell-ebrate the latest wave of ocean critter data. Click the ‘Classify’ button to make a splash and help us scale new depths!
Also note - this project recently migrated onto Zooniverse’s new architecture. For details, see here.
Vital seafloor habitats were injured by the 2010 Deepwater Horizon oil spill. NOAA and partners are building a network of experts and resources to restore this underexplored area in the Gulf of America (formerly Gulf of Mexico).
The sun is powerful, but its intensity dwindles as it passes through the depths of the sea. Yet even the ocean’s dim middle reaches—the “mesophotic zone”—and its deepest, sunlight-free areas—the “deep sea”—host an abundance of life.
In the dim mesophotic zone, seafloor communities include deep-sea corals and animals such as fish, sea anemones, sponges, and sea cucumbers. Sunlight-free deep benthic communities also host corals and other forms of life such as sea stars, sea urchins, fish, and crabs. These organisms colonize rocky outcroppings on the seafloor. Some deep-sea corals found in both mesophotic and deep benthic areas are slow growing and can live for more than 1,000 years.
In the Gulf of America, mesophotic and deep benthic communities are scattered across vast areas of the ocean floor. These communities are found at depths from about 150 feet to some of the Gulf’s deepest points, around 17,000 feet. While seemingly isolated, they are composed of foundational species that contribute to an interconnected food web throughout the region. For example, mesophotic and deep benthic communities harbor fish and invertebrate eggs and larvae. As they mature, some of these organisms travel to other parts of the ocean to feed and reproduce.
NOAA is working with a multidisciplinary group of partners to plan and implement restoration projects for mesophotic and deep benthic communities. NOAA collaborates with scientists and resource managers in the Department of the Interior, including the U.S. Geological Survey. The growing network of partners includes academic and research institutions, aquariums and educational institutions, and governmental and non-governmental organizations.
Restoring deep-sea habitats is a challenging task. In addition to a lack of information about the species that make up these communities, there is limited technical experience with, or precedent for, habitat restoration at these depths, either in the United States or internationally. To meet these challenges, the Deepwater Horizon Natural Resource Damage Assessment Open Ocean Trustee Implementation Group selected four long-term projects in their second restoration plan (PDF, 493 pages).
Through these projects, teams of experts and resource managers are advancing our understanding of deep-sea habitats. They are also enhancing the information available to inform restoration for impacts from the spill. These efforts are providing information to support management, protection, and restoration of natural resources on the sea floor across the entire Gulf of America.
The projects began in 2020 and were cumulatively funded at about $126 million. They included:
In the effort to the map these seafloor communities, the MDBC project is exploring the use of Artificial Intelligence (AI) as a method for automatically localizing and labeling specific species of coral within the thousands of hours of ROV videos collected. Creating bounding boxes around species of coral is a crucial step in training an AI to automate the process of labeling and identifying coral species in future videos, and we're asking for your help to achieve this goal. Through Click-a-Coral, we're asking for assistance in building a training data set to train an AI specific to this task. This AI can be a valuable tool for researchers and conservationists working on projects like the Mesophotic and Deep Benthic Communities (MDBC) Restoration Project, helping to analyze Big (Ocean) Data faster than any human ever could!
If you're interested, here's how this process is implemented:
The first step is to collect a diverse dataset of images and videos showcasing various species of coral in the mesophotic and deep benthic communities. These images can come from underwater surveys conducted by remotely operated vehicles (ROVs).
For each image, volunteers and experts manually draw bounding boxes around individual coral species. These bounding boxes precisely outline the coral in the image, effectively "highlighting" the coral within the frame. These annotations help the AI understand the location and boundaries of different coral species in the images.
The annotated images are then divided into training, validation, and test sets. The training set is used to teach the AI system to recognize and distinguish between different coral species based on the bounding boxes and associated metadata. The validation set is used to fine-tune the model, while the test set assesses the AI's performance.
With the annotated data, machine learning models, such as convolutional neural networks (CNNs), can be trained to identify and classify coral species based on the bounding boxes and other metadata. The AI learns to recognize the visual features of different species, such as the shape, color, and patterns of the corals.
Once the AI model is trained and validated, it can be deployed to analyze new videos or images. The AI will automatically detect coral species and create bounding boxes around them, along with providing useful information about each species given ROV metadata (e.g., their locations, sizes, etc.,)
The AI significantly speeds up the process of labeling and identifying coral species in future videos! Instead of manually reviewing hundreds of hours of footage, researchers can quickly review the AI-generated annotations, correct any errors, and focus their efforts on more critical tasks like conservation efforts and data analysis.
See below how annotations can be used to train our Coral Awareness and Recognition Locator (CARL) model!
By automating the labeling and identification process using bounding boxes and AI, the Mesophotic and Deep Benthic Communities (MDBC) Restoration Project, and similar initiatives, can better manage and protect these fragile ecosystems, making the restoration efforts more efficient and data-driven. This not only contributes to the conservation of deep-sea habitats but also helps advance our understanding of these ecosystems for future generations.