Bridging the gap between the public’s knowledge and detection of marine non-indigenous species through developing automated image classification applications for marine species

Guardado en:
Detalles Bibliográficos
Publicado en:Frontiers in Marine Science (Jan 31, 2025)
Autor principal: Zhou, Peng
Otros Autores: He, Xue-Qing, Tu, Zhi-Yi, Sun, Dong, Wang, Chun-Sheng, Shen, Hong-Bin, Pan, Xiaoyong
Publicado:
Frontiers Research Foundation
Materias:
Acceso en línea:Citation/Abstract
Full Text + Graphics
Full Text - PDF
Etiquetas: Agregar Etiqueta
Sin Etiquetas, Sea el primero en etiquetar este registro!

MARC

LEADER 00000nab a2200000uu 4500
001 3162017732
003 UK-CbPIL
022 |a 2296-7745 
024 7 |a 10.3389/fmars.2025.1508851  |2 doi 
035 |a 3162017732 
045 0 |b d20250131 
100 1 |a Zhou, Peng 
245 1 |a Bridging the gap between the public’s knowledge and detection of marine non-indigenous species through developing automated image classification applications for marine species 
260 |b Frontiers Research Foundation  |c Jan 31, 2025 
513 |a Journal Article 
520 3 |a Biological invasions are impacting biodiversity, ecosystem, and socio-economics globally. Marine non-indigenous species (mNIS) can be introduced through human activities, such as maritime shipping and careless discarding of aquarium species. Despite significant efforts to prevent the introduction of mNIS, occurrences continue to arise, including fishes, crustaceans, ascidians, anthozoans, bryozoans, sponges, macroalgae, seagrasses and mangroves (Alidoost Salimi et al., 2021). Once mNIS establish in recipient regions, controlling and eradicating them becomes a challenging task. Early awareness of mNIS could enhance the effectiveness of early response, particularly during the introduction phase, which is critical to reduce the impacts of mNIS in the future. Therefore, it is imperative to develop reliable and cost-effective strategies for the early detection of mNIS before they successfully establish themselves in new habitats and pose threats to local biodiversity.The public are playing important roles in marine conservation (Earp and Liconti, 2020), such as detecting and monitoring outbreaks of the starfish Acanthaster spp. (Dumas et al., 2020), and managing invasive lionfish Pterois volitans (Clements et al., 2021). To monitor the presence of mNIS, actions have been taken to help the public become familiar with and effectively recognize these species, such as using watch lists and guidebooks. However, due to the high biodiversity of marine species, identifying specimens accurately requires extensive domain knowledge and skills, which can be challenging even for experts. Furthermore, it is particularly difficult for the public to accurately identify newly introduced species that they are not familiar with, especially during the early stages of a biological invasion. Traditionally, recognizing unfamiliar marine organisms, which may be non-native, requires the access to scientific expertise, such as training, consulting illustrated guides, or searching numerous webpages, which can be time-consuming. With the development of artificial intelligence (AI), the public could turn to online resources for image search and analysis. However, achieving accurate image classification of marine species still requires professional automated tools to ensure reliable identification. Therefore, with the aim of supporting people worldwide in achieving early awareness of mNIS, this opinion highlights the achievements in developing AI-based applications for automated image classification of marine species, addresses the existing gaps in mNIS detection, and provides suggestions for potential future solutions.Image-based automated classification has been a long-standing interdisciplinary field of research, bridging marine biology and computer vision. In recent years, advancements in artificial intelligence, particularly deep learning techniques, have significantly propelled the development of applications for the automated classification of marine species through images (Table 1). For instance, SuperFish, a smartphone application, has been tailored to identify 38 fish species commonly found in Mauritius waters, achieving an accuracy rate of 98% (Pudaruth et al., 2021).Similarly, WikiFish, another mobile application, focuses on identifying 89 fish species prevalent in Mediterranean Sea, boasting an accuracy of 80% (Elbatsh et al., 2022). Addressing the pressing issue of fish fraud resulting from species mislabeling in the seafood industry and aquarium trade, Fishify, a mobile application has been developed. This application, trained on a dataset of over 50,000 images across 44 classes (including a separate section for non-fish images), achieves an accuracy rate of 95% (Dhore et al., 2024).Fishial.AI, a project leveraging AI for fish species identification, has launched a portal at https://portal.fishial.ai, where users can submit images for automated identification and receive prompt prediction results. The built-in AI detector not only detects fish species but also creates segmentation polygons around each fish in the photo. Currently, the classification model can identify over 290 fish species by their scientific names (https://www.fishial.ai/solutions#imagecms), with accuracy rates ranging from 70% to 90% (https://docs.fishial.ai/home/features#fishial-recognition%E2%84%A2). Notably, Fishial.AI operates its model on a cloud server, accessible via an API, while keeping its code open-source to facilitate other developers in creating AI-based applications.Moreover, to cater to the growing demand for efficient recognition of marine fish species in global oceans, FishAI, an automated web application, has been developed for hierarchical image classification at five taxonomic levels (Yang et al., 2024). Utilizing images of 808 marine fish species from the World Register of Marine Species (WoRMS), the application achieves an accuracy of 0.626 at the species level on testing images. FishAI allows user to save and share search results by providing an email address. Beyond fish, the deep learning approaches have been applied to the development of models for image classification of marine invertebrates. Examples include iBivalves for bivalves (Maravillas et al., 2023), CoralNet for corals (Chen et al., 2021), and EchoAI for echinoderms (Zhou et al., 2023). Recently, an AI-empowered citizen science approach has been introduced in biological surveys focused on marine ecological conservation at Cape Santiago on Taiwan Island. To assist citizen science participants during these surveys, a webbased application was developed to house a trained AI model specifically designed for recognizing more than 20 commonly observed species (Chen et al., 2024).These AI-based applications have demonstrated immense power in assisting the public, including novices, divers, fishermen, and scientists, in identifying marine species. They also hold significant potential for detecting mNIS. Fon instance, in diving scenarios, divers may encounter unfamiliar highlights that are of great interest to them. These unfamiliar species could potentially be mNIS.With the help of these applications, the public can save time for navigating through numerous websites, atlases, and books, as well as comparing images to identify unknown species. The easyto-use applications for species identification is beneficial for early detection and awareness of mNIS. Despite the development of applications mentioned above, significant gaps remain, particularly in the availability of ready-to-use automated image classification applications for marine species.These applications are crucial for assisting the public in early awareness and monitoring of mNIS.First, the primary motivation for the development of the current applications is not focused on detecting mNIS. For instance, Fishify and WikiFish were tailored for preventing fish fraud and enhancing food safety. Users may not necessarily consider whether what they observed is mNIS or not. To engage the public, one possible way is that the developers could incorporate questions following the identification results, such as "Is it an alien species?" This may stimulate users' curiosity. Furthermore, to help users compare the locations where they have sighted the specimens, developers could add links to biogeographic information resources. For example, the Global Biodiversity Information Facility (https://www.gbif.org/) and Ocean Biodiversity Information System (https://obis.org/) provide distribution maps displaying the locations of occurrences. These features may foster engagement of the public in detecting mNIS. By integrating biogeographical data, AI-empowered applications would conveniently assist the public in preventing the introduction and spread of mNIS.Second, the dataset and algorithm should be addressed. When dealing with a relatively small number of categories, applications such as SuperFish on 38 fish species (Pudaruth et al., 2021), WikiFish on 89 fish species (Elbatsh et al., 2022), and iBivalves on 12 bivalve species (Maravillas et al., 2023) have achieved high classification accuracies, ranging from 80% to 98%. The number of categories identified by these applications falls short of the estimated number of species in the oceans. To enhance the AI model's ability to accurately identify broader categories of marine species, a primary task is to establish a global, substantial dataset of labeled images for training AI architectures. For fish species, in 2021, a comprehensive vision-language benchmark dataset named Wildfish++ was released, encompassing 2,348 fish species with 103,034 images (Zhuang et al., 2021). More recently, another large-scale fish image dataset, FishNet, was created, containing 94,532 images from 17,357 aquatic species, organized according to biological taxonomy (order, family, genus, and species) (Khan et al., 2023). As the number of species included in the datasets increases, the accuracy, particularly in fine-grained classification, should be improved. For instance, FishAI and models trained on Wildfish++ and FishNet only achieve accuracies of 60-70% at the species level. Misidentification can lead to inaccurate and misleading assessments, potentially resulting in improper management decisions. To address the gap in data, the image data from the globe-scale datasets with curated labels (such as WoRMS) could be utilized for training the state-of-the-art AI architectures to develop the automated tools, as WoRMS was established with the aim of providing an authoritative and comprehensive list of names of marine organisms, and the content of WoRMS is controlled by taxonomic experts. To enhance the classification accuracy, it is recommended to increase the number of images in each category for model training, and to employ state-of-the-art architectures, including those designed for few-shot learning, to construct the classification models.Third, imbalance exists in the development of automated image classification applications. While numerous applications have been constructed for the automated classification of fish images, significantly fewer applications have been developed for other categories of marine species. Many mNIS of considerable concern are invertebrates, algae, or plants (Alidoost Salimi et al., 2021), such as the invasive green crab Carcinus maenas (Jamieson et al., 1998), biofouling bivalveMytilopsis leucophaeata in Europe (Verween et al., 2006), the documented biofouling mNIS in USA (Lord et al., 2015), invasive biofouling cnidarian Pennaria disticha in the Mediterranean Sea (Bosch-Belmar et al., 2022) and Tubastraea spp. (sun-coral) in the Southwest Atlantic (Coelho et al., 2022), and the invasive Echinoderms species (Ling et al., 2009;Lang et al., 2023;Ling and Keane, 2024). The good thing is that the ready-to-use applications for other marine species besides fish are increasing, such as CoralNet for corals (Chen et al., 2021) and EchoAI for Echinoderms (Zhou et al., 2023). However, there still is a lack of automated image classification applications for the underrepresented categories of mNIS. To address this imbalance, more efforts should be directed.Last but not least, usability remains a crucial factor. Despite numerous academic publications for the advanced performance of AI models, such as those trained with datasets like Wildfish++ and FishNet, and the subsequent release of their source codes for public access, individuals lacking knowledge of deep learning or programming skills still face challenges in utilizing these models.Users are often required to configure their environments, retrain algorithms, or optimize hyperparameters, which can be daunting tasks. Furthermore, these AI-based applications can also benefit professionals by enabling automated verification of records submitted by the public.In conclusion, with more images of marine species in the global oceans being collected and utilizedfor training, AI-based tools for automated image classification can continually be improved.Despite the persistent gaps in the public knowledge and detecting mNIS, advancing towards the creation of ready-to-use automated applications would help bridge the knowledge gaps by assisting the public in recognizing marine species, and would boost the public participation in detection and monitoring of mNIS. These robust tools could serve as cost-efficient solutions for biodiversity monitoring and conservation, such as leveraging them to develop devices for real-time detection and capture of mNIS. 
653 |a Mollusks 
653 |a Fish 
653 |a Shellfish 
653 |a Datasets 
653 |a Marine fish 
653 |a Indigenous species 
653 |a Aquatic crustaceans 
653 |a Food safety 
653 |a Socioeconomic aspects 
653 |a Introduced species 
653 |a Fouling 
653 |a Biological invasions 
653 |a Biodiversity 
653 |a Marine ecosystems 
653 |a Training 
653 |a Image processing 
653 |a Taxonomy 
653 |a Oceans 
653 |a Mangroves 
653 |a Books 
653 |a Algae 
653 |a Biofouling 
653 |a Marine invertebrates 
653 |a Seafood 
653 |a Animal learning 
653 |a Seafoods 
653 |a Accuracy 
653 |a Atlases 
653 |a Sea grasses 
653 |a Classification 
653 |a Fraud 
653 |a Marine organisms 
653 |a Artificial intelligence 
653 |a Algorithms 
653 |a Aquariums 
653 |a Worms 
653 |a Public access 
653 |a Biological surveys 
653 |a Invertebrates 
653 |a Conservation 
653 |a Crustaceans 
653 |a Invasive species 
653 |a Corals 
653 |a Deep learning 
653 |a Aquarium fishes 
653 |a Biogeography 
653 |a Automation 
653 |a Native organisms 
653 |a Marine conservation 
653 |a Public participation 
653 |a Surveys 
653 |a Seagrasses 
653 |a New records 
653 |a Aquaria 
653 |a Marine biology 
653 |a Computer vision 
653 |a Carcinus maenas 
653 |a Bivalvia 
653 |a Environmental 
700 1 |a He, Xue-Qing 
700 1 |a Tu, Zhi-Yi 
700 1 |a Sun, Dong 
700 1 |a Wang, Chun-Sheng 
700 1 |a Shen, Hong-Bin 
700 1 |a Pan, Xiaoyong 
773 0 |t Frontiers in Marine Science  |g (Jan 31, 2025) 
786 0 |d ProQuest  |t Science Database 
856 4 1 |3 Citation/Abstract  |u https://www.proquest.com/docview/3162017732/abstract/embedded/H09TXR3UUZB2ISDL?source=fedsrch 
856 4 0 |3 Full Text + Graphics  |u https://www.proquest.com/docview/3162017732/fulltextwithgraphics/embedded/H09TXR3UUZB2ISDL?source=fedsrch 
856 4 0 |3 Full Text - PDF  |u https://www.proquest.com/docview/3162017732/fulltextPDF/embedded/H09TXR3UUZB2ISDL?source=fedsrch