Exploring the potential application of a custom deep learning model for camera trap analysis of local urban species

Somin Park, Mingyun Cho, Suryeon Kim, Jaeyeon Choi, Wonkyong Song, Wheemoon Kim, Youngkeun Song, Hyemin Park, Jonghyun Yoo, Seung Beom Seo, Chan Park

Research output: Contribution to journalArticlepeer-review

Abstract

With increasing demands for biodiversity monitoring, the integration of camera trapping (CT) and deep learning automation holds significant promise. However, few studies have addressed the application potential of this approach in urban areas in Asia. 4064 CT images targeting 18 species of urban wildlife in South Korea were collected and used to fine-tune a pre-trained object detection model. The performance of the custom model was evaluated across three levels: animal filtering, mammal and bird classification, and species classification, to assess its applicability. A comparison with existing universal models was conducted to test the utility of the custom model. The custom model demonstrated approximately 94% and 85% accuracy in animal filtering and species classification, respectively, outperforming universal models in some aspects. In addition, recommendations regarding CT installation distances and the acquisition of nighttime data were provided. Importantly, these results have practical implications for terrestrial monitoring, especially focusing on the analysis of local species. Automating image filtering and species classification facilitates efficient analysis of large CT datasets and enables broader participation in wildlife monitoring.

Original languageEnglish
JournalLandscape and Ecological Engineering
DOIs
StateAccepted/In press - 2024

Keywords

  • Biodiversity monitoring
  • Image-filtering automation
  • Transfer learning
  • Urban ecosystem
  • Urban wildlife

Fingerprint

Dive into the research topics of 'Exploring the potential application of a custom deep learning model for camera trap analysis of local urban species'. Together they form a unique fingerprint.

Cite this