MICL-UNet: Multi-Input Cross-Layer UNet Model for Classification of Diseases in Agriculture

Abdulaziz Anorboev, Javokhir Musaev, Dosam Hwang, Yeong Seok Seo, Jeongkyu Hong

Research output: Contribution to journalArticlepeer-review

1 Scopus citations


Agricultural diseases severely impact productivity and result in significant economic losses in the agricultural sector. Current monitoring practices are predominantly individualized, making them inefficient and challenging for farmers to manage on a large scale. Hence, there's an urgent need for an effective solution to identify and classify these diseases swiftly and accurately. Addressing this need, we introduce the multi-input cross-layer UNet model (MICL-UNet). Enhanced with numerous residual connections, our model ensures minimal information loss from the input. Given the role of nonlinear activation functions at every layer, we strategically incorporated input data at multiple stages in the model, reinforcing it with terminal-layer residual connections. This design aims to distribute critical image features evenly across all layers. The MICL-UNet displayed superior performance over baseline models, achieving accuracy scores of 99.63% for Vegetable datasets, 99.25% for PLD datasets, and 98.7% for Guava Disease datasets. Beyond its academic implications, the MICL-UNet offers significant practical benefits. By improving disease detection accuracy, it promises to elevate agricultural product quality, reducing production uncertainties. This approach has the potential to foster more sustainable and resilient agricultural practices, paving the way for robust economic growth in the sector.

Original languageEnglish
Pages (from-to)117685-117697
Number of pages13
JournalIEEE Access
StatePublished - 2023


  • Agricultural disease
  • cross-layer skip connection
  • image classification
  • multi-input


Dive into the research topics of 'MICL-UNet: Multi-Input Cross-Layer UNet Model for Classification of Diseases in Agriculture'. Together they form a unique fingerprint.

Cite this