Concrete crack identification using a UAV incorporating hybrid image processing

Hyunjun Kim, Junhwa Lee, Eunjong Ahn, Soojin Cho, Myoungsu Shin, Sung Han Sim

Research output: Contribution to journalArticlepeer-review

201 Scopus citations

Abstract

Crack assessment is an essential process in the maintenance of concrete structures. In general, concrete cracks are inspected by manual visual observation of the surface, which is intrinsically subjective as it depends on the experience of inspectors. Further, it is time-consuming, expensive, and often unsafe when inaccessible structural members are to be assessed. Unmanned aerial vehicle (UAV) technologies combined with digital image processing have recently been applied to crack assessment to overcome the drawbacks of manual visual inspection. However, identification of crack information in terms of width and length has not been fully explored in the UAV-based applications, because of the absence of distance measurement and tailored image processing. This paper presents a crack identification strategy that combines hybrid image processing with UAV technology. Equipped with a camera, an ultrasonic displacement sensor, and a WiFi module, the system provides the image of cracks and the associated working distance from a target structure on demand. The obtained information is subsequently processed by hybrid image binarization to estimate the crack width accurately while minimizing the loss of the crack length information. The proposed system has shown to successfully measure cracks thicker than 0.1 mm with the maximum length estimation error of 7.3%.

Original languageEnglish
Article number2052
JournalSensors
Volume17
Issue number9
DOIs
StatePublished - 7 Sep 2017

Keywords

  • Concrete structure
  • Crack identification
  • Digital image processing
  • Structural health monitoring
  • Unmanned aerial vehicle

Fingerprint

Dive into the research topics of 'Concrete crack identification using a UAV incorporating hybrid image processing'. Together they form a unique fingerprint.

Cite this