Segmentation of surface cracks based on a fully convolutional neural network and gated scale pooling

Research output: Chapter in Book/Report/Conference proceedingConference contribution

3 Citations (Scopus)
219 Downloads (Pure)


Continual use, as well as aging, allows cracks to develop on concrete surfaces. These cracks are early indications of surface degradation. Therefore, regular inspection of surfaces is an important step in preventive maintenance, allowing reactive measures in a timely manner when cracks may impair the integrity of a structure. Automating parts of this inspection process provides the potential for improved performance and more efficient resource usage, as these inspections are usually carried out manually by trained inspectors. In this work we propose a Fully Convolutional, U-Net based, Neural Network architecture to automatically segment cracks. Conventional pooling operations in Convolutional Neural Networks are static operations that reduce the spatial size of an input, which may lead to loss of information as features are discarded. In this work we introduce and incorporate a novel pooling function into our architecture, Gated Scale Pooling. This operation aims to retain features from multiple scales as well as adapt proactively to the feature map being pooled. Training and testing of our network architecture is conducted on three different public surface crack datasets. It is shown that employing Gated Scale Pooling instead of Max Pooling achieves superior results. Furthermore, our experiments also indicate strongly competitive results when compared with other crack segmentation techniques.
Original languageEnglish
Title of host publication2019 27th European Signal Processing Conference (EUSIPCO)
ISBN (Print)9789082797039
Publication statusPublished - 18 Nov 2019


  • crack segmentation
  • deep learning
  • CNN
  • pooling


Dive into the research topics of 'Segmentation of surface cracks based on a fully convolutional neural network and gated scale pooling'. Together they form a unique fingerprint.

Cite this