A preview of this full-text is provided by Springer Nature.
Content available from International Journal of Machine Learning and Cybernetics
This content is subject to copyright. Terms and conditions apply.
Vol.:(0123456789)
International Journal of Machine Learning and Cybernetics
https://doi.org/10.1007/s13042-024-02438-3
ORIGINAL ARTICLE
Bicrack: abilateral network forreal‑time crack detection
SaileiWang1,2· RongshengLu1,2· BingtaoHu1,2· DahangWan1,2· MingtaoFang1,2
Received: 24 November 2023 / Accepted: 17 October 2024
© The Author(s), under exclusive licence to Springer-Verlag GmbH Germany, part of Springer Nature 2024
Abstract
Crack detection is an important task to ensure structural safety. Traditional manual detection is extremely time-consuming
and labor-intensive. However, existing deep learning-based methods also commonly suffer from low inference speed and
continuous crack interruption. To solve the above problems, a novel bilateral crack detection network (BiCrack) is pro-
posed for real-time crack detection tasks. Specifically, the network fuses two feature branches to achieve the best trade-off
between accuracy and speed. A detail branch with a shallow convolutional layer is first designed. It preserves crack detail
to the maximum and generates high-resolution features. Meanwhile, the semantic branch with fast-downsampling strategy
is used to obtain enough high-level semantic information. Then, a simple pyramid pooling module (SPPM) is proposed to
aggregate multi-scale context information with low computational cost. In addition, to enhance feature representation, an
attention-based feature fusion module (FFM) is introduced, which uses space and channel attention to generate weights, and
then fuses input fusion features with weights. To demonstrate the effectiveness of the proposed method, it was evaluated on 5
challenging datasets and compared with state-of-the-art crack detection methods. Extensive experiments show that BiCrack
achieves the best performance in the crack detection task compared to other methods.
Keywords Crack detection· Bilateral network· Semantic segmentation· Real-time· Deep learning
1 Introduction
Cracks are the early damage of buildings, roads, bridges and
other man-made infrastructure, which will pose a serious
threat to the safety of the structure [1–3]. If the crack cannot
be found and repaired in time, it may cause immeasurable
losses. Manual detection is a very common method of crack
detection. However, due to the subjectivity of human judg-
ment, manual detection is not only extremely time-consum-
ing but can also result in a significant lack of accuracy in
crack detection. To solve the above problems, researchers
have conducted in-depth research on fast and accurate auto-
matic crack detection algorithms. Automatic crack detection
methods based on image processing have attracted much
attention due to their low cost and high accuracy. Most of
these methods focus on threshold segmentation [4, 5], edge
detection [6–8], wavelet transform [9] and other technolo-
gies. Compared with manual detection, automatic crack
detection significantly improves the efficiency and accu-
racy of crack detection, and is not affected by subjectivity.
However, due to the uneven distribution of cracks, low con-
trast between cracks and the surrounding background, back-
ground noise and the influence of shadows, the traditional
automatic crack detection methods cannot adapt to complex
environments well, resulting in very limited use scenarios
and practical effects.
In recent years, deep learning technology has made
great progress in the field of computer vision, which pro-
vides a new method to solve the traditional automatic crack
* Rongsheng Lu
rslu@hfut.edu.cn
Sailei Wang
saileiwang@mail.hfut.edu.cn
Bingtao Hu
hubingtao@mail.hfut.edu.cn
Dahang Wan
wandahang@mail.hfut.edu.cn
Mingtao Fang
mingtaofang@mail.hfut.edu.cn
1 School ofInstrument Science andOpto-electronic
Engineering, Hefei University ofTechnology, Hefei230009,
China
2 Anhui Province Key Laboratory ofMeasuring Theory
andPrecision Instrument, Hefei University ofTechnology,
Hefei230009, China
Content courtesy of Springer Nature, terms of use apply. Rights reserved.