TY - GEN
T1 - Privacy-preserving British sign language recognition using deep learning
AU - Hameed, Hira
AU - Usman, Muhammad
AU - Khan, Muhammad Zakir
AU - Hussain, Amir
AU - Abbas, Hasan
AU - Imran, Muhammad Ali
AU - Abbasi, Qammer H.
PY - 2022/9/8
Y1 - 2022/9/8
N2 - Sign language is a means of communication between the deaf community and normal hearing people who use hand gestures, facial expressions, and body language to communicate. It has the same level of complexity as spoken language, but it does not employ the same sentence structure as English. The motions in sign language comprise a range of distinct hand and finger articulations that are occasionally synchronized with the head, face, and body. Existing sign language recognition systems are mainly camera-based, which have fundamental limitations of poor lighting conditions, potential training challenges with longer video sequence data, and serious privacy concerns. This study presents a first of its kind, contact-less and privacy-preserving British sign language (BSL) Recognition system using Radar and deep learning algorithms. Six most common emotions are considered in this proof of concept study, namely confused, depressed, happy, hate, lonely, and sad. The collected data is represented in the form of spectrograms. Three state-of-the-art deep learning models, namely, InceptionV3, VGG19, and VGG16 models then extract spatiotemporal features from the spectrogram. Finally, BSL emotions are accurately identified by classifying the spectrograms into considered emotion signs. Comparative simulation results demonstrate that a maximum classifying accuracy of 93.33% is obtained on all classes using the VGG16 model.
AB - Sign language is a means of communication between the deaf community and normal hearing people who use hand gestures, facial expressions, and body language to communicate. It has the same level of complexity as spoken language, but it does not employ the same sentence structure as English. The motions in sign language comprise a range of distinct hand and finger articulations that are occasionally synchronized with the head, face, and body. Existing sign language recognition systems are mainly camera-based, which have fundamental limitations of poor lighting conditions, potential training challenges with longer video sequence data, and serious privacy concerns. This study presents a first of its kind, contact-less and privacy-preserving British sign language (BSL) Recognition system using Radar and deep learning algorithms. Six most common emotions are considered in this proof of concept study, namely confused, depressed, happy, hate, lonely, and sad. The collected data is represented in the form of spectrograms. Three state-of-the-art deep learning models, namely, InceptionV3, VGG19, and VGG16 models then extract spatiotemporal features from the spectrogram. Finally, BSL emotions are accurately identified by classifying the spectrograms into considered emotion signs. Comparative simulation results demonstrate that a maximum classifying accuracy of 93.33% is obtained on all classes using the VGG16 model.
KW - RF sensing
KW - micro-Doppler signatures
KW - British sign language
KW - deep learning
U2 - 10.1109/EMBC48229.2022.9871491
DO - 10.1109/EMBC48229.2022.9871491
M3 - Conference contribution
SN - 9781728127835
SP - 4316
EP - 4319
BT - 2022 44th Annual International Conference of the IEEE Engineering in Medicine & Biology Society (EMBC)
PB - IEEE
ER -