Gesture recognition is a key task in the field of human–computer interaction (HCI). To solve the problems of low accuracy and poor real‐time performance in the recognition process, this paper designs a HCI system based on gesture recognition. This paper utilises the Ultraleap 3Di to collect the dynamic gesture dataset for the defined interaction gestures, and the high‐precision device guarantees data collection. This paper constructs a framework incorporating the advantages of convolutional neural networks (CNNs) and long short‐term memory networks (LSTM) using noncontact gesture interaction as the medium of human–computer collaboration. The framework utilises CNN to perform feature extraction on the input frame information. Then, the extracted feature sequences are fed into LSTM to process the timing information, which is very effective in classifying and recognising the defined dynamic gestures. Finally, a HCI system based on gesture recognition is designed. Based on the Unity3D platform, the UR5 robotic arm was modelled and the cyclic coordinate descent (CCD) algorithm was applied to solve the inverse kinematics, successfully realising the semantic control of gestures on the UR5 robotic arm. The experiment verifies that the CNN–LSTM network can ensure the real‐time performance of the whole system and the effectiveness and reliability of the gesture interaction system based on Ultraleap 3Di.
Loading....