Abstract:
To address the difficulty of grasping objects for the elderly and people with disabilities, this study designs a machine vision-based bionic hand that automatically performs grasping actions by detecting changes in the user's lip posture. Based on the structural characteristics and proportional dimensions of the human hand, the mechanical structure of the bionic hand was designed and fabricated. A facial expression recognition algorithm was developed in Visual Studio using the Dlib model, which extracts key lip points to calculate the lip aspect ratio and determines lip states through ratio variations. The control system, centered on an STM32 microcontroller, achieves data transmission between the vision and control modules via serial communication. Experiments demonstrate that the bionic hand can rapidly and accurately recognize facial expressions and execute stable grasping actions, laying a foundation for future bionic robotic arm development.