Keywords: myCobot, Robot Control, Gesture Recognition
In this video, I’m controlling myCobot-Pi with hand gestures.
Joints 1~5 are selected with number signs and the joint jogs forward or backward based on thumbs up or down. Fist sign stops robot movement.
Hand gesture recognition is done with a small neural network processing outputs of mediapipe’s hand landmarks model. Find the code for training your own hand gestures at this link.
Although ROS would be the preferred way, but it currently doesn’t work on myCobot. So, I made use of ZeroMQ which is a light weight networking library.