MIT Teaching Drones to Read Hand Gestures

Presently, the U.S. military is attempting to bring combat drones onto aircraft carriers. One of the major obstacles concerns how the unmanned fliers will interact with carrier personel on flight deck...
MIT Teaching Drones to Read Hand Gestures
Written by

Presently, the U.S. military is attempting to bring combat drones onto aircraft carriers. One of the major obstacles concerns how the unmanned fliers will interact with carrier personel on flight decks. Yale Song, a Ph.D. student from the Massachusetts Institute of Technology, is working to solve this problem, as well as to improve general human-robot interaction. Song started by programming a computer to obey hand signals:

To recognize a hand gesture, a computer must look at the positioning of the human body to discern where the hand is, and also figure out when the gesture begins, and when it ends. With aircraft carrier deck crews being in constant motion, Song has devised a way for computers to decode hand gestures on the fly. Song’s system calculates the probability of a series of hand signals as they might signify a specific gesture, to make a decision on what command is most likely trying to be conveyed.

With a library of 24 gestures made by 20 different people, the system can correctly identify what commands are being given 76% of the time. The system struggles with quick or erratic gestures, and cannot process “slang” gestures whatsoever. A made-up gesture that might seem obvious to a human would make no sense to a drone. Especially a drone infected with a virus. Also, 76% is an okay score, but not when human lives and very expensive military equipment is at stake.

MIT’s said this on the matter:

Part of the difficulty in training the classification algorithm is that it has to consider so many possibilities for every pose it’s presented with: For every arm position there are four possible hand positions, and for every hand position there are six possible arm positions. In ongoing work, the researchers are retooling the algorithm so that it considers arm position and hand position separately, which drastically cuts down on the computational complexity of its task. As a consequence, it should learn to identify gestures from the training data much more efficiently.

Song also states that he’s working on a gesture-recognition feedback method – if a drone didn’t understand what was being conveyed, it could nod or shake it’s camera. This all seems very tricky in regards to a robot plane coming in at roughly 150 mph to land on a 500′ long flight deck.

Get the WebProNews newsletter delivered to your inbox

Get the free daily newsletter read by decision makers

Subscribe
Advertise with Us

Ready to get started?

Get our media kit

Advertise with Us