... | ... | @@ -16,6 +16,7 @@ Implementation of visual streaming so that the user can see what the quadcopter |
|
|
* Goal:
|
|
|
Control the quadcopter with gestures so no normal remote control is needed. For that task at first several interfaces are tested.
|
|
|
* Possible Interfaces:
|
|
|
|
|
|
** MYO braclet:
|
|
|
One possible interface to capture gestures is the MYO bracelet from Thalmlic labs. With this bracelet hand and arm gestures can be detected using electromyographic sensors. Another interesting interface is the LEAP Motion controller in Fig. 2 which can track hand and finger motion without touching it. The hands just have to be nearer than about 1 meter. Furthermore the Microsoft kinect Fig. 3 which is a depth camera where whole body gestures can be tracked, also can be used. The last interface presented here is the Perception Neuron which is a motion capturing system that makes it possible to track whole body gestures using inertial measurement units. Furthermore also the HMD maybe will be used to control the quadcopter.
|
|
|
After the testing the best working interfaces will be used and combined. For that it will be necessary to develop an own algorithm which combines the output data of the interfaces, filter these data to remove outliers and therefore get a robust and stable gesture capturing.
|
... | ... | |