... | ... | @@ -13,14 +13,14 @@ In this project different systems for direct user-control for quadcopters are ex |
|
|
### 2.1. Visual stream:
|
|
|
* **Goal:**
|
|
|
|
|
|
Implementation of visual streaming so that the user can see what the quadcopter camera is filming. Moreover the user should be able to change the viewing direction by head movements. To realize this a normal camera can be used in combination with a gimbal which is controlled by the head movements. Or a omnidirectional camera is used and the the viewed part of the video changes following the users head movements.
|
|
|
Implementation of visual streaming so that the user can see what the quadcopter camera is filming. Moreover the user should be able to change the viewing direction by head movements. To realize this a normal camera can be used in combination with a gimbal which is controlled by the head movements. Or a omnidirectional camera is used and the the viewed part of the video changes following the users head movements.
|
|
|
|
|
|
### 2.2. Gesture-based control:
|
|
|
* **Goal:**
|
|
|
|
|
|
Control the quadcopter with gestures so no normal remote control is needed. For that task at first several interfaces are tested.
|
|
|
Control the quadcopter with gestures so no normal remote control is needed. For that task at first several interfaces are tested.
|
|
|
|
|
|
After the testing the best working devices are combined. For that an algorithm is developed which combines the output data of the interfaces, filter these data to remove outliers and therefore get a robust and stable gesture capturing.
|
|
|
After the testing the best working devices are combined. For that an algorithm is developed which combines the output data of the interfaces, filter these data to remove outliers and therefore get a robust and stable gesture capturing.
|
|
|
|
|
|
* **Possible Interfaces:**
|
|
|
* MYO bracelet:
|
... | ... | |