|
|
# Intuitive Human-Computer Interface for Quadcopter Control
|
|
|
by Stefan Spiss (stefan.spiss@student.uibk.ac.at)
|
|
|
by **Stefan Spiss** (stefan.spiss@student.uibk.ac.at)
|
|
|
|
|
|
Supervisors:
|
|
|
* Prof. Dr. Matthias Harders (matthias.harders@uibk.ac.at)
|
... | ... | @@ -8,7 +8,7 @@ Supervisors: |
|
|
## 1. Description:
|
|
|
In this project different systems for direct user-control for quadcopters are explored. Furthermore a control interface is developed which combines the display of the video-stream of the aircraft using a head-mounted display (HMD) with gesture-based flight control using gesture capturing devices (e.g. LeapMotion, Kinect, Myo) and the use of tactile feedback to indicate obstacles in the surrounding of the quadcopter.
|
|
|
|
|
|
## 2. Task:
|
|
|
## 2. Introduction:
|
|
|
### 2.1. Visual stream:
|
|
|
* **Goal:**
|
|
|
Implementation of visual streaming so that the user can see what the quadcopter camera is filming. Moreover the user should be able to change the viewing direction by head movements. To realize this a normal camera can be used in combination with a gimbal which is controlled by the head movements. Or a omnidirectional camera is used and the the viewed part of the video changes following the users head movements.
|
... | ... | @@ -34,16 +34,28 @@ Motion capturing system from Perception Neuron that makes it possible to track w |
|
|
* **Goal:**
|
|
|
Develop a system for tactile feedback for obstacle avoidance. This part is important, because when using the HMD there can be obstacles which are not visible in the viewing direction. With the tactile feedback it then is possible to get warnings of obstacles whether they are visible in the HMD or not. For that an own interface will be developed using vibration motors.
|
|
|
|
|
|
## 3. Implementation:
|
|
|
TBA
|
|
|
## 3. Tasks:
|
|
|
### 3.2. Visual stream:
|
|
|
**Transmission of video stream:
|
|
|
|
|
|
A Raspberry Pi and a camera are mounted to the platform. The Raspberry Pi is responsible for transmitting a video stream to a remote machine which then processes the data.
|
|
|
|
|
|
## 4. Change Log:
|
|
|
### 3.3. Gesture-based control:
|
|
|
TBA
|
|
|
### 3.3. Tactile feedback:
|
|
|
TBA
|
|
|
|
|
|
## 5. Platform:
|
|
|
## 4. Platform:
|
|
|
This Bachelor theses is done using the quadcopter provided by the MCI Management Center Innsbruck.
|
|
|
|
|
|
A general wiki page about the used quadcopter can be found here.
|
|
|
A general wiki page about the used quadcopter can be found [here](http://mci-quadrocopterws-2014-automation-project.wikia.com/wiki/MCI_Quadrocopter%28WS_2014_Automation_Project%29_Wiki).
|
|
|
|
|
|
## 5. Change Log:
|
|
|
TBA
|
|
|
|
|
|
## 6. Literature:
|
|
|
* Teixeira et al. 2014 – Teleoperation Using Google Glass and AR.Drone for Structural Inspection, 2014 XVI Symposium on Virtual and Augmented Reality (SVR), 28-36
|
|
|
|
|
|
* Naseer et al. 2013 - FollowMe: Person Following and Gesture Recognition with a Quadrocopter, 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), 624-630
|
|
|
|
|
|
## 5. References:
|
|
|
TBA |
|
|
\ No newline at end of file |
|
|
* Ochi et al. 2015 - Live Streaming System for Omnidirectional Video, IEEE Virtual Reality Conference 2015, 349-350 |
|
|
\ No newline at end of file |