|
|
# Visual SLAM for Terrain Reconstruction and Autonomous Flight
|
|
|
# Intuitive Human-Computer Interface for Quadcopter Control
|
|
|
by Stefan Spiss (stefan.spiss@student.uibk.ac.at)
|
|
|
|
|
|
Author:
|
|
|
* Adrian Marxer (adrian.marxer@student.uibk.ac.at)
|
|
|
|
|
|
Supervised by:
|
|
|
* Justus Piater (justus.piater@uibk.ac.at)
|
|
|
Supervisors:
|
|
|
* Prof. Dr. Matthias Harders (matthias.harders@uibk.ac.at)
|
|
|
* Simon Haller (simon.haller@uibk.ac.at)
|
|
|
|
|
|
License: TBA
|
|
|
|
|
|
Bug tracker: TBA
|
|
|
|
|
|
Source: TBA
|
|
|
|
|
|
## 1. Overview
|
|
|
|
|
|
This project is a feasibility study of Visual SLAM for terrain reconstruction on an unmanned aerial vehicle. The goal is to use an existing SLAM algorithm to create a 3D map of an unknown environment by a quadrocopter in remote-controlled or autonomous flight. Optimally the algorithm will run on the vehicle itself, but if that is not practical a remote machine will be used.
|
|
|
|
|
|
## 2. Installation
|
|
|
TBA
|
|
|
## 1. Description:
|
|
|
In this project different systems for direct user-control for quadcopters are explored. Furthermore a control interface is developed which combines the display of the video-stream of the aircraft using a head-mounted display with gesture-based flight control using gesture capturing devices (e.g. LeapMotion, Kinect, Myo) and the use of tactile feedback to indicate obstacles in the surrounding of the quadcopter.
|
|
|
|
|
|
## 3. Usage
|
|
|
TBA
|
|
|
## 2. Task:
|
|
|
### 2.1. Visual stream:
|
|
|
* Goal:
|
|
|
Implementation of visual streaming so that the user can see what the quadcopter camera is filming. Moreover the user should be able to change the viewing direction by head movements. To realize this a normal camera can be used in combination with a gimbal which is controlled by the head movements. Or a omnidirectional camera is used and the the viewed part of the video changes following the users head movements.
|
|
|
### 2.2. Gesture-based control:
|
|
|
* Goal:
|
|
|
Control the quadcopter with gestures so no normal remote control is needed. For that task at first several interfaces are tested.
|
|
|
* Possible Interfaces:
|
|
|
** MYO braclet:
|
|
|
One possible interface to capture gestures is the MYO bracelet from Thalmlic labs. With this bracelet hand and arm gestures can be detected using electromyographic sensors. Another interesting interface is the LEAP Motion controller in Fig. 2 which can track hand and finger motion without touching it. The hands just have to be nearer than about 1 meter. Furthermore the Microsoft kinect Fig. 3 which is a depth camera where whole body gestures can be tracked, also can be used. The last interface presented here is the Perception Neuron which is a motion capturing system that makes it possible to track whole body gestures using inertial measurement units. Furthermore also the HMD maybe will be used to control the quadcopter.
|
|
|
After the testing the best working interfaces will be used and combined. For that it will be necessary to develop an own algorithm which combines the output data of the interfaces, filter these data to remove outliers and therefore get a robust and stable gesture capturing.
|
|
|
### 2.3. Tactile feedback:
|
|
|
|
|
|
## 4. Platform
|
|
|
TBA
|
|
|
|
|
|
## 5. Visual SLAM Algorithm
|
|
|
TBA
|
|
|
The third task is to develop a system for tactile feedback for obstacle avoidance using the work from Lukas Haidacher. This part is important, because when using the HMD there can be obstacles which are not visible in the viewing direction. With the tactile feedback it then is possible to get warnings of obstacles whether they are visible in the HMD or not. For that an own interface will be developed using vibration motors. An example for such a system would be the tactile feedback belt from Edwards et al. 2009 in Fig. 4 where also vibration motors are used to give the feedback. Another possibility will be to use the Perception Neuron also for that.
|
|
|
|
|
|
## 6. Autonomous exploration
|
|
|
TBA
|
|
|
### 2.1. Problems:
|
|
|
### 2.2. Goals:
|
|
|
|
|
|
## 7. Change log
|
|
|
TBA |
|
|
\ No newline at end of file |
|
|
## 7. References: |
|
|
\ No newline at end of file |