Commit 0dfca870 authored by Simon Markus Haller-Seeber's avatar Simon Markus Haller-Seeber
Browse files

Update README.md

parent ea196f9a
#### DISCLAIMER ####
This is very alpha.
This is alpha.
## Introduction
......@@ -43,55 +43,16 @@ This repository contains all sources for the NvidiaSpheroLynxmotionIntel MiniBot
- [ ] Startup Scripts
- [ ] Test complete setup
## Bringup
To view the robot in rviz:
```roslaunch minibot display.launch```
## Notes
### Connecting
To connect to the robots:
- Plugin the WiFi Router and connect to it using SSID: `EDUROBOTS` PASS: `robotsarefun`
- Power the Nvidia Jetson and ssh into a robot (You can find the IP on the robots)
## The Sphero RVR Base:
<img src="docs/sphero_rvr_urdf2.jpg" align="right" width="210" height="140">
<img src="docs/sphero_rvr_urdf.jpg" align="right" width="210" height="140">
- [x] This repository is build upon: https://github.com/strean/rvr_ros and the free Sphero RVR CPP library from https://bitbucket.org/rmerriam/rvr-cpp.
- [x] Basic control from Nvidia Jetson (directly via ``/dev/ttyTHS1``)
- [x] Basic ROS Control (actually no one till now wrote a real ROS Controller)
- [x] Python Version of Basic ROS Controller (rosrun sphero_rvr_hw rvr-ros.py)
- [x] C++ Version of Basic ROS Controller (roslaunch sphero_rvr_hw rvr_ros.launch)
- [x] C++ Version ROS Controller (Build upon URDF; Velocity Control)
- [ ] Still an Issue with https://bitbucket.org/rmerriam/rvr-cpp library> sensors_s.locator() for odometry does not stream data.
- [ ] There is no way to get the motor currentor motor velocity from the SPhero RVR - so the classical ROS Controller read part is not working
- [x] Build URDF Model (v0.1)
- [ ] Extend rvr_ros to publish all sensor data
- [ ] Write ros service calls to set e.g. led colors
- [x] ROS Noetic does not provide ros-noetic-serial - this is build from https://github.com/wjwwood/serial. (Note: this is not the rosserial library)
- [ ] At a later stage we might have a look at the ROS2 implementation from lomori: https://github.com/lomori/spherorvr-ros2
**Note:** To build rvr-cpp you need at least gcc version 9.
## LSS Lynxmotion 4 DoF Arm
<img src="docs/arm_urdf.jpg" align="right" width="210" height="170">
<img src="docs/arm_urdf2.jpg" align="right" width="210" height="170">
- [x] Calibration with calibration tool
- [x] Driver: currently offical python drivers
- [x] Basic control from Nvidia Jetson (directly via a usb serial converter ``/dev/ttyUSB0``)
- [ ] Write ROS Controller
- [x] Create STL Files for URDF
- [x] Build URDF Model (v0.1)
- [ ] Write service calls for e.g. led color, settings for the LSS Servos...
- [ ] maybe switch to rust: https://github.com/dmweis/lss_driver using rust ros control (for the moment i think this is too experimental)
## Nvidia Jetson:
- [x] Build a Base Jetson Nano Ubuntu 20.04 image. This is based on https://github.com/pythops/jetson-nano-image/
- [x] ROS Noetic from the Ubuntu Repos
- [x] Nvidia, Intel Drivers from Manufacturer and/or Ubuntu Repos
- [x] Clone this repo for the Hardware Drivers and ROS Packages (Sphero RVR, Lynxmotion Arm, Intel Realsense)
- [ ] Include Vision Part of LeoBot (https://git.uibk.ac.at/RoboCupTeamTyrolics/robocupatwork)
- [ ] Test Yolo Nano https://arxiv.org/abs/2107.08430 ("0.91M parameters and 1.08G FLOPs, we get 25.3% AP on COCO")
**Note:** This image is "minimal" and has no GUI interface. You need to connect via ssh - or at a later stage maybe via a webbrowser.
### Usage
- To view the robot in rviz: ```roslaunch minibot display.launch```
- For more information on the Sphero RVR base see the [Readme](/catkin_ws/src/sphero_rvr/README.md) in the ROS package.
- For more information on the LSS 4DOF Arm see the [Readme](/catkin_ws/src/lss_4dof/README.md) in the ROS package.
- For more information on the Nvidia Jetson Image see the [Readme](/jetson-nano-buildenv/README.md) in the buildenv folder. **Note:** This Nvidia Jetson Image contains nono GUI interface. You need to connect via ssh - or at a later stage maybe via a webbrowser.
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment