Turtlebot3 3D-SLAM

2019-11-30 22:10:00 / Quick Start Guide & Review / Comments 0
Turtlebot3 3D-SLAM - 3D-Camera Upgrade Kit for Turtlebot3 | MYBOTSHOP.DE

Astra Camera Intergration with Turtlebot3

We've developed a 3D-Camera Upgrade Kit for the popular Turtlebot3 research Platform. The Kit and the following manual is compatible to the Turtlebot3 Burger and Waffle Pi. Furthermore the complete Kit is available here: TB3 3D-Camera Upgrade Kit

Turtlebot setup

  • For turtlebot the only step needed is to install usbip, as we don't want to process anything on the turtlebot but on the remote PC.

sudo apt install usbip

  • We will make turtlebot as server and the remote PC as a client.

  • Check the devices connected along their bus id's (command without !)

In [1]:
! usbip list -l
 - busid 3-1 (046d:c31c)
   Logitech, Inc. : Keyboard K120 (046d:c31c)

 - busid 3-6 (046d:c52a)
   Logitech, Inc. : unknown product (046d:c52a)

 - busid 3-7 (13d3:3394)
   IMC Networks : Bluetooth (13d3:3394)

  • So now lets make the the turtlebot as a server for data streaming

sudo modprobe usbip-host

sudo usbipd -D

  • Next, a link must be created from usbutils to hwdata so that the usb devices are properly displayed (first time only).

sudo mkdir /usr/share/hwdata/

sudo ln -sf /var/lib/usbutils/usb.ids /usr/share/hwdata/

  • The last step lets bind the usb port to the network

sudo usbip bind -b "Bus-ID"

  • As an example in my case:

sudo usbip bind -b 3-2.1

sudo usbip bind -b 3-2.1

  • So now we have setted up everything on the robot so lets configure the client side.

Remote PC setup

  • First ROS drivers for the astra camera needs to be installed on the remote PC which can be installed by following the procedure below:

sudo apt install ros-*-rgbd-launch ros-*-libuvc ros-*-libuvc-camera ros-*-libuvc-ros

1. Create a workspace if you don't have already one: 

mkdir -p catkin_ws/src && cd catkin_ws/

catkin_make

3. Pull the Astra camera drivers for ROS from Github in source of your worksapce 

cd src

git clone https://github.com/orbbec/ros_astra_camera

4. Just to rename the package as it would be like ros_astra_camera but in CMakeList its astra_camera so, 

mv ros_astra_camera astra_camera

5. Create astra udev rules

cd astra_camera

./scripts/create_udev_rules

6. Compile the astra_camera pacakge

cd ..

catkin_make --pkg astra_camera

  • Now plugin the camera and get the info about the camera by running the following command.

rosrun astra_camera astra_list_devices

  • Note the vendor and product id for later usage.

Device #0: Uri: 2bc5/0608@3/6 (Vendor: Orbbec, Name: Astra, Vendor ID: 2bc5, Product ID: 608)

  • Now the next step is to configure the network for communication between the Turtlebot3 and the remote PC. Here usbip Linux package willbe used which can be downloaded from here or by command line tool:

sudo apt-get install linux-tools-generic

  • After everything is setted up on the robot, now lets configure the PC to get sensor data streams from the robot.

  • Run the follwing command to make the remote PC as client.

sudo modprobe vhci-hcd

  • Check the list of avaiable devices on the network.

usbip list -r "ServerIPAddress"

In [2]:
 ! usbip list -r 192.168.0.223
Exportable USB devices
======================
 - 192.168.0.223
    1-1.3.2: unknown vendor : unknown product (2bc5:0508)
           : /sys/devices/platform/soc/3f980000.usb/usb1/1-1/1-1.3/1-1.3.2
           : Miscellaneous Device / ? / Interface Association (ef/02/01)
           :  0 - Video / Video Control / unknown protocol (0e/01/00)
           :  1 - Video / Video Streaming / unknown protocol (0e/02/00)
           :  2 - Video / Video Streaming / unknown protocol (0e/02/00)

    1-1.3.1: unknown vendor : unknown product (2bc5:0608)
           : /sys/devices/platform/soc/3f980000.usb/usb1/1-1/1-1.3/1-1.3.1
           : (Defined at Interface level) (00/00/00)
           :  0 - Vendor Specific Class / unknown subclass / unknown protocol (ff/00/00)

  • Finally attach the usb devces to the remote PC.

sudo usbip attach -r "ServerIPAdress" -b "Bus-ID"

  • Now lets check the data streams and the connection established.

  • Run the appropraite launch file from the astra_camera package. In my case I have Astra Stereo S so:

roslaunch astra_camera stereo_s.launch

  • From another terminal run:

rosrun rviz rviz

  • Add the RGB image topic fom the add button and you should see the camera images over the network from turtlebot.

Monocular SLAM using Astra Camera on Turtlebot

For an example application I will demonstrate here how we can perform a monocular slam with our turtlebot and astra camera. Monicular slam has a big advantage that it can work for big outdoor scenarois, small indoor scenarios and the variants of these two.

There are two types visual:

  1. Feature based monocular slam

  2. Direct monocular slam

Feature based slam algorithms use features, keypoints (may be corners blob, lines etc.) of the input image to estimate the possition of camera mounted on the robot.

On the other hand direct slam algorithms use intensties of the image to estimate the position of the camera. They use more information and create a detailed map of the environment, but they come at a computational costs.

Here I will show you how to setup ORB-SLAM2 a feature based slam algorithm.

ORB SLAM2

  • ORB SLAM is a realtime slam approach, it has dependencies on the follwoing packages.

    1. Pangolin

    2. OpenCV

    3. Eigen3

    4. DBoW2 and g2o (Already Included in Thirdparty folder)

  • Let's start installing these dependencies and the ORB SLAM in another workspace apart from the catkin_ws (as these are non-ros dependencies and the package).

    mkdir ~/slam && cd slam

    • Pangolin

      • Installing a few dependencies of Pangolin first

        sudo apt install libgl1-mesa-dev libglew-dev cmake libpython2.7-dev ffmpeg libavcodec-dev libavutil-dev libavformat-dev libswscale-dev libavdevice-dev libuvc-dev

      • Now install Pangolin

        git clone https://github.com/stevenlovegrove/Pangolin.git

        cd Pangolin

        mkdir build && cd build

        cmake ..

        cmake --build .

    • For OpenCV installation follow the below procedure

      sudo apt-get install build-essential

      sudo apt-get install cmake git libgtk2.0-dev pkg-config libavcodec-dev libavformat-dev libswscale-dev

      sudo apt-get install python-dev python-numpy libtbb2 libtbb-dev libjpeg-dev libpng-dev libtiff-dev libjasper-dev libdc1394-22-dev

      cd ~/slam

      git clone https://github.com/opencv/opencv.git

      git clone https://github.com/opencv/opencv_contrib.git

      cd opencv

      mkdir build

      cd build

      make -j8

      sudo make install

    • Eigen3

      sudo apt install libeigen3-dev

  • Now lets install ORB SLAM2

cd ~/slam

git clone https://github.com/raulmur/ORB_SLAM2.git ORB_SLAM2

cd ORB_SLAM2

chmod +x build.sh

sudo ./build.sh

  • For ROS components do the following steps:

chmod +x build_ros.sh

sudo ./build_ros.sh

  • So the last step is just to conculde the installtion is to source the setup.bash of ORB SLAM package to run ros node from ORB SLAM.

cd ORB_SLAM2

echo $PWD/Examples/ROS/ORB_SLAM2/build/devel/setup.bash >> ~/.bashrc

  • Mono Slam from ORB_SLAM2 can be run by the follwing command:

rosrun ORB_SLAM2 Mono

  • path_to_vocabulary is somethinf specific to ORB_SLAM2 which will just be used as it is in the slam process.

  • path_to_yaml ix actually a configuration file specific to camera and the ORB_SLAM paramters. It includes camera caliberation paamters, projecotr baseline, image color channels and for ORB_SLAM2 number of features per image, number of levels in the scale paramid etc.

  • The camera caliberation paramters can be found in /camera/rgb/camera_info rostopic after running the launch file for your specific camera e.g. stereo_s.launch.

  • These paramters are approximate from the manufacturer, if you want there are some good resources like here and here. An example caliberation file used can be seen under /ORB_SLAM2/Examples/ROS/ORB_SLAM2/Astra_Stereo_s.yaml

  • ORB_SLAM2 can be used as a library and can be used as out of the box as well.
  • Till now everything is ready just a last thing to save the point clouds to a directed specified we need to add the follwing lines of code to ros_mono.cc before the closing of ros node.
std::string Filename = "";
    if (argc == 4){Filename = std::string(argv[3]);
      std::string last = Filename.substr(Filename.length()-1);
      if (last.compare("/")!=0) {
        Filename.push_back('/');
      }
    }std::string KeyFrameTrajectory = Filename + "KeyFrameTrajectory.txt";
    SLAM.SaveKeyFrameTrajectoryTUM(KeyFrameTrajectory);// Save pointcloud
    std::string PointCloudFile = Filename + "pointcloud.pcd";
    SLAM.CreatePCD(PointCloudFile);
  • so now just the build the package roscd ORB_SLAM2 and executing the command ./build_ros.sh (do not forget to source if you haven't already).

Demo

  • As mentioned before we need to give the path for vocabulary files of ORB_SLAM2 which is in Vocabulary folder in the package directory so its good to do a cd ORB_SLAM2 before running the script.

  • First run the astra driver and than after that run the following command:

rosrun ORB_SLAM2 Mono path_to_vocabulary_file path_to_yaml_file path_to_save_directory


News

Our Instagram FEEDS

Social Media

Why not follow us on one of the following social media channels?